var/home/core/zuul-output/0000755000175000017500000000000015157244410014530 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015157260455015503 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000356746315157260351020303 0ustar corecore`ikubelet.log_o[;r)Br'o -n(!9t%Cs7}g/غIs,r.k9Gf ?Eڤ펯_ˎ6_o#oVݏKf핷ox[o8W5ֻ!Kޒ/h3_.gSeq5v(×_~^ǿq]n>߮}+ԏbś E^"Y^-Vۋz7wH׋0g"ŒGǯguz|ny;#)a "b BLc?^^4[ftlR%KF^j 8DΆgS^Kz۞_W#|`zIlp_@oEy5 fs&2x*g+W4m ɭiE߳Kf^?·0* TQ0Z%bb oHIl.f/M1FJdl!و4Gf#C2lIw]BPIjfkAubTI *JB4?PxQs# `LK3@g(C U {oLtiGgz֝$,z'vǛVB} eRB0R딏]dP>Li.`|!>ڌj+ACl21E^#QDuxGvZ4c$)9ӋrYWoxCNQWs]8M%3KpNGIrND}2SRCK.(^$0^@hH9%!40Jm>*Kdg?y7|&#)3+o,2s%R>!%*XC7Ln* wCƕH#FLzsѹ Xߛk׹1{,wŻ4v+(n^RϚOGO;5p Cj·1z_j( ,"z-Ee}t(QCuˠMkmi+2z5iݸ6C~z+_Ex$\}*9h>t m2m`QɢJ[a|$ᑨj:D+ʎ; 9Gacm_jY-y`)͐o΁GWo(C U ?}aK+d&?>Y;ufʕ"uZ0EyT0: =XVy#iEW&q]#v0nFNV-9JrdK\D2s&[#bE(mV9ىN囋{W5e1߯F1>9r;:J_T{*T\hVQxi0LZD T{ /WHc&)_`i=į`PÝr JovJw`纪}PSSii4wT (Dnm_`c46A>hPr0ιӦ q:Np8>R'8::8g'h"M{qd 㦿GGk\(Rh07uB^WrN_Ŏ6W>Bߔ)bQ) <4G0 C.iTEZ{(¥:-³xlՐ0A_Fݗw)(c>bugbǎ\J;tf*H7(?PЃkLM)}?=XkLd. yK>"dgӦ{ qke5@eTR BgT9(TڢKBEV*DDQ$3gFfThmIjh}iL;R:7A}Ss8ҧ ΁eor(Ё^g׬JyU{v3Fxlţ@U5$&~ay\CJ68?%tS KK3,87'T`ɻaNhIcn#T[2XDRcm0TJ#r)٧4!)'qϷכrTMiHe1[7c(+!C[KԹҤ 0q;;xG'ʐƭ5J; 6M^ CL3EQXy0Hy[``Xm635o,j&X}6$=}0vJ{*.Jw *nacԇ&~hb[nӉ>'݌6od NN&DǭZrb5Iffe6Rh&C4F;D3T\[ bk5̕@UFB1/ z/}KXg%q3Ifq CXReQP2$TbgK ء#AZ9 K>UHkZ;oﴍ8MEDa3[p1>m`XYB[9% E*:`cBCIqC(1&b f]fNhdQvݸCVA/e.# Okx܍>М>ӗom$rۇnu~Y݇̇TIwӜ'}׃nxuoỴRZ&Yzbm ]) %1(Y^9{q"4e?x+ [Vz;E|d1&ږ/0-Vb=SSO|k1A[|gbͧɇد;:X:@;afU=Sru CK >Y%LwM*t{zƝ$;ȾjHim @tBODɆj>0st\t@HTu( v e`H*1aK`3CmF1K>*Mk{_'֜dN${OT-n,'}6ȴ .#Sqη9]5zoX#ZVOy4%-Lq6dACYm*H@:FUф(vcD%F"i ' VVdmcOTKpwq.M?m12N[=tuw}opYG]2u<ΰ+a1tHayɒ aY(P*aaʨ@ΰ<pX X{k[%Egl1$9  ֲQ$'dJVE%mT{z`R$77.N|b>harNJ(Bň0ae3V#b,PY0TEu1L/]MTB4$`H6NI\nbǛ*AyA\(u|@ [h-,j7gDTÎ4oWJ$j!fc̖F4BJ2ᮚ苮p(r%Q 6<$(Ӣ(RvA A-^dX? I,($F{ձ7*Oy 6EK( EF #31J8mN .TTF9㕴/5~RxCe,&v3,JE- ZF5%Da,Gܠ*qI@qlG6s푻jÝ$ >8ȕ$eZ1j[h0SH,qf<"${/ksBK}xnwDb%M6:K<~̓9*u᛹Q{FЖt~6S#G1(zr6<ߜ!?U\(0EmG4 4c~J~]ps/9܎ms4gZY-07`-Id,9õ԰t+-b[uemNi_󈛥^g+!SKq<>78NBx;c4<ニ)H .Pd^cR^p_G+E--ۥ_F]a|v@|3p%kzh|k*BBRib\J3Yn|뇱[FfP%M:<`pz?]6laz5`ZQs{>3ư_o%oU׆]YLz_s߭AF'is^_&uUm$[[5HI4QCZ5!N&D[uiXk&2Bg&Ս7_/6v_cd쿽d@eU XyX2z>g8:.⺻h()&nO5YE\1t7aSyFxPV19 ĕi%K"IcB j>Pm[E[^oHmmU̸nG pHKZ{{Qo}i¿Xc\]e1e,5`te.5Hhao<[50wMUF􀍠PV?Yg"ź)\3mf|ܔMUiU|Ym! #'ukMmQ9Blm]TO1ba.XW x6ܠ9[v35H;-]Um4mMrW-k#~fؤϋu_j*^Wj^qM `-Pk.@%=X#|ۡb1lKcj$׋bKv[~"N jS4HOkeF3LPyi︅iWk! cAnxu6<7cp?WN $?X3l(?  'Z! ,Z.maO_Bk/m~ޖ(<qRfR"Au\PmLZ"twpuJ` mvf+T!6Ѓjw1ncuwo':o gSPC=]U҅yY9 &K<-na'Xk,P4+`Þ/lX/bjFO.= w ?>ȑ3n߿z,t s5Z/ Clo-` z?a~b mzkC zFȏ>1k*Dls6vP9hS  ehC.3 @6ijvUuBY hBnb[ Fr#D7ćlA!:X lYE>#0JvʈɌ|\u,'Y˲.,;oOwoj-25Hݻ7 li0bSlbw=IsxhRbd+I]Y]JP}@.供SЃ??w w@KvKts[TSa /ZaDžPAEư07>~w3n:U/.P珀Yaٳ5Ʈ]խ4 ~fh.8C>n@T%W?%TbzK-6cb:XeGL`'žeVVޖ~;BLv[n|viPjbMeO?!hEfޮ])4 ?KN1o<]0Bg9lldXuT ʑ!Iu2ʌnB5*<^I^~G;Ja߄bHȌsK+D"̽E/"Icƀsu0,gy(&TI{ U܋N5 l͖h"褁lm *#n/Q!m b0X3i)\IN˭% Y&cKoG w 9pM^WϋQf7s#bd+SDL ,FZ<1Kx&C!{P|Ռr,* ] O;*X]Eg,5,ouZm8pnglVj!p2֬uT[QyB402|2d5K: `Bcz|Rxxl3{c` 1nhJzQHv?hbºܞz=73qSO0}Dc D]ͺjgw07'㤸z YJ\Hb9Ɖ„2Hi{(2HFE?*w*hy4ޙM^٫wF(p]EwQzr*! 5F XrO7E[!gJ^.a&HߣaaQÝ$_vyz4}0!yܒ栒޹a% Ŋ X!cJ!A\ ?E\R1 q/rJjd A4y4c+bQ̘TT!kw/nb͵FcRG0xeO sw5TV12R7<OG5cjShGg/5TbW > ]~Wޠ9dNiee$V[\[Qp-&u~a+3~;xUFFW>'ǣC~방u)т48ZdH;j a]`bGԹ#qiP(yڤ~dO@wA[Vz/$NW\F?H4kX6)F*1*(eJAaݡ krqB}q^fn 8y7P  GRޠkQn>eqQntq"Occ°NRjg#qSn02DŔw:ؽ 5l)Fa/TTmCԤ{"9b{ywSXE*m#3U ùRIvޏrJ`k|wJKH:O*OKy`( ݢe*{ ua ȻݔhvOkU~OǠI/aǕ-JMX _.6KsjA Qsmd  O#F.Uf28ZAgy>y,d$C?v01q5e.Um>]RLa&r?+@6k&#l)I5_> ` D s5npo}/ؙq #a2V?X~.4O/'|/_|&q̑0dd4>vk 60D _o~[Sw3ckpkpLNa ^j 5*<&}kˢmqvۗj=<Tr=[ a^؃ È(<^=xZb [_tܡ&yЋ{ Sym^?̑sU~' Ԓ f\itu)b>5X -$s?{WƱPz;| \;_D[T/BI GH8@"t*"9E[/Y5d{zrBܖ6Hlc "mKv~[uLU4lZ;xEN'oI㤛rP*jC# 6@dmHg1$ʇȠh#CBΤ{sTQ{%w)7@y1K^ ].Y$46[B-3%OONw8d`Q4d$x0t8@t]y1T\YAidtxBG:pɨyeNg4n]M؞ e}Wn6׳i~'ہZ*FU{fXڃP'Hl4 ,ŸqMHDCYZz Qnz܁$Jp04ȴIL΃.0FiO-qy)i_TA|S2G4miBȨHM(2hys|F 94 DNlϒòκ-q|xC ,gKDzHR%t+E/wd#礱ºȄWEz o\JξB.wLKZ39(M +(PWՇfR6#ю3Ȋt ݪbh]MTw䀩S]'qf&)-_G;"1qz퇛0,#yiq$ՁɄ)KٮޓJ|̖D?:3mhW=rOf'/wѹ8BS8]`;=?,ڼ"ϴq*(A7? /W= #^ub"6q f+=^OI@߱^F[n4A#bYѤwd)J^Z{*ǥzw73LuaVad=$6)iI gC~.1%YmҪ+2gSt!8iIۛ*JgE7LGoş\bC}O i ycK1YhO6 /g:KT sPv6l+uN|!"VS^΄t*3b\N7dYܞLcn3rnNd8"is"1- ޑܧd[]~:'#;N(NknfV('I rcj2J1G<5 Nj̒Qh]ꍾZBn&Un' CyUM0nCj.&Oڣg\q0^Ϻ%4i" ZZG>Xr'XKc$2iσֹH<6N8HSg>uMik{Fm(W F@@{W+ߑ?X2hS4-=^YgpUHެbZ!y!ul@ڼ63" ۩:6=TZõ$E,ϓRV|G&$rr;J TtIHFE=RȬ]P pLm|?$%>Eü%mWO[>Xmw,*9.[G n >X8Ī;xW%dT:`ٓ~:QO,}j6j!yڦʲT:Pqҋh] H+&=>g| Z;D8ܶb:! Å{2:+au 6:!fF+0#+̬NY"!6a7#񕪰%:r|o5Znڧs?si/W qEU馥˟^_޶oڷOj'?nc]Rn\t3^邳塨Lɏ"8k8M~?M}OAH$77f|lgn I;.K*!<+"eK5c&`X:#;@B@[(K44sBFu M.MNWLlY]K᜴=/ VމYlϿ4i36$>m|_>9|dUA"{!$jKx E$K3hN(tÊ-#v#O N, 9g80Ǭ&VdӞ5W1!1KYd`,-*&>F~⯰&jb.~cNk BL_OG]Bv.A|'qT(Ol.' 4IE|@Iі)<-p JkQm1 `qacܗVc?)cl*&<}P媠E{-sVU>߇GUt\+n3X]Byoz)li$2cPs6D>TE-n# rve{椱I |p)U݋7yJw&PzDgi xs  xh\L r Ѥo Zt(I >|$>tnMdэouD "#A s`aߕ[V6ޒUyaWwEim>r탷sxVw>.!UXVȢytaJCUYV 6FЂ{ObgxX KoVCb(z:j8GU]g /7PZ&/k1n }'qRI/!63rXX:ֈ"? oC0oƟāJ'8WFF-~M~٩:Ȧa%\!^P,o1XMV D:yzub1X ~iS? )JJrok"UȊ긠mr:[Iɘ+AYhCve-g:Y`J$q ]Fl%Đ*`Kq>u^pgTp 1dq,\WF(ަptzv9|ksAay*19mWj ;c]u}X!.Zkzqa!誎S> B#mw79,`:a _N 1Ef rr=x-R*34| %1zg7r7OGdt[))1 E.Ma^>@xg TKcTuvZ1G/2< 荊BʢՕ؀ 0raB8.V 9B"%8T(ԅ%B5,T18~VC 5.y`eQ9W8]䇕!|k=Lb@VC ;緢%NV'b=d1b {ѣ`[+ LA:-m!j$YD؊0jm$9d3VEb/0?ػ: \MʪBP4Zt%>7ȅG 7bbXi /bj@ 5Ӗc}jL!cUc٠=j¾9gRn dNJD a[݂(a}XNW@,'*U}9;~e" r:L;2vN"o=;pԓz]'ԋsI m1ܰ#$A4<-B}08ê`}Y<ٽ%|X"3}M?x]p> TA<+ ނգn9:/ʹ @[xK2i $A^8I&CtEw4 Ab d{-3o˿S]y@ks~d*"K=i.<x)ve .<,Hz%d1Nr@CY>EUCGV4d!Xeck5Y$DA5xKA֠%a(D=Ma1l- s|hZ+1@񺱥Vc1d-4O=G֠ 3LJVH2pa)h%1,I:*Y$ 4#rk ޶k\ȊxVCkr ۰|wlIYt mpaI=2+v5R%~XcvR%~\L^-U*t\:0EDza@GfZb"Oq5HbC*7l"! k{mͤ@GsC2;m%IQ31񱍠x{(6}ޏC,I < [j_Y crXBH]eV kJڦSXڨ,tWzFU`m\ہ:#q`x=C@ա`ƍ)s@%C•-u. e#w K]`Aa>r:Ț1q ߶lNaEPN4#`ɼ]Aa2@$#1o2aN*@端E]>ey5\)=.jǪÅVzGf /y(=n٣rEmm aE0 #%%(]C>v6X9<3t\lD:)=5mF+ I! g(ַ2+-X3rXƍ [|=c.f0GU`\Rڈa`tTtъXLm* dEޏӰu9t?HZߧskd蛔Ȃ^*q+AWNGиe\M'Pk6mZRy [0ޝYߤvsդPM+0u18lY.nّ[/㊀V^kDʡ!u6X9^wCN ӊ/19M/Z^K̓? ݖ1EJ=G [򞔧rv\ gƄF.!"#m.RŮ71/@$̨E7w-5li`YcrXJL ;Ÿ|ۭoAM [WVGRcتw@`@EͦŔ`oI鐅_nq.\5&U 6 stdG [pda[@dؒۮtGP8)G`iTa5MԢRrh9hUmq')T1D9qD vhC_+JXcQ]GUiͦY;* XE$2Smpoah~ <6.Bǀƾ2OV2lV"U81vzֲ?]PVqzC6}OZZ,X%2q+o6oVBzZ[RGܰşY;G/>XRB ?*k!k [|ǝJΩe]PdqU>܎<,:4:b 2Y~׸̀ji:8F1} p+2=O/G?.37:8'/qg^puM&`b@BÖ[Qou-H2߹Xߓ ыPlҷk$t.C`8߱EBÖ/B_83~ hp> \Ag<3jRvAڏ3qJCޤ:-xq5Oӂ~\daXV;6ClһiQ,#0>*ƵO+͡ SSGͺt@U ``qo%|_:? ~9Нv;zXt5@.ĵFkq"> M V\5ްG-7ll[[0Gz-1=RiK?`#Rq!uXVh уB txݾhP`1XNqQb 6+0l[ _;-9 [| w;-*3t\.Lf2EH޲d6gjw U; Q*Л8Z Ӄ~XG- VsR )"@Ubbla˯;C>#G`&RQ5KZ7/(EmC/&Qusuy۟Fۻ_My^,6wSyt~[\uno^Zw,/j7F\?,no6N̛+~+mGs^u/M N7,>?ŗaj \ÊнX\N5<څt}{V^4W_f_q+$kU9ί)u~ٰd=gC< PRs:&nE 7XIUg#d5T)JpQK; UgRæOY.ᣪVʰ>7[SYYq{\~*ϳB6OId ShV X'k)P&Hz; V3,]T.$²pY HYɠx: d U \ OR2<'<$S1ȸ `\,a =ɸ. >.arCP%+d@OIى/Nݰ 60Z$^_+c/"3]a.\Ƕ/ŕg'>z8y#d _WKuqE&q]#)>ɹ)!I #ˠ3 悳Fqy3yDq2[brѾ4ZFUoɹG7Q?2-ʴ,vǶI3YL_Ge5hAVSIaJ+c!xpP+ֳ$Q@&L.X5?LTX2 K퀡5$eǫ62oG>!)++6 (wL˶!4eC44Y,]u~DAuƞ'#Eq:j.R[-4X: ےO/6y%D<-B9o_S6ƚ5M/y" F\3$T\6i{l*L奪f/QȬ/}blifbKTw̽%:Mf0D 9/z \$PLY"˺PA6b zL@ LEVIl4=>C&vgJ"xp8wjS[A\8NcޗY*+ (j:M,a,NuPx |5 CWޓ4 YVxp0(Vi[+HmcL)x;tQ` \2W}KFjMU2+");3u7֪\_$'eS<(Hj|d@5 N}}kbENN6! eڨi+%w!E ^b6L휑⻀o~c"E)I`qJ!w9E B]20)]WxI] ;ui郧ÂV61J 6"Gucɥ LP(QeKX L,XXB Dm9,=U,u!@iaU$~1T9V&KcraYz2b8qXk%pൠ6BZڋ"&RցA Q"3nE8㣘't`4Z1#kaJsH8HG'-[6{-a =mW#шEsGSgX8e8!xF›G̣ Xɵ:ԝNJ|1QߺJ8 7gCph6]eyRGmўD}4!NDKX#)م4sv:iP l5Gͱu 0V*i=Ob,553֎r5;ձX}C hRD}+ k HM[9`.. ^JW w9~x#rIT+vb@`(d"Xkn(;PV3xfEHY 6XU}[F伉87å 6#KysD"-g|vmC8q{DøTsռ WYd'\l1gOs){!D\`C?D~#e$ b&ˣKl9jTk(:KCb]}.w${R61I_oo_ u6< o:;S=+a_[K>ٙzuʒkkW*\¨Jɤ}moGogyˌ(%D^o8|}l [;cs%ݴF2R ]t餜#8T,I U@Q޲PEƾ.I*&a=̭;OSeq)>"S>4TYy^Kq-DňKCc! bjݼ++ cEWDxΎ;Ҩ37zeWyX Wr@a5, / yhJ*OaߺZ[D웺Mu"bji$2ONi1\PUtfƪf8^8|p%H*`)6e({tG̣n[+qW3w;ρruo+ (P٢qH}<8XD MEuDf=!|;`Y Yu"iX},'gC)!#lՉQU G|=M6JƁAp%BW"/j5c25旛byos7m MSlM)1hL<:&7&նtbM/y2OAm羶`Xx=}F}˫(Gم\mLFP̷tMuY%6]KU]6TEC!jQ±E=='G}ٲѭ07<*glL+7-r&,1m q` #Vuж#g^\&K*v4ۺc-AZf%:-tzoNܒU;( 6lOOM&+Z\mw!#*8 zC! h.xLJtu^S*M|D]Ё%ڔ-]UDE5֣ǒT\RhA]]ݡy.v_yzR[p}n{+x)eYKV{OauV}XRom8 dG WޙzUki&)(_+ǢL2BtpQi;U壽-n]~j8: :q|ԃ?Rz8jL\BUGZwiZ}catz3R%Вt]RƜmC zHŽIs&)t\fVMQvSz \c4oD1 QWym׼HtnZ ^jrdU*[=T=]\KT.CInO$2X,ƵX"U]*/4-{Q"3Vk*<"/\9Z%quWnQ]_k-~RCkDvt +}mDͽaӅ0lvCe6L$桷Eo[uiAM@[1PClj@dRfh%5} &D@JaX= 5Md8JLPTH+JCzlt8w}I b<]3^(q?x~{!`&RFcZJmO b6]c։ӯZM=:b@CZ(DhTsh?g'@ ێZa=ffx8Kp~N5P 1>!6oT^8V*YP1!Fd2(D.=,̗v@D2 G;l:'iv 1:M@Psh6M6oS8ʷI?YAdfP`lR'n=:R(dDi_f#&1ɗԉR]?09(Pd?>Hޡ0 v6w-mILjS{yYK)#DV]*%$H:Wu @PdJY]$Л3(墘·g-FP ^To\WtuwS\t%"0 ?]o't#T5xcIMC (D_wr& &4 xxʊYܘX=xך"ezp蘒ew_Y"J^a,=}E2&Q,sʀlEu9=>8Sޝa9Q&/1#X?O MyWF:k  A`N_| e%e2!/g gv-d'RE}n6 Gp `\ @rΨf%gCPB|qe|+c:ab: OG>|Ca fMPC1حdsG s`v, X %93B> aB j@p5">AdFNPo px dN}sʳG(-1 2 6ֽĎ!r}؂Dr2 @ ܾ{0!~{DTa2 GbY>ٽ ilB  P\o}"@6NOJ N_7 o茅NO/3x#3 zbZ[qP׵w֣Ws>9q>K/c> 3q^n8u(\W([p w.Lbu}z?Q,Ssf=a3[Ux&6BKiW5oPmp DF+IƤ {FNBtMN>e8$!"0HխS)y&ܿ:^'vb\% LV`HPj Wмkڑ`\Нr;lG8̃k]BN`*ˢHV{<.eFpl.Q4$iWq9}D#<!̳f帑"' py"" 2ぶ3 ea IJ[6˰"6_97a7F__I@(5/ru&(VRk)@6]&,1ݐyd\Sbm S^hڰJ̳506CVo)'Ӗ)T/ ˨;}5UJDՕ V /dWj p=-88=$WʒJ@|j֩s<6aO+G9IzΞ%r<6q^CJ%gU7p&HC~/aO[T!'wg^v¼^=gq/c jxB6fMM=U߷CF,#NGtM0@5au6Gݶg&]x+{)DE({QFC|2#b:WdGy0Fw؉Y}4di!,=I燘jVkQ(KP}AU-U1.rS4|pO,%ȮѬaIV7*348hD8FeqfD"Z,#:8GQ{z6u#`XgY .tYxhEq{j= =vQ֧>Ͷ>8%]Ý(FFa*f6|z9!8α**[z}4* "h-#yW 8Il;L]om_dR[n%S VwӜ^3L~ήIDe"Q. u[Q gHu9{>A*IIO,c[~C(]'@(ݝP8B zkH.cJ0J:tq=;Ccet:lbǑH)ہPN(߁P;qw T*v TNx⁄u uw ݝPq$݁PoPoB G@B w ߝPq$߁``B Gh@B w 4ܝq$4B.yl\`Ԟ!PYq=#/Bohʣ#PayӠl0'͕XƅS8+FT|'(@ю:Nb,!6U{ДaO\R<)]w 6-x S[yzejjT,Q?}.!=]t ~3Qvh[ H1=KdCxH--&}woE^X:0@T[CT<Ɇ ϠR\] !oO"; рTy]pYR[V_ǖRj#`}~.P:O-8^N`uSXi&합jeEꇻuo:z5e8h=`pϚ,zV4[N&b3e6xb ]({Si.@ w62bŒq(4\*3ݟzw/d))2AplF.<@o8b^ (f$Χ?^WC:m}$\j_1}(i} [w֌`!i5 ;2I]\2p^{ ^5<4?? 4h-Ho; 4酩HO9لz hrØ3ij[,,gM93foUQ7H9H<+RĚ{6]6TZR\f: an:*U4>_TYT.'r".& ~Äf IigEv:w &pW#2_-^kyb;]xp:pm1(/8_4N uHo!]X(%K6l<[ӏD{jӋb|5|97cՉXʜ%yExțlAvEg^)}'Iz)0{S`)Gxf ~7AcuJYTwxG{iC?oGwf޽տvo|p Ojş OuWVܞ$*BoEV+o*0|xq^l-@y gUBw%|؜z``ObH39_ iJ2f3p`258/E1:t%2mK1Q8hC.϶zXz10ol?$:+}2fZ]–R$q+30a#VeIit Jmn'*7CC]eaxFX+W1qb%B  R 4vnå&vuH`W#SɃth)~e6Al88;&r" WCmjǚV0w|]s8u)tr.pU-f0Ib夽g9 ={ LpJSyNJ򜄺:!V=Y*,=m֣XY 9d,+ @uo2&u.di nm`YY9<<ҕB:V볱\nO iת>8Ѧ HFL0g\;TB\)9zRr*3,۰:La~К=oU$\/UImƸ!2ڝ7 pj6S3kW6%aKKr`rỶBJ9-I-:y| s~JݵyA{+k jƲ=6̖%i+ =hCĉ(|&cfM~(g-fbYƋɞݩ,[oV$s_zХDTijR6SZqu uM7)R-Xx9!}@1eN. hvh|:=߂C;H ZbYJANtŸSr%GKMv eiWzb24}UU4+.x+uu"Sִ9;4k9 ohS-4&niϮIQL+1>k\+7ŷ.ski|><ɲ9ֶ*סll{UYɌ_k2Z$8?fuXruVnU%z:_YY<,X` 2 UEQ IK[M1:{ m9.?_ ˏUps6cOU\RB~<#KTCIqiG3n-Ls]cBg;;F$ZcݘdsQT&kMEb=W6o#g%HPOK `wp!\[G ͉Ѫ<1)\:] `ɯK8;ylp |p0gDN5Y%Ӡ,8Y^gHHӚGٺ갮p; [?iLZ{u͉!J-#l^{:soO]GTv]ydJOdž N֙hʶTaĐcg_Tz&bڹ~;2](,_i3m!^W&JF1. Lcc .%et$Sk8"$](]XGvʾNJPtLR\?P\']pJ=4R+% ̫,ii IjOG0$zyr}Y%#F}FBp;8DE$Ⱦ@ѱZSQ!D(6F̱on&Ad,?N­)iY1!İo{͑{+N |}#r8UHUe*Uq۳%ѿυ):ө7 Sd"K^_08UܧkdϦ&LLEȓpmpn|QsGuF5<ƛ%70]rybiNA-XV&% q*U2b,|D8 -:pl le.x `Gtq ~O7$8~}3! ׏<"KHۙ% &2axsHnUJRlp̂Ŧ=x(h1/&5( FM~=쇧+ .÷eܼ'Qsa%- *P= nIVFfZxM\rq1e#DdglC^$8ULj7!ӴRcݲ&>'$`FƦUM q$XFerþ=oox,_6RJ%pA.] t#bF94CCL7]p ْE~И9/#cQZl3]4h@ğwbVu)xl1UQ$s8zo5z4>Yl1UK bĐ:3!*?8x혎NǔP'W$, FwOxq*t/J1!e9QjԞb׹n}j\jrahBXSiXONA`Z3& ܡ#Y+7I x \Tт,}.ZcvnsRwf2Lq?(a\b%!8/mvtxnwӂI]f[pd۔<{L'^G n=ɖ>.jxc"U fU~Y} -eգfh7 mѕl 30pbdoɁw "V!1ˆ!|2}fn_^?W;ͻ{޶kGw$*Νye@ Ƽzp 7 Un嬤ô%tiTᦘoW<&=I|CʯI7w-vCdS$փJc0b/fO0)+~Q_A{_4 fkEHln RI/FWw&s(> peu#3e.7+:`/9y?!ct޿ݑT}ت/y  9tY?6.av崆C$b=T&ZY6&FHqo]m0}~`i Omi$ FR=Ŗ-*5c^l։eMN6 ^&A{k } ITauC`+3UaC@:fDTRb^P@pNK!{J$-H:u$AM\m v_n= w XUλՇvכ!ozq[5 U = QjVKz#JֿhN\!tRÇǸ^Çh}>|c|M;gc+Sp(IjU)mlz  ݄!1.wGo?x`(x > nY;'˓w#׃YV귺O Q/L#ЦUpd0GoʘV:oƱ`⊼ulQ5TPwx3ػܸ=\}QD<(se<>#0N y$i(j ~v4n0\| p՝wl7*{=<^s"1̝KvpBfJyBFs+D8)߬BkPc$]hnj2=(͢vX0 W<{(Mv0#tv掷0-!M?soB{{}}oHp[9*\,ύN-oiCtNGgfyi)"Cc?v7 ƶ6p@ E8KTQRIl#B>m-I)pcsIn&"*L\p&Q\(9pPٜaqy5@(a[*}ʳB~̤kz ` L)&{^5H"?0jxm! BtCyk$86"s]NqP+/W/cH׋"8q"tk!U /ʌUeI.(n( ڲq8I B.K 뒎9 7x]0dR D"oO`?==8xȅv9511#n?8xtn+yJ @O6Yh:bM'Ib9) G,Fdܫ4uBt(IpgBx{ Ck Ԕk-8TVLˑwW=A'v>8z^Zjpm$d*Qz8M);7B-K3iP缹 )T-eim5vlZ{DM4ށ6w$y!f$y޼s9jp~vdYys_)>YѓQs!jq܃Cٌ"Yڔaj\j{P@υ@.R&1-.jDDN1D>e݇Kq^ᄕҭ'ܢ|ӄ86+ tg}T5ic |?wzw!t XT|QҋC|ЯGhbX޵^+Lڢ'g"7Wo&& S^ m]3 _JoY\dv4Ai=Q'!ɰx2}b>MGtSi?{Lƞ9!)#8I+x\?@~ ,&IBx_x5SPؘE;<ߗG/ |c6ĕbl#m\|%C5(ӟ/Pd~a/ocAڍ"͹Ooڑlb'bZ~C su-3)-G,]l2h:F"MݸDD,hV.X M+t닳<.h`D7v҅R$N4=KFnx{N OF?.O&{_2K|칷7|s]~vëcL>?=ep{Dshp0zf9 yFތbTu|S)^=Ⱦw>Q !0<I! hҺe1GdIzLl2Y d4H/GLz]%~ q 5aJht eM]|;M']fmFG`}˜ =ƣ"N_WpϚ&o\5}u̹ͥn ֒@FD;. 9yUr tF﶐aMI{]93, yfQ1Jݻ|q02?'hynΒgB6_kv_PlMyغٚju%vhEfӢIҌ&puk1OLzVxTs$V6F]H)ƤS<@FJ<P e[΋fC\C6K$` .qҺ;xcoLEL-kJu5H;tnxG;ZdV9ywB0:e .YC9'چXN4ϐBw+”ͧw%$;,!<֞&ju; wU<"uh NUP[ְ$z FIh;'F5V@)}(i},lRju:'(:ʡ$by[(/Nqh6=ƨl2`{ AE/ b(aL/JK[C\J4z@Cߔdp4Y:-OcSߝ\(tTI~vr0n vgv4]U`?)ŋ+.w)JFV̻%qWw{?=ts .p< ,'Ký:,!,6M5S'/c_ڌls4jBlQ5 05"De dSE<:T ^nbIßWŶjQ9׷-PgXKn68A cE9_*5k羫SvyBjk+9Z)M^jr9YCZUZ1ft{ 7 U;0W;MgCHA2 q`M`&qS:?X]OX3R:s r ˠn[j =-kV=mVD ; -߱雾c1Vx{+6mvCSwC/nЏ!cdig)ݮc#ƶJVFbAh-HFy'̌Q&YVH~TKT9ƹ(C}hnݗ?G/B{Rsn4K\ ܧ-Fh}sh$)8V;SX XOo5d}f5=g+La"mӟՖ=7mUU l֟RhKgE$#'_Rs6ESr^2/,2Fzãyֆ5|(~OEfQvO|p ~wo= 8#I itO*xOmU"`NwCJ&]ھ) y %!ڕ[-4N([*0ҍL{ϸ B{Ckgr]jv'Ɯ@ƓQ<>?__?]}e)Ixǃ~a՟lR?A,$-//'\zHݜ#VZ or5!ǹ ,Ȁ@HRix;h ߦYԫN4B1tY(Ŏ7A|"uBo`5M}V^up;%%V7 ]+;`(15yvQ3o {esm"0JuZH$stuCoQ;Co-8QcK4MR!0p4H,QyJp`מc.˙(B3c[!XsЇ^.BjhGGf'WpEpzY>`>|3F|7e"gHT=ڱ_~/".W6x٠Cbस5{*dJ࿻o[s9ZXw8kmYEU5#,icPg.E|$b6yl.A@ \AD)xKYުޔxbhק?^OzWJV|_[Dto/ pXm54Sy>@3Nlujߋ@.KWmZ Hx醁\sZdeUm[i(gm/Γ]oIwږF[' |SP-Y %@u-[|,~ ?nY}6JrbrB_ Wnj&[7^w-xELJKfЗ}R;IkmYx娿g֠+BFo(NT#f!fkÔ^0 jHpl.VIT~wDaydQ'HMM#[=f䁧&?zX8ьG=5^;XQ6SXZ@9(f凊k0{ٛ4qV~byg)TICZG%}<_g2!@&Rl7CpR6-ܢW5-5 ­̀v~|SK).AS.X1-ni٭¦QI#_qUH  U)rOu$paCІzLƳ܆ KPĺÎ)FVz/&&PW;]E}Y4낭Q ;Mv>RԂDQ`!;)FqJFzӆΘz*^cs. S *r +g{M.N{,r[kp wۨ@SF6*8Q!Fm鲍En˱Xr*sB-\KeKF=X:*`FRH6*1˜uy"K}f# wW㜠nEP!愑xw 'ubW4*m|>+W #VRLrbiy!s=ogsUlZ_[uN4N!\v䱭f:ƺq#Ɣ1!HR!K5UZ-#t!\+P"ǘ6h}*5DS΁q@<Sm (E P䌢93 9XI4] &$DT`8DG-e*t5^Hms8sq?Dc$gSf!g# h (B]r!!'Ɋ-a|BKΥM}GE1'R~=0̄P{?掫cNf֟-g*mcg+UX]`]G(PxK(?fꟗ? a~Wzミ^T{綾=m?nwzRFY\{\xjO)Ć]q /FhK|n& у+P)&C#48:DAlqO[M_q-.IDou(nd*dr'CN)IJih[=/CR4W\45bHl\a{O@N@WlrCڸHZkT@SsAl\@w"崗H`qNsشˮ<c'˙sٻ6$WJT8!U&9v`g,"uDk_̈8->M H9D qCș ZR b`M|ZqA6[~,& $2`9őL [0#Y|0@Rf !6:'ѩHT1y,T8Q"Q#ƿ&(T@Jn )cRz&(rYbHuc)GG:ov&׵)4A'r:ʘ:YuV CX3d:yXb\SB"FDΈ4AEk)TB:gk9` j9 HS!܀4k6(*ϽxKO9*? 0N6rk IKloyVgi]rM4T_W 9JpyW |7mDSwE5үY+ʸ S {3[Op`LMkgQW*1XOUB$e&VBAb}!i$Z<.LrtIXE2S*LINo&̿Xg+WyYf 42ـ _c,sus2 iF.*ih[BbZoU-3eޫ.ޛS  W2@DMTʛv]s{# &ȖR5fb* lyws~3X!D]Į9Hh{24arG-ӆz\`:`F}8426fϧ%Z7P8t*P ־2t#b86/')d[O-@'N0mߚ 0:Nha[%(Ǥ#\ωfO=[D9?_G lәYR:\^ƓRu+ʹn vXv6(f=6`ȕF[S*bS*xI:WEz۸>u9YnѻʫTםԱ{7W7"OJ'Պ߶}Lu"(3 Uw_uȎ(맄7e?'8gpA7ڡӟ3M9jNZ|貈֜X`˶cͧ7jR[oղmVhS6ewr|p;\-go*о޲T,gH}F{B'B7o|/oYFW]>A73w(hMoM}BUk_ OQקk#Aw-7ʣa{ҽ^  WR&62mQj{9hM#6"/ؠ }F=2,6m*z9}9h?EǽdjٛV*/CFh:dŠT{zx\dY>D?#z[t.h<̹|/:^|}{ԞVN+v,:<lV3U]$l^Y|oF *t90OF hfLF4iJDg4+r<6!Ԇ!֋.*!wCḯhZ.͎Ǟz甒}vKe7;)ET<-*pcr΍_\Ʒoja_! ^>s/Y^8eO3ٿ_}Q QR09ĩY.M&>.o] }_[sUTQ'B=鎲~)~uGo-{+~jTkȻ+E>{cLsˠ$J˫x꫎D3^ۀ5Pp<+bnj|UHe<<*a#ӹeS>jfgOJ*ӕw Zo٘P&xmÑ܍-k [j}@VkWujZՊ;as eO` NWr*Zp9^]JQe)gQ.> rSS5Ȧg5|i3g9X1s7Pwe)lY:Rґ+?JGK)8PO~ L`!əA/ :>Y`IVpQ M9SN fV#V#-WLO0l(|X% 6 toY`{1/7 t$ 7 tՄWqU:T,tj.tNItPȉ7yH S4W&ҘS^P ǜh%RN#:W;Tje/k7JNUrkI/ 0Ltk.EVKrݥ)UaIInlu{5cPF ]{ 48=(1y.};Q[g۔.K@]I )+2T=7̈́4v2TND1D" QiH$P$ưHί omIz$qtRۊ㊛8ll|+,MBaO6x^u<7dzϴ0gi'V}?5[-w^ vo'?^1Q`LH{D :jl ˦rQtM;vtpF9{9X`ȠJFRYԓePeQO才Zg=6 POͳV |N1C# 5y Vt^6Hkkm>,թGUkY R ¬ 8 r}U`[ɦ J" HD;\oD8RCޛIG!9Ϙǡ p|8tݽ9mz MxnlL=v}zzؔv:KJ,]TSƐI bué`Ҷe`Jg27]Mw/eVE Kӽ a`ԚCC ^^*\]OV:U*[c3X%`Fl4`aW& Fj>H9ba[TO&&x!p3ˣ13C=f뻠R31y&^cM yc%lb &6鏵8CM.1oy@`ރg-)<%e<^[H0&Zk( ?691w(dL+ 32 Y-zj!BB5듒E  |* ھPsiίnX JQ.Br*2@RY4K҉we~Y2A b~:w+.F@b+!Ƞ/6FPxh!`7mp_ PLV5#H9c[R^(7(܇JCQُ>8"XAJ 3X`jO F筕/˳k`Η;LJ @d)/bjYm+_vϗ]5 㜭9:d 0yˡ|bhv*Jib?t{mM}ѤEu3I_rjw2١޴mʛCC|"I68K4;M?^mj@~`mZ=H6W9†=X&EA7>Zv/G>]EӦv=/68΂lr D\}6bʡ`!:F|lm: pj5C3A![Ck[8(5hj 0m [X yԗ5@ &aOI \9p:&݁_3 R D=99rahh:++B- {jVr oqKR9Iy ';a*Tg<|.H12ddBS5dFx8A][O ㋪J)]pȬf/\kMȃ|rke$DolkYL7ˈL;Fmb5aGMᥝ`[utFyG=5z+}+`]#}/H4AWFGz1 ): RtV^FǮz<[T=|a}DfgX3)@i9+pz:Bll+[9<~LwYXɛZ)Shm)jO붢WHg6svu[t֛Dj>惔N:Ojqvbq-o(L-CH5ڶ ɋ &A%W6k6Մn6ۮNO3=^YИ R 4&Wl ֆR ҷ!LoM!w>ƫEoW*|_n>]'(7OOFw˓i/*xpO_c%De0Iz]i{{&C?S쯐;ח~_^}yـPp_6\~_6:;TyrNq쌩 ︯SLW!~n Aq,$6J߼~{x~aҤir7=T`;T}LUgiۮV퇪[w:$U^lɣ|bÌ$97zK'RIIz'#KH2Z'}\//U:D(x10E̚O\OM=C%؜P4n.~y/6jL3hP3M4m]4tL+dJSCFr쥶XEԹuRqyuem2քh? ..+)>Pqxe:7z(3Q;9TLFAcelE n,g)G~V]+ ixI=.879#H1FbA1SD[A+Pl>(l拧tSqAEzb T[[ݏ:]?^YU1 `,T)U! m;/&mq<8YwKz diPUc2˜<] 0u޴3;}=r݃.qFl0<8)U,(> 5(m(/;(]?DN{/É`#D%uβaΒ!qܤRz \|>i3DY{q;f rY"R"yVݲm puw `2[ld))L4}S:;H)똒7^ȴv<_n!tnESFy_B'FC9:QYƽ1z* < !'A R|PĮM*P>$LnG0  mm' 8E v>YQ|r$%ܵbw^Oyw]&;2"P2 ǛWQ] 5[,=wo܇ge[XvѸ Ig)Fj|>D[%Iҙ;43ԻIL H,IێT$UyoФ{ `@~!nѳ0DԣppBF6Z{pÑ Qdqe6H1\h36q%?Ph7qWpgXd:#H9YЃ8VAotv8'dE R0o0tU\m)R/wŔ /R":tM]lyBv #A›B^Wﲂ#e1iWEp U뤜#0j'Y4z $# HS(iM7|Ԕ6~cDXl-'|NLm76T#`Eo:XFm|>[{~86lej7z|L?Jۓ7'nn>^}./ނ߼ͻA9u*qC9?u1z˯.nƋO7?x9Xnۿm$f,s,,nv43K7_?8GS46>֙sFS oWu`"Of o/6"h84µV~?ۍo7Y.}G;G OߞawSa髟A_yC4o/No?Xv"t_ǯmG\J[u5~;)}(}_\ [=n߃'8*N˸RYhE9NO ڗw-Pwl}6sed-2a&TY>w?m#'w0VX c57קsn%`v<*Ȇ=ۃF7rOR@[k@pv(uͺ+xӺ\`t ɱe=Y>w{D{սrS ZW;r)Pg$g*(j Jc7^05(iJW9<у X w[x j+,oޤ_}>)'ҕŋjiz'}<7a8V}>4K171WY|( ~[-ŖMj7;wL''՗V_Z}knj3>;F8+>B߆)^}lGcGXA6eg8BCr~?k+Y#9a"/v|A×8CR& g8uuWՌ`O#M:ڞq'ЫhHI*=͂˫wI%*uM7(T5L>")~P âK8w-4x$r}^>@Gz"N@eYRD~(1S3N}/vRtLv;i'S^4cP~#`tEEHVIH*M?/ ,5IaG_ ~=|9{fNJ{ *'/1?Go pc7bVӹ' 0 #w#W//a)Vh8#$Fy$>ʴE y9Εq^XQU9=LX,h1cQA'06 A"p@|TdF}ATߴsꦹ6ެ'=xmwFsYRmk4F#X|W?~QLR |[K6a-uF,,`EN VD}3m3-e `"2>PvyAA _lY7Μ3b}9u_:dϽdWQ[uMPKy5I1=V# y5KQW`x7Sۛ!ꣾZAM5Po~diɁfឳN*j,<{M泖>X-%TTבbo 4*EzKg|ƨ+pSzyix\A'滣Y·*#n9a `g~V"'{1ǘm%x8:_pOW!P3\H\!x_aEN1<3nQ9 f0 OG)Kf-)ZF+Ou۪g *+Eg#[3l1_I#^zMa8s{Ȝj͋$eYAos X? &'LL&V=Œp8Z\AG0kDZTd|*=hNL)sCk- ʲC |o4Kuo /:AK:~y{Cv.Pr>TF/Tɖ1I=쾛f`D3W-@瓛q {̟qMĩb+TlTl Sƕb+vVfNGH)~2͛c-=X\kNVNVNVNVfQqL,yldQ)>-P=.ҨI&0%ղIF-OR6+I3L>-$YYI4*86 1P%*ؿSzpjH$UdNEAZ0<-N:12eqHs)%֙t}U9yH4WQxQ1LeW:jo֥*1cT0,pl%ef&w~I*Pu5.q޽z9|3<~]\#G/|jUlH`AlD`Ey6Hs)!iYC(y6#.*T/^-EܿVFOk=I׏fOA@KV< jE <9"}6mYi}SXJAC[ōΨƿhclX=AvJ'WEXMtwɫJ\y>фkv:R-O~!(113"ɋ_q\5ɅI&?}=ծrlw]{7ɞm\1mGmX3Ŵ9&hGHkRqB-*NǛHZ_\V)X iqdMX|M *4чP_To68IHοYSs] Ǔ( cbSb-.FҺǹ.Fj.ƪ(^sj^<^,Ԝ'F쒴u@#Sv|`nЊ;Op"9R_$`3Tt&W'zNAvJکHj֣I^]/"XYra'/NWG³c'ffgnhZ( U@ihb+ ebX8 Cг"*;{+vwF"I]*>vF T_O ?u y4әcS@sb52ӌ8yy9 .h$YU8 ;}|hy>BKip`([{q:/2x^_jKy/"㺁@qCַ4UFp5j7sZ+\7 5'LxAlhfvj8t!W?~Q*UT&h. ؾbp֍Ι¼+u_+P/PNryלXa$#R5"x2#`i*>úF0%Vo+7Ky\sW֟ uDרѢR59{FQ9:Ҩ?̸}扡LZ]t{sr@8KZ?+r𨾭8( jH LjѺGn|sC<,XefVPʄq'Zo9Ѫ*rGϟ\S3OL>s!\lF4 Y/Tׇk 8h# p6"=C)4蔇 f4ઋӆS pn}; lŀt- `nrܘ|WqVM:?=g?|(E *<Y$\O`gxCfcP:oEuQe;/+oRӎ^Y&|zݰ(08)H,R0nЇz6v 'Ը?p1۠&eΥLmO8ہsQ{qLtq#0߿/"vй J4?SIwEJM %~]UtX[RGКqa獰0V727eF)q765mv % 1.O^z7v0ƛ%*Ǯ.x]ǬuUf7Z۹Kg)Z1@=_X%aoT5HT6XO-HF@E"3 f{ vxwziY4_ҙS;tⲇBCTN̗S4zޟb=t_E!Le Př=ai&(<ΨV bC\RJ7|\c^jb7҃ߵGo |pS i̫I$ʴ8`86S&aǵTk+IM21 LgArWb v(PPH.hUa9Z5!" %'RCE"jR$& vDh,*{K Rc٤8ֈ^N lYU*xd6͐ ze8$X$+#%N7vBF(irFN`VByh,8(&{J 詆&HdѨ͠j`o]TP?F2`Ɛ)eo>( a( @$RD&XI44-mӍsJd$JЭ)) ,,=0a*?%Uɺjs%;3b]JA~T y P(4i_@eR AgT g2BQ!]`( UPp#CD)$h~C]HPS H{PW6;'e8./k.\ۖP3/UOujPHN:ZGA p Hl-@%eb $zͺ aΡ },|:omBD:[gaonv7 q)|ǬD|#RT֡:a:I !0Ⱦ3"v &~quӂ;VȖY <.di> ka&ck,yiN~r,|W2z :9Pmi>Z.*h`fS yE n?X ABB'ąAP$RDM+We2G'kEp jkt<{*ċEK*z<-ex P, c`A,TGWBgj@bQr 5Yre5k@D~*?EnyQ's8-f`I%> ^{@#Bzv)]L?I:"J,tҗ 7\#TKLE)e`칒&b ډrcAXx65*ɮ$}`56^`# K&K A+]K@^rzHN"|S<$|~6 s$2S" QlArQ+Hļ䍅"֨ WP4$1!dE(!cȡlͥV 4]+AGYD5)e#o R.E7i[y/Gm/XPˠQ*w+(a$0 JP_ȓ* ~0dJ-K@h~ׁ}y[/:WdE:ӮE\$* ye qu3(V"8zkpS`}&w9Y\u֘?k>Hr8%E. dL@aҷ<(-P{8n𠗨5AĐ 9P/3XhGr9VN AiH` H5.V9$TiOg׳lC<@6m6 V"b.UJ:*Pj 9 JAozU C {?PeQI:UmXfsT`-0'GQaj#E:Q=%h pNr*pcr(c&35~*Ei L>{*?3N! j(W(&!Ts&듵OHהG֒D?=(JS>(|) Aj+aTq@i#b .t"H˚ g@hB2*~.$ 0!#Ƚ*A¨Q$%bm(T q Vo8A$\( u=hS&p X9eT$,MrAP$ 9E. #f J#׸ 9o*е"No2B WG&rV/n~hwi\oZް$>Sw1@c|r6,PӪ*NITNQDݩlnh=%}ņRlf˜nAv,xuݿt\]lEYNq>,>~{j淹ٷt^߬.ϥrgEi> .֯X;9 c/WFxD춶U}Qԝ&e`F1 uCu@FcFl׍:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨӍ:ݨs4%hJFlQs00u$EQiAqԶuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQuQ5xQIt:d:}F}G?Q !FnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnFnb:wC78K̶<*?[]_G!>G>b>G,^p~Q0Old6|}ur5SuěK,j},˫w{< `g_)HHp//A̞W˟6al/KM]1<84!x{_#VǏid|%` R. 2Q\i]|^>%'6O]!-$O>.0n~ںoWwu6,1__^CۈLN׈LFg q1nw|S ~ͷIPTq/쳰8Q f,`|&ɊfМ2^A24lءF֜+ic,B1_V?onGfw[qBO8rm\]r{k˄oOʹ`;8mȽWP?%?=gY?QIhCn"X X3^MN'F"}LlN09<RhM+M2,L9 S`F6z"`-wu?%q0P#*T0kgǃ8)VIT€d0NVBȌkVI)E} ^qD|p F#ҙPZ9u @ORF1VSYkSk*a`uUc&֓;,?- f$Xs&4a'VjTFTdΈMs:Cy/}p&90vtg6Ok߹yp=:7>5(Co@ 3.VDJdN,$!ɩ5֡4xݫSA[;Hź#?VjgX}S 70@ua*`A JXM3䕚XoL F[{;n$Xw&1Gf]J)'v#&L,玻3j^dj*a`XkB0S g9ñuƩ[੫󼘟㻳r~6C4.nwP]( a /'skay~}w~Nո7?[syUũjpGnM޻O(Z_}|ä.f}xuqͽN"@J/nzsW;Y jSBJs|ox|vϧIo쭡>S){OM}/j̱t9O\{q?kytBJs$RƷ**IxmZLdeYͼY]&ć{Ev*V(A6[%ډ!U'MFFaHavYLƪ$TIK_I$bkz)bJEBwyED1_ɽgKÛ@sArͥhW(;a CqF=D.Ikɼƀ݊VP^E[cPO^Z q1ԺJ(5IR I7-ދV= 似[R틥֌%)!pf* KC LTR*D sEXXz B݈fpfx)eZIv:,-uY <Vzy-{NYWx"m{hhiE&UBE%Sju3)?YSИUbiͻCsQ;s& \~%|m \B$bO:K9,KfҾ?US(1K Ly[hd6cJ{ܸ_q0 00oxI01bVﰤbwVv+!8vI:$/=˔$ ZE/X'(5.RT34#DE\G(uczb/XR*k:jRB3CaE uL.ӌma8x)VLBYO9jRyP<"ˉ92(h?Zp-SUƥ nxgF2a ףN%XlU*bJ<ɠ-]9 Y8Fʄ[zrlnToPV`6rxfZџ>`DR ~'Y0LƵYhclØ, B:fDRNVqeBoJNњQK;fseT*\dYViBB@)o j gd E%|4PG3`E TնL2۹mgBfe[W)ls]wbג]7M6^ 3 Od<):p~?@,}peUnנtph!҂bÅDD62NJ!i߳SqנL)KxQJ8mʑU|g-i mG`1 ae#R :΀x=J{!ҽ?pȒa8{2H= (,P|L∑ ̯;@8\dҔsG]։_K-1`&艆`%d$>~Uq18@B.'2,t(oW8,靈FixoQ&ӛ^e#y^b 8 э@5ݩ1(EBr*1'}lAՍr?<]?6Gj\^הbn'؞&fcowuov54j|;rFr{Z+juѾ8;L?' gQ..W?PҖ;) /Ny _~eʗm=coOEiu6p ; OZF))}RBA.Y u$A"P uK QBcI upNMsB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*ԩB*BovjƿmVʶwddoj^n{n4;tBSq$'$7Hȍ'/7pSFEZ*7r*7r*7r*7r*7r*7r*7r*7r*7r*7r*7r*7r*7r*7r*7r*7r*7r*7r*7r*7r*7r*7r*7r'[HpSHJ8ry2B~KP(-BPGVTNTNTNTNTNTNTNTNTNTNTNTNTNTNTNTNTNTNTNTNTNTNTΓ F֝2NlQ"sr"`VJTJa'V}3u"`-/0*ؗ`MQ`,բ͕y=х\,..˛حyߙ4+^e{]r7ȃQZ¥#s/B{ϽcwϽPZI?'zσW{͸%szQn?`g_-ځ_{sEg<;q0orS&EWykMZ Iπ8?MüOs_is1shaVJдlaޡ{l?|;Y}}hx.Nݛ]MC{vu 565hD4,_{gӋģq,;hl./4n˺ x#/gp!؛~~>HW?_]js1 yߩ7=vFau׼?_oF녯o~er=ja彿.5p2~h&Z}@OѲy t9EI/>ulhM&W2wp#?N1 toa  &͑tG8_Cz!zgg{i(+r5?9 ~/G,%,aksK~9{;&wTsn VN7۰5^Dzgar#qw&\,V#LE͌|ڟS?zf WqC~ ɢ_J;/t7u+2sE˩C`1?ׯh k)WqoеԸ̌k)AQ<+If5rJ~f̦~n*R4=SD ͦJ(' lyUoZ(I^-I94OH)?J^H{@G0s9Gm  `MԮ9V\|\} jQUdڭX/ c2أ6bĈ؋koK .s{ V{nC}hdZ0Hiܞv]Z#Ԛ~݆_??^hEn6%p\?l7hsaXc{^-3N>NWo'n^LkXjM9}tΦfek;OLJhiM \uW-)(4qo s:g*EljBsܶw0A~*-ϧwҧ)r]HML].,|Suc|q?e?^I?um/mekz_׷atզU }\}sbvy^o+zjgnqV7)MlrN?)e^;kmss<%Y>o>fV_:4Rg*BtGsnͯ#Î8{01Q7U&VXebU&VXeb=LK)z;'E3m|mR_`\ݥ .iȓ.C@JjflxWi`xMV*xhH_9h5/N?gu,"sU!]jm;!rR5:ߤDW.l6jiⷳiz|׏׿ػFn$=,pf>&৭XIgf-ݒ5LK"bUaDG[M{gNNo>')OqQ6o-v3QxhvTo-Nj|#YCemlهf-ʯ>gS{_LĽ!"cF_c,qjk*iOu(EYo"W|悞bh< Jނ-81 MՖn3 {Sι%]j1+3LP&8f%7|7QeMphQ͋q7Mѯ(sUD2(&lQY|z2Vtm'uRVD2Vk9Em\p-b^ܲ/i#OuS'3]Ꙋ䮣 H1Dwa IJ2Ub.ougߟ$ӕ'a̍gq)*%"'kq2 o\wZ]-,ZNȳ4õNRjg#uJ F2vo@.i+Ȋ洷iUzގ:jL`M,QndD.21)%M_tHwDbڏMbQTX> |C(Q'}YP[K皸B-C^)akeX~(%/C908:QKϐy yWM-㩩Hj#LB9a# fAYD]FI_G\R1ޑR Fp)$J<7i@ZFRE{a9~ؐj#ta%oa!RmRp^ແ]2rܟ "T,D I ia-b` ,$VK6 .# 15A 6X-rSc9h8C {x<'e\@{o.i߀+B' 1\3NC@Zk)幱ORҨ)E]AYhUMQlP&iN-r> d%Lf<,AE2h ګMyձ͝J )HFz$e]]sWDpV(9P>Y \+OL=7Po@]Fj'ۻr_%B $&#c4-ZM#䒐A>y|C^@jc"ze2K2+m(g F$Ǖނ+vmIe]϶t ndTÕ|coOHQ ۨT{Xfѯ2?v2G0\`W+n= Feۑ,V ˣ/w6ru0EGlΛ)類0EmfO2SNvɨ]DG<p:}?v?ĝ*ϹZ2dY(~2Bn4|}Z1252FQĬE05 %0wY/rw007=9 o,9õ԰QKKx4 /#Q+ qG>e(ֿ`ğar.K\VcISӇy76{MBY[v~` ,W8N.c).:T S:ZW-":E99n#[Dg9m f0r _4,>F=]IrȈoc<#m@/x (NLjPMA鳫HqBa ó?dK痐wE\ ]zQ'.;hck<&V>a &,hTLhlwhw qA&E!f=״Fv 8‡M8Xe E \K5p-#d_Q+ia ; SӜI.4|l< 1);a\mwW/opw[IXA2"T6)!(K9wQghfw!j"1D#* .hIYx&6:u[<뵯Ms+"$bpEPm̅bZ~zww˱fERQ ^QK\huyGț}/+{`F(Ux/' 3#ΗJ̇#L> ό.JnS昜Ŀ 'yK=0}ΦhN+گocX7Oڷe3D'<$u$OoF%jϬLW/lQ]8NjIkMHj+3H $]& ;"N1beQy+,由%9SIt@Gf'.9kéѽJ6Xw~9Fv<>Ge9/p gw&VvuX!"F\XrʀFT],،Jw}|IA=T:8$',""C`B.n^I x4W"sc^]Z]*sT#ϥz}Sqc qfMe?W+餾k^*7thg镖3†OTvYնD"y) VF\*,i4VȻy.1EE Bo.!Gm OPjcy e!=n-(*JbCJx` f* ¨X((,iԬ," *3g_Ge1B/G1$`x`,rj(6YY[Ҩ jyGP/4{H I0~c)G1xNكyB~1.yu*o3Q> /pMΧVoć#$4QHZ_d{<:gyrce!)Hp&G{崔;`IwE ((畴!(}VAO|x8gFR61 CN>Q&AL=^Qǘ x4cE]!1#ҽSwKwоEqw"CnZJ0?:˔,?*o4,D.9ҘG. x4EyW{L!l$! jPH0KZ7%#̔ԀO-[ZK ?vuKdⱏf2 ̫ \R o>t 2{Nv+~'ڔ[bui1Oz4=hj.:4N*Yծ_{b D`OSX`Qrudz9cKٔw\-tUcEQ>\ 誱^56sP:bXib<l` J#VSہMnwDWStQG>xNc}uRˇyˣO31qTt[Xk帱uR0iwHÂ"+ ^[exoGSSD ?kjjOdg?f_í’T5$aujuل_@g,\VxÓ}?dq;Jʳ~T@uk/qzUUnտmU[}0 C) & \8 ,pFR1lW`\?~~xb|⏣ _o~qn'Q{Jǵ"?bNjXx4= ^=W 8bY\}ɇ<~Yͣ]L'{^_˝}$6e0w E#J*i8t+LeYVsŗALp5q]-F;^SX -n--L)9ӞӜHcbDmbAVz"Hj9Ggy|M=,y{LlC8&+@ަJ%MS7xgfxw| /v'Z~[ye*O*{Y+xz Gn=0+zv7^[Q6+a/ixr?";>av׻}e݁Cye-RЬ x?ͨ^K0 ]v7y3yCOL (6kj!QSQ),Y'93,'S̛_&60{XsP&Oh+5ylyH.r0󲲿/3s̼LtsdzЖyu'eRyR<.s,o/L-"]zbKAlŒS RH*iҚ=ǔ` =a|4_z9iPd4=Ffz_3ׄ=i}_@N7{5aŒpЄɻ4-Vb䓔JMr/I#$Df'T}Q>}/?+1b8Hf:_6Ĥ+X6% u?u53ݍg__׋Kugw*aDyqY\` Y4=z/ǣmi?$U E&pE BC紅֗X_V/uraQaץ/zI`C24 A-۹DN CE_%sen\m2 4z])3 k$vF=ʥ-9bQA!.(O;$INFC{-0< WʡPn:p0%hiHm@ѾmWb/ G= S<׹u8Xyۧ}"olvwI*B3gia{?|'iA=s(!PK3`Ρ%9I$G[r=緾엫@%[K %=iտK~Κ1W=~ҭoCWK%˸'њ+?_f!HIQt4" ,-Hzs b<jxJ-㲰)[GQBhբw"[(??֊iT(Qdhr$-,QSs1߼; B ӵyIn 깚)kȣ !tjdxi3Z _̠z> ^B1>GV,(3IH#ZxvPlw{>yQP*MX,? OSG9ЀtV^^~:?/y-qaHi/\-l+esTFX ~?O柩17b?ċ i2CCeE1ב7ӛ/ ۚSo=<EZovQf,iS$T _,׋cX9m-lyt7&ΠA7~0,xcMhIhQzhzȱc8$'yj8 Lh֞I/CvԌ;r2b&,GLZ2es̄Ɉ}TGN LUdVACFn T\#t6gXE&~35u{H#GxO܅|. | w)w}MEz \(7yaqZ jTcG-jZݴ86_Zd_G*^_X b頌xq^^#*aY$fۤ2t@Ք is8s@Cug씖~)\d)ү 8!I`]`U|(wbiP9@|]XU@ߧ;g; Mݼ {#ѧ "t.P`ƸPNK:hH8HACY\4;X1&NcF~yz>f~c慲*bBdqUN4tamUԘ1cAClD>_sQN#rBs@z+S^Z2&UQ'Yz\1AqDxas[Z0qypJ y{?|~ +(6aep@;֑?y V~o/cj̨r9[|$81 BI)h}RzʅC%Z-c 3 4:pxu:7~ՉiX\u0J؛zD<3 jN5lCB8ĽPh pA>m@^"6(M' ϰ]a)y"ɑF==>t<5V=^ GՌ$]rDΘ#Iz7kfP֎8FG9pP+1tYq ٘?5[N̆ h1>&|4t9dgҀ< BM',jmdDf5r48kήU!ۃ#SdVP22jGd9g]OEzbTNHJӭJ/i$rJƑPb;hXb9 | \I vOC]I2p%%|xX@C%Z-c#l)'_/^w2{f$,-#f*V F$) ) kX͍%Qnܵ@,% K RA֫:_tbH5B0Ov57+[փntc2KOR>V`o@yU;\I.mjlʎSaHqe |r(5%p )Ϝϵ!y7O 6-'m{sEϮ‚ΰeǮد]_wf]O}Ӡu+ܝ4H~pe7x˒ :{eL"6XǼe!Vi%xg,]@4 CP TaCYU LCjƿ{>t跗1v{h2~;'#tP1ɹe,5`DeL"Q3`3xR#H ˝]DWDZ>_5`q1Du;P6M߹x2Q'Fܸ7 OH+˘13zAWQqeVVt˘FL?:,Ϥ3Qnwg\.` $cqI݅5ǿ?%p6I3Β#I ʴGɶ e+0)eI!-5!nhD|J" ̨m~CrlpO 3g=9`{_ПBCjA uǿ|m' dfC3u&ݘ O=@ zMT N=,cgD93p[Tz|i=j.@>j1>#o0L4 0U7#V:fa5aYxcP\ȇ-2gCf`'WeLBD"α zQZKp4|(UcC@,6C(T]ií8í@YqVxYN p\,m nQJFcU:hZACYz`,ͷΒa?]m; {A. _PJ>=>B봍\ȇ-2WlWEkc>1lF+k ]B:g[P6aKB"-}5͆ ͍Hg8p0L[è!ߵX Z> OU؝5 ^Pm;\[>fuz3 k yreϛq >کTm  (ndpșOeAn_ [}LͬXqm!]ѰUPhvoWa4 ^a? ׭C{q7d=vИ%Xd>k Acs dD$1i -qLL'W?ȧ ]BƳ_O'g+ͨTrcRܰ") BNԣ:#Q;#>Boop- (4=LwH`Ӗ2:h48 DP 5$> /rp;[f\uNY#擳y-^EgHT>fA+e.%! dh!eWQ[т2u0V_<]2Q0Le25z0u|}bsXnr/+v{@e4N,h>T8yǫZ_zx ?4[ X.Ÿb|AzAs%f6QRP et~u&8'u |`G9햦+rX;% o dS{~3 A ,)v_#gJ&GmѾ@CYg% WZbv5 -'-BSp4ZC>mhpPFq*"UBO"Tm-ZrHFKFZxʙe<iGίӃ4,t6EkDEksL4jZ>hZ{*6WACjnG14frq>,k8A:h?>}z!u4[cX>bBgȉ345C>!/)2Й1üBF2uy="&(?tfeNǒ8 F8ڊ>c|Nv؟x\jJ eEΖt Zxvh >\(>Р4.M~b1[܀?:Ke*(~^o(q{D:M街Uw|ζ,Քbv}u>inQHƌYf ΀Oz@rw6=.q$Ijh"" 8pWZQ;>ЙzLkL[ӍQggrfIgsiN2Kt}xڬ>>ojvԌ9b1P*k' eI;\hUD,m*K2 i=RXyW8r0/f ~"ɇAZӒF3߯ejZ$myĘ*j3ii@c$=~%6؛l-xLy}uDHc rmJN]TRtTmm񖋯rt/&i6oq@*1I7;o,Vxl_:O:  ٩?eGEWoϪ开?}U"Y=oe23 ZjtVT3bUz$(UQJE}.UYXBԿr7P5nGG?] b:D. 42v+9R#ayJNQU@~}LjE"Ʈ4?QE3:,NMasQthARB 9UZ ,1RZx+ "xk_NmYMyZ:ҧ[fX{' G\QN k5VgP~T_}EnG}z׫C&5v}br%H|дX rDTLrY%A f/T1q3mPT=Jx7S|`x맛}~}So^+L-F{^J/iy,"XipH)2N3 F@ Cۄ7):/D-ӗGM >O J86sID>+qKy?M%Cr Z߮F_jK#~hq)Wr-Pe:fBaIk|T +<㨅F͆H괇>Վ$p:]lg/tds(\3JA~ϵ|L`E/Q15.8DL6|{/Z'a[')^8 f{صoKgh#Wic^B.]x]7 3'L?B7C%G2f4ypH%%xzi+v˞>K'T$+q9HGǙŷK6vs@btˁ/Jj.*'Cgm4f*QbnT&Vuy CNDN W}o yZv_=,-L^WE !D9U9)PN=`vSᱯljM]Eܖz sla?)g$ORToԍpgB"Zx\>$(~[ah"2KB3\ AbRrb)F"2wAjEѢ3y\ٹ4.qǸ\= 6-z9&VVz&D$PSو-93&|NN>{s3vK}):5n!bAZ 8l6/$}EܬrA $P5F\ GίaLĂAb1 23 ˆ/)N{QsJLkw˯/Hxga*y s@\I6csSJ8jg*,ïuႚ$"2CRN39sh5V}$Ѿ`V{{yXU"BPP_.AA xI5kTiTwPNfwi9ru`6&jG|ԗMaRh\hu®}' Hko& $|| qHLONR9ȥYxίa/-gcxTLkܰLpq>ĕ<1!8?_>'Ct;2;}c)hDfA 1/1bf a~Pf \BxKJP.JӭrЯO7u*+`McB: iVrKk\Br9:j 8s!gk`2$ y.3p E nQ {Bs @:0$Sb31rnO&Koc,9Vc )#6Z:G$ڥ;8[_(TQr$#Q*Tٜbˤn \gc̙ @ z2[̋3 c2)r,z+'Vc3 gT =`}>V]ƌ:Q%v1 GT8 QTxiǰb7@ј%ҊZIV{ȈG$\`5 ŏXEC[_- )벁f#A Hs(drQ>7XC0Ba 6<(+=2awWJ*: aJ0-<H3VE]6#8 0 1@.P9m/p- `=܋ǫEy unP #b0Zwυ7d|9dX HtC",nYki ڣ=0^Hs)(иOG\c 0ؕF,qYf Ha.mfH*088 R,xX"! cXOF)'! 9%UI3 Qrcm\uf09fJV~ZI}ʳ|Ы-{VW>q$UZX*L`CTo1pz5 J@lAB Ql<8>^Qw3+ L-<zۣ.WO=UI&#us [yJ̱@&o`8(+*ԣ1*WTnHRy7Dv0&~u3  Nkv.Ta:S,;4<I@;6cgPwn!\HmPNA@sDTq0 G_J5-<2+^AASO(tں?k`yaZI6f㩀;oLeEH1GS);o"Y_˞lnMN@ JQc;u f-~oSշ0qxF$ ѐ.5?au1}푚`/8<Oqo| ~\΍yz0[R6E >eijB/~/㏺(旝8 \q} Fw``VF)Xjrjo`EhNJOّ5jl:*jEp6vO51m^H +-'oLVDŏU? S smwY`ă*P RfIfJJi1F|e~~:_Dŗ_?rTw mY)D5k{^ͪW٣I'罞`O)͹ZW^G4a1"n\jx菐ʖYϓ5 " 9-?͊|+Sͧv,_|?-Cr٬e~q V#o11ްZ?vMCiC])׋%}_{InC Q8z; Q܂T}4 ꢈ 'x 1#OAqAU-OWWԗ_+Ww8+#!gRdO:30_żݻ^x2V `a$dJ9.t *f9Hp"p@n!1(VòKc` W<"Zx + {$#hCKCYʡ1=/ v8„h)!ga I-- C.錑SX8= h&>vDR<$?H$}*m[—ocz-Y.՝ckZ2{FB$ąpZpXgrQmX.ds#?tp\ױ:ơǍ׌`YX8=8WC30gthQԣbUI 'l3L[%Az2L1d s41pxP\K m1pzݧ'$*oz=¿&&w!U\.C9„xb5·Q WXSDI,=-b_w89c,re3uNQP1V<"GB]Q kL&P,R뀅Ň_c`ᐁfo.OW0=orGZV΋>-sNfcMK|fenhz~vnϫyiUO oׂm*~Hhi> ߏ[MSW]ɳ=fp %=K)[aZuF{)6KzM{G9]ѷ"R) (x8$mCB/&q8뫽iUmh36mԹH+FpR|wvaۢmQPqTV=GJ#e}.mg!y)82<'; #^3 MqЁeUZ _=0tdՈ(GT 8$+Gimi"ԏ]{yק7A5ͦKnCpi9A-nZ'LlCpVR+8ji3S:4[x`Py^ ԁi5nDH&%HJwT:ΰ`:MYУU(?=ℰ Ȼ+Chi9$TL]C@ 7O dӅA"ƈiPxgוϏ`gB Ù:2(nsn._'t!PdQc |B#X -^?ya/m!>ø$yӆ:\ts*NnJL$=(Rsy5KM)?cJMd@Ƅ\ju4(N o4I&7->`Mn | Մj2PTq`ib]^?f'.ztnMW+ JWԋ?M#h ECF. ̨WIؓCT!#ꅌ&'+ױ0OhG8b+|yr.ơEZy\]v< LV898 Լ ,2+΁.08<9L X_P Oµ]P՘xI()!^88hn$Eچ ЙIγ,V0TDLvQ""!8 ȥE̟h^.3p%:1#n ]B2Z0P$nRw0H\E("5b#:zDv2paM<ָT,+mtXa@}V w x^A*&ds7P p1[қ5&ko (/1Tm:zdh?AF쩞v~"w~sˢX7ʙ5A/"p\s,a(Sn\^va)!$b>f[ǻXtɈ`:c*ŵ:eCʋS~xAʕ|"ޔܬw; }u! c[5 m:nsP>z.^P #%Kߖ,Sy1RyU IlbY9AL;*t3n]܊G0Gk(_y hM5bD2s^ J Gbhz\-@ڲCh^eP}E׍7[IN#nڎZ'np-l5m1а ?n[PJO[x\,C,~2#&ɬؒe<(x\ u`"cmE1!`}rX~s`;p4j(q^҄Ȉa_rҲ|JC>y SuIr \BWx%qxN^~WD["x0P(5ipb0},M:sKdf}Clz Lێ5Q0+l$?f .0)"ƶ+is,\ H!\afgL+q5tjׅAJ\MJP, CuB,JӧsXr'NͺR- _, њ18* ׫K8HEWyb.~ δ4o*d^2LD,9 (3Xx0OUgVLU6\NsFpwg$xw.K"17jQ 0Y68s(q.E^KI\yE+g\<|뺥R[?^y|`!]ߑBH  'yf_īZ2Ќh3z= M֮vxg❱6` )vǕcfJl'0#۫hJ{נ~}Lph~yf? kߖr5~g^?a+>.P "5];ȵ?_ ٲ9Ƴ)u=jx0,^?˫ cm3כ(;f>YUXNweBnO`kΐv6]~]E鎝OQq"QtՆO'_C~ͬ}N֓R f6 F=ʣ~u7-9~%" q3v,@we&|ȍ^-0PIi'/AoN;[öULj?$qXl&9bI #Ji6}.nڟ'PBM.Ecfy^Y*oaV:yQ5ʜe9:*0伭r mnֱ#{V.qD;+뺔J`ZCG׺1$FB8.PmTzPv=H&+c twDH4幟U0]SP Еn'z ^8XSRA:FƥqfsL{nTMBs"c/BH) 6/σlow<N5G&٧c(*%:<#tJf Fw6Wk|5޾Z-ty1yY1?D?7;Q uh[,&g T6F)X;M 悷y$4nT",bY#TU&Ȳ?~ \GI~nF2çje?Nf3D쨿'qS(w. ;;_4ڃI#rqdbOP9W(J!$$^ي^{sƠSV dK"koE9 U(](s(e\~Bdgg>x4g `%#G`>@1s2&I5A%6rDŽ5΀-p8-$, wZf*(S,*4R%ZBD,ۨ!a;!A{C2f4X*j",(NS$<(bQ{,YE2$mAZJEKXx}8Bx8zUm8\<70pKe;?,}tTj%V#.A;Nt-86Upվ ]F?w|ke,5|`#@7hd"2q[.ZZn Ya=@bf^}.&&8Ϻ lۼ1rn_i,(a,Jb8B1"`g#Rm*ڍdV%n,+Û?6vth2`NLtk%XLB3ccIĉL;6 k,̞J/管Ӓώ.|bS(6w={ϋ=% jRXkOI8r'!`"'i:1G`SԞW f=);]'(crq5u$YdĘhQ3\zKzn;KM݅_%v==#4q9ziCO?O[yuN5Sѳ3OvgiO/??O&p͎}|氛;h.“\ex=姇'8 (=yzoP>s l#]Jzq7{5Wz0'gcpܣgh>o6D;` M<?Ȕhc~N̑w\=KAiˀm>Gˁѧq]1 \ A,؈{-hXqwm}hO A(Xi^lL 8i{%IڜZڢ}g&Do)""F[3<㛒N<5,ډ5.yCW&Nb0SPsjUX#Q,i8*988I8 -e~>S-%ٖ UL4c %.̢,ЯB4 \bW4FdDI"FmXNCL~on/L0/̅XDOd/b͂`眷8LCƴ|0`v!Btkyŀu+g/*|ximIZ~R7?&I<%]'mi%!˸r yd_IkDߨ/ k*-}g-ڗlI Pߠ|7 t&RյIVWWMM,llcU VWs펻/XYwPdhKo+ǟ*,1P " pSZ'~4&/O~~oVBv/nagʬ졦qrQ[$\h#[0q=zv:e>UO۠f zPi }>IRZF]۽آ}T p]h X&Q? ${)njr0@ j"Z\ ES:}6QR\)a~=RHKjoTsqJ ..Fz<9gh-7NЍl:-d jC*TkX'l g(E<]<0ӢC7AWWd(1b_+ g75S>y-^>pM`Hߤf= RJd6rRBG#Bi:ؾB?Zfc],uHZј] 2 iV5굺d=FQލ umMO/pڬq́bZ{>#ax̾MRYr縌40 >!.N8[-86hFӧ;c z{Q^ Bx=ͧ?l? nD3|hhM뵅o xB/|AprP[0ZJ~a `FrPDAi@lm"sB\wC\oRbcntw gh{v1 RƓ|Jc% BVsd@LDsMBZbm;C`ʥxҬc٢heR乸Rvghً6=@{I 4H>7M~v['=B5"Kk03 rXVQb+ iz\X2&$/.Yfv]dyVQ;5.3aknL~E߱浸amӶ jd`ghT} \˨l5wii|EU0[H cow>ѤUѝe.`Lי&z̽*3: ƉzE=.@W+\y3?#;OGǽgWKMK glÎ8$]oe-we9A(u  `()PKn4p5ʖOf5?G; C|9$"(wN[RmRp^oSo Y<7{!P)"JG8G3<BJhlc9-z& I&V`,rj/ɛTٖL㰠uB"PvxdxLyWNKy@1}{od_b; >  I`(P>Q&>X18tpϳykw|' .42пL@hkJ^!+R \zPC0v1BR:0ӍgXQEIqi#RR%A볚H6Ί/gV |'8&97yHa"۬2 HKCE"A$kν1˜Bbs)Vt#}xx[YPo H#l0=Q T<\>y/:Tc*㾠pn|]쑑#̄ Ʉ8n*}LnzyXE&5F\b]`bGCtDU Ž0ܵ(,"cGBzu1VoE"N-DBiX18tp7?4.o;A˙|BYň(&a KkƄ㘽OTF:/1'pbS4XbsW)q"绛6G~WT- r`I1G4mCFcp4=]_ew2B 1+g*$-z ?M[hFv1O$P|X-8ӵJF~"&<,cN[9qB>u, @dcP%yHXTH27swM@Hְ9O1Cla"8C`h5#e}?#ߩ2cJ|zgg7%sEpv1)%Z']T•O<1QJ˦jROMbuCi} %p"s@HJ&cGkը;OG\ZިC  }`DO3A.s܋N,HT*X(q) EzngD1$t|4G+{دm qVq=!$ǖEA pM oK=wy:M{ Tcu&&/C\> 㴲F;"_0պ(} "& ICcS`aylTG|1 ?NGP]wHhxu:h{ $_{2.;h b/\w ק ,Kf#XB N>"Z}ai-L_&Wqn%f?]M\w W2Nc8Z]7J#ӣwd?W-6-tdUWWxﮒN cv4ճ*Gc@JpBb~G \\5X }%{ǿW_:NU-A-lE99tQ6L3Oת758 ^-V7ڲ1؀W8zGM%kjGWS1Yqs."Eq$ Q!b6"f:ףU̝T@lQqz[@$=me E(΃Ia/lԨc^I³ƖEؚ ZՇ&ΐ` V(1ܹԇpϵ~[DGiO<@')"0u`W(୘uI4rP,A=̲8WLQE$s oi.[ŵ"X뗳z}ך Ni)XoXhrqBΗ٧In_nTrކMzUF:Ek~_0@)ks mmB~=k4G?>EʞחMQY|M4._X#lcT\f_n'y;B1{J줷G3$F<L $xD}Fcp*xxFf31MNCBI޳03WfnvcKF~$8ƅIr{)JO 2>kIv׮80ujK-)$ӛd վ )*3πfYtgdM_>RFAR`*>1 Jqk^._3`8(L)U k&N"6^$͠nJ:b|&+ px, 6Y3H5b"7ƚZ?A.`)0ƚSB7a"^;J.Q@f$|-)vULR`LE;ÁXفfƩ0s bD1zaB a()loZfX3qW\-G+vmk&-y7IzI_ k&NU-r 0PV\`C8&G4|\>1( Z+_XPJBM-+%w^,sbhjfī;MW%W}S01q!a,_-O".\.`k\0$Lf4yXBeX$_OC-lXC i۽#FL2Z\e-obS*keiI_I[IJ"98B̕@5UgnNBfX3}JHv dgJ)*)z@BG3`8xuIN+XD_cLR9E+oc y";OEM.7ƚJ%|4}s9HpԵBcL/) |t$ %@|`D/c} %(̀fQ_"}!_#(Q ѮGJ/^&55:::Z+0*49B ۪ju#!}cA<42,G-spg7l/t3A#@a~p 4⻖ F|_&"Ae&v?t\E {πU3axŋv>A1)[Db.P}P90&`Naݩ!)j[wAx̓rv*8wQ.ᚉ޹"N""iW$uD r!5vJ)Z1Ulosì}D(Q̴ժ̅s (*g<W"=ZV. NVB8 گ ~;xĬ.$}Wvlzz5M 1$Ì!3pPwmc?GY^Ή]LU BG!*5QlD'%X3SC88҆^vwG?L.uufemrCQҬ6J1Sv"0ic8-0eLwJCP" SIZjlF@Gq@ ČJ\#;s},#c'#.9aݓR+!e vĂ^GVcaE"X98qګ5Zԁ~$ d(QI軆b 5! @6,ҁ1snGsQu|QTf7my++^Gϝa )pnn#z=9,% ]V[.Wvdh2sm ev}H,44['UhR a.̤0GFB$;ɰ;JKwׂXB?ل$ hʽQ7[p dW#֨+ɿ5Mr 9N2 nk|׻ʌ7>f~rFs1r/D#Nz7s_,ϗ;v}4q&s׳Ge9 h;}  i͎< "֏PaJ>!塉L~z{34a#툣RL\ WfIcYI׃Qr0^˦ 8=[m #a8 b@֖(/úaҫri#̖~l)O*?*ٰO(G% &Gn|nRxrxͪ4RFd5`Ycq ([8 a^<\)=+jnUyFd(O1tV@2WT[InIbruܢ8tOXz po:XX#MN7<[3vN61?x/&Ӱܒ}u6ύRy-Љt;Yk&_} ^>|ohh -6474PI `+P TL#"$"h "0 BMZUM5zn㱫z'Y:G%XY +? 2.(94fm2eyHo'W}VXc@2 Vơ&fKZRSAS+Xd4 w@O'S{]>&vkMφc*SO^x8><=z~][&섚uJӓʰ?dPJ\bb{VΓk{SYer0)0ޟҝۿ;O?/~bN[1w)Кv-nwoʓiH7oL ?L{2 fGmLJjy]VL]<ַ!ו0ds .w9Nz]%I^ֳ3aw7} ^ Fo.%ۻNL-UsKӬIoد\.H{VXs: tGΝ-ynrW:gݚHW4@A4u:|mgܬ۳w, 0SVY<ߌKyYp@%$-dd Iz%IЈ!xHE #O6@%3sN VKnU LQ⒫v.i[9vӳT;-Qj=rq9Jtvۗ՝Ls NA .{XkL|D!${3)cp=ЃK ߟg(ac^Dz zVzv#F TY'37 ,8*s7YJL*9!@[^?&Β[v2 ǏKmg^ u$TI)2AoƠƀ -F׹/I ) $vIƔIyF;'U\t"#8: +[H3=>ޕ7#ĻKB 70<|3ϡr4oh;nӃY bQz}v@G9׋-zuc>z\]u(tNr^ԺMnIlԅ4T]Q^ ^&>G;:wG,`sA,'t%A.NJ4~ aʪXU:'}9JU]س7c i<TUYe,=UeuI М3p-fFJR~f6X"834Tc({O |(k\EJ`4K57'l+/:bINXt-,`nSM9F <L@g6P OwL_ A8`T Svpq̗+!=Y؅ E+=ۣܓU(tj3Mcھj*ojЗiWJdn/UizE[˴m[ 3fW99=h02-1dSAg&$A ge=B\{-yN\+q|Ë =GƷխLpg;P2SyoX-$ƥbs*}ؽ _~p?|Qk2L_W~ q0VzՆۉGFj"8yMwtvw{X! 5[UvN ƳQNcఁ?{oG;_DoSHۃ -&f 2gft{ѫzvw$*Vh^ iyl0hun1_O1k=>SA:+[P&*8oEMpNZM ?JB&64) py[vb=j4T GS7ƩBCsmٴ]Q4"#KOz !ٞʦg{ McG ۑ-}(2MOY<=}Ɂʥ/9x ٔMe vLݢkxM t8[3Y=YD-SWY>hþM 锱 :3ZR6ez#hy&@Tvwa^Sf͉P=GT1"g] HƋ6W %,'[g8#:j%ɁWWxs&u݉ 5voڪ3jMU 9 ;*K{o,}+ٱdn$IHٜpE>zᛕW/^aڋfI掀dAEd@돩D iNaDfEFp^d6QTGR{'abт +'eW <}a>^&׵é%Nnq=T$N Ta.0.'F%)lRGLYyq|}BCUw,~;NwdV pgAޠI#rɺc܅P{APUž,'Vhp%葔ܡ22si[ڞK$Nm8x|kX$rNBv?h4.Mlbߢ,6]4S.fz;+Z2a-~D_c|_v/ֵ?»P|u ~˪4A 聥VfΝn'J赕k+V*zmhRO<ܭh=Z~T<9Vf>7'3O#?c%a;&kXZ(78[{`kl큭=nZx\{ra0t ŧ`ylfum8,-E Pf` oRDZ0 4I)&3G`Cz!u4`Lfmwh-&0gQ`S[C {6BP RT#-PP =EUIcKulٳC2Բ|x{.9P璃1863fɩYr.}8d?H sEJ^eDӈ\qHn/u<`Fҫx̏~lsc "_LSoq]ۉ8 J|B6I }Ѭg.W/o{'.dk6+C'?S~=kǧo6ܣ7"  P]~m߹tbnى-=M;DǗ}RBJ%NΌL(Iy_Eng~(n5]Lt'fbzobh]N-q;N`U|2{/vKM0W{2- ރoa&o.z͢5kۄt|N/ۛml_,UPy'w J)(SP*:.Ɉ6SM{mŻN$e0mE6op)e4BՖmVrOW4/93E pP@G:gZjN'IŇ{xz➞]+H]pF>0[{連~*(ILŀy>44%`ٰSA`߆c iFsJ(f"rQ3h 8{Z !hX}5y"kzPՒoAv8?Rh,@SEe J駤t8W̹2b9Ns[ƓS3>!KRiug|䌚O;ȭNvTwRnY R|Z!uX-"GxA߈ DY#!}xԎF#m+_uzazS"IQNC<ͯjC_j=)ArxdVb)>Eh",eh9z*qŤ2-{~n烘oW^N7Jug.F@Y:lNʋpD]=t֏ Ϸ Fpu >\}@/Lt~E`cݰԉ|I:$ȝc8]Qۨmo,hD5,TgPg+mɺ%VZ(OibfBȖ9,ödizkYœLkѩu f[- zk=գ}1C@cvl_Trr9J.fYfA& n_H[sRQ6-.G$b@eQeS ;~7-`T鞢떪mgY"`-)GbqQ㳴/1JS^׋jٌi5ߺBk&Rs5Kf|tVd}TiBIBKBIEꫩ|\aImcZ%u%}Mxa2) jJ&ENeM/R2=-0kw$u%Il <2yj$}Ŀ]*I;eqc>@P$uIf$uW`rDRoFZǐ<&RRT p#{H*݊x~dʝ.W$`Sjk_BjmsR[q +SMktsb^]u;\5`P S_2B:^DjpvNһ%ӰL-T@=PUX!eϢ6h;:X܊͔%'({בʀ:~!@yd o"C5Cʖ Tp449͕h45sX.ݧE]\"]Ԗto;\1`tgVhPah_u|Cᶎk^s1v{t:<)>kӱ FY_H) =zCMrRZBi? uqK-7|[+ 9ͤ;'E%]lG9~]dӮy"/ⵓ='Z/&56?v9GUڜNn8x53Uœ}rr?:ђO~2"77/^LV%{[ _~Xw1ܹZv)eM[ W>=afOuA*g2+냃sG84c(o5i^UŬ.y胏q߲+zK&F*2AEhfcwa=9*Qiӏ#^F\7d0#C]OSkP3ˇf+X.LfCh/VڋMڼDrC>+ ѽm{`%'p"'kid2Sw}»LJwg%s~bB틕>YcTdJj9:{PؿXZeR 9-Vj#- d;roXp{VtU+Uφpy~tEL~tUG)>?IҕV|bﲼ^oa f^_NfZ.qc)ipLR"\j:p,n}\b6$Lwׂr)/e3g8Ym\-:wKjYɒG :TWyVHog^[T+EjPY QIB #$ݞ+% D{E; hn\dAk21.Дi#\Q}z$i.wO薯4ީggt9yq*2Keu3UrχWpOHXG/3~:*7\ k+]mElv97|qrѷNC-;rJ>X5n7\\yXG4Na&1i!re@E;P7%XYLɪ.ڤڠo+NJQ[Ѡ$%"Ɔ"Z`A z\8HrOvxڞ^\(nIEoK$Ƹxۢ442<\Y"Ru  .׍Q%g1fAJSޛ~J;;Y cW+]ADx"=y31ك: wZ1ߢ= }Z16"I`[.%B&Y!2ҞL>0’`yl~7_~QeViTT..1 mtT T%C LJLC)kd9'c(<(o"(=:_NxKOpq)M97qK*#_i!D E"ޢ8"3)[Í T<}4;(d"FNdQqdHa2)P\oWwi78]zk`%i=K)(&E-lQ %3Ō&N"p,kFZ.2AL+ LzfB=5Ч;ㆍa,@ Ld4Zf 3LٌIC>,( ˽u '.BԾgRW';tZnWr*+i"2բº)(9Ċ*fg)h2cKVRH;8bEC%y 1B-(O7T)RrF A Ù[Xi.<tAf(̡P4T'taPlβ2TZ߭qS} e/C2R/OпM/^Յ~?Kk봰V~2|r}-S/ϥID5柧,92ǛrywUYʭog6\,r 8=%.*t=]^@f> /u7w]j͗Vo'Ѫ^\i ,&?]N/Fj:L[͝6= <GRmYƄiwH_1-0o0;7O3m-}$;ٱcbtF(ǯŪrSޏ1vIj8ƹŬ>vf+|f ~Zoӯ? ˄k%Qx]|B}D|xy xF/ò[ ,ג<c u0q0?- 93%c7ͰI,4:OWWȏ?ih'crOg?LyYl6l(@/9DD9DŽR^P/"VFudXDc >- pS$J!RȂFbDT}{*K76 (S] OC B?o&^ NK=ox0vI/=r#l>utoq :~6ߎ'Ohmm7mB{. [?Qp-`34D%M?XsEl_꾂_*?fƔZF 5s71Wս_]%Qsy}!T AHT:o;cƌLbὖ豉hj4BZ"٭M@OMm$[ӯMFdnR.{t kq-VȪP?gmuzr7YkރZ4:tga6^S4 ~\L0ELoк[ww9.wtlӍY٬:ZZkzjz_|Zvt2oq۱}h~2~=wt\lm}u4hkڠ6Ǵ'ow,zٱ?Y6 d 'gtL߼޺F+婟Q`. a˯u|ͷӯ4维r-iA"L}5RX5T'@17" {e~-zz쟅Eh\oM$xi9-sZCrYs@ȳpyY@Pojxx6TadjCTRܲTktou<n(`sF["IN+s+aa^?6XMLrr.P3f4j|sWLXpAtWzn3?0V#oO:iƏ]O+e>ڸ3[ʎͳ3kDbPZf2L8,qY`ё6J! lPY8@ )&+^s[[cRYtC ^ 0c.*5X@U6~*Nj`?S{ë-c\9.iͦCW:)gR5C5ZY|f(rD[TN.&cnb^rEbHKΐSӄ@lsZ c_m? z2^ۉ׶x0%񏣓N өJ3iF~wm:d  m7L/tx7[|Y]GORӳOadY7'7J >363-ޤb0C 8d@`ˉEw7eDp/87Vvw̌Mm-̡-|uƌLr exF˛^}Jw E*o< `=i>d\.*'6̲;,:NSԑj Kf߮ÌxI) oy?`U$qnƅ g)/`_\)a"4Xd(5qu# wb#9=.B\tyG GcB)7j|9 Ȣ`;R8lVJHCg2%D,QJ~l&R~? yp#ç ¶1gWWۗ>`р-RPC "ᯠh}b[p(tpΥs4G9bkcA`/qp\NghVO.Udž%SdަZyd_R9kq79~E,7rx6`l5b+%f-+m_@\Y!ίgXD(X +gI1ɶkl$ǫ{ t/ɵ3&ۻr΋ܙ-)"5yOǕ閝^@ӘG0Ac,6ʌ ;LQ1,%! jۢ+L(2GYo2Ŗս ?eb_T6`#jF`'[9>%[pO`V)=xy?Wbqp鯰$ӯ0+: R #?u䰣]׍R 3`jd0PN$r*u15UçIÅ Å@lYPRIt&ni봷@I^N VRGIN jaO_旪t`ίxB-~:.`/?Dl7nɭB^}5Nˡ_ 6߿gu?E ]?mD7^HIB!D!\*}T9a{pR5RR¼d7՛vlq`cq :If7ěmvX %VndI*ȴDVT"@1)`#I>vylltn^V;|SuR>|K݅6%Xf=7/.NLt; lj6Kɏej Pn-A) ,92h̜u`٬f.[e8vwPЦͳQ/4MNפ,s=*kT0n`lhf/<~wM^gUWᬍy &2due6s]~}Q>jgzgƗΜ13cͺe9ޘ2S)VhyS1ګTAH va 'mۇK<]7z@݆Yr%]''n; l8#)k9 }o6fЧKwLL Txs!c}{cD:A%{frGm LT7S1G T|TGEdQ0)X+\$J!RȂ^hp4kRc#AY^"^ewS<5[6cK,+1gWWۗ>Pgр-RPC "ᯠϨ5֠-w:y8x}|9ࣣ{rɻMh aBqG›Tk͹Udž%dM':^}7+)679޾Eˤ,.alRm0Pf?J*}l _2|<'s &伆OUOlՊ/߃CB3o#fB@"J`o اVaEifUaX¨jRQeMXh-9LUT:a>W-R{9V]\2x -}j٣w,&ryQ5m^Wo|*iJA%q_NF͋̕m#I_n~w`ws/ t%ER`UDz&%[6qV]u`Ja~c`bҹVpb#VzƦme,:u~?&6`&.^u6xCo>( ~k& 6 ޫ?o7ZTK?p?${&^n_,@ R,ך/ErAJ=`uX3n:߷`śtRvӅQӅ±ff&5KZ xiG?VY\QǛkzuB{HGmLjB%]S+2 0R+Qxꣂ.~/)a)D|MW4 ,Fbw]fǩ9tf$%R?aT a 쟿nIK ƀR~R.#?^xTaIMohB?@W韑HtO]8K n2EG X`h,$f]Q'.0A<㗪8AG]]$pLb[CQIaX q]6&iJKW KVj48= %k^ &+?fnz _g~:Y )g]0K}jm.A!Z1=i]`)l%N$9IKUWwtgoPt6_$YKwMgOF{D "$rG\:8m"YYx?-Q86%>̧XӝRT8`1waʽ3ø)agB"Iu2/T tOɩq܌>j_iJúPBF)^CɍW<,7C0p?t`z sZD)O 5(O+O6&m.VO"⋅@-Oe&#QHYu2LwZ D佖豉hj4BZ":_W56:Mƃ]vТ{aÖ_YQoV̎kdkLxDsYv }`d.TɼsQzz+Jr/he]>10nQ. b2WWy9ͪi潢q[vzͼEj힯zz<ȖDƞLM6/AmEiV˯|.Mkw~>Ie%#Rc;Ɨ<4R*6Ȝ9rȅp4سȍo')` ʹ,ijYpbxU!=Hy>UO1 rzyч2U%'}b?ϺG,SX: -,@Q;7 qf%*Ű[)r]N5G*]ᱼJv*R+o'GkQ!bl\:+}Ԗx93~)z\eʇ g_p8n29ƁK|nӻvsttߏ!>㟧*O?Z mnlFbs`2(@ޒRMz]塇 ͋2{f'|_H+L!ٶycKV"y7U2MS٥Ru/?*иY9O^PmT4%hVkxAڹU R]SsÃPcv7<]ZT򰆅en]~D}'UO(Ξ GxK:'&&VQR1qV{tHbaJ -/露ߵ`A7U:_˟E3`M]t,I-cMUG]uY+M7^\I f\{%s`sh1fv8kObk}iuؼcg-"kY"Mwl5…sc̱|> c}mvMj`(G+Q]X'rض±>,IՖrФ؄&-\ MjB5_h) ÏU}tYlW(60aɽbLPVbklT[aB7~G]z,7,V`(UJ6JMԀXo,,l%7x Ųi i4c&\n ښ3Ln J+̻)>cp@~*D]\%(k+{qzĕ `*=o83*چyp|GfxErX ;jX:֜0֥k-n!q0EXfw#"y>& I Yw_(wy mjmvYr]n"^(F \mC6@ jI YEDٝ˥,j|,J=0Fs?鹸 Zw_n=DR(\\=`[k$a4aPR⽸zcI٩J r6*+Ź]\^\BqE愝J z6*+鹈]\ zq Ř{8xF$O_x CnN]m42VEрmP9  ߂v&fn 6JJqp(f9fH*D"*l 7(;(p;r&^Jhp4Q2Hn$q(h umjnx(Zh!JhvU x%Ls6G$kJMθ uL05FqQ`*8U8eoePQ$'\jb@ MÊau qL#b"j+\roqT=0sm#C*A$f4F rEiI ,zcR(jp!` z[=T翆xl5_-&[8NTڭ6$Ԥ EtdHQ9$Rr̙{ ,i%9L2;"!I8V`@o e(mAKJVzLI9n 4gЁ?%7i7X?pȲoUt&o.Vr$JqTQܦ$H4,bJc1rbZ H-:o9Oo{>;h[π?]mgӕP͇އ}"6ンv[IF7 Hdח9Ԗ'z<UX-zgMu}&ye4zl5ͭ.754ɛYNY̙υJ?0B7+ 6Ky.dt:Q. b'w0:z@hv`np`cF0$|}7jQ3FTy|՚<ȖDƞL8yW;Aڪ.1IZ~5t+u0&_sJ`i#u'."O_1 ?`*\Mjt.jZ%/Ծ iumMN>^y(@6$e(W1p3-29`i=(9ig12A:@8T( *efq rz`4,;dƫ> N~. Dܦt=Sx`t -,@Q;7 qf%*Ű[)r]N5g$*#]Dṍ:EөHy[T"GkQ!bl\&&}ԖxZD`N衞rZq9Radv7sl7_s{2*@,US K(Ir yI St$LEt!"'U2`@e_E.qVq"eX͎Y"BN@k c}904 ycZad`(QHLd$&F%;ϩZNmaA.[`P-$%sٔHMRVqʀe٬>JԮ3! 堅 BiuVCMӚ5MۧA)*tub6[qBJ;\3Ic vb ޒ$?H A:TtbhiM B. *,K1N$d Xʔ-rl9o̩%=V |l%w=~Vl,ILT$6+ߠޑFᢿToz 㵸wV:'c>x'0-ǃYJ0Hhq)̸ f|LʥwSdL*P\l_<-mG"?DWsnFIֱ=[tkXN 5 ypz`xgU)ksI3' ֜ˠCmˁVqS]̚FQ%Q=@=;[I9̱01<JpAr8 OuglD+tRbGV!6M$gRXXޜ Vnp3R㕧0/XX Q8Z/KܬBBhapz.Zt 9[Jҧ#}cZak !ŋDP[|_I\O\OFmH!Na~KKY*{/%œ'%yZXMs}\fLi feC]s!W'QbP!F)") [E .QWqTϴgR g^CVV;PN(F-\}0P̓p'gDa:i\H\VKE/:^ x3^ R^NsFu*O@I =旫"s;ozsQy1 >Km̱,|୍(bV";o|Kԩ*+?4Uk1oq`fɜ$s5$t0r^ s+hHj0KWE Z" dr{? _dLw ̓' h$VK rh~6>t?!φ<"9{uA҄K^FƭAcA*c9u+6lO1ԻQ.Hk>N1uCLS-5 ^!'"LO8n]U [Os5RHb8 G FqͶC-Up|s4UtiH1Ht,ۨ2>8 \܃_iZ5KBr.=p)uaDFe1^ =Tˌ+xabR~>bŻ䖏κ\|FW~s($6W{]E:bU8STMӛԛST*|SU d&—풌n|zQ-bmtY%?57\M1b5wY7 ЫGVVgPӉ1 Wml-6l +5ָE/o'6\i[NG[6m!q1ՍZFr37L˞o #xd=ߓ`cG2=o><0AU{KS駛5q7X"b&311];n{h_ղS*-le5IC_>PYQԤy 'l@/_nj=š-ת$^7vc[bԅUVQ7 EvD0QkЎYG4 #>"AU|T"D/vSpj!%BHu),/l2d yV7̽kUnpswYXCCM=VB EY5  J.ko5`mDb ;KVkU#U_$h+r+>k!X %&q BBL%`Fd'-Sy8ہla[ Ϟ]AgU%6SE[ܪ8?;Ɇ==4:럕mRpt['ji褥ȭCN&ŝq)o{]=toKЋjK(v~|͑\[2qYL?fvȷo7g*REy,q"d$iעkK}Gc̖gML@psN*5 l4ĐuUh62R:Yb>2dUf 5ʁSVr2P6]\u׻9&ZxзӛbI}ɦC#˛k gdlGܳ*W| ExJCo- *h+}! qv<ϋCs3#5\h<h4jS4~ϳ_nnj4 Z!˄#(Ja&t6mTm(,ye ҆5,(S+V! EeUۦNɥdoM)z`༷ 1iVg&rd.{g+l۬6kw2l?}7.^,m2kN;pKEYDA_a= "-wo}{7WzX;[@ rfȝ+@h{2!ǼVD̮X)dFֹ]}qkT˙y0pEzAXn%oѥ7a-+ԝeh~/AYkd6<;PX"skHZ垹3F1:!ӯ4eFf'Z;M @ & E$m썲J+y8 |1e!̓Vˉbǽd:zcZ`+˂>7 ɯ*fkvՇ`*Im2'6g,sT6*eLIٱ("WTUP%`A u͉ ZB6$%+x`!x4RLyԄjٮr9˖;^58B"'Kq t]TV b8)pz\^uUWwScS#&f8 )k<Mt)˸~;{ʛvQsz_cɋ;Zq'?&a: OFÇ Ȁ#,`EN\. jnΈMW2 XV{[.4,K]F2@)A$ 'n½84 &Oڹ t;1kJ$A0f"j(>H*)E2! 堅 j؀iZӴiԴְ=(EX0PYT&{ 8N {2Ją?3q,o =.|~ ?>G&uq*oBtȾ-jߋ~6\#8ۿmv0w i.$)E72E|˅OWNϺqi8O'ot_z5\xM%ϭel_z<{Q6s~P?-bv3_o˶8)vZu0!}A_끩U-ں96tt$tk{CG]?$ܷ5ޏC—@@2ߍj$Uԓ' h$VK rpaR]|_l^{Cc浛kEށv'}trmt NmJ: yHzy1Zjm3%BOYEb.ZO\OSs5P֮kkto@fې֑N[9069}4 H)g!yZVDs`k΍b O4k|qpCfQ,*SB\*kp&PUe2#sV%˘18 8=l799qݑAH|́\m85x-zz\ t灨sg_6,b8+<4Tiw3߸Q4ڊ ]I*eF 3hLrOLro6K%kUb~˧0/sV塇Q_xc/K[۫KG ӫ^B:*QrD HVaC %b 9^8J-̲ 6 s$ c\Q>,449dlԸB ;>5:("tA=,pEu7Xs{LQS;f9;. <uu\^jG8M׊g/Tܼޗ{s-7+Ϝ q>я2gXLjK!(GiI%y͆Q*'pWU,Tt4X"p}h%*J::LBA]Ko#Ir+œs:ߏ|0v'c8ͦ,a;XHJŇEAdEfia%.&~%~B]oMnӡk8#XÔi`sОCcOg#Zz7Vz#R݌YڈƮ]fBf h豰KgRRUD? "cz֥1"Y 0hS!F| ½AU ESOhz{Hf/^ wf悻i-L9nkuN}'ո >{ԛJ˖DYeUo\ ~L )?6 ,_l;}noG"xu_0z滋hfU̪S̭Ⱥjh^}:bU8R{p;ٕ(Z6O~a`#,mKظݠ7Ps?Erd嶶{MoDm"G j3:t+7ѡKV"tc{D;A?v_͚=3ܝN,P7p`rl7LP7j1b0&sF\&UߺǩVGPiYg!Jފ>PqEQ(')o{@/&ShHZRE8KIL4 %'h<8i%)FjP}h>.=PA4_ߛ"yC%C^ lx[5jZt b8Sì>~S$oU\Yb,Yąy Vx,u[Qy+."I¤`Ĝ9cʄjNEp:E,6y9DBXa ZB3ccIĉL O'O9 Tg?~ 6,0Wluru۩!jnYKO6aY/^ 11?XƍW3&Y$i,52HL||<90{XrI1F6; J{ ̤138Hl0XQsLaB+V7W{?O4ySc~uBFh 6e/Z **a8O(uZ*Pc!+V[]vJn*M!;{[n fWmo@,@VXѐkeq% VVK'RJɵ ֆ+ j4*F5o@Je5WWc䁏^ /*󣧜xs~po7 fow:o@0~ yv3{3f8i5>=JJeYy̾*4e#2WY`9s=7CQ.\e)s0 ܩbD*{VGcZqXU [,%\}@sŘ\eј,֗n@JI5W\ B(y}M=Ͽ?nqj.;ZlX ;m%Ӡŗ?tX Ͽ9-!Mj=ҁI$VPop(\\y څͫ#rL.YLoV<XI0"%p6S',\* !ML֧$LV$9"\3i,**{/F#(MYWd0/qt& @נHDjC:pub*t=W [:5+h}O[w2>ָ B7knOq.+xY_#`wߋ4giO› nݎ˃ WСŊÇ 9D;'rƠSV drFe,&~X=i~ j?6}!2_y>ui?Nߧ'`5j8H(fQF`RyɝAQr%$M4|L+]"ڠuR5T;-\'fx(SK<+(!`]H"`QGT֖а[~UǪ]M03bTiʊG%JbV:ֳ^5jZմ!5m6R̦Fvet3p#L c:fi:f)֑s~[\z?&l~V(06|D<:HxBcp%*2nNE]p_#fޅ5^4(%W %25$2gt:%"xbAsX ᝄ+t}R'kջFdc z ǺOKz-].=ݴ7!>Li/\?oGfۘE!3֡,kưVeZ(*5+]PY @]œ1<%6A igi%62nu4[mMr=)Z/AǶg@3OA|~oUKwuS,>Y)Du.\Bz6'D}n~7  FZ9$Q*PHrHr % -lx0/;,-&C/v mej, ? o//4)"<*% P0͌NADj! 3/3uK2|+hPl.@[C303Rc)zZd7A5tJI,f ye&sha=mw5cxB$֙@YJRƃwQOl[vB'M:%irLx캎>O#(q^X!NxBf5IehNE\X 1gX!"Zg$6ISnhn%>b @PQ),qD9"x d9VbW'5d]hL>A{g=ճGoFmGaqK] ÷$~ ϭ =vJ4RVD2Vk9EmU*Fh \՝,A[pKc6sx1 8%IDLw(124TYHL㉻ 6KsQq ygL2OZ#RHӮ4%j)qll_Ȫ[ȉcސq Mf\q 7:NJ΄( `a?$3j}/9`d9Z%@nVԡD#Eဌy)'CyݿZrg5-k.p#cH bÊ0I ߧWJF7NWZ!ՙ ƈ3@*۸@?f\c!nDƆQMSk"\HO=[L(C)cz]][N#]ζ"3!-g峉jDlY\&DV6&*'*u)L1 UI Yf yy?hƼ*൤)}S4b+U/u'`;\\p7M.6:Ē⫛ZF-d|;eb5n7Wqu= /QalY՛77*ȶgz*[rd~ʏ 0%˗!Ϊ#v_+EȦe^j+Y*s+ڣWάXU)NCU'^MԳ^%Nv:ʥ_@! y6.t7% qQ%Y-j^.Q`G5^-{F]}3:4;f*X}qs'hJtaجs1jϑ,Rn&U F_{:?] ƄÔbhD[8?w|w_^S*9,D tR[(E5a8-qR`o^ٻEdWPٝH;cVڛdXyn WOdQ@ASP EeY %23._DdFLNhHZRE8KI)+iK,0OHy %q2JSu[9'1joQgፕHxi;0mGjN^EtR .^w'Nt2HJ]9rn۹@q ȡ;9H!S. q,q䥣#?޾t=;tK%=i?$Ux?9jK8'hJUK5V,seE:J$ ֥Wuٔ0\QJVG*+舗D zyFi21| PpTcu؊ O])tA?r W:9]'rVҴtX7>LRI}@y(K xTYą]@.X %ҝju}δXI42!pSB shN81r0x- 뙱1apm'2G\x=qh}ysnVHyE\a0WvnZdFzb.~{{Dbx1SVz~u`ǽ z]XOunyѿS9Lo1۲C')Tyq gZ gfGb?KS.?]]\\yUV2@{B5'4}>8ÜAsh^(t_K߶>?&G堊`fgNT1sjnD\ KP$ݍFL\˃v"zjo]!˩|\l=$໕=`ePUnU6t|\ X.t`Bh/iˉfr%bg*i|.d]G%eN=,_Q>\fݕ u_ 3m\*@6,jÙ/:?Wc6o~".?ةqK'rY4Msa~#۵Y FDTk9( >qar"@"!)pi%w):#(g޼Eɩd)DKqꠉT/Cd`L$!2;Tyc^,Ki|x );i3A!ˬ!;o1o&gL/ ӓVȔV$2X4BIƢ67,_A ^<0MVa S9:8bU`If},7P[6ƜGM5̰Tkκ ۋʰ1Uej}"vpøF%q4o[ɕb٠iqL? R޿j.GU&Cj![&}pz#غ͏OàWnUq:qV\Q+ktWݰ+܅+z¨=v[:󺨧eV{/; /X?x*;:eVըt2ezbˬP/+ۆzHh}@a*dE3rQ[$bv |ڞc,3o"iBI$=V%b7oh$Z5!s&bbIb."D1Ab s F8Bir[Vt]!ˠfc!4ά7,e(P;>w?qxOOtvVƱCG'BG \Jz4t)ARHzqFZ%ļƺWzR#]m7]5-T1+pjGRRqaI R9<Ԥw^R>qw]|Rqu(CYM35lׁMs/ tUmN҄)AM e_OV\ؼ3B"gC1Yn׌a p㡋 }~qMSmGorEQ[S~3z)fNQ g6Ġ4I'> !(鉡ɢBVӞSǂi)hjBe1ZN,X[i99Ӟz4B`oF/%] vwZqA4CJ<]SڼNYx[۝l (htbB(S;Cf |(K3_*)4:9ߪ{CDg !Kld#jkG/iOu(;KąmY[#goY/@sˡTkaz?] ׄ,^5j%1rܩW&˷'{ hjp,gy"RNIeZoh~$I-r]6[ ˭y9]ho~薠?>K^ھn_ |il1.*bʃRyށ{E(6L3#SZp d >ެI.ϜJ`v%5 v.̉\EPT*lŔr2Zre.3+sKTqn JMjdתxϥY&xƄ$"(ªH0x -uFb4!SJNmBLOL A !>Q),qD9"x d9,ĭ[yB%A:*MW.ڱGgeCMtO.Zt[k>K`ktӄre-w])|e;?Em@>O?1+5OG_Jg< 7kJMra}Nշ.&~ӿsہ|t~ Os }^|vKTtY'7ܜ-0]F`;Y& \U2ows2 bY)]5|:wA2;tYwZ#W7'N(Y(N&aa֫ M.dRۆwn7pa'ْmCe./zMSvZZt侥Mkz5m2s3I53豏o pI(G5ŽҜ۟ &;zF:gE|2AG&p72AU{qL`Lx'])Il-/#;BDN:/zE:U&]xRअ+oM[ShHZRE8KI\N/iK,0OHy %q2JSu{[ك9'joS gፕBxS;0mkJ~&EtR/^w[O՞bL[`rS,ha& _ٻ߶#K[h (ܵ9zܡ40i)G8gDI)Krh$~wܙٙ66ɤR3Bn;L wFn#nS!EoePQ$I4+cFphJqTQܦ*:2,bʸ#aREbD hl&f :_d+x"dk|e2<5e)O6aX3}>:y I9ho^ʣm9\!r -%& }>Sqa"6JJqp(f9fH*1y0 #;(p!lC6q8x&E35γ%7OJr S121ҡA%Fv덱['=Ȁ1ʨNy[Q=RT?M˯)>ӭ~ƒ (v!3 |&FOV5uyG4L|:̙&7 Ki7u,*+Y ^|Nnkw1O>~6h<h ־Џ\^E r] j!*ט\+VXE-;wG0B+c.c#AiPa0BYqK0P-LG eZ30{-#chnDZlM-}um5>zM0b)a3,'o!b7ܐdJ(R'LpτivíYx[nG=,+tcΖ%!׊6oMB_ Ā.-MN3쒮WgT3YF};uٝgj{xk799P=@,7ouج%{D,6 O.]mDIl>YWSS v !yqCqsV"u'. kpOo |H5,T.yYcʷw~6|x.P_w8qz'-cvj8)L2@/* Yj% L/Wo^I7T'p Q7 }pn֗.kj8q'Oۉ `lCmf"y|o>KWL:  rl|_@nheio.s,ј= SQvìg_=ӯ|t_[0ZţR3\<0Ny;%^갤*KlI'Ҳ$X L-?/ /\>sΫk /cPNbXXHW55lJbI,'X޺.z0{s&y0%ԩc s&m[5QT.݉$C+]W]|;8"QJ+m0X}D^J+|PFˀYW  c |BHXG"\JlB Oc* ҖR@2µzqyPkl2.yrZ(wC״erzMDXa'2: e1L[nK&X{Ƙ"1w=K4/P1+-V('7@h|nÁdP'"0R")}Ǥ"9̰4N+!ʓܜl(R} PqwZ ~ ʺ4b"@ABS9RP#c$!p%hZT&"APn?Դ"iBc8B98R2҈rsN,E;FY*xH*.7Å7D5jN\:ސpfz}?vIK lp_m 1 ؒϱYEI[_^$/LgNəA}%i,*(b%:<`eΰ:S`)"gf.;3enl9 }&?@@`lE-6`MH\:$UX). ?,萞\EG%1尿3՝_T%ٗUŋ:+|>gnW=Ba1}axsUZ)^̠T5n|-` rց﹩nK짻 ]t) aXē+~fI+Tq3?H}4ތ7limmMR35<{tzW$Ƿ߿W7{xo߼O^FLM"hh~}|?^uᶚ&MSŶiZuoP[ߗ޲ Q vM~im#kpiK.lS ƌd&c8YyїZ/%AF=iRY&iNMkʫۿ&uQ猊ʃ% ȸ(S`t-!BHS{G&0T1zH}fbIT38 64`b6Y-1BR2=St"QimL~NCO,b.0VDs[f>&Ç8_O-Z>!;x9ise(Q]͛.PXSrDReN) IHI:U0Lye`QGC*my 4aʽ:aFhNPƖbkl]~̧K.=2;CSf8gpSS_ךuI}`"@*#}˜@Þ+6djB`Ɯ r) B T{ͰpM׍[\q*eKo/K<_u68;o Z@[\:5+?k e & ¼&BPl0A*1j"Pݴ((@k}x%9|#hݝmGaDa8W{?P̕:?+M3۾)B._ mQ;DJos詤QQoxGj:_|?R_@*%N5X!c42,aS%FY. j:0L$Z*bEF,NJ5,!CԇbiZS!.ȟ./1 G$`F`LՆ #Q 3Ӫc"{&CD{`C)1I EDpC@G,s$#-+qklV)kT 9idmXk+>4,>,Inʶ.rn[:z *CzKă/ȹUv9痣qX̵Q*g er 'RRL:⺂NwbONvip,sjfRO''s<*̬RkΑ(ҮdN;C!fH YmiwVkSJxR@]RHm}b =Tü*>)dl~SbġaFٰw7N:::RH}BA`ɔO⎷B$^zd]P#72{沀f)O7ks.PzX>tK3|%gSӄut{ HGxO76c*Wm'..߮,vaC$|S"u ߃i|eN|vm [쫒,ɉDUc:|M oT11@-TaQ˂Z(ڜ-rm|ʩdZ9:]Xsu(CYQaׁ%]m$REchI4 HX:^x4ܒHkm#9K#h7Njn CĘ"ei[>?sHCRԐ"1ve.3b\dVhNQD:·H}S j%!J3KzGQl? E SR8 Z`|4"ᴋnVТ_N6[y /DK'I YMt("0Q*]*nFEZ`1-Xhm TQetE刊` r;X2-spkl8%ϏFؔzdr_Z(KQ7fL%̘f9qP{Bʭ;>+1xt tM]!<ϣIEڝZ?*Q b+Zd3kP1s:zʆ5:i ^B=0}xY9\zӏ/81֋[.OzF fe˓z v}lJn^\}q~t\#ɲo:9J[12w2S >/S&t?{I?kѫ;%SPGנHe췄?cjU (BRHcv  ڞI\OWg~ٳ&)]Ϥt1Y~/hɏ.o6iR1Bg05FE5w\^G+ L x2O((.SЁAZ 1m6 8M(z*=Ȱ/ -J)MZAklF  By%gcP0*]2Xa[eb>0O /|٧׏m9(f `hf)1oAs߁ϮִFI)Ō8g IM6&FT#oAag2Zq8`բU}>.~nor+?\`0F&Zm8. ($78ÄXoU:6 8A꘷ QU<R/؅J+m<~#1uzi]iϖ1ķˬl)(#+B  $4u#52FWX)L^.mmn~iF҄%5pɩ>8%+%2҈rsN,E2t 1Ronkp7pu&QY {9 Qgr7?g/-D%;[\Io9)dIY>F[Ÿ^'+LwCK>E@_2y <ޒ:,`9=v<F"rvή䢘3 0Y[ J! H ˥CRrF@:L( 'E),VQ*3a\?ޭ)DIJoUGlu"5W8e?s4WMLSMl>Nᬸ#$/To|[QպW6OfFga|xtr]}p1BxF̍A\YWvPmtRԔ6 4#1zaH0yB[Q!`*|4 ߏٿOMQ Z?j=ɦQ BF͓?}64{"~poT^T/`n&on)}y廡trW/6d$+Q6|^Y!U]*ELUNg9|sk`?ӻ.C~xw廷Yx/R=&4~2 = ?o=Dazh6C6g]-b򸂠n|!B `@]tl8:"x\#G˘-lc$N0ok0fk ~t1, *0wHs*5S~lFۿV8 K05i @$q/"Q^it-!B锺C[GjI-$G(DN5hFGbԆLL&06 F(|TJZưg.Z:-|uK./byoUͬTwe3ǏCW+"$2G`::m"0Jw#]r|qˠ9lGM"sIZ[ SIHI$.:|0]E +A U"bK(F[!% 3Bs4 [#gu\~ 泄T΅dXWu}^0Q5sUG Vɞ.b^Y07§Dled-w«Wj/[FX>|sF`{'p_AZqxZp}'QKjQI齎&eQ,ҳp0/%>+Nrfxs >HSAxCmi>/y,// +dC%KQ7Wvz`z/~ڻ-w&0/kwˏ2mZ:x hl9?nݍ)и*Wk[Y$yf 8c!n#IrÜQ 4mlf% ,4H A,IPAUѳT`YA3b 41>P\䷛kY ^)w+vU)/-ؾ:v_=\LB:Շ`=0{Hz3Cas4#VeSi% es BH/F`r cq$( #l,{%LG eZ30{-#chnD[#gKR6L]]IX5jn3+ [k̳|7. %U~7G}&DXNN>b+tc!%"ג&5 mAg@\ɕur˩uVnvWAk;}}6 ί*v?G5O7{~4۹˯n؛pSƦڏ45.гYOnB?To_YkwzJ:$-}^stgIRgRe4(mlr[P٭Qk3)̘c.SL 'NTNeߔ?73YXM(u6:yaVIX|fK)kj OMV`2_L/Gm>ѰDO\z g\)Τ]̔xu:Il,ƑP!rH3 һ ORlؠ 'H @ !@D=rA@mpec  #$$h8XG!1z "A9@0I `cmXÅ@iuON;iS x '+sy&w-J׍hrU#}TLqXtW-XHf1tfY$< bbeP{SwB\oJ)hX7Vt &&VQR1qV{tHbaJ xL*[mf~SoӁ/ (>yH {hp&Pzo"wh_ q. xӴ.)>!uG:[χ0֗Y7g [-=˭!Z[{&qUTP+0>vq$W/F\-_:C0qQ\=\W\=ZG\=Jq+]o#Wc!Mrp9dVFSXrFo#}{nܴF>6B|p05A C$YÁ(5"ƭ}/a G %Hunn]ІH \Tw$L=I`*Dfv;e3|[ .m0pVZi ZL L*8i|rNrei5168Mޗu^UjevZ3`|aFf[-YZcah|vte{sKŅ1%"duML[5ܪ j3bL6Ġ4@?yWJzjFBtRHRh-k  ir$ԍ Z5TQ4T$y$$y4%[3q?^sX šoS罌?ƎŎ^\_ϯ_A09bHI !^QF h0:" do$gALJRe2h+9ф0RŒ@uq* yG"U/M{Ʈq@@5rS]cF9zBcxl$ѹװ@J2;꘧vKqBY4pR'M9%Yr xm&w#e_߹UYMB~H.01g ILҌDx­bxC*&hȑ\3CP 0),CqT9*x XZ q1r ܷh96UP:l_{uОC͓fXhѭo;*f"pKŮ'®F˥Sh*)!'%ґAY9OZ7ѸmpU*פ,!7u/y'֫HЂS{4Nk! n$bBܡ4$jdi/j-l_ćxn4E.ljzJ[! ,o#÷r%utH%G Bڙ7ZSZv3P iS "[ʿT$'OR 91I8Tcʽ>衣xWK"f0ڱT̞\̿˨Ɠzmq9FeE!P-K3ݧ.&q w6;ݿ/Hv]I~(0KME<-{{u]Sڢos2Z`y-ht [(»MKSr].w Y s YW "WW'*o,*Oxyuc҇^%#Y[ji?Bm et Ѵ#9N;'-nv;GG_NmLR7=v;E.92 |Ľn;iVvώÆ9WM a&xdr?j6?ޱ@ݨ3 o7chgly{؇oSA#w< Pti?S[Q\-ze|Tiu9%Uϼ0-b ~*HA0J*1UYqq?Z~4Ȼ_s]A׉@/A}3"5oKPDTB и``KV1U̙IiO  iD eNhXby9DJp|,ez06"&q*SgSG'I:F!J;拑RD ݬhwgGC䒉oVu+k77`B< Zj>f8,eˆ\!1hU6FKB:Uj^VLKN a4:=GSrkYFS/%VZ',Q4д*l^mReKh¢)srFp"!(@mrր5I,l^#{ۚFm^,j9a,콝j[4V0TYz47}3g6mv.|ԣt.vM]#r'iRfChg~B2O6gSϓtf6`aY[WIr޵]v>qp2\;a<o~wJwh3xv볾u: Ko>֜|=>=$G<>17w!CeVa‘ 3h bޤm&֡dwZ$)kF[AB 5t}aؙ{{n"sC*̰pFxAN&q4/[V 7_/$~Mb$(M L){di$r2H)n.ʺ #'0b"EҀ 2O}Dz9W hͩt"8"g+jrsg,'^77~ՋjRT+OesdrjV*n%MSA0(- Bxm [q*#ZXtH@Bd 6*I%Xi*Қ95n+$gTºPV]x 7>{,dPnWoZߵn߼ Fv56! s^%)tgz;>Őz#05% ӟp7fhalax.{O:NzoqLqQ:/sš>i{e~g}J(aJ3tV[f< 7D5" }#q mx Oq`_- $䍣9PZ(OEjAnxߦ.opɧTXlg{m꫻zwvJU•iS7TjRΔ^M' ||M/f@j*/6_q'M)pko卺ѽf?On ?/gO ~Ӡ}3[Y0+(BF>+7vXA3n{['n{ Y(!GS+W1a3ɮzSKn{wEOm)CKH~@h-j+Fc}7i~~s<&iϮgZ8B&@FGˀqupI.Og9TlyUσdđDZc 53uUuuIV(38 B+!Rxf4_3Tzqz($wo^~&߼;LwoĽ/.q:\E[v-뿿e]u]s t-u3KIC^߮+Y? h  ɧqw ll`#]%vͷ8Ͼee4qj4a{*/:zW}1ԟQGODkYa߯tOmydrWRzFՉEtrKƒP)21L,T>0Naި{lHE%_vC7l#% ZF&S2Gl6ɡ0+5!i EM!NU':ȽTȵA+\x'R㭚|y2p+G?LlVAN?yy+>ykw b~gv:y=,?V Ov=m0vTdqrQQ& Gr/=j%g&xyW9/|Pჼ徍. M|z5nhvnPsAŸ^ ~Uںu/7UmBou+b:.k} ^ӠLyQzy8@n\56'5WvfoVZ^QuU#oKhN,g w4< b"XA(ε3ymeN&2tOS$L:g AFx:E DK VH mȹݶmpH6??qi< !=W]6Ǿbt2`O*0HӛOd^#"Ϗ,+eaG8/qZh,NԤodFW8XA'z&>tЍ;Ru3#$ASi%2pT޻(ITْD߭~|M^G]REh(37dF}wKr|+`ܼ Ө>M,(J)9Yr/' ZLOɂ\HGpK?pɥ{WZfv2@zzpeX.p jo*+UV]Lrn\ -l8/pV>(8繩 rU[}8Hؚgq6yUgGВWNFJ*??jZ8A$|' LGXړש\Blz>J۱ӏ 8s0a0o ,HqO QU9CFwAax~?5'Ah7nQ}2g WZ3dL~~mZ˖lN$sRV:pYHk `t;p9-ZH&i΄_")>9<2(Ňl3|RiEfwc3'Uw jbĘM5zvfS/TDw9s\,X]"o4J͟RCPDc53A[C;o 0j)ZL!#d2eECt`ld%&XNY*}d^8pAɶB H`\P8iGH]D Ǚ-dFB^\}!P<څ@\R虍"r {G\Q."l#p쨁N/RKH!Orn(˜BWYfqT9*SI) PrzQg[Od `w y%W S`gӅA@NFe^厪@ a4:=) zb=hꥶJ뤕ڡv%1V*F4.x뜡'QMnpR9&;V/;#gÃ0|2\&n8'v~y{JU+~^.Y~̮MV?#t1 $-z+d}3g<'s~އSbML'-%.\ر :|<x$p&ƥL)\P>Yx`02|T$DDFtTP$Hʥp=P 5PDpJ;vvF"OUeRBl;{[_U%gw{Qlo| %`Iy*BH KϜ)2JR9)C^{-THƱ`c)U1H*(6!Hus;2R ;b eWf?"CrC/zyY٨\=&Fr#6!h2 {fG$Ի d9ZADj6AvKylwqaK%8 A1TBshWXʮȹd]Aθ㾨:FmգvmqvIHeӂZȨR@+kB$^ȴ8"FBx4c<쌜2ۄKw;"u="vVFӬ.Q&2༆(D/j گ8lkW}/i=PtGGu{1.${@2Al? JlRL(d:!BH P6%Lin\\rOKmcvqǡR1iY$X+EdZmH#)$r2Dm$IV@ISwsP)c$pP8q6HA8)c(]P`HUa؜ϿG/pkn_L[Ś#f*hrrbYw&.b)&f`D R#yx6 [)G)ǂ-2* v2G9]62L\ϔL)߃LWIi 6 ъR- B\d @{d$;A]b(dܪZZ5+ϦѨ+º +qqh"Y" xϸU)<}(T"1$_^}폵5m"H(ZA!'B>hDC,*iL{N`.<0jK Rtq bAWgUnȉ?s! ({粧/DNl0Te)N#bxGYs/sӥ}SyJ`:yJqy5s2{ʜFs r \rh2a9C<;#S4kIHBJ8j=!٘څDx#~T 8!C\h'Ǚ*I|?'60qT>%ٗOM{xq4;;?Z~T 3eGQy7;l[i`KM|`|?=^RuP~peD׃f+wՋ/f̓W!Ή943bfEh_.2O^|jGq[\+{qsOga*P CO*̳wxtNtsx6+{ed} WrzlkOb+!c/JD4#zHFbs?asde/Qij+~$h+ VxdzVg[Y$y} 39 2W[DzQ˳bk$e٦6'fn5Ūnՙ\[TMkn|7`$}۳_޽͟ߜۇwo޾?'T|v 2& y>O 5?6nZaRiho4|e^O|v .v]nWR|u]4д.6=&.D7HXq˘-)4HqCC3X1Ab Hs F8B-hQݣQގl ˕]h;8L57Si R]^ -9`&asXDƆchMSU^Ԋ٢";ufբemv֠v-Z]c7<25?ٶHk?HKbfxyum%X Ej->x=裏o/Š 1%9crf nڃQz0*zsAE6 ZJO 8)C7_ >BPC)1E'E#()9ӞS, #څh 0N%b$|L耥ڄMHblbG7/Tf4~΢Ieԇ]r|duHءsX*5+]PY%t#w sTbI3E-"1kd4Hc,qSSzI^Ҟ-Qw ҵY/(5A];Zj%+aSk5YY.,Ue{ îo\]UB #0I)52A+x*PLrKLrkk/JX_4=M/`\5[fE''X>}1txb1.N`,$C+BaaFjTgzړ:mJʺжfO.96Gs&E$MU/CW8eL!ODsiz.QX=vckRDcRqk )섲p%N uJLTcX#5[d,d5nd;B\>W$"ªH0iƚTxhcB4|.q)D[2'2'1;ۈzTL1 `9½ MG)EzPopqҶQmf~4} GwuvC|q܆x.ƁyMէuw]I?nz>-%ŒnZ9HZcY`no d@; ym&O0& dٴD.4HWѤcShrMތ[\aa?\b֭6 5PktyqbjPMgNJQSjiu 7f>. 8j!y%ӭI;WZuZ=Ft d^#d:A+xz:U)@,wQ 䮇{NR~D{뚆֝PinB@'-z*zE'ZՈva{[h+ퟌ& )*TR)'NG ̓r4RuIk#a"l>noVUىn,<y/xPxH~AtXR߅^ WLcʎtX97 tC?7d=7 Mk3>"w3Ž]eqw:<]ei:twTzpA1+rWY\r4*K,%zJ )Ϊ.ǰy) #nE0t4i$))?Q-RdF~,'KIS -ՖpN0є$kXRb5) Ih`*&&kcgf?[ɱ3^l~_/0Q&aŠoyGq1| Р t AӉ'.Ξ=7M_1 !Fd^V8 WX(P)CK+i&M 'L≠6PH3/keW(͠ >8x[kTkp r2Dm8幓uG9uPb<"6F80`8sڀY;dH-]P0`k/.T4Kz1vⷦ9'XD+ ϗ~4KFkY\^wM%ӄS(SLa ysQf0,qP+0wȇ%@h|NKu2Uy.xdJA+MU]Mny҆ o"DED s[D {= {q>K5M_IӈNJcK|X{ʬJ(d`N)~,lG_}Vw55mbH(bZA!Gt4"Rb jψ1̄˸*W!5{ TPh7?=B n/[lyM":)0/|&`|S6Ïyw_]Wէj~[)V0D' X)L & ӓ|r3#̟ت׮>=&צk$n%olF:rYR\P?f↯0 ]>){(GY|B?}`C ZUڹ+OӟKf:0UMrM:fD:6oa)$T7\Y?u/]N~q6 0Kz݋2s7s Eh?n#_]lh{<%*uKMfDsU=aA`(|}tuf;Z9}N6WpztjBrGVR'7V>T ` ~o2O/onzM ٻ6~U. qzЯ/s9΁lwKY?ZlIqILmNز*$BgutJIMJ >L "%lIaŅRVȨ QwjRv3Qu OZ†mVdg(K] E1Qd1PJ]T臇PMB]Tr3E]}Ƥ1ooe⭴0Gnn@?g/8jj k!l;3vQIaRG!oBJFĒ!Hei0TEJq6?(0'rM| 0ɖXlf$[Qi-ch%[Mb3ria> 0F*+@+{rBu lkvgC Rs xLxY:ôʤ #^Q`46- "C bCT mzYYdYFEf!)C*D dZZ4bAY]UP.QHjܺ2Z蒀; Ni/qs{~Q<]Z} X/;P`@Jrv@qw:˟:nV6{lyfP;/u\ݣ;=Go=/xOcv7?ٺ.#,un[< [a+?ü=N΍e$ɋqeoJOK IXXuVoY:̠rh0m|*Pk#Fd*L/ShD6;[D,^EZywH55mNkVKg5y&lLBlRh-:ʙbhjL̤bSR͇p|T9D\VZ$ g VR5c3rvk(,g mc]h]xT]Hj'{:d/Ehҿ]Nsb-LR;`IJfME+ʖWtlm!+cTZUKώI/ `aV~WK͸Pkm7hnVa^%}΂gUFaY2%bNJem"bd )2#C lE']\D(Lc5/ELZ >lFn}83=b،?ՈX#ؤ b@KO tvcB(X6 @PtpKk] vvvhG{,bl[<Hv@2ޙ`P?Ob;4A` A&XEWp,{K--ҫuz0v>tJygĚw@j)$ms_!:uMnW+W9;RMo7Di&c {xϋͿu9JVll-hdց!(%\cb5(KRyRK.K&!l6b/$.m=Z?~Y\9;_8?zL30e{oJOL-qَ 9 <" v]`kt vt`"GR U@2dN &kd'W#8gR2"1ۙL!;j&ڵPd AcMS3zዐAJe9A"T1ӊOiއy22γ75 !o;4o &*]\1:2Bx{HrySݏUjl'ic·L]}('0v{_|k8e (kľ@JFWVi iWػj{WrL{W*{Ce^D/=$lȣkkutP]!vPkapb/4$Z)t%; dDM+%%ZcC?iǻo "~l GK+WrNocmo?> c,ʻTJ;!W5   y$Z;JJ%JpP-^J}hXIXd$L Xd)ջ[_Y!|~J=AV׮V(m/v;#Yeg{шؑJ;1g;2.%CHKX?Cu7pL~{~hCqz?c&F"O8/Xջ[~]u%d y&RgUg~8.% ̊.f_xM타l fBͰm8,dF`v=ŶhYt衈v/ \ d9sò$G>Y8P<;jqԣtQ:R[滸@XD6:m[*ӋA߂/tL-wIZn>hvN57a͋+.oY sIXP8NCƓ\s.`kX^%h1$-y/;,bnтf<&';֨ɜMpfs-nji VJD t2CTv@Q(j2}sV?ڙ3yR1+.w%0^8Ue Z#VE3f>`Yz2a\sk μ4ܱMRs]#mKP)h Rv=]fΊ2YCI | qV>|6U_aܷAcP]V ݍs$UdKCA'jYCm!hdF4^ x4VlAߏ9:b h}tPYƦq!ƬR֪&,6Cow[ 2xydEj ֕JaU(0d%sHޑԔeD뢲Ң^k9R~1)G##r$K##)SxS$"y2ɬZ$dkXyI)+cIQ@B$ zJ1a]vvpdf>A3e3Q !cnYmrRBM*3wxX}gc*,UFL'Y5I+*{ 靷̅HرՔ3^x8Pi aRY@!IL-bFJGZ`,>٢CĹ.kzY*I'J$* 9c$9Q(IAfmAڣ kYɑ1(!:v:˾QcK}آ|Kz6-r޴$VؘXHVJd %0 ƨ)6ed/2$MŤSO@ڮ@3YydŘ&=&EQH>frNUj68ІJLa6x!5TiO!rCȘFkĒ0`-ĩ搦P3 #LQb(M45%nd,M%OU8FlU&IqXbYG qih%}5=cɩbiu X PI4`i4TÜki 2PH6WTJRFPHiX\I;;턼[NUgd x*WXhJL &&l`Ù12;@:9.g !./5GȳBmTzr%Ōq2M!̃V;@C>@B e ֱWMU(`GyR5FaAE"Q ̈́aK$-0&lJ80)8ЙVNecX"6VRr VQ'J[e@SќQ3Dt0GQF4`ը-ERP6ZrABj=2ZS^u$$E>/)^?X#*DgVUQc%Gii"j8(c Pe0>j2a+֢ƛ ~\uXgLӄa 7Hmnm#eۈYUI5ŲԠ‰s!: 3L ߹aā.M/r]kk%ܠ*MNzDղwovئ:HBK/(yi87Kxq2QWҥ`ruA"X L͇ɠ<Π;l _jTcA#zPr (1de+ JPaZ*>72A)ЭLQx nEe06( VgGESU_~(Ӭ(al2JRpac0 vnWn}ﭸsR (" 'PG]rP1@>P9B(5a2HcBQhv)0ְc[8 9@g@!t9hs^u@XK4iQipp@Zi298kcj#:& gh ` eDAo 2Zdpc,p0",0wϢbt",U@k6֜OXFHGmY4gi&0 ֨H `f)4ͫA*U9uzˬ Eae4+ԿQn$Lzt`_5nIު0!K֭hL6֢1yحatZLeilri$J,ԵEW@7]pd3 L[ ~)v`)jY:Z4kH͵繨)UFCkn0r7G _6D,T@s1%%}~A5(QR(.*PP<*tGTi'O H@X 7nkdXllFVWU.ܲFrqǠ\U8/ڐj+&t Be_j;J*΃ Yk0m@ Q,gfFZV2Zc&8 zȕAy!ϟF`J$nÈ9>X2Q{X6jgC1IXU?\|3_v)&T,Ҭ"H"r é K]p9D LE]&y]p]슄ީEF8Ø`j?®z?j-nϗ[ȅb.b;ꀻK'c8O["n0?}UvA)c"j;8?0率Xp;/&yuSׁw>a] sT]#sU~آ{ˋ!M3\{O]?&t}_+QG+ Jq''ӓY䯃ː꾷$/{; tSY}ckҞ^볹u1jd&u#_QeV*x^dY傘k6~ϸiYYyzc5eX3~K q2á|u0;KfG?vv1ןNV#r f>![|Dȭj}wr1U<0mwC("}וKb@eGr7N^\oa\?c}.\i>Xv?ts7y/<3~y~q ׋ɧrFhv^_Vnsywo2uT7K[Riך7+Æe`AGepg=oڱǐN݊$mVq1meg­tJ+tчؕTRx;0޻y, B7jdp7},h},[ #Ԗhp^ɐKYc[r"[{#K4ڰQ;#s~BR[so _{ۜ{_طKe} Yr9;|Zqp(m]c]XY׼nfQyY.FôuГTBJ=wl,Apg2Ux( L#c deq/6H08EO4 b,3rIe߀Ooy&e!{9$q] tRcM:IL@L` ϗJ0p#Nn@3{<>6Xuz~#q׶8)t ]ljo<q}vg>!7<0;S1Wޘla@pV*:WmعR;{Ӗ4"T؍Un4s?ؾYOP=W<ƒb13O|Ϟy\I58z,HG_Z3x >Қ~iދ֬_ /)'#0@,/ً4x4:)eT!׮͜P{VZ]wͷ~sH!p5vop8v%b8İ[L ieC)J%w6]]vAU)0EꅮJw~ iM߷xcsf/gz}Η1ǓEr顁S?F&}z4U'R&<4^t/^{e}G'~&==!S \FqCq$`YӈNr3@eK[z8,@`rm/ XO MdCq9=75E@\yj`b/_/L.}5Fi. 'nqbk0rj;Fk1>%AeIY)eCvzI53ҫ;nthCO,6a]>X.'uk^?pOz{ Zx` ^dZu{oq1+U)EV=7,j*?{ƍ)}6#PʑgT~𔸦H[߯1áH/EJd3ƯFbȁNZ-(j)'6N2~zi:lۊw<I|THh^ E[g; u&ࣅ1s` g&doϟr98ؖ U`hi)1oAs߂϶FI)Ō8g I`6#7ؠ A1۰,a`Xej'.syWnC+121RA%Fv덱['=g57<bQfQ1zJh*K"__7m A3Bֿy$1mD/i+M)PcƒIaJ5Mr+",=T6P*r;K>+[!,IT#x Ji-X'>(e@^KǬA,ƈ0 (W%6!F2 c* $dm)E@_fTd%_*[tzz2>Q'RsScKai* §X+.a ;(U .6#ڮ~zft9l'!+G˚TqvjfT:4 kZ\K{b|qOW˺!˻4-*A }+L\L+y׽m=sWK{%hu t`Ȥa9E?Ǹׇ w `~MM,7K+jO V@!./>v[Cjr;tH+>qC.I\瑠TDߪ^/̡\k bĪRcjJBXalaNCPq{2vBd Ezo9 x:o;cƌL"^ˈiDk45[!-`6r6 SmWf4Bu4 宸]@f֖yAkt֕Gk㍪6"/k›7ݿUm.dʤsD$r+MB }Tc wxBMvYzfUr^Եnv{|r ΋Ajc@YDƍi؛pU_Np%F"p"W7]\]9cK}i?{8[O{noH9Az盏yby Ⓝ. f ~WӜpT|~Tg%J*ETEl-(HĐѩt9NspN1@#Ù 3-h [ 7 0gzHLvz "-5d0I H;V6r6(p8{^:IɖvvQvm7O"e'qŽWEyȏ,$N!vB}(Ό );M7j7||ꠈUn[^Oa0<&x[h{RVL8b4Sy;E wO _#}W&OPTcgTjntɛyMX~7z:W'<ʀY,C2LAz S2㺼ŔE܄qﴆajB?蛵ZWzj++woWscݿ_F-Y\  3UƢoShGc\X*Q=}񀗵uSufXY-y6| VYj} dWm[vZo.>;?w2Ubbj/`߄!/zNQ#D9\ ޤLh`_Uף}z/jԛ ]`5wN*Z mD6^.] KH4RLp75n߶MYTiX쬕/TW@Kmn#9lp~mK>gʚF_a(E(03X?َ:սb=[׆ xuc4jID*Tf}_fU&HvoswN 3~ˤ" 0'j"I\%b=v|eR>b#oX^Kˢ%=kT(i ,( B""Rb&\-6)>&+]Pѻ9KVj/9vH(򅖜N)bxЄ^xFw)ι`yctr?3[x_xwM:~SŞmn\ͺX^u~#oQ|ٕ^ԙ@99pO0)&?^Vw[v"+,G A,g5S+隩 őBU A?!UXi>x3F,v⊼= 0Cj=I\̮ۓҽ=IJFzo h|U7^4#oݙLz"Y10 xNoigDkyu==FL,>rX@׃=I>)f^?lK_~]M)pGWWuʭ1%H6r[_'GsyeC oyAoYoJ\qh,5Jkt5ĕ\ N|;I)67HS%Nktns\&-1WII/\\+2WI`\%q՘$n7W/\ Wdg\\JJr*IyizsusiI|sͮ\k1W -CUK2WBGu`ln]iR`ҼWAئA++WqH[J8 Bhl$TTs*%BzA=:`$"OZu4?~A3KJ l̻j66S<%LreTtY2@c8ߖOJ| Y , 53gAc>Z81ۀ[1`1[8@HP"xz~[(6j. }*xWUqMYjjǀ""Jꭹ 1*~gFUy ϓ>nRWfnfWu 05LcɁ[?O>*Cx0܁E ,o砤?vj m.ИU5` r 2mV6>`y)MEz6&_Bg;R : No \5FRr0Vﰜ#ۿ3 nJA(\1TjiY4a#A-2Z{x.Q b;EYUha8|݄u gm Ǯ*|&ZnƯ}  \9F@W_}V7J_Wy[Q(̼3?ea^Ϟ|Bʬ/jEYeJ*)x*);@*Og:'[ GY=ڈOtuV'.еX4GZgAZc3yݚp ]oҮGzh~{g t\E,dxz%+x1tK&kA\ #z!U, `"RSF1h#2&"^}6q63XR o֩]G_Miwgi{ K#nW-2ѳ]#p; PYN]Z孬Nuh$ 61! R4*ۈ2:QҨQѺ4"h@QAsg1rD`ÝIM܉`׮V3\nk{-4Dݜ6WL^D~fM7l"gaG<_*#J%{ ˱hOGcYy蚀 K\ZZd'4+%9Z)T5zIA57V#5@OrP .%k,FR L0+yiEL{ /sEyO&6 ,f")V ~G}cH) !yg%T"˥C BDKE$᱕m"knܖ֯Ū;B\?a(5RW0ndԎZm\`HGE& 'FLV%ƒE*L[ލYڭ?[Z:1&Y0-[M77G gWe﬈%Lz_JafLn}J4բyz\5C3{|Ĭ{ ]WoϬXRX)CN<ˋ ӊ)cIK&vZRs[Y@h!39l\[a[KrK.ump<8}zyFOf1HDbFc1eesqĢ$Q)0 ,' t|>}*qPSuUٸ汻lȊ*ٷgMop~Udʏ>kO$$1(`ˇI8L\O8a2}\ig{)LA/*JS:)g~ ?ɯ՛_nv2f A?6µ. m. _ݶ^ԃ<׳Mp,Amhc ~?+B^꠮sߙ>LFem|?AM&&h ҙGw7\ ɠY퀢swnza yieBl *X0B`D$-hBFI)Ō8g I`6$FT#oQA1{mM$s8.с4sY^{l|wjW?N*)k- ($78ÄXoU:6ȥ QFʛEyLR4NpΛ (v!A1grA>YupJo8*/(ςh )+$*xxi0ii@"r)~e5%oN {Q5da0Šh$FB:$ңTŜ8̲ȕ68}ԥ}M7ןdj':p;lRG-vKZPG@2[  ~`Ԫ RHc E/*"mN_Gۧ~y ^0WttQ N|[7! R!weiFE)W@ wb#CG}T̅S!!-9D1ƒKTt`a0:`L3bB)7j<@ST1`Z)"$%QٌB YsN,>Se}5li]b b24=V˛(GA`h( J VQ^𲇗 /F`r اƑN(0V,K `Z2K Ap@HY.zg՘1ye4zl5ͭ28[F"at~| 5g$m9]B}J4oq 'lQ{&,nݬz|?dҍ\+Zy&IWܛ6:7 vM+՝aS .3Jl`h>eql۷]7O8GV.0ovusfo1ǣy/gs 7n/:^ZV45gݟ5]mçۤo:$fn=7?7Ditdnn|MͰZ 3We1(U< am!sSQi=egUvŪVh˫t19_Qy+F"凚L?]񼮸ias!2& T)J]v68U4|TFņɲ f'מ'1mwUUSj7N%1kE(K k,)eF(H .KTϹS G,7y r %ZE@H1rK%bdF/rF*e&f˸fB=};}H'[kJw~ݠ0>sP&f\8B:"FZND,xwע0$g9rZ%fm#"b͎}leo{nsq!X)=B}%PW1s# ##5y}Bg 2 C̀E;$$GQ>r lV~+$KǾQeE\0R<1$ j4, ZK9S p&Lb @CV'dZIp:"^2:łeP&t,E& nj vmt$_gY](]Խ]1jOVQyLU#+.#e8dspI./xHCV_pH4IR"1` {*\FA\)BEpqN/B/>,y>܁ + ~a#{ܔץC>S:R@etgAmj M 'auԽO2xJ`haR ߀0f_wݯ+ L.g_:'~Ӡ~1]YnVmMVv8_A'ɵ#X=պaa g(!G̫wpp;[nUirBͣ&nԵsE9g>}~M+GhXPu i⦅n>7 >gmbLO<4 e&^}k4_MoJf40U__ߟ}?w~{w(gop/I]#'PO O[-547kfhYro1.%%f,+{*4*D#x X>G&"27HXN'^lLUz@j,8o6O }gcNgmEx #12/kc[_8] )mNDx":F%HcIH"iJ LLӍE@3iª?Cr"% ZF&S2Gl6ɡ0+5!i #P18CE}3I{-b.y4|FmVSz|;mDmQ A3\'THӌF\W#5H%tsq^["8HI~&2tOj%L:7AF  Ou42QǸָ--Xi#1HJI(E`/><,!{7{B.U>a^-}/ tUpZ`54>^5J(Wg/e:_Oԕຠ(qJ@ ȊiW9XEtt]]OAHupL8 =h YFjH彋_EEE3Ieٷ"rY P{dS8FμޢȆ5ѝ?+ A<7<4, i{[0ԞpɃ0yАHr*RHxb*'t=s* = m85*}1@S:14<0:z.|BI <XC#w&@()Q PhS뢎 AժA^w)4Ah("{/opOzEnt~wO?ѧc}UZ_Һi yjP~OZ: qӟ'|2jzs]#kz@lz,/txO1@kqo ~ĶʮW7E.-o^KǏ~WripTȬ[."Uv9@~9 xcsgK퓷׼*=S^ƃc $;AGoχs4=Bd|=(LPNBek>Vt e Tf> ׳ޏkghymzz1;tI5 q.[i R F@>B},V="ʌEqE3hІ:DsvU{M1mޮ,g,UX޲lڸdP"{a!V-I)ueV(U.z4,Y;\SQzK;uyάߢv?94[^ڜغؔXj}6P,kh`s$ShSE &%+ʈaBԢ r'A,՚9hT  U3a˄1\/78Ul_IhVn]I"d\eF9mcj ʰ\eH*RcϝBrEHbf`X-*TA҇*&hșɠ>g3),CqT9* Y@/]e9 uMJ[Tbd(0e꟟{7=~\Pc..R]Yu;]vC}ٮ; ITS 9ql吩*˭:;Sŋ-bmkƖ*F$KC)yh*Y"(N9DI >uv.fS:$b`0F&mFiD ġ&zcIOcY eT;x{{&FQ4V%>]nX\>%oʾߧB@/Sg^2900r~ ? ߓUVB N Iy{8 ?#tE 7nkwA?a'qg+>?}WIݤ&#dQ, qVLsLM? {4N~)8Pd˂i}`8lVΛTזpe'OI08w_if*}yEV7?|莯:I59;7,^Sx`ezU44mTdH.'ۿzBWV{1W~^*s7^L4.%BQ[l1[(ʶo e 9-ew5%Njte$`bb0Šh$FB:$ңTI[ `re+a*I!mKz0tleXٕi`$梃 ,9z:⸎Je[+ U1Q/t v62*$1*$tL:6>v{:oUu>'QL܎q.K Z^rÜ9Qln%9iC-Q|^i y0]60A"#` % Pey@Pp-l$piMnG.jiW >sB e5[(#?'<ag9׽^4r]Àh!0(U15V!Z4].9% #DpB9a1v8V e7TiA \OûKd&#QHYu2LwZ D佖豉h ih6&Ά s n:Se52t{ۗV4w~əT*UI31j]sjQq*.]'jAȕYtIDh}6jx b?OgXdJW ,Umo~jmvU~sȧuRMή=V:J?7<7[ _BN:՜<7tYodj~m= >kHƗ&D+7G7|&UE+KidsThf8qmk&;ϪUU32O'۹0ڻ_& bYmBJL߮jӑY#BDJ1g.'q'i((H7 IM Q3駴 /cPNbXXH]q~A ,6 T֘{JƽyQ8$.((ʈ7\x3ι1gR\a &QEmsI."/>(e@^KǬp+i 1bкF >Q0!S&J)G>G@2µL@8z|1q6Ĝx$֢xMNo+ky+v@SQT ;&F5U`cEG` yR )%T#idV"f% RP͚1u8x"uycjF*#R"%TD2'TUK@4^ {x, *B:bnsu4bXRH X )G14TIIdʶ$?Vkj9Kh\Rc ^(6U<BHQbΉ&o6X:8KoRi+jSp΅J8؉JT}V;)o?3>Nn?8Nח"'璧/D() 9$!V^0ktQ[ 3[929Sx"@_vQ<_&ES`N&0Uy`X9S`)" f.;5aݻ`4tu^# t+:mqkB2¥CRa9Ate]]&= Qci;?>3Y/|)|ug]vk (z zb[uF7grPu"5WlM.'w_` 1)~fpyO aS6DYyyr3¯̛ċ:r}+ C07N/v;WeR}߳3"P޻qkӛvm7.u-Q[k7hVU>0/0 z0bݤmvn:UUmnuu>+:L a2BRKbkfI~ s$OWV*)er>KU@= [9YpR|^钋p~Gznb!n ݤ)li'IUdQ,Y`,qVsLş3;9/OE<Hh)@-եı}Yv[~,^)`+B:ޗo_yuCkEMf>-h< K=feq/piژǃPѕoW  \%q ;JRupdgWdWON' ,&O WV?zÂ+"-\m1ᒊ#$GWI\M@ZUC"DəWI`͎@ܽ@sp%WIJ[R(:"J z4pĕX*ILRX-\}?pŰVc܍ӻxsiBÇ_J9Z'07 RfíSWfd!`<`u^cQ̄h13JaJx0L`igTʅۥ/).'Kߠ9~7UW5Qҡcs-c1+Y\53c`Q0Z-:Dʠژ%1 ޭHеI/=8XbmllmXޣYm IUp;oB!gh@8=tVz4"VFڪu`*>b._n=imCX}lץ/0EPlL8ͲoC']{wsW!ɾlod=޶Ⱥz 'FTMNC礹'xN=h4Q1||)|*=WJM^m<÷uBbǴL*~,V>H=t+?I9Zߑ{L'\bv9W؀EJ&\!3xn2$ŤHwHgmwxmjD[SL*d@V ɝQafar(RZ4N;VCU*\J#68RdSY@ E\TknvtZV@z{]V9\X"BOK>a^i~̭`9oQ7U~rl'2AO RqJġus, nM$%y$"kY"@4a$y rgZ{H_!K #a`qx7׋n\K )R&)>$$%KMGؕ,hG?d?.E7+ܙc.Hqppk ںv NGT'i|^8NQgH)QM!fC! zViWO>b:աBlv`@ŶCZ֬SҒfywl4l>O?U[YAF6 z&eP`nu* Z2hp^ruē"˖8 ΐbȜ 6궧Pe4FI ˒ɬH0&2Z/0$7H^&w@fg5qX3|0n}[qnV7*Clbz:g{Z UxH as2.cfJJqlev\PU:oxTP g$].'$\M5x-zz<yy4kOV[n-+:f8R3vßg."hX#B ѳY$&ik|t*kkd$w$wTUz7 nO}\odg*g/q>ۇO^M>%5qQ*Ct8UbBE/%x^qtk@Kr艓x%ygT Jӂٕu0Da8HkQчas-~m5Ѡ91Y|L+4D&m2F6ihi끋]Tw=ma;&s 9g 88h(M6Rwv: YXB\>N'2p<{^|hFi3A:7g'Q' PUUY)cLVNxNJI0r\ *+q5qnW%\L96sTH;a?9ިne{aCZ¦.&OZuͷ-Lun㮏]\%ӒQ)66XN± Ҥj@*`:TnjqmcmJ6aIϘ73yC\4'H`ؤBҌK43DIU//oC~x~>n}X&Oք|gyFp1y1mOt`-#8.:P-0ta=  @\"]A IZlDN:kL"UCYfð,Ä|8]U ԋ;?f ]m <5 )B)22.`(EbE`h$$]O}gxxz;ӳtYk. X+j|]h&.M;xC3+c)M '.pΥhxdOB$g2X )$cΌ׍Vfʿ*{-YZ[Ny\N,i~Os1csY+%Ɨغ od'0q1[i)6_)ϤdIңLtg풽"2kx{1KaM Px bҵu:u5iœm6a"Չ]C Tk_/z>jhf1?\4xQ2b<̌ jןz3OJz^qGw9/E=>]p࿾zBwL\FlB?4Uc# 6 24jBv&1as6r59wȹo$8Besp-Ԇq.+WfRCRdY,2}VRjS XȽs.d$a&sj܎B7+Ԥ-gK)~}otE'!Izv6—ࣃ1צk_Okϔ5Vc5*lP h 'RKJ^|AB{fipKfVJJ *NZ2ALL4,B*=$hĘM 6Z竀AW7MĻ{s g`E`.Ft^NF)o坣gBim~bȿicf#YbW|[o;~HMu~zeyJJL627h4lVɴH] Tb(~svIx> ?&F yT@#:kĢA!,0@)U s #h9DE, Kɀ@pArJ&uPJ;EHͭWbsw6n3?ksF?E#97 [/Ë Μ-Š7犟DNҴ7f٥ƉlIyCME&ҟ^/̤w_:]WZS& x?ٖM$Soz7dMJ :@qO1Lr&tˣɔQf~mF$V9BbޠY֊Kӏۥ޶2pZܼY#)7L[_t;֔GFo絶ŋzx9M(&jpu8nVoT;o}[f_[w?fW^^^LOξF16;^ >y?9W#{QDmE1ÓW0\ 'njI-)M͈>U~'5i8jm`Ų@O:VꢓZW2zbBCJÚ_GP}1!{#~̏xo4XP{5-i+/^I(+'^\WB,Eec/.Gc42}^?bm-R9f̺.B?/~r޾˫~IkLSh?X?hOVs%I\v瑠ԯ.xv1dza#Qs<%4Ƹ.{CyC8c'4.(8y`Dr`00 2/cQpB~*W)AJFgC}J zT"\!#x ^(LeXM;)l(}3_ImføԮ8͎7,چRav9}YDw``n{vF^%(&nً,^DJ3)ᶚU,8y"32bRSyR'.r5ި~fVivˣ3Wҭu_mFj`^^B egզw+mr.>v}盏ðɧ~1(MƇad8^zCL|hΚ?u͕3v)^Y3wv '쐽Eϭ5Z5:|1!k)!3Z`^28kC~2։m ۭ?u(HR:W)eU|`ޠݮ0~FA/mQYa"VbHO"P`crֹHI r1)ibn|Xhq e_£8FYWt9Q'<ĸI{"hTOyd`94FS`@pe"eXA%Z@2>9,h)Gx#P 5PDpJ j18 52E*5N:J?uIV/iTLQV~ΛvN&M+@8+iR P!$p63',eF"g!qʐzG THƱ`c)U1H*h6!Ifbp3c9R CPBrIpiy;~qwOnl7fl8s^ X&E.&rg=,G*H4\Bedf1;{60ceRy NBPз#$&|BxK6]\KabPVY[*؍=ƝQM$I+„FW4'ޞqHI}RYﴠ2*C E{rMKc"u!@<:9Ra>,g;NS&KabPFԅQ2āfY$/1*P)EidH(xL% 4͚785   X<}˨ 2P* Gy١f]ĮvȋY%u% yѴ$֣lIkġuYhQ8+Eh;)xR>0><<҈mw) >O4(XK"Rє`팖K(0X[TL62`CWTMV_ْҰ> ]0MjKWd WѦUF{ꃑUFiLKW4y6q;)-Yw?ͽ}o{8&UEM<Ǵaq ]q +,o ]e1*=~~(MKW@`Mr0i ]eBW-Kr!CWB2t95 W5htB@hKW "`NUc*}Sz]e*\kՇ\[Ul.%)t2vtQV]}Dh&DW W5F]ewB;i>"]8kR C, f1*5 Q!-]_g3/V3 ]]t-]TI AtE"5.#)t*ut(-]}@bp%DWj ]e1tў+a/ʖ> ]q͘6 +*)tj~tQ+`,jos3W;akaμ=Z(<ګ N^g~Q bH1T*IS9Te5{φn7o59psmāu6+-ϓ_>yCfD9O.-tPxK(\xCGd!FZ& HCR1C# ~Zb{z+0t /aBF^r,뾞Q~BnO?O8d^>`6?7y#Uj)5QrM(CfZ?ϔ89ƵL™:]sj~d7}4aB0qKIQY'CCқ X}!Oe_S?|sˤ:ɴkG +chߗJ,߳'.>!/q^iѓ z=n?|~q40o]=x:t?k?N=r6Mۤݻg՘ɕ/RlsJJMd?r D锨ăC̅ɴ)|Ѻj@Kz(WJݕ]\ ttQ.0x s}um|5w_Y[[S(c֎+w͢MZ5`]/ܻrQV~?礥vZV=]O?_3_I4:4 5&CeMM{]hFY#h,p{-^@?y4O4[6; 1ƋB`]Vx? .R7Vdt+ouwB$@J apSu50ʋmA| I?7(6!(aɣB7:™f0|x캱޶]{CK{𡲏=9 WoOvs ./!0y" ]"JQa|Q 䴳FFC4R9XV[ZҞmmaKDBbp[JxhX̞"گ-䢳}I~EYɳTFx9ړܥHIȓX0wphn iBX{a /;ȸPV$Xֽ 蒜8L 6`/$(H9wX#+cCVM3wjy ͿVժVejͰkE v ]mΤ itOJR;m A[DŨ$ri1S"%z`)TT|r"L(1ېS֓%m6vg!z#ͩ 䭧FvoMΤ4(ɨH hQ'1ecrjZF8 #2@K -11$!;:p(vϬ+#I$D"%96 ZX5grXү#TnC53w3=^iޘx> .К4b"!VnGJZԴSx]hg-ޘGIC(# վ*`#DGU =*Y#JU+ye_KALԁZK*hipk8HW,ct(oA'E rV~IF/.kc]klvWQB8d1 ^gq^*Aـ7~^F}]Y(6Qy7Ëƹ;1ߵhMj9gmіg.szCN:gZ{ʑ_evW`ѓ4wЙ|DI{t%Z${MaUUpMMڬ޳ wvwRbQ&zTyeޮhc'E;>(;.;ʲTd7ZO۫RJf4YET^œ D)p^:] {(NnSϣ3^ etoᥣ'zIUqϼ0Hy-s`"/T)(w: i uvGENq٘؟ %'Ѳق:wMVY-z"ҥLq ͎J}IA<c]\Sn 6"SM zIA\ iO ; AD eNhXbTj"pRP2΅HOʔxZ2I(G l)"BWKLd^ԨY F#;?u[51ԊRh֗V~.a|$RWi"uqf+IwшmbMǓKs=QDNy3#h.nH1aQi\@LaU}o_:6b0zr^u4UT*I42悵N U`"+dV[N"wb=R4iMq=Ww)EKVRv{ۖ<at~oirs;1-3Q `߳/Z /}Cʯ}+~ގ>$K oģ ߲0'$߿~*N#'O㼨%dg$ΌߚϼzУWM~PGr> 碭w}3t2ZnsY|7Ʌp%avvL<~[wL}{7@Mv;aԀן:͒v3Ew\*{>6vn7ÙH .;;?_Co'~ӕ2qrrc{eLa[?nC-ML$xBqkA}:UE#iϩq…/+OD*;vq%(6KL͙^3 W {.La!o4ht{la}D>EwԻU1a瀆ii_OcMޅ0w~J87j >8;t3 ׁ3t(krv3DQh\l4F:=SMMג IBJ8j=!HtBytW\T?f[쐮?sYb\w~Cvhx۹5Lz {Iy5yz|uz U)#4I6/akR7/ \+.q )U M4Sޮy~rE[aBg\/~u/.f޳ Eh_nuw?^Wʚ@_ժjj-!5Fa^+߻vh^g;^8WZ\Uʶ"Y|$3$W,P}:\+ukzF;Bռk3gxS/7_#OoV^>9# rn^C'LDmv̻qf4_T~ח8I???O)q-8z\V`"juՊRUc}Ӫm%[=-ꥤdW׻JFB !-̀bqRiٖxXx;^bJwmlLU:@2Յf0^o7>WG~VD<1"fUyfЎ. OJ/ [\D(׉CB#"t0QY|`,Rǝ8-; z=#"QI2|r8 VѦ=*ruT1X(.i,² ^}롒 jج=m-x[}_ \B pBdh*UiFhFӭ ІY80 ȃ .(a\;cזN9 bR9攉9xoOhe&PhQdb9TG#u` @DphiJ#ATX,&6Ik\~~D s2^x~ 䌕.-uUsdpH 1.J`**R_>{a1jj7Cme6|h3+7kr^ z"ؚ>Odsuǯ*.f1C 2ZB5ɨiw#ݡJIMOiYFS/%VZv,Q)Uhc"1MP )K$)P5rL5E𶾚Ms^l4yoQM:Fu",=eNrEۥHrh5ZOW6Bu~ң3trly`_o\obu7%_3aRɋp0XSBy0W7lj=cw=5w g9mm.OMd\'&Y"skqsPF۫&Ө_[itRʪ q*N)}ZL5^ 64KQOғ''i۰ܶ7 O(*ovd9Rɥeh +iR P!$p63',EFG9E٦C!D檐cƠSbTrKqrڄ U*Űf슅0  Ns72^\yvǽo^l7=#62N{K~ZĝUNl$ˑU%в MArژɞ pg6T^3A%vDD4PDҥo g3bA?,PcAbڱ+j¨Zn-ص7q;-ZH+h 3^ДzP=  %ir$iA2ZE{bMK ǘBx$90Ia<,&f:J:X%2Q" 8Ѣ !pqiǑ #@X'q 9~|ۏ m=Y`'c6+Jy9fJ9f_`YesBp v:pT*K ,R-\@FP}Jp YWY\NPZٱUrbmc ͸kr~p/3mIr0ZIA?k&$!$x^] 'op՗`@c~1b>H(yc2G)'[U'䔡B:Okin8՟^D۟(W& UEV'BךYX &[;ȏ7Kr6IP&s*FVƆK16^X8yd,Gn[{Cl44MfZK79p/l0ʥ ԕ ZUT9v|/ 6.عA/N>?TZ8A"|{)i af+RRE gp9ϛ Q,0Y\!Nbgi=vԼφb-^ >H`CeO WI Gv!W];j8!BUéUԱUrbm'

[ :kոG(:Ei%$O(i+M :@%Lin\;x)<[ʷClhxݸ?59ΧE_/Hc]9v /D'NI'~N^ympoYwX+EdZmH#)$r2Dm$q5*hKXIx"BH#‰A d\D rܠ MPL|1q60Е-Y5n+\Zq,F*-bE|iS<-.&72 @ӧ )WݛUWFV:d[cEǦ21WXHXu0܎{N:Ǵ3h6']?7u|u?Lt|7e2?ս'X tpëPw&,Uo2Tf~sh$gom?(go׸ M\`EqGU{]+BKuM6Zx7藒CnJF ш m(910NaCPԻ#OXզewH94ff>$tMk/ś)YJ'cO,x<󴽭n*jوo[وY` b@[I.AϷfûx[G)j pBdh*UiFL{|ۋW88 nܑ*pL8 =h YFjH彋yE3.e 빱1$SI9V` 詣F#lJ93%gY>-kzijReX|_C%b,.5Y3A-|BqB]`!`NWr[dv;t{CFBOiYFS/%VZP;bx(b!D?LE*Fi&ruPHY"!(O'c0,&Άi z/|~Ru5MW͛.W^ȕh~n:zG(Sxz{u8њF9.ugnj7jzj=6hy0 V7z~̛6߻giqeg¨WNUs2ɭ..oy`\/R⻎3FB;Wq/;-UJYU_e%+7h!f.辅OȾ,Cv!S QE; %IY 6&GjM9тާ !{鐽eQ$/\;LGib\ʤ=RkA*'KLf"!181Ԋ$((v*Q$߂W(t"8 8 57lKpqi/OғỌاmrf߀|Qz'Yfr2CY"9XIT%" =A2J g^/=sBQψ!5B┡>檐cƠSbTrK8mB'U3gg,Ub/}<¹97,nozonl7>}{l83&c,EӢzwV93IcT$.KT! l cژ=1βRy NBP0#$&"|BJ,%z0b.zmUk> ؕhp˨& DaB+Rg$zF $L\> wZP!= }MKckĺD LLRg?5S;xbGԅ=>xăG\I8,[ =7F*08ɴEZ:D PQ&!"AKϸ༆(2j4C Ƒ41L1q{OC8d.^g1-/ R/_<ŕѐi3Q-Qc s-"Apyy`~q~q_a1(vp.`zx M9J;d?>O 3RtA/Ə$TRȆ)B~VE OL3Ub"/0ِ̩ihO[QoYzpVڦ6a9ErKX%(a0+54rg6l PKx bjf.>/=7j/b7ȫ@ ^wnV~5!w:I}~wqo{ͳG?oAwYeTt,mz&wn阗}MNE|^ZNTgb*k'PlmsA=%\,,~'Y&2S `h곏] GWGM/;;zp*j0{w./_sB0>Nܦ ?8i%V y^]ԒIQ fmg:5'wwNdw16jTd.p-hfmLtZ<x~b}ݏKQWi?_v&jrxd?j)vyˏmS`S{nܟ=6vén.6}?z İ]˛,%I~zX]8g6Ri0E@[U!J֔tGv~$:Sm:+ v ! &N b &eF2 ,fphk1ێOâYol^ǺB1Vyꊗ DzpKq09RA0LJa O1\s#SM.\CLgա/$/\\v ;UL;.)Ƌ0qgR`BWk6t%XF$pUy 2ha+vx)uORS0PƂt1E4'Ep_^pCt uVnL^+>)u[ϒz>~ q} ]ww1=ԸxŒT"P$\Hd[C)}f %@T*Fɾ`9\[KIVg͞l099;*hXy؂Y/!u<Y W־ )f]EWgLT3h)řڒqYǠ#_^TÖ]G9S߃~3Ô0:͒Ԗ1K~1Ӫt+Dp`'Ci3ͻEiHJ1Zb=JD+CGOۜԂM`cHvt AP> ,{.eG B}z` [fЁ~(E$g=tHrdmM9$bPK\/R2~nﬨ2/5I~nɇӇMu4>)[M`qa%+$VY YIE])2́Ĩ;z-]HF}j{<4гzzyiȃ2.W`Dq=c&)CϼvO!% 2aOyu!?=C-8~f82m&sjc4,FR>KY!K9琒W TYU`}^wzZġZair_ŀ;>mhq/ 9{m]`ږ@TƉqEEpY ohta|AϜ%-L҂x"B$J5]Q d/Op3|گy288NO/0xo$'aSwĹ)d,,'Ӥ=Nϒ=ڒ3qLZL|)90JgrgU悳; $ͯ|Ert|~^2.7-]+^اq[g5 × `: 29/I,^%2l+᰾0x[1l(ڵqS`sD(olÓ\EySYf&hw_"Ԑk1-WJ`(2Q_uvqOa(ߩ` ėivs m1[>pCscZH;a-=ݸz3N>҅[Scٰó&'ܠ^hFejgjb-/hObmݩXr5}cӨ+ VI"WAjNLQd,3Q@veH.GuH"(>>zQ%Yyy6^9Uxm guóMUN(b6Z ʥ7Hƫ4S 2zvXN^O6M/ηD WӍkab0ry٭ڞ]u4$ ^渚Ӿ+}U79FP c# p5KhBI7:1ݕr%ʷÒ\HKk eY]^K*N(G߬j;8gr12cdn;* aݣ._8~;.~OnEq:2mS9pƙx|!1tM`WU.t9 ~ypF5{mJ 1HWE`#H\ġUV}H WW+XÁ"<vUU|HYW/#~øETWk ls_O\֓V火{Ʈp{s!"W$ePH+`H WWBha]UךC+8;\)•vM 8 \qU])Hf!+8E\-fHp 8Oc]-ɼMڢ}{>~e+"Sl0jkx)S9۳7:z{~lۋWF}H뇡 # y૟~;2EZ= K F7oos ޢ3U&̐wUԍyDݷok5, swmZzVt{8vt) v\6Pͤf?^oJ+DҼ_ a.'F@͠0!*z4a㳸b jWӎMQ+634^{+Y֖W)C !dR $hBL|]>+.x8L̐yA&$-0Aq$YEȩN\e<&x9vJǦh*#G4Igib䊜SN)%'k)soRD"5p 8)fb$2"\pG $4ɐ!#TFj<'iG fn\묦%bq}OJ@ًdM6YII| N$"cΑyौD}jڱ'xfx̱Xv>vkx ~sfi]G$=\F#g?.LQ_fV mc&:enk tPy0IK pn(SO~\-wb|q֭s]y;ߛu‡&Ŏ^ ]T)]%X;mH ( <[*XEYEDc2bbbFArk"}U?>HeB.%.nHo,IP#T$">9Zkj<2[R&3tTl>`&H2 -IH+^^/6?zWUSyƏ!ԥnc>"C>喆 nc)啤2Tz!-cJf Bk%#JB'_u K7K@"?0 Z#YDrdIKU,䇇pcP]cW9\kZKQKZF<+cifRԺ B:Y% Q`Ζ,_zmQ[!V(sGm"*S&YU&M1G$M Kx/,<8* !m` XR ?C%b.¼;gf(΂-/ϕ8`gښ۸_a:[`;ΉȩRJ1HCYSoc8CRP"q*u1i o!q]zK/XJM8#w/&ogow.')wi,&(bZkd,kc@rO!G݅b&Õ߶ v|6[ FVtb քP¥CRMKw[X;/y0LŇ24>i M\\%XfjY(V'Rs;~v4go98|"E׌PjZA~h̨lݚ6OnFp>}s28K C91Wsote^쒣muR 3/m? ~F"r$7tU5 WOhU@ } +V1AvbUomr\UJQgr|n9;ǟߦg_=;DyW$8خU$h$uȯ ׽547*ТY Ǹ9qZWx*Dx Ls^LV HXb8c(OoLob)cyb `w`Ed~V;GJf|W_+muQ猊ʃ% ȸ(S0t-!Bi4aVc{%'S Q ̺hA 81*tsU!T&ף:PO<(l5#=Cl] P{`)mxg&2Zz_\&}6Ww7lD,CˋCas4#VeSi% es];RPq{2vBd EzoC->Q{Ht:B[գ3!ru u}LDn%-O|:wzTyȀ]LC ZOirHYt-3h9+}ۖNϋƻ;/o%ܣ祖a<not}1}x~+Yvco43f%pΚϚyLcB_/ēT3*,6*ۜbOv ӹ=K؞eGc{R477s5la?tQGeLqXt/E#یY̘C:,x+j<_sZRkUwEJzH1gWIuJ_Wm IUp;oB!gh՞2#,RFX*J+}z찒[ܔ]0䮿abS&[7aК@o@P$[o"nTcp9 *2r+9Z>("Y M=i;s `ӟ *Lϳs^2a㈸\iL;|c>eʵ``PLwKsyg E2(nd&uTץ QF)l"GXCM0(?/xHo<ƫq^PBz ^ 1[,} Ʈ lC^=W!u-mkf 'BZQ\j•q %"Cacs pK"ɸaKB;qSIiwGlZP3s=Yy}EČuh$ 61! R4 U!1Jet`KEIFEj`1`-Xhm Tet`E刊` z;X-q3͟|5cۢ6pj,ȲLCe~ 㞛q5ʔ`I 2( BahcIȒe1XInsR]Y u?]vu>|2lqvI#F6Je!٠LfadUJIcDiN;sO53Xd@$wGUs͒9EFE iĂJV+;X}<̳]5\BZl-_ϾhkgpjQWzؠ'BG4t~_&QY$d"iY"4awݘ}5L&nwfY>ro'>\6 U 󄪉&%9j\OhV}5t҆dSM&*8[.~|خbKߏ9,|&}u`,XFc357şUnc17*4BZ8GZ  aje ""XM!֊MNGNGfq.8gx_4\t\bR(mQ(t .űQqܡh4!Zo&F8FH(F1ՍG Ǔ ߨJhy$se ʌJYdR`2KeTH S[B0@?}w-QAwG5{@ID-B)<+s2]]# SRm>_8֏I}!ТBfuvu/PTe׳ o?5:x.AKO^hAoh:9\T s0!Кg& bSHhxUpVmISegb4T; jD4y cʢjٍn2 vSٔ[/P]+Wnorqmnxx `pi+.7PXR Wawi$h&Ŧ,4jۥ]B~˺%ꖉUpbޖgUZbf'.-zzU=_lK&G_샖zAU1OYXhxL8( jި.1x!9K_\,M*m Z.ج2P*Qm`lI<'w? Lk.mI"n7i_ÛWwOwos4_.`ß{N-[_^M**&9MD߷.{}~}wxvΟU{gY߯!nG~VdR-Ny mVj !D׋̻}]@ejہ1{Sfl)԰p7 W{3IeЅ,LŁI(t E?2[-wh6wp31֫Ae,Pj7{vtm~[|&*/Hվ2̟[Xtqoީ׾MDnuC h(7sG hFMdQ#aJ qˬOi&F )vo@$l*A$f4F-Q,wN XPبZP Fբ0/{WƱ -q N,Oaj"eeyz8\&aHٜ_ctc.~I$B`\xiRLOZv#(ll򞾋>ʞFr1d%#wi$=c2Ė[$#hKYN oS5*?8-`)IV,XV 8-3*!$rǓ:@\{{/O=>cb|3N :y舗D ( GiA2&bAT0H@&)@0k <)ؘNl烉xpnn'&~}w t tiEW\M?V"Mt,1 t`'0v=I&$' ՜x :0 a wʘN"zW$Ndʭ3{ZD I((guy'(CG iWBM$? >8!"ζFHsr&ϘVR#-CQ$P  >.ɒK1=QDF8agxfL LJLF@hh|D <@9tVpemr V **a8O(uZ*Pc xgQRn㭢')kf(Y;՝^عdK 09u%v6]@\Wҕ,KKN+Yr]ɞMW2@x3rWYt%l]sqW -Ur^뮞2rB]e|-=wUR]=KweFWwënɜ)l/ ӧ]/\!R;^]i^~9߸|XRPCٞ/ <*X=a+x _TH&p)%PY:UW5:TzDGm_^`NreF15I[yGcnhEk\r=Gm7Z݋=ΡD#EဌyPnNK=Vr;:2=*D0}w^eh:TO6r q)52A+xJ:"f=N>Ff8y"E%(OcpsTԋOYRp|t#C%{3i8LJAO b j!p:#w]eq)9weUrFuW]QA8&g䮲Rt:]ei 9uwR LZw srW +%g㮲 u*K)M뮞%iA>,~w}x5e)Ⲏ%CwJR H.w gB:y\NbTxLUMSl'{љfŮݸ6;'X/ms5q`50KOCpѩ6s䓏KLJew^w(x<^!:Igɳ BKX'bFas 1Js`Ylv!*|{f{-F]J^vUC xa/idzBm\<=>XX8,2x`) L`0i)BMejQl!Tj]7u?^gr=Kn͋NgpK]${gW׿ՐufٗFոZ$Rpn_weGn'z9ަUdy"5btSMTzj*+v]Wmv3|jz:/V-ԛX| | ~"V!cFc\9\gT6k/1"0-Q,z3R E҄;w M"&,#Hx58kքx]%8W$yyGT:iL5pլjPpf@ɡ@m5ֆI4pM3Grrl#EGF-6mf i9m4`#]pۿ|_ \99r^ z2]Mefѹpѩ R BGlZc~G=yE|6juWkԮ .Y5;ٶ֑"-ʈ$~EupKaŅͥ1%!$r:Vym6.lh59E6ҀNJyKJzb(%F褐mqOdfէJ=jrYq/Tø!,k~:JF(&23kC ]œ1<`X|  52B1ȸo5KS% $q!ylYmK'`ǃ~j}}Gyh呐%{@P7^c$$P)uB&kodoQHZ?J/l{qφMq09%)TS yPlfFXZ`Apep0YhB{3WנC& M-- h[s$åA L}n&(~G-@XK=wha;-3pn屓]Xk%XhLa'Y⤉P$MDE<.Brl6_~\}>YM!BY$VE)Ę3gLh3)74 [ɩmԈIW1@BLчbT!1*`#(G3,'ӆ1qV$9*{ kd`31ezYk3V,z>-?|-Ħ.6v:YJĺni둰PZ3bu⒳鉐eDlϐMTH09Tmy5X۴3M'U*4F*FđI`)gNzNSD8REcnK*M##XNJ~ۥѽr x%'sJ$"əJR%8c1VIEA콤>/+D$FqrT٘\##d]^jCDIG; S+}duF n"Ei,w> ]hdk=rߩY ~* VjnTaSϾF<~jF"U'8;O `=s [`D3N2fbJZx״ ?Ժ<5ϣbѾF,d:BTa'-'@Hdp*ilXTmX}(@7=pG6wI8ntV-.1M)[c"`Z}sV?Z^ї:ZGCV QP4ro dCRN<p5!Tu>q7k ֍oKQ.uBԯg\u9y}wv*Lo[ݔpYofTRμ 1ٻeEXw7W"#.gjD!s8QswM䊭ވڵ[~wlqp9'ʕ:WH聯@L%p}nb%|=)JOw#!\l*Ȩ"1pon7LĪne*UVn^Yk<^sj[͵O֦%JBq}=LYU䒭2{wO{Wz:OMm}; JzC!6#wb΃,\óe5d璹}u;,!#5g5!Tgp\?_=|X;uJ"/kהN((*~q. "UcXIE?rc ?Y >"Qn|CǏzŊcΏߕQZ,P5p"rW=ܠjj8 ou|#5Ey ֻ#''XwM̤MJ\жg9̍8$˄ Dx& & ցdO[l)K ]5mN|qb5H 5@13p<vB)*TR)3'ͣv{I98i%)FZ׈{Tޯo 2b"KG>oa2bIlVmT&7*o)7*DT~|SRtfv{ՖpN0є$kXR]J$WeSOOgo>@zaц 1%X4ϩ\.}}|DJXjģe1j$jU4"ss(6'sŒK1=QDF8agxL LJl@DLAV a'yZZ_7|sح_ؗ'ZBM+jz'`y3Š(VۈVEՔ\5uJCWZ4#+}IH}q{8ya')qM4bW=Er8A2Fv<,<}AcWDT-#Cč2!QagL88Zz9`QIlAhتU8$ |:EBd$&:ł|ˠ@ !LR#1?ұEN0.N̥}u&%;"G-pMB'AJ(S@F2JE0`X9!c@=zHIǞ!G=@X U9nJ+Q+rX嘟%fP)R>;5ا&b/Aa(f?}Qyaz[䵙E`E>z.㫻,Ľvᗔe0+~Z/o"kNTjeQ, =+́A;:- 9$3u8Vih 3\otJo?;{ׯ^^x ug^ip\#A;w ]K5Z9 Ũ!~A׳.}! ,̀uqWi6-FF-O!Q>􆧴l%c*Ƙ1ph4a(Q7g6ƾAF=iR`JM ֏p=PSuΨhPU\z@pKz8p2^vpU{W@֬'WZ'``*QKžUuoR ^#1aW"Qpu+r9~dgv*8pu+*łpnW]K> 1fW\*mk .kWOBRt :*աU}D%C\=AR`p \%jwJTj+b  j \v8PP)Dg]=EJ}H =v䲃]%jwJT.X\=ZyHUJ"c]%r &jwJT*+)$]AJu ./d*  4 fAq51=ұ?].\Јt'׆{Q|/XxK+h%sT9v{aAi#W)fWeLgC0ŚfJ+a1yv*5Ӡ> vIjSh+m$6{Yv}A[a<ԋqT;~Z$O3? he}}|=a"dt~DjltrtnF7Jt{ ۖkmw1ƯDWyY?{Mj5@Rug0=79Yѝfi!7YƝ m8ƾm1Nx ,]WbҹVtO 2|[OK=-c1sA/Lb 6NbI6\ׄk uiF ox8g9i7f@i6 C>:YFW; Yއ &kɠ#VT_K`vjAW߅U0XKrn-}Ԥ>]mO..配13E ]xmTb@VWZzZhudfK̓ȌS(]Q_o8/;w-u߹nIhܟMhiuISru`ճa7/mtiXkDN5U xh$Cee.bYmËٜA{c،Րd5qo†V~eJCڶLIZ#,e2QK~2Qym-P{0^UM:"Ě:nF0_kdNYb.͗^׏l/Vlʰbyɦn4꡶X&7b^r쬅7Qs-9˝R 3<_-x. Ƙct7h~M tpO7r@D8DDGB*b٨{iRaAZ=љs˛?!Ka4|@')ǙWw*tڮYFϚ=sF<}7Sz'qObMZx"nR#s\lӸ؇v=Δ8!B9\eq);>,P \=CD\e8Js*pGϮ5p 8Y/rL09D.Vzᬣ/FGvqЬސg5m(CNx0Ns"Xe.Վ|A4_QURU%ZG;0|Zv'.fyf)vUUl2ټPԊ\fY\~>ڣѠDFqvw&2g4˹a:麕-Ou,X~.kσNm k68zG8a [Qu:&lVDûPJbO/A>cEp&RאCA,щ!>0b%DOm; 'J{BaETb 6 5'P+>b"Bލ55`RJaMār x d9X_7êM 1X>':u"Ԗ^E0@{|um`ܕߵu7N9]&σM*ͽ`]]l8,x'5ZA ˽՚E\ᬣIC7TIL%ǞJR6$X)pi%w) 8yKx%@$dD:,ֹBVUZDPg/b n}I"+g Pڨ dӝgv^O!T׫0i8-. 0[-  :Мfv\Hks BtR{OQ ^jKvİ,r!d?5/V1HT4\9G`Ta'fX8{F'U' &{uzv wӽ*/ga=2Z_y]U yfF8fwd=Ϻ?Uybml黽*Z(KW l&B۳ι~}:[`M1u$9{LivOulm*YtgyoH|陕C[!_:9-vi{iү4Oaޡϒn|iEtVۦ4]a5F<Se G?OyÝOy}cמpg)갚B_rRejEݛi9\1+@VBRQR(B QZ8x׊R+$G9A@Ā"D*$h1蔸U1HMRu#cmFJmXXff, >(L>ƙoر?y|Yy]Qvr#6zq'},ɣi@9`$Ы  &Plwaf }ژ=(aQF3l2N *oGI N+]ٍخ K.池vmq(jQ[50ح=Ɲa&A daB+5 S%Sh&^> wZG!p= &D%T{1 XģS5amʩX68u͈Dlq8 H58s.=tN\R-bHjED!sqIHXP3.8yQo5!+fp#=kg!.MKEAjEb4sfOIkDUf:Jh8Skg~DX,u3Lj c+yurA*]kFԭIf_po7?V/^GW00WsI2h٥@*ZqWmۛ7m7^`JrcI\/jS1ts1bsU a~Q؂Hug ].\uF*'R)^ꬒJXW=:/sqf!`Cs+rDΠ՟LRT-?Ǜ*[޵~~ uJt]fx/}NB+̾6&tʖf[TmDMI; h$7߿盗x/s T_7?ı//q:${PV罋V*˛4ߧhYg G@rYd*TAF6Ņϳqw/+ H\b(g?hS2*)88Tep68#/W&ȟQG YڲS?;JKJ۩you",2f7XHH J#8'mǎ̐L1FH$mI0Q$}JƲF;RSqj]i&j:[jӉӿ੨wm#I4bb̧}5V$$'qo5I%m9blUOUWW| N+wb.n⭤EuzU6ʀ_<-C_CP+ p@:]aaA*Xt,k $.(e8k7@$\2RR$AS0ɩOQE3SJ1.XybPV2 m,zTomԟ5dt{ B *E|SlmFOriUS;«j&Vc׬h׹|xHa$Z ip[@2EpTOlᤳEb2@cNGӡٛm[B/3JWodW .,7(m/BTBKkZ)CTA4rge:%*EKύ)4ū$x(כbs+a}^ߝ>v>w7U >aoׄKٓ^oiR|Wst8lLDIs[I͋cYI: }8#R\XGIBxQLg6@ ZuEbPX-Ea, ;0`-$ޯ;fq;>u9ם.Hyp~{'(%sx÷v=@8 N4B)?R΂+bΉVw;pO#DoD3VFI.op|؅1jNIEZd9xP$T+"$)$*|4@' gk\D؁/ƣ/dWg]vvlunTPjSyei?9m| |@AeV{͢KPʸ&YF OhCIP]"s| m@p/!XRВT6]pVJmxgIɒܛOᚒm8WEqX۪ٛO?ۍ5/ml]w(6X1ԮkB96U$L˔9xޡyEp+iHM,d >ެI9ߔj6@ۙ0wdpB0A~ȃZó@;Gg ~pg ɳGʿ`,g0-BSp*uC:꽮.O?+W^ΛU'x8~^m:jXv=2'=|<'~PO ?4, w]g80ZؠZkQp]x*T!p5 0!n7ͿSjO)"xGlb!rMԀꔍ2Kd7zRNdׯj3g<_-n胡R-Iє#% Mb빳 FFpV iJ0hU+1@C>H! чL q$0GR9K (xֲFn!|C–>JD@Q~QQcݓ.V`cmf)l[:WڤޖZџAj%\RO1:sıC*;Dtvj*C>wn{ _Ǩ(X)Q WhP*P(4IO]\W3v?A~H$:e'+泟."1JOcbr eF؋Fȼ1)N1ܞbI)CY@s=2>rm%5AHQF^Њ1׊o4f 48 s<&nQͅ!u Գ_S{3(»EC]޲q$$Di i  h%EBGAnxCsNxFITO4CVnXy{>\"Tg!&8Lz_;y2p/ʦ? .Bd2O!2 C))|9sHѰJ:(?ffd?]ȑ:a"z) ܟPM/2ܨ_y)"o蓫RYLI9Y~GOtIxVMk}Lk4,#ɮ@%&j6k!Zy(9i+k]s[ V?REϿ-,Govûo>ԟ=~lqx4`~ ^}|#w2͒mLdav}{7=9<\;twwg-tr4o?V+n%Jů_fԧ{HYmN/roSDӽn]wGB'@ I|'G+s/}[mY|aAt~Wo毤 xbҤS!DObv瑖weejv^φd՛ג禐ffˡzC˚ZRQ2_p:[}OtU}䇦8˟.sc&'FgϑRG"c[ 筷_귏Mqҽ"[fd$rjX-l865 L ?2@u=?8e&\Nx7f1LWdc}[>4s,_qeiq:D+|A@ǘn٧C\BsE,h L$H"YiG.Χӊk~rbËAji#`@d6_\LQb L9z2"'WDc`!*P)^+J/sO$NnM:[9|w'>î@IqU<Ÿԝ׌3c9D-0+ȡ9J}f*5i9ҲRWH`k=u5澫+wGTWKL`2`Uvo2uGTWVYbI]!QWH-c^]!CW?z:RwVed_80si\y ze02ů{3ҔSoY4{ˮ&`ħǼpvpSC",`0(&xyGI+n_]7L!w_6Yn^FIl+".uer9V[ͬtO抻Wg3iӥVVkkd{4F*fuư4dT7nUKon?w<4s1|ت_֑y[F`$s&MT\h`*g$Az%29x\oJu{#Ț֨$TpyCi[c~C2#Wb 'Ҥ$nIN}YІ+ᔓJi\ =Mp2NP9 iCHsN \ۘdRsYNLI_G8w4 6 1l컋HfjZ#Ĉ 1PcY.x Ka7Չ($L!J;͛Z.~ch⥔1tRE`IF$JIɨ $kQ&8;&. AIfĘ͈1pYJ!TxI46y@Zk7 aGSdfaW;14t ͭcPƾ+T2if3HM, w:pPty Z|ĉFBJF'oYIOp11e0PB.DEo@ŽQϦ\kTa|gqh2/94yE5*kjI@*㜰^E?aOBCT^^o6@TyItI(CLZxg$n IJK:`z H5(QkԇRpth)E 0JPqYDž8֌s I@MKɴmKpIH$*jFjX/l22F, X'(ou]{n  8DT$p7&H\xQc:B." B# @#K M K-#Ch'".фE +}и^:F{V7kGoX-;:n, O3޼5<40<4i(^*1HLwϯl$#Oѵ"KǵyH4\Tpt[1q9(.P?8w UAvj'Jh* g!B &tmPx@^-c4td-3JMC "Cpu`Ii>䒺6bGRv=|l3f6S OED4Osb}Ŝbg'p Zod:T Խe2ckgiBWk,*(.-aNvZ2I,ߜPcy݄XK 2ѺznpB]؁6 +Zπq3Fo}Q5 (gE 1m6ՙj R\`AϊŘGª cǤaa.;Tk2Bǐ݇8JA4z| 6cf<#\+tREY1p sLʘΊEUy7&J6:+jB?傂4:*L1t]Ͻ)"Z<8`#SH3,$/B$ֳѣ㠄" ,99 cXh0 Rs|ZQ4c L ϟ_սW{ i&ˊI8Qsp1@߇yw'HRs3V11򣧹.`)ZIh|>K9`" tdUrJDX:6ڤ̰U'LvsB$U]\Nu~t5W1"8nQMZuYNbwzy.vzݟNЏ] KڰdP>t-Ya3L!(ƎD͍zXV}X++Y.f2U#@*:Girk<[/5QEDϊ̌or6CB ;?~%2Kp0# ֛ y@!x{[s]o_X{T@ y"UXk&rXfj(-t rÄB|.dxJaU2:N֞k܏F٩MXUf>.,1ЗS-c1%IJ"T$ kH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 $[]Z_^ӕl5>n率o/ܵvsE3B e7F8%aR&cKPIYK{.Qh;D𧧃J< 6Zu\Jr*frn \E1 +(yDB3IqC\ecP\ +&]\? Dm|1(*/^$N/nV~oe 媽‹]oa a~';?wo<6W=7 [/de4޽}:!}. gQ?vh/se߷O_;)3Ӓwǿ 5ow? Ax?|KlDT> ø+QqERh1G\uiC >+(+QYqC\yM; Dnf]Ԧͻ+Qj 64(Q4 Dq%*S\W#}2L~\\Ju\ʨ##Gf \%9wz+ n+v2]#FrW"8Aa_ҩ%|tuM?lq!O}6{dWt /ge32f{t۱̜NgΧ6̛SHp+WJTR\W,@8J0 D-q%*OH`qW9+M+̊)r Jf7 XܼJk]W5# g-0KPݕ+EW{U`r!+&%; DmRLYqC\@.hw .j?|\E%q8qi#{3ڕȵi\Z[Ǖd!2gr./X`!oR5v{G1}hoXN/i&EkZ-c}Ȣ 'iss9@%6;)E. "(jKlQuE)MOgOjx[M^\6 %1Kb\Mo}8nH< D.Qp%j}:DeHW(V(+WP:-~pEq (QimWPiv\y$w0 Qpֹ *QwG\h=\&8ap%r0ykWPv*D±<0L1)lWP.W1@y\An6J:u\UfqbL+. +K|2(j=mW2辫=*fSizNb U[Q^w߽|qۗӛ}7~7j[L ]%\J!;:B,6]%" $UBx*ԝ:FR?q_KWXJ ]%Ld ~"i!fGWGHWiUw՚ݸ@cˑ# Z}H338#GĎ{tNo^3d529d2ük\Ni &8Gfb':ɷcOM(\oŖ;6#>f߷6+Gk3@t nYDP6+C $Ȉ`+0΅ZzJ(]`e6t ]%LJ::BB8ulUBi*#+gDW C Lr`BHGWGHW+Q6tbvJ(J"qNtK ]%JBWǮʎJɌ k]%LdHhEUB9GIWiTFt|bW f"`}JB::FJ[/ԬrPrՖGH&ԎXTS,xP攵PSל-?Q*֝QY,O; G0qlOiW8" )_6ŵUMlHW]_F߯nݴ5[Mga}/u_YɫW[.pp|/Ɛ ?U,Xl1Z6J(Իދ@-&''e0caEp4#!BF鑔Je FrnǁYRY۬pa5̀M .O&nZ\>*S S 0bQG"`" q-a$EV 7J{`8(R, mlg+;Y,/ FVVϖ9W?A\_Xyr28fv.4:[#@7Zڐҥd6'PQ{eLG&},Txz;~L{<b'0:"+A`o,J+ >cjUK)C{造/_*ˁ?깧15kbbk֢$JayG5-N,P=ԔziwM*ܯ&6IuC^c-ܔĹ,R;3%`^Ujnr E_h|Jei%0%$4ϧ%,&_|2O NbFeJh>*]0^\͇C8}H/-{tAɖ˪¼w JXUߘX`_oK^C/ˏ˓Ao&#B~p݃o! QQj U(@z؈V(C]>d?Q蘒{${|]# me1~SCR<[jOQ$'\j"4+cJ^"d 8M({(:2,"1rbrX$%QdJqopW niUI'ep0/k_j2..V6aO,d7*:P/ail\SV f] ⣓Jdr>gVl *X0B`pjoAs߉ ! A*ɅC2|dER1q")*6#7ؠ`A1d =!"}͎)RS"!&"Ŝ(a~8|#j|%=,Rg@njr9њaSkݾ12j)ҠAr#CA;LXŭh΀1ʨ.7-KS$x:2ؗ#Ջ٨b~v}7{o0ͿPT$Hmݾ%3 J{䱊A\pAc0Mr+" =TnR&jpź&zŸϖrm2c? ay5* mR͸|RQLT#x Ji-Y'>(e@^KǬA,ƈ\$\~؄y$TVi@DkR}f{{ㄭsSN+`hio-/V ?ݣfʖӭMޝli |IZR xJ|2HL!4 %T#id`V"f%  I ō.J\zN5u)wL*"Q* J3K@G$;״?V.b~huu0bXRH X )G1ُie;?s*:ki Kj Ӥ (%2 ҈rsN,E;F]*xבۑ27womxcn.|ٹ䔨YmbKejܳÑ_[$oKqyɟ*,Q0kPԸ){o|eeu>UGU^܆‖(bo{+< ܁B#LOQ\S7e_΁a՟J pP& -,I6[jRWP!CwVHΒ:Aub|3p՛J52NOOӓLH>ʾ!pi*bѪ`Ɗ !avv-*amFO0#|u,Rxv1,/0uՎHq[NDl͉9oˆlφo/PT^A7a<8 j0b'ժ<]jkݝ몐rZV =εL1!)cdVuQ{^>M/6&*+C*YJ~l.^Z-^b)$+ Px{~dvX;,{n&nM l)KENIWg?=ggxg?:D=WK]l"o [ٝB(ɀmqĽݗ|Zbt GO3nk$I=U&2\y\.@ ?b쯑YYT`#)Uj)N&ӿcaO#PO]N[(FFґnrnrDReN) <(H@PT0]{:=V)* '1EĖP<CJf/hܳX⒗W.sKn!> )$@?wuK`Ms z%b:z J|-vVt8Rf> `$7iQ@d k,)PO+yzOY>٪gNȃ3a1X##`Rvu6FM*̣Tۯ>;}enP|PZ; ^`{JmjE =؏xzB7s}E7<v@҇iR f)b=\{a3OrEZ ;@x;aM/J>YQp9QxeL )` eu/ -}0B+c9=G ; 2XpeEzoڞ7B!D!e!x0k5f,`ZFL&Z iسú78;ΪZp7S|vUͤ2/p<,plfii5OUmٙ\[{\p^$NLr'({ғJjmk. NhezSM)&mw'R}>]׎\2OtUtx:=;kJpu0ZwO"뇿)sf̻<ݦ_+OIpǷzpխR1zmu\?󔱵O'xdT{n](@DIylhF]߃nBX[HidsThf8qmػ6n%WX~JNelݸʵur::y:r*+I|m ID) {0ݍ݇F٥8.APvc*p2>DYΠKstƔ"%ɡڧ!#Rצk\1yy u#fgJeRƐ r!;! U(J iFhQʺsk , ېd6@%/JJ!Zc+/ךPPPѤ  X&bW-=nUTr^4mXo*ZW޾[v^eM'A:pɴ j#Rc KU=2QY2L r=):fEt)W8j{jlJ5_XM3n Ue_z_W_3;+xMq'a6y rBI < 4;%< Wd DU7@&N sQ0^(n ¨:cf6|"3 l%vǣxb׮7ںֽƍdނ#}P4dx(P~q~Pa58(\X 7яhڽ!79ӳ{7W_u1#6GDU9aYgk``& Yu2}4mʬP~t^v=յuRӈ[H=uI J"DDgyo#D63D6A˸*hq-n]A_Qw}:4(WG+cGbj`jѵu~n~Gk]k5>{Ģ"0|5|[?&\F#OO 1f|hK窲%ANm;=ԻǙաir,yʔۢ7" eI4V0f$6$N Ӄ>7e=Юtj YL;G A 浲!,s"Ж 􉐚 IJl&^eF)DT$d}22f"\{P!|!*Ոi|9]9īf΋Nݢ^ں/QU[9wƖ ZiwZZ/-s1ʆD*Y{˭f q>> YB+/y,qD$0!j,'S1) >()%yU;s,;"'nsԌFdaIZ[, 2M"OvVM amPT:}(e@TK~P⢳M* <3/\B&)CM |8$QY!z'=#T M X)djj1mjiD$9^$V5fÞ'Aw]RYJd]XtYUZ )-Ra\2*`]7V }4ĸ|(SFL0dH_;3a-qv(#~x#(C6'"omjӵ|eӚz׫Yup=#;Q/騉,6HQK"j Cʴ%:كDZjKidӇժƒ< V{Cc6X]ҳG.Hqxpۛ. +pjPC{Thd.Ҭ Bm3k7jZc}ǵy}UW=GY=uA%' t]l{uOHKPjoKg@[0҇"lmZ (Uيt^#uDD*zK,=WנSU6ڞNaFͅ>ZB'as:td5ڨiLOW 5^1&pT 55총&~y_;1Ō9:n䜝s/wmqK^ل\xDX<KdYȚJ4~VW^Pǩ$p-ڊROt:q&c:@lZ"zNpUk̠BSfDzЇ!ɐ@(وm쐻 *q5qu Ϧ=52 2=[kmY}&%ݘAnr[zz?UpP 0W s%plIr >8=OT.:]A}^tJ::UΔdqќ C%SDI1䞗!yu3<ߌ㧤EcIVH<{0|)teD(t@OI.e{E]NAf1D$p*$V:c&*r9$bZ$"g|5k& KJ?L ɇw;T*]}wPͷf3HDg;R$YStȹ8L\$FM+p!rtOfbzֺ<6ϣKO4} yϘIUW1:;+ΆYH<?çy4OwΊ6ˢ݉>-9=;9˲+G./flB_~I0Ȓ>Eݶ?sq%p/fMwKb,Q۽?~K( ʻ-\-p.t_-oنԘ /m< c_:Ois _f]%sd@+BS h-v^ּ '8gAK.Z FU5v8˯v zmM4~KMrn(w~ A~&>v_-Rצ9#.:fJݞEŠ uY-l4r6{҅lXY0:.tPjz/ oow>s51 ?~ŵuͅ"D~.sAmLwaCtO+VJƃ+AE] ?HصJ2m6+AUrlCtЕua+t%hC>t+]!]hSpn3t%pi3 ݼOW2%#!`g3[+AKJPR:Bd gu%pCWܡӕcYst \̽+Aӕ{WAWԓO^oXاR*ݧO':}di+@q{e_Ɗvz q\]I'JL}kJW}Q ey@c*X [ϻeafV5s#Ccv{&scbq3ˑߟ;yG<ysظ?vWm7um1ȒDĜ䓯_?׉F  eyWf7Y}K%Yl3&{&dFzbl1 =dwbO|5eWK%ÛkB[t ]ɡ [XS?g̝AQr8&4Fpe%_{ddK4e2k^vC>>wAK4gWcXku7d| #QR̜&QZ^>;Vz X̘\fhZ_W(:ԨyZ( TE1S&QJpɶ'ck Aa&\ߖ6>dÉK+-U^V-;IJR1"+dF]/B93JZmlf m-)<}ͼ&lGSl Z5 9|_0!6ZrXR@%Br \#BND@Ϧ"X'_/6ߚg2= cHLC=G;g9Ƭ*2EZcC q^t^D`9ޔ~YU( _ˁd[=K<,&{v:eh'iBE?Gj0('[hk20Z6VcÓHcEєe!1|Orr(Vԥ&ICc,Fy$Da9Y #NX9ﷀQ;v?[;7`V=V~ ~frka%a1@ /-I2Tda&HVb\<uq2I <"zBZ .b$؃kI 52؇䉫1P02XJuy@ .`JUW Y7x4;" c۷եU,ԀGwf7v#Vxe;&{k_6H jҗ a|0{W믬;FlÔ6˳4Y`U> ][{ #ءTCyρ 7D2/JPGY A.ÔA~,9jT -AhWߊ1;}"[ހBBKB.%mê14s'֦Y) 9(ɰB"rS:׈7rݧe6*I@VQZc!ZP bg&Q Í7y8^U1W;M1`EU2ӠDS PkwX+wHWY MSLk$,άvXP)Ny \ i oU; H }>;>ͳ%Ct(.My8Ha}s@_&/*V1 /T:i1` Eʫ.Mt ; Z&a-Bk٨+BfBS+.z|dNzOQ38 NR^C}`;fA_ G=Ap&/M9UkRWIۥ"j[m$Ь o#_ ˯0p@Bפ;ҍ$OW<*:F9r4cAAt<}_4ZG[aavccmwG:&CXVå+^whirV,7{͖z64]w4oi n`NZ$ A+$ұ&1Hnf?'nvn ~-g^/R덽 j]~քC=:xs=vT@5"5Wr3*ivuG%ɋg,fζ,/.a<\ hwǷh-7r}$H]zs5L5ƚ{Wެ0+fai˻hFϚe:ߙYfsOZ1ț)Ю1;X,?k:ϙptJfǩb;y4bՆ):Mkz]^5uMkz]^5uMkz]^5uMkz]^5uMkz]^5uMkz]^׏7.k:n))l'z~z({4~DXș4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@:$Ƕ@i;I M$f:$s.MQ(lL MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$'rRH*?m' f@!Gx(lH@?L>0 d2&kH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 I MiH@$&4 tI $c^Fr8yg\q 6Fz<|_ ّ^<(:/\z\ǡ .=e\.Y .=u-tA1dK۹.p ] %'#,Q]pۡ+x+te7kGIWxe;?dM|wO?r3`%,(M.n Yr.ɝʌ2qė%~YOF"IQS~癏]oGWM؏uM b[C5Eʶ|oCRH"1 {J3 {?iA<OJy`/=@  $#,!GW("X*KUH W \j\9Ҫ,-\}p 󣁫,.c,PWYJ[ =&g0 p4pG fi9tRESpYwSo+/b\Q/V䚶/O쨋v\JN54(c9_TU˙RCPDN™ ˭-Rw)vw r7U DV7px_SKc,XRtǥ?}hX)jh>W _t$t MVQ"DQ:ŨJ*!4$ș *UUyϗ0s!hYL+O+'\)W}S+ObQ9Az2Ӣ[.zz= Y|0ҫVDOe-f?>۹EZ:S-K\w*o o*bŰVj)2ysrzW38zOt41&%oS#T)(fA)@I՚l^_=*jɡy}OKVIil4%2qd$? t?@{_]rP,nru@u}Ay._yVƍ3~(=?j`l r{Ŕ"q+ *2c9Za/]ԠGŹ-@X VGmK޷/7xs(Qwuogwd#aߑA?^Q"q(㞲SƇjIR.׹18 0.IL &: \ (P)L4( SlLLXA8]Tv' я\r͵/{>ZgQ۫^wŲ,Ty^ʕ }п3Hާ^'s{!{ZWy/mΞ pg6T^3A%6HPDM麙@U1_ e%N}s4]Nʎ6[T#5C=bo0 4ٛ?&|쟿՛g}W 8z\ND0"0?^s׊Цf]s t-tw藒&\O, hdрwbrp%IM;}W<ۜ-bRHơ }y6 ?.cȟQGODkʰk#[^uGOvpNG.c[4$"t00HwԶNSGGfHXT#$ cXܠgh>%cyFj#8R3:dMF6Tc:qފ:rS=2VrJhUāLJMy[|8_8j pBdh iFHL_-2:"SyA\ Q$ :Źv8-r@e$sMI:Z_J@uN},'(hdqq='ZZFb<5BSl'S\ dQom4[xAy]!z:UO*Y>_Ňq}Q?M%odFj+u :鉃G*%ԓ<2NrH+ %`Q'Ry&yWQrJM}-z홾vH^Woes87GkrOWfE>D}r6^ȅ! 9\h@cp9]ma$9䨓Q-i#tm@#t@ a4:= J=Em4R[bu #`p"BpT NiGizTmfF8]f4=]wot& !7Vh?]\L67[S)szwxw5|w1{s8rK{.sgj,Yݛ4wZ^ܼiZ^s> ͮn/Ƽ3{LGB>|i=j,cs`-*1bS A.J@Yz )xYY#ș؝GJcﰲ \(TOv-кMoW OXCT(FwfեRe떓ҹ`U@yM>jbiJH `>Uj||Iv6J5sAۺ0/hpa/^ѿͤ$ԧ]dKP%+a&r͵&W 7ujv{lKRJFoB4heYlFGri[JA"6\ޢ{]crl YRwGY-gWBg!3 G$ ƍ 1V.(W<Qf#Q!ƒE*L<))1XDX,-,xGI4 tY->䧶^HnjM;gKxӊMՒ <08im7T}c}/J$k>,Wa{FsGY*0R=7ô}GcÇTnDZ/45giҰ7j ZKLIe(mt`A#kTU|pr Z+AcCA:㐧JmX:[\I:]~t.l{PdtiPJiHPq@BlB&)NOor,h(9?&JLKc1F1s&C3?i$vsgg;!,ip2m ɝQ~VsRc! kgaA%kԴXm&ᙝb$#`Ձm9pS۞9࿌N'M,Og#䵁Esgu<ʱ^:Xn:Z~F2ˇnF<@rFϵ,lTrGK3=xiOhOXy>kָc$_wwh%5^ȤbJj\99ۍ,|.}To-SFc3_Ue;MnU{ _jD2@'*F#Gp`n=ABGZiGe[udUGJMˍiˍcvU eR.aҽ=_ %q:; B2TC 9'KU}8PD7 6-/ < a%CyJ<nw(xlF(`"pA J ̹[ۨgm`1=K ~fҫ,yy O+zV5_kx, .rAq8xb$2FE^yT[A BR[ixWBRi!LYI9)@ ʌJYdx)0VF4 5y+ M|vgcW@QܲeZ(%gWpn#-h "$BIgrh+Ita}oЗ:oZ,هMZEԣ=]M tL"$uiEߝk_"4F<,#EԚ0S')׊ 1BpGedSAbc_u4  ]TV4+B$X՞(UL h28M?0Mr|x}R cKf5gA,0fW_~RڜU:DȇhFaKbkߗ]u:ۛ$)"#4cFSCJN!I{U8]|I;083ee^V=pK)S DUKk++FQE_c B8 rC|`ztO>N)K:]Ӏմf*={e4#(}Dg!Vn}Y$'<. 0Q_~61($鿽OUמ~~rZz|>4yɠ?~w7O[?Tр*/a/?U$?8;8]ڧ]tg7wr]aq83o߃V7tˏu0j`sxg=5}:\~:q5MnӽnX_GPmߟP)>L>r+53id_'_K@g38feA;RSAȰܥ'vlЧ)9h!,tS7 IQc3۶(n;.|^U|=30e_7LB(CQ8?Ϝﳹ/3sUba\*=kG,jiA0᭟{Q*G$Gkny_X>ѼȌ[hb]gy::KӜ! camHy¢,jd 7 [ip8n)]/idٻز{j` vt <I|THhK?>>|]O} 4OL,"CwYq޹IcѮ^)Z==}hOSǍ(WƜ r) B TtĈv:י5N&T" 395w^N/vѿ lf(#4.1c/2*hN 0brsN,EtJ_51c-k6O|10 f@rUaa jY&'9cm2 &uD9hkc6pL0&@) ;0hg @Vm_*)FYzzqz/8r%}Yz^u]WrT#pԩ LcCpzg* `Cph 3^ Wo\%Jꝁ+V#pĬ7WT)'c]%rJrpUR[zp4&L\%]m \%jvJT΄pvZ)vgiIyZ[]AK1k_)}_,+t}٥, +à gj{8_!e #KuqpIΈCWIgI*~3$EJ"[팑P̰*ODk.2^'=cޱz 2ÿëWo~_L%NVzydA" >"̾|~٭m* 5terd1 9X"̅̑یN`־y+Q=LϛLRۦRj+.3oq!y&܇9j7Ռ#n~\ tzve4䫑gy'_aJ8qozp=w>;b;kfkYmߏvYPV")xߕ'OV^} IUH]ArJRQ0"+RwjZwU]:|ʯ<83E8މ2EtpCGΎHWLԉR>'`2B4WηiSI>|iH'tڢ4ڣVS\i9xVDɬ";Mm%ğLGSCmIe6K+w~V:|%Y?0Yqs\#q nh\8S 4hPqԟd+K9T>"M+st?7Sg?}>D_:lsb!64& Ds\\M.O&"I\Ux\Vy3&f v8&`a[*G`(Ij19 V-H\ }J!pXmۙL}@=0v>}>q $p&ƥL)F|4$`rBCH)0 8DFtT$HBS^ŸZNѾ#gƳի!,~C+'{Ynrn; pZyiS% D0s& J Ejd!6^z愅Qj(9Gr*8ez^ q,tJ`U Jn) MJkbl֌J1]X3vՅ..|T]3ȸqssKf'q7T~_'_FS9>|rOR4y4?- w1qg,Ff}XY1F,jD]X#N#vqm$a@,n .҃@4o$u( <&t@Qh4͘685   Xz5D%ږQeU|.U]X##g R_C8E.)M;=0{N:X%2Q dp.E !֢x88 XC/>,M> +YlQag~BB۔71PpѲVȴ6860GRH$(dI0ʃ^,h"B31RHp;mHFj24CBi_-5|%Y<ɔ\ :ȷ.xʷԹWkY.hpmLn)&fyKR Rcxia [)G)ǂrCH/:^|z (Y6ś )Rؿ,$-B %"Z(v :ACf(dܺ0Z]zr"R'A$eĉ\KI @ _>=VbZpSP"hױ.%qS-MҾEL;A*+H;9XR(n4HrJ胎F6"fs*s쏥*wlوgBlbe> ,>璿y3E?4;0D7X)"Wi˅CYWǯ_d/̸{ev>%P] bys2{ƹ4S@vOF _3\8؎oTDZoO- I[G'$:W#P(WG՜prOk4QL|v~Cc;yozWV%Y™}9k^^8\^,^ՙ2BӓI|y;j`Kr6q85[(Ք[UzK#>knFvy]zr>j'iNI\?/qv,73(BF|չ̧/Ŷl .ۺa݈-[ZG Y+pпMϋˋɺ +#{ ֱc^y"$$w,}1UOǨR/{O1 M ?뛺Y~AbsLlgW&A["/Ef_ GMqOX[3UDNg3ċAs/QHOoOoOoO)e_}-26 ݛ_#@Ax]+BKuͺZt·藒CnwܾejAuQµ62Xa'n3 zm( [\D(i, M D`rB{E@30.U->p#\O@)#e6RP Vu6vlT b˸az&y 4Oi&D #c]臣@#;u/=|yԄ(G^W^Sq~~+2D)Q~jǥv ,T2I8R{QrV/K^v9z#}3 u^KZE[kBJB.?,GxIO738ޛ=Dnjq\^{M/wf/jvĒw&ףQ/ޫ_~5cM^gQi{ȍ^\dGt%7(j|hՇ@%3"+ϐ@DS!N3JFt)&]U ABH%tsq^["8fٰ$-S᳜2ُh@7hdqq,<҂F0Rlmz 1qaT+& r0i&|iz8$Tg<+DWBS9L7SQwhh4 [spvYzIVp9P8\tHkތJ@1hO +54rg2x`9VAo# ! ǽqk߽݇h{ͻ`q-i_q Wۇ|%p6WxcNy^ פ<^-gywZe=es^5MqdmwP'Gzž p/d u4W hƂj2H%w]dp*T^kfE4 ޗ $y9 |jE:&6DB-MEUVΘ F'-uwwn':,|ZXFqTi-R\Ig0޵q$BSؑ nIa0obJTHʗ翟j(E#llkf.F:nUa\s24g1HgY5e6vAu t_鎁v@ZOCNJLKg.oO=-RvXjbؕȔ׍S +fD̲u3DA8Hd9y$]Ȟ:OI>|}/aH<ckjۂ7.ui_\*^R)CD=N *`95(JCI) ˭1'}՞dy5TNy" CiHGZ0ua Lbff{ 5)P4.x˝;]D=<m1Y{V69EMHEK,XG jəUcc㱍: fugv|'O`ZCT21C:P`uBOՈ5arF*Ue[f2zcŐ8Fl$jb LeԜқ87ZfXLc SփWxm >\}yk.f;m÷kEβ`-w}"vZ7Lk&MZmomߤ4I{MN'sݩ~iMҌ%a!lXW9]Ll_I 0@`,}3u?\`5Ʌ j#xAC= 1QST:VHy"D]RvktA` A ?ЩĹ.7Mc4䫚Пњ[4Zj>>rCmwmJd3T֛$& ImPeM0Jp(XЊ+e2hy5(QN= (G8c]MCkгuK ƚx*Bp Y-t%] 5qhs!7irh4si/Ye׮RgQ@l b{!*n f(ډGGQJ)! k=udS콵[IvCvaz;?8/D?(x>}?K1b<3:9q `󨔜IGЧKIhK6GxtyےExg0Y'.`ND-rK7 99j ׂِo~ Ond7oh !+?̲nOT%U V)s%+q.L1V"9sLBtx7ȱӏLz6<7Cd(o1l9F|^ŨcqO R`#U OoDM/}\ǫ'A6b:bp;Yޱ¢ Zk*%ZE6+??o-1^lMB v dOx' xꚑ$/2P+bVar58#֤!RY1;gP$|pwwɩU[]k5TNʅ0LճEPB1$:#WgG-z}˜zOWO:A7oNRG]Np6A㧑ߌEy4Ce"?T>_Fy]Zw,7 l=^([a&a)KO?}~xէ2_a:Od||o?f~C>l,ֽfGyeϏ{''=ͧZx>{vX~)i~㎾K76-ˤFm?ęM~zp?jK;Hέ?`UDd\f/iQt.t;pvH=~Ms!X?b Y~ 5i6%|ے>o_O%t!xS͔[9 fЗɧa61,0/ЂtMw"г_saf=KP{DK]̓hv1/y3])Ԯf˅,R::heB hf>|dM.6٩5_"/\κoPd7!C qY^pG( U ֨M%( VXb!:kI6b-xmY~~===A p<Z!P1U=aJqc^ءh^/Ni`V!M7bAۗ7{ q9%uk~fizk>N<ؒǃ/W[4Fb@<\c إV]QN07t 13^)DkD72i(yEP)+"FS-LduzW5>T=HU^m<\5_f[[D14Y $lC[f?"DD ؋nAM'vQ }hn@ uHu]+^-nHkh;n `k8bQp֗N]bf4U-^h )u-춰{K/ ^xJ1VUؒi|t*@LڣF`;r%j[y"PQ!dTQ}& FQP~Л87#ELz6sM^Ԭk; O۝';>^G1@'znnʫ+_>LYҪc.i!S 6shܡJ$hRmtwk)p+*si]c@4xTCv?IgzZ=p?s>6vK3-[+-x*dLRM313Gocl H1Xx홐,8nSZ_xI/<0 |;'7^;{3YZ\ѝuP;$ǝA.l=␨#J :_֮w%Q> vz- b>DtMT5 #.ρʁF'P+,h%b?Rc Hr"AC@@v.a.f f,پĹe֙?T]*p[.NdxsSBT;pFe1@iVF&G瑵Z $/_uy[4PW9!+r2mZ=|BƼF@qVQR%iV,@ZAMQ]p\c7Zhw {r7NQ<9`!F\~FR˴/}UQ}[av̩|Eaf ǮlbYDX`CxA;{xll#^@ O)e~!lVl.npC|owJHVM!jNJ(\6gz8_!//7"ɮFy J\K‹my}gHT4$M6%JܺՏ;gqvq0S(Rۢw9!g zc讗g_Z.{vv:Y[3eL+Er6/fT.^lXqCHK@[yս<͈u4g'N.g,? lӅ0KMrs׳KEhّ.\k5>%&[[j麭ތheUf`X(܋y$n=kݟ{9[[ed}lk]E<>|~t+)}E{fa~/MX< r;^T'ACQVA-=yoRiU%j꿚W=߿D#˯p~?~8L7~c_LU-/ M[7-47okئiYgE|vv]Wסv!n0XL[epؤONpNG.c[r!!0)21LT*S,Rǝ8QzwjzG"FI2|rhJ|HZ+@h`h-Y9@b:w^yQ^z =LOhnjH趉Z` gO PNWi%V#2]8F i,0 /!dXbA8%SȠXFJ2'F11gtI&Yfk("y:c\k\@"DK VH `8۔. X.q}!gBBXw_ >‰!xEALWjµ>\L"Ihfme\+(p5%v+aY1Z*gHŸNz⠃#X=v$zHb#$ )@NRt0(B LE L<_EE " eٟ7W}/tL@MZ_iϪa}o&x+9;T 7<ϝyތ?uV6E[gʢxIIM/y}ӄ]C?v/ nZ}9ŽD"&/(A/v+˃I(MK'4 &s.`P+` Z y-D HG4Ϳ((t"8 8 5^ KQD{ғVQ^4mXn' n^ys7L.M+@8+iR P'HFilkgNX(VsDQ)C}%P!ǂAV 䖢qڄNg,&fX4cW_( Bƒ•՜4#㖂 -'Ƿc?w~sAer`}N=AhH?-w1qg}J <S]nH"u*RHxb*'yaNE'L5%=oQڪX'-Qv+WquF Jd\`RxFLѦSf`_jShZ+joQB\]믗w*^0xuЛuycB5{ލ\>oeЪqVHo?deck+w阇[rᢍ֗竼"f>_/-qKPQ rfdzA=C̿ BQ/B.{a!V1 ZUT9ݚҼtp9{Y) {Tx5rW7âCext(!qAJ9 &N "&V2#sphk1ߌc!96x1LZg \D*&RBqm$ aR sLxʈa"mr1b28%aO5q]e'ڎ<jQ ? 2o⟵%j##X)0A ߰+/ S9lUF [iSœtL{GiO#,Մ,yj}\'E-uD)ɒ2δw0┠%.RSOYHƭO{Ճ<yeP Ux -uF4ÂHOJ`SKHِ E8hHIaڈQ1xΒj}l 5+h&jqi.Ǘ~B[V}ð]'/݊]3|[b}X9nη]?(m^TZmr M{$JNb ɤ H/FrYnu!Xpgg10%9ɓFs탢P*8%2@ Z,I+ƙ4"G!UGdfJ%Ih :(ZJ9blig]}}fV+bǰ&)xtJ F(9~VI9Z*V 2%h哊i6D\Ф e SjPlAXQLCv^:qt:Wi𒪾dQs,K[7@"oEoH6|B!v<4xP ؜Ы+^$B1I$=Q'j7PU7"n:g&6DB-ME Θ 5>a0+Gu93,n9upӠ#C(`(U~p5j2*LPNBe !yPҗѥ8utXEC斃.v jX$Y|ut8v( SA*.tMyGi®㡟pt n}V=ٿa?i9s(r E-$%˭uvV+,P`S'̀ ! cۈCߟA{Bgk!E|7ȋGfzCݯ1mm\tu!ʬEqMCI]"sz"CCNYđAR"ZLq|Rnz DǨ(0%N!VL?P#JG$1(§77!j^׭'F>'un&ÍZ1: h5x|.|11JOcbr eG؋GȺ|hTcxNA&r 1`=yGc&(t)BZq!f:o9 =fYB1F| Ynyo뻩ì;GOK@й& \B, &8 :p kxԶ3MCzV<5ϣ30#})l, pA$#N'DHQV: <6i*@/ٍr9DN$J:Q KܣO\I=꛳C2zy F_bhf?r0I;EM#, 6!pـF˙'O}$@d W't`HrRsGW޺]Qqrp~ g9Q:oeir*?>ғ 2M1"uL&pxؙX辯Fuę_.'Mw~?MVz7I |kXQ.{qjC_o~d:w_g]Rr¨>'=h6.cߗ:-wK*iMNܞ h!;yU ՐfVT?n"f`zh# Qa|<p q* 뒄H}M+)zG\{|LpFܡZl0o)PmI4ʆ: jCYF936ij[Mw9YR<9]|S*[E9BS?|anӭNKR󪻯&U$'gVTT7}9[_?s\c;)ǓpC^_2Ӹn888(oyr+ʽm_; UY mF938; ͺ@VZC|~r3zwEi gq{Z FjճX^;67fwZ+YEnRZY1@U=c oc㩚qE^5W ]Yw]џ[SB4޾$%A5r5Lݳ6;LEX)HH%TQ*(K.Zb Huy y3K;%PFYfv (S,$$S&HFD$u"Hڱ0KƐd(ARN5ЧJ!7;՛\܍}zi0.Ae—.:}y_tCv2Ÿ/7Kw98σ5_Rz~,)M;UvLxݖ*n_6).neOϪY׌U!}1(a,)vJ .Qn[cO1VVXl)un0z?j0.W|oTg\C@@vxoR/ֲcwNClvMzq T0,SdcSF,7Jp *8&x'9ru>p V&*BemD.X-Xb$}~X,=B b8 _l5NׅB WX#0hea5mjVWPCxJ ;꘧` kծ8Pl9Xr\%,Eh[iU[9R}+dԖE*tai=w*ba² J0hw6E>**{ kG֤mzbcWBB1Mʸ/&^>^xQ2" .΄jL?FCFl(?,F_q}(mGYV/4n:bb'М :3ĉ>bY,KI^%OBK&ʢfI+pƟD;Z}P 2h]4.WygL(,ŀO a$8p٘8C&Dͼr[Po]̓P)砚y84%3ԧsWp SIryGyRu r{-3!-4́F! 2 ˠա,j7fr[M$0*I97>h VْJZ<;OJ6[PHg'*rFhǔ&8#3LpObE+"'`yv֘8[YiW_ՙk5:{[j TK >BdXh4R$ &|lMIxH/O?@: Sq5+NbvkLEv5>ְ/j0kK*$aO:bL Ƃ"Z4bt4D(Ac ?u4!Zo:s\CﴪAu#lRMfe}teRJHT:!M6ydjS'eV3BvRMR)\B‚1D8@FF%$f;%<]."WiLgc"qL2&6 on]@k> #ThgqR\˚حɝ&R` VB*6}H3!tG=8j"~uaUFYm1Zvmu#z# '6zHKjCHJ*넄+ο{_&W [h!!sǔT@sp%ll 1z#N3fbqR{.c|x A+O-c*-a$L8u?VBܾf7gAb^\NΧmk>0.5:P^#2 qb(iKDpG,J6e0F-[*1*ZJ@!r&f%IF#BI\H4Bߘ8-xiIZZ>;wR*#_)J`W򑐥V n[ `^qZLq|RnW^|~z DǨ(0%N!VL?P#JG$1(§77!j^׭'F>'un&ÍZ1: h5x|.|11JOcbr eG؋GȺ|hTcxNA&r 1`=yGc&(t)BZq!f:o9 =fYB1F| Ynyo뻩ì;GOK@й& \B, &8 :p kxԶ3MCzV<5ϣ30#ޓ67,W:e`-}8`A,0cf߇y^G$$yC7[}lTm3jiA!S30 B LFQ!Mܩ3jg?P2 Rڶv300+lI_jC']]wL/DNgD!(3ܫ 'i*PEBǗ/k6c֤.eָ F@ԸWJԐWsB? t/y*6aWFJ`0 %`ˡ"Zy3gQ5<:e f -Pk r aXa1ӆ&'%ꁷ wS գ6߆`ߎyQ9D@5 5zomNüyue F%88b3EVTBBQ "Qa1yR5 Cv6@&߉bbV}*gfZw݅12j)piD ġ&zcIOcY eԎy[aZ{Nz78!wG.h ow=75ۺ}Ӑ^"P+M?䱊A\rAcɤ0&.P TB7 L]i 7Q"`Sm(LO,:)A-Z:fJĂaXaoUE:@F2 c* $R JkR|f-e֔οPIוpfgpZ}>I_Y1}y&mjcYt/;rӒ*LRb L'KU=Ci%KFȀEJKPq6$ÅALcB`yGDJX1dNF03,*͸Jb,#F͆k1I'@?JYK<RH X )G1JIdG;_ݎhڪEn@S؆(":"<Lx9,3b.IͰ&X'юejBc8BO|PrsN,E;F]*VT;$oIj,h;U璾́PTpaϻ=_9[$GsUN¤Qa@bKҠqAŒg5/9];Rm< .DgkvϺuj_UW:N>pHZ<4u8E.K+'3f5&OKQ㳾;vE}?$`+tQ |>mwWk*RW ?|~>;׫w/8=]`'u\ \ȁ:p!/HALp*.[Gp*$Oqýٚdtx[]~kohTs?,P<rϫEve3jZ)}S5`:v#>`{=Րˬ啕mtQ0p>m 挧 ƃ.N[*FFէwctqdlynWJ.OCXUjLM@+;5gyaV(@q{2vBd*4 n ]LG eZ30{-#chnDΖQUl._MF9rE\`I،Rfuj6nڊj7 pCrn1<2ϙ40u} o?`8sDK@m\9m^;w3VM4Fؿ{+ysFOf nU h0X3_swsA?nn<{9^}uބ*{K^=%fmC7٦y>v'D->ol{%xF4>;\ʍ<铳ar#0d0ϼYjbzCtim)he p.! tFWx(E%5Gˣ|)R٨0OᤸXa"cERgOF#_Ǧ Ac/(?/@z f~st5.QmmM[`C ~\b+z6|)RAGp2rwiӳB89 E bxWդH7zFR$ۖumdl}j:R h25;˜,Wȯ2Hx*GZ,֪w˾kNnzś=?6C#u-x:ڏn (S'ڕrYPJPVU y LqˀY,C2NcʠM+-+)nq]dNszx^i1&tz4R, O.|aWrV8ɼ%!gg{mdYwW9Q2dTJ;#\J*1<:.cWUF.ϫ^{Oe$„?~OgQA޼()W2(e A* <0iE0C$ ɅvmD[!PYee8-)< jS#*e(&wP;zCO6沱ߺiqw+hۯENj'ZzWycJUrtKP0 h=DLPY ޣ RzcNtʌhGUYD兊S=С% .SUUQh;:3vU:,P.Ts\xR.\1{@e#wkѧ_~6 7pi;gl8u)njx4</cKKQTW4\VQ &n^Os^3MWT"hf#VL WkuЀx.f_X38uǬ3kgbhQclRW&AI-^a( l(U%TH*CeHxўG \"Ef1qed jL1vu>z3^0bgq(#dF̌bP cpgBy%V*9 )4-5XD@tP$|(EƒB)C鍈**-f(;Mlj~vuF|P%tOcYɁ(IǼh3/f^|11IKeF)s@N Ut|J$ךy<02/‡YGOPa$Uă@ `=Y|6rRy5!w/qsX]4xFu߾0; ]˄@'@-sآ.Jhi'RKDЦJp6PѼ ;.)Kԭ-6ŵZeY?s*`_}B \sܻyci4~Pw}[b.2_R?z]N,S7 '6ei81,j) o s2Z&,vc{8)J(!lr?;,GcSeuݘm۽;wtdOP}3_4.vL-;qVE0):iVҽETr{>߱ _B`?(a];iX!)m.-;=Q`Ő3:Too=.,x; !-'VF|,dF|.婌 Z>(u+Ϫz&un6ndu@u1|1pi0ų,[z)V?J9P¹~y=kciKѝ=@e?sL%k+J>d9Բ wGٳso7d&{Z^Zu4ӽaSwN|clG>;3^QrOWs`峻حK&by]vh9ecJ/\lŖ>)WDDW6R.%=u_p<]!FLWCWkӘg?]`Ku2tpN"}+DD(j+i2tp O&t(9tut%V'DWBpl2tpE2 t(tut%$:% `Wtp߀];+D+z?v(tutU$DWXS ]!\.R+D+m QLWHWZ2RD:upI8t(iv/am+D: dƮ-'Df@pB6 Wd"ZFNW2 #R{V"J}PmrJy汫Vh=U+FlJ+ЪJX]Q%Wڦ2f@b+h2tp5M}+Diy +d$!Bgl&bt(tut%4g*%uIQ,Z3 .g8 -?׺V(EߜAӂLC 6!B%CWWT seolLWHW տN&*BBWV^]!ʾ]ezL+li2tp%ID.M1KWP ]!sC+m PZ2]] ]IN ,BҤBWV4.+]\3hOzxBLe@\ B,y:tpE2the3HWc8ф [3pE2 =2v(uVWHWVP+yBte#$uWT ў+{c;" ^$]I(uv1d1{M %iRId{J{߫] K%sT6'm.TCjY5\}֪:5əR/dqhsooW9a;LPF^ضm}hSU >yN d sEkRLWHWLNEBtb/VWtpM=]!Je3]] ]q4Od2B\BWt(UVWHWPf[֚$CWT NNWR汫K+)R4!d d ў+,O;}/MJ)&{`u8dAD{ "J2]] ]A7EIJ +{d2*NWo2] ]eJif3κ++D P6g3]cz[ex9wSFC??fKP[5~⟟-;vfqqrEno>~66s7PY;?I-&P׫6*lw8q< OoF@73]u /^~1< .}Wv#/x|O;ҢuYQtt7<`֛; 9lvKNw[gMe~ai6we}_?McPx?IuZZ=>82DpF^ B6]UT#,JrkUXN`!́=UPx>܁YUnt7Fa8Z?~ ?~c/6-3nW~tq2[cRk`6K?<, &ӻiod2B&/EuMjݽ;祯?pc'1 w!jXގǟ5{Q0 @Tξ^p೟>e t;Oo)WOo(P2͆:@wfE{<4fAiP~q6xJXڵԃ#Ǔ4R>:x\; ) (qE5,Kc$pErLҥZk?+@1"QͣבVi齬 1RZ**+6Vy-U\uasʊtC'  ueԂ_6(b@yy+%!2Wr:3 m[ެiit|#jVnhɳRX_/5f}cp rz$Φc-vld=ht|t/kW+*$<}NLVfאI;Ǭ:TG^AQ)㍙[׏Iy"}m)ztG:(H c^'Scyk1 ;{#]݊) Oލ8559bW% I*EcIy6o/w5nHT\YeeˢkCt?IbfFJJ &J,fK>,͗}HoaB go>On_D%cKjU-G,bpNTV[ꏓqWlŖΚ=jWWW=g7Ԍ+ 1hnf } P1jt:k+yPJpfV85eA[. V1'XtGjQ*b;5 Ƶ6|N&A\O -v]ge=/>Cfk}[ڧ/DWW[+}ʬŲbR5!e \Pi*heEVW;S-Ϊ*GY먋\88['W3SEG5 u܂;z ުA;dzjǢsɶyW+@wEL]QzaY58ȳ‹>mb/'S~ ~Ev4iʳ'GH=y'yd#Mi!qO;Or*6ry/̙͙߸c>}U /VLu4`MYr*&eJq{P{Ep+]#u)W.x;$n=ӻR]4]Y&h s&F :waƏ1[:dAC+|1gTDuP,0N<=1E5u#ٿd+FVdjlj4$4T;>+d4$Q%AsFw^R Jse EfRND B6%*[dž"[W!w:ݜ|S.ܟwC `\5I1CR1xCxƲ#`E@jl8gУ!lMeTxv@G,ˣ,=oMySO[p7OS]]W;~@뤸h&%g2K@qxѱ B) Ր-dSit2Rpyf}6fLjm*S+XLԌNj,-:e[݊򂺺_X|-7jp\,6]*ӢHS}-1T|q  <E4؆"c]=hf!zjs\Չ5e\+\u#nME,i. HM)؄9Zm{>(GǥI0)ʝ5qX!@bBCz wc{`R1ʰ7eL@Y0ca->G@UVa"E~6 x<@F!AR8fFgdt"+$Ү`0/"7PfOoZ xW@4zu:X-۬ޢ1T ('K)(j_2Yy199&ġ](RZٍrz?bR htwTl^WIUYdM-!\ 2aֺ$P)" DkMRP>!Qc;AS`ig//_L/N_MW?^yϯNzj7ߺ^|^^P>O-ۿ g/׿WJ+g:_;Ĵ;{wdp9ףM~7asZ%g_gjkCf~em4ncv=_s% ^_! Z|/&<27Rq]vO]W9dþ\L/מZz[}=v- J>-O9ru1/_}u=cYUsfL!aWm2rRѵFy3{uxޮ,j.|$s> 'M;_ֻ2䥯i٢ğPR$NzdZb >6Z~~3:|z~ί㸧_}pdqu7ТY  pafk7W>F "#_zCsdS7%?޿ߚoْ]~BSY7jvjOKZp}$[>a_n<J9VwwZOv[vrE<ͼɋ^.U D-7fd잇끾[Fw&}lsW4( yHi2G 6ܠE潪Z3nsEu]]T>Ʒ[>h 5ku-GC><#t0Z`@,YE P,:H&u5NyXi?;#m5 3)d.JZbJy]&M*N$x/Jh(DkzS9*H38̓v~4GU3zp[+Ey2׋cI-ZN3ءRi㩋.f4 X`8e=Ji>QUY*cD^KCȚ f@5.= R)I,.ZSK$E{f͆2Bw7rm+<<4!tәK=d*!(#QL)_;~x[ IhE;/4ꢒY# &y81;aS (I$C L)-4W!H(X 2R=Xزk1ah~Șx1}&Īrgכ7dyh6=`Q_BVUw :Ք2fML.ˀ,R  YMwޞ(Z\MdÒz*N{k6A<1^ho`:9bu;ӋC1mTBMR.N1~v vJwf"PBSB|%tkjtAPЁ.yTqY8Ip&|HG]9R1oC½{IW1JE7J6 N@,f}UdԊR%L`< Fj[.[*#,6ulcB *ƪM<*KOM{]$pۦJmU:}N'欁ΠNv{ZK.b(32F'z+=VQ$@%F )Z`V*Uɒ"Yfq9)`&:Q蘖ab%mɜ?~"+Vo'<ajl1WgdҀ(h9 JPo7М!\{>pm B2''d$0ICZ }AD ̅4vDHdM^!AŜ0Z"r*D e.Jۘ6Ξ+W,oϧMg&/ޮjn{WgomCpG*}VU>+#\L;^w==}n,/Lrk[vF;Jvoy?k,wYg /f -{A׍[$kv=\L[[;WyCwtL'#/CUo˼2/[y>ىڋճp ]J_Wn6mνi#Yc8'h2o~d;u:lg1h42&d<%cFd7($;:)MPbS CkV}=JM-HYTJ H%do$7@)ysxHc}pO >+I#@⓰{A0:`]M1頍ҦiF%$2+~l>[ xvRM ;jvT:w,ם! <_͗ -}~nʻOAJ:F[bAe84Z(.'$#A Λ٘CIF)2Z7ȓEd7.eFʾ90HFfٍJ3,lfbmv{/LZ_VfՋ{.@|<|t8Ofgج8]Ԙ J5X~";hTTD|bU-;,d6RЫš,Q*YaSYVg'C%{H|HVٍq:kXPuڮ1j5r}<9bWi)6$.ŰĶھ0Eo@BffUtRk2d%I,G(~c.&ĢbcpXu q5 ~*q[Hя`ntFHQ <tY<mD[*!I}6eݮjg'g'~~_ʫo[MrOgl c:_uSa=Ydeh IQ[2_2L[#plglthK VDI9qb>,dݺusnm\\%LfWj$R MYs2*Ee: i@KS!2Jeh?8;}dHݵ(gܷuhxMXۛa? P]=r3$WBJBD^'8R)kt[A\\R%YO$Al/ق^ 66i[r*~&畣"5ό0x'GGV.\ {X,dt>;:zp+κ;F^XjQQؠs1gkMjռ9&D/ΧڿVjőbMjbQQ MZ%^-T9KrcѬ &MMىG֑T`UZtb}᣶k.w|\QL+cv.*,mϿ\{xhG,-ʩws\a=~F"`)&vIO {ѧEufxZл5:PRH-މ;DR罸Xtt8E<ޫGazImY&pn'FrLAW)/}g2i磓؅٧VŢ'OOO/4+G˛۷u,htl^9CxϫkPrFWzzhv.3׮\ɫ_g CB8~LR:\hR68ݨNZR%2[:kF7c;tT Kκ 9ϦGkE/.ڜONo*qsNW=iusu!ܰm6g D6ܓrb뗺 v07Zߍ~{8ɟ/x}_˷R}~3?a\NkF!{p;7inZ{ڥi;dW ߡ])rOޮUbqB ٸϙ/VWȑb-N(L 66%T@LIYx)`6&Ĺ,tAs _z|si9}|U +]A^6x8g*@h%>n4(evo.jo6""-fGu~ic jd|RԱ2SX$/7z:XV_3Gw1џr͇cU6ƒ%LQ~/) psv˹h!nb8|=G"o+)ב k7BB}fR$vѲ4dεG%"맣3gb{7GΖk&Gjz&+shy#|Wh/"?_xH'K.41Z1`!Bl%+"wU*<}FvΠ%wZ?Żjs^k8e/^ rqFMU5&-88C콊Ek{ƃ fMħ$raL tUr_e#(V-t ãay6/R2K%oxaw׆ޓ2f9!R~j"hfiqoPL.y/n{R8|cws񓳋rB_LiGh8/WZ2{i]B|9joUZ RIjxuٚf}6˨ RW&_Wg>QxYY5is#{ZG_ocs$<$MPd1KK<ؾ/b"+|HNB2fE, ϸRQhմkD绶Wէli0)+n!5ms)j!P8*6KP-}^iSqWViZCggS@>[j͘\5#xc$e`Xh!'aZKnXg065c6mntʪbKQ}h9(+k0f >_ˉZz,A됔4¡a4&aM:J0T0KoYShxѱ]ˡH΢dW=O' +D v33Ac>(z;ThǛLt!dHXlO^2g77{EUY:QS#J̱6gD$GY0ڢEA2{g!mZ%2 Mvc5H ۉ2VMj#|i_!GZ)xE6Zv[iv[?I+y1뜵'Q\2CD v%d)1jD䒳"acV@jV RBvX%(xs"$6ow$2)ˀVA @C K.!J`Ō&"Xj5Y>=HpӃp pdI&.Ʈ]6 3h)Ȃ  ?A.j؃X0"FpZAX0a'D %M ut")5Z'tAXA5 uV4Q8FJ"4[6jPJPފ{)E厽ZBV@"tAT | Z^KT !Sj1 6 0ڝ+ib2[4-X7123WnTL3J|kй11EvS3EwGAŨצB1+k>Hr9)Emn0&<9ih-CL$AY.j:#S :( 0rHAO!HHY>*a X߰YɰhlAT(kYWT!& <*@j{)"wV^0`\SLD,ҪchTwjNU +㕂 uPY)A33-DuMF)AhSFg-0<жS(f ,xB3zz++B9?)qr#е1>Dɠ8v{25!|D,K 8L o>@BAza4FNuE_hK-"rNz`uD& "Kx!ܭw]bX 4 DTWrAC ( [e!|T,ի$B2׏F2ALPX P;Up D |{5>Ub:WNX }1ܛRcZK^yuzO=X 8&&l0@\K d)3@J* @"@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 |@,N"f*> QgHd=G&Ӏ5 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@zL  9&&,0@\e yL *d=G&_L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&2 L d!@B&3fI=&&l0@\{fR6pVUsU2x&~R,UpiQK2Kj)` nKr&,-8= beNU pҁ 1 ^C}u2_β@+h:fԱ@ 2( R^g |,P^3 8g&Y| r'DHs@TJ)y1Q;%/ov772we+-ϣUЪOxkP# k]ퟷ?,7pO-QX9fFJ;?.Af \nԜSִR)K!` "9^y\(a/6?uTQlۢE 3y3Asp@P`λͅrv^ 5Bv{5$zkp/F}Feų#0W<9)0dT<%5-'Pb-dpdY6< KaVz┹/2Z|9_>4^ Guΰ5O\#˒qzZ_z5j^i8L//Uպ,?M.9MÏ^#Ch>VgUJTrQY6 "uÛ.p:ܳjQogkTu%(y*YG]IEө5^)UPhv!gkFuNNK|&EDuu&vwO=跰mlmGR?m=)x i^efN2'<^6μ({Z8hICuTBUAKҖF@|i2z5gw@z:>l`Πs__}=j{ݮ>oWW۵OGqh^hamԵq7q:ԍA]Bh":nh:Ѡ}vBDŹ4@+ ;3ţLjo*^Ê;<{*Y?Uˊr%):Js/"Ř`2RUKUby#6Qʙ)vQ˯Cؙ8ۑ"xꆇFK&DkU% kowVmw?z xsbOPv6@DS "LmZ^G%pU1 #,^?5d^"DxyRPB-5(PK lX* 28U (]h1⦚egxo"ok20W?txqu^L@Q=5y7EkTEgOV?=4߄Q[;g; QzާIrmt{So'+Y ޮ}{a8.~q4ηgںg7 //_~1؍/noԇh{fvd}l0oW9&_dr=(xܵ%ئ_]O|^|JalDG]lCF7h0VW3\_XUN`ID23 %=] %;iF BT>DKIS~j!9M\!{!}@?c(Lp~.7X_;B_gzWMTĿC6np^y;۔%Qn>:Nr Fs.U#:PI0j#Rc K':]\ڡgLힱ;ҙ/L3c_>/q3dj>qdP z{rG.dOX >rQ8DUQL5 MC4$uTx՚h&[lzv j_?Z5]\Jf#o5*Pu4t2O./A1˕i/n:6NV\J5z#ލ5kM[b׫0s>y?hGȮl G><}ᆋu-1%!nr]3l}3r}VU-aLQ`(|4\^ٿO6uN2y'׵آպ콧%Ϗqs{e'ިݜz9 i>/O:\K\b~Yc/c4_^}.Zj\UG} 0۳^߾:כ:{3z}W e,W'D"0e5]5 ͛FҴ׳ߡ]Jv]rel\ XwV }sRv#nctK+N'qXўy$Lx2yF#*H̀hBFjX{)a7Υj!n::=AYBiV!gx%!{0a!fcFa:5\_cE܄U4;P4njcu{MpՇ𹃬`L9}ea80b2.8AAbٲ̢Rsn>G$\@ RR&?&9absc6RSh E aF&Yg\Fg}h~K"Ā6lc}J$CR+")DiLϰ:-'ZZFb<}/՝b+_AfiKYCKz}T!X3><2NbH+ %`QϥὋy3/:$L8u)sCي-&8g/.(yLۈw`xnWy&BATa?یxRdYkap.O !"lujN8S'&r7s* = m ><*{n.ĤMϪAZr8~HXl>p mvF}!<{%EshL <XC#w&@)Q PhSr HHpAj1S)bf-˳8.+K !6GU4Nr# MK?[ |6R-6仟^tnƼNHmI+쯦j(|^_&blAKb3R׵6YwP2Aг➟/&l脐f,xQ?@z&(As"P%Z3+a`E^[L}@^Yu;JmZ Θ F'-u7ەa`.2ڮJ;HD}<>9fu[/Pۋ٬M%\^B!.1B)g< %tB&w*yIsE9*ch(Xj#ӧyI =)ZP/'.$OJWT+֝9V x<NӦ"{/(m%n)f&6eL%Fl*, d|g(Z+6AC*O(;KrO_)Yb5ߍ}/}^dooQ;t˦?9Ml7^l m5K}20P,Ah8H4ЦA0LJa+ʈaBԢ rF@A,% 'Re+m L& HXqfaP/5ů 'Qߢ%|=+`8Id \՞ȧ\輺x`$Z`&P1@CL E8HIaQ1xΒj}l-źs7/PWEu c SF>^jtǡ/u]2FbY;|b,źٮdJEg偊ڮB PIwjQgV #P&7]I[b'\|%Ź zZ{v"%C %y? YEh#$(J胴939#Ѹ!0XC pσV܍? =5P ^jݖ#?=ר ;+{{%:j#"B3gʥPr֐LG xd4*t[sWt&c$`VPεZCPqJ"se3Kq& HȑjPIEP#3tT.Q'#h)]^ϊugG>; oR[oLx>hJsji"f &"`>v,P>G+wL<8+IvYKNrUP!Tj PKq ˸cwP~g'uvZC-j}4S'(CM9K˵!Υ:XD>gUt8ªhV .jIx * ZdD@$H4 "Pbʖr;"ط\;BOCfE IEĴVk"DC9 |IePР :PlAXG;QC%n^Lbj Mm7@":ggr6QFQBjj8}ϕ j]Q65 ZDx5S|OV!"p*R EҨ@qL"&apg&= _;c 5Ԟ*YHFU1jrZU'MLPhw2*^Upsbʛ򸘨UD.HAo}S~z+%jE3)2 J2% c A`^%9d>"1 OcՔua/!9QwK "Ic(>re:ID1,$P : pg68ͷ&Nv _&#[?j7WU 2g8Nl=ymGF}&{:GB$'-'"dp*ibxT){ tb'ng/Z"ͪ%|'XJQނN@xzc^7̺ -6p`N zbQ/W}<-]͑MT}|Dّ&iU%.'R.g5bZv 1ًm/Y6hEl@.^4 =q鉷Y/\C9fCba[4Xv0apgWˋouMH'$\]ɼ#lvnEo/&Mt8}o ӕub,;-Yp8:w68x.>( 07NjwywZϟf ll)bQf]9aJ8QLOskonv/M_FO8m?9NF#wCjY꾋do ų=w5`}5hUof\m `kܡុ.90Am> w ۄݘ}cVIl-?4hyŽifvADn?_P'!`,6P%@Yr$ bɩ^TȼpBSqlOH>BosyѢHQdOI$gIvIK )" `]Қ$#bzE^@NʏV6f5Ihy]z吮x牷6R%=N\s@՛_{T =ZM߼y=zݍ^%5.2 dvvB(v̭&> y[M{~7!-)ߺfUyR>뛇 O/ܑ$."bŴϫ ݻa5W r^ Q..j/ jb#w=v1ڬj"gt<ǭ>is[_  d؏4WfQ7w!E> xYJ'ע*=7Zinѻ٥&V3Lx>sesRkFFS>Dpcimq m`HP5*`'B1e D tuӥ\h&6F ٻ6r$WsCL<;b=퍘{b2%rM%JV4ɔmYR TfD4}zI Z@p1Evʚ*I6@? 9!jeLA[A6S17nyL,·RyWH3孅4AMݭ2~?Դ!, fsZN`1+#s< !Td)ke{ iR!dSE6HJ|tOB Tj_)1%Ӱ?3z~GyhFj| hFr4O1%_d%eg 7Usㇺ0Ib|i\~R<Z' zXc͓t>ۨF$O2A <9'okzcxg>]. ?g[^[k`ĺU 9:Yy@ͲӛGyrԯף󏩯YՋxG~|q3J<]Y%^\?U8cUG㔅a$s7-ni zݱ<77Pj7ҼA?ǏW_g_ SZxRoO#Kfd7HX&^UG>yOC{''%o}_3ߌЦ5d:QEڎ:O'g׫ݴ9~?9e{[EU'Z+;=\:Qye!abJ]L˨{Ma> Kzէ'YǢW 5Ɠ0Wש*$#%lϷVQ%=]լZtq2 sy6Q-΍2:ݡ ]];QkQ9@! wC;Gg2H=$Y,EG`!&ƐDMu TI3 m)H|$sE"ƹ"AYWeT0h#NL:&fj]eԣO]%4|x8G[ *{dDImԓ({#㲏ɫD4ŎҦ>Q"@AҺ(mC9 xU"ڮe)zN4[@q1bWP[bfy4HJaccCm&nCo/;l Ku8>JO{)ˆ;~Ym?7ʻNbMt #C$b"(a2&\*T.Ay3jkD̈́˖ ɱT\sHorҠ/Z{fJ3_L3/ԃ/|T_xk7"Z{.@Ϯgl?:v6x6~'8M>ִs,%2l1BYUlX0Imœ6W  $mAd%S J>EDc[G gIڠⵛiǡ^4f96yk]Jh0e⍐ fo d)$Eic $3dF EGʆ}M"jhfkJ3C**Ȥ&&iI}Mu1v&cPh{D;x#n$,$EBYwkic6H*&VDIJ SpI6] (&J'@W~i `-dC\(ӻӼQaY{Gg: sGRy羏?{w{wɻUNޫq !G?׭|auAuI>x&֧ד#fݺZZ7BUvdX"3tI&_8Ἄe_nVZT}8RVRbJ}<1_qbv˓}"aRu2Y.8\&Z"<r|iQIkFЌ-)&Q S,]HFGp4堕MW95%%YY R ׍#q\sMBGa/5^n&Ξ.4t<2yq~1n;-c)wzonW{6?|=$bbCEg $#Dp(,^FDD_Z򶄶GU s|&kiw%dF%{0wT,IH%FmkWsk}אabb+b`Hbn^:y7"|r&0v.Yӑ$I>>K%14Ek|GcMy}ql67.u/.vهD'gC N1er500,9aFꙂZ-Wz|'y|Tm46@;PoDa>d#% qn uFug.~efxZB'A uͧ:g,|.y5a{l;1-lM *x$b* pC.K`4@4N?l}_c-Msdug q83y Ł2k$ʛ cHyIY}pZbQ:LJ3=^KM,cJ"$fCNZt*6$B<0)/֕̚ۈרjV-QGֱ)˸/vlzO>w|Vi ftͷ-L6g3x] wA¶~c*\H Ҁg/$ukllPdᙰMqKcl3cc@Xہ/zg0Y8,3HCBT(Q\]t4 D1,:eh? &jkc ~.AL]68!,0S20 6T;uV3q>h&kZ&/’pUALiJO("UZ]~H[$KïA0GP1ꀷ&U7l(g6{*NFtv& rߍv"'4:0H(= %zD6&kOZ(ld\ f ڙœAB"g0gdth1NI(+-63:dQc;k&Ξv6߁uv姰edgj?Tˑj Vx("`/¡BA&D`-M1z/~?UhQG=`$e(ՕJ~32LH^x j˳' M%$Jnb8ֽ:Dėfu/E7Ϙ@) yt=(CX*U:xEгY1IU7i>j5(%eJ AD N&!re1DVS]; q?ApQA[JF2 1RZQ8f DU%U\w[6QnJ("|4Dç୍c!TG oo|"gd$,Zh[>p'\򱶞OXF1W7ȁsq##f?tx!˯vx[5,"j>=ulZW%V NbeCʲ菒>b ^JX=t)dn74خ%>{I3ߏ.݀SךR=r ryՇhYǟWBL*ι.檜{u9&qԳ*H_=NGY=ڶS  F'֍m i)-}h{4ʝ"Ŝbc 䕭rY*\}|)q>k~1~Stt$J (Y˫ XN@(DHP(2 f>Fi6v6v1Ae#n56O:YVM91#b4((Yl$BhJ2v) eFPxy2KR 2+gc"gs*$tdeMTP9b o.Qf{ poZNtȱ2f@'D'q-ċ@dlwO-+ORȒKHStH\&N _$FMȭW܅d8nhU+nd:{Ӫzz4.V@RrT*.$ \%)^k~AtyǗWQ}(K(LGIHVOY^p ~SI_z-/ӝ֏~ѧc yӵKK??}W7X էOע(;4sJB  Pœ(׿uA/ uqF,&lpCmbltu۳qѮu0Z"L!ū0Ko5 ~tu  dz2J7=x0\q0x<#@s*O|EBNq%m|O`8f)Q )>blo7[ 5J0޺;D6lUMRv4k >,߈>\=1bA 惶@ohe?>}UsM0!H).&9cfM쵴/>'<^ʯ 5Rjfo~LN?G ,ʏFi~M-g;8EsߒuѦ:V `=8끾P8f럎f ls=]2tW{M{&)%2{X5nfo4,}ox턷yb1Y⠏5/Z.4/qkvfmHs/a 1;.]%&2]GlS䵣*$>8~ٞw[_IYF1MYfoF$e]=KPl V9%HN^u |\UrxN7Ay½UϳzR[k- oG"-PkJ{ (>V.Q.P-/}Zڛ)}'ZޥdBPB iJhu$ZgT䴔>HYޟӂES ;vsB}\[kAQ-a,H GGoХ 7Pܛ $4x^;r;㉧_Ӵ jCݽHNVΝ8A:uЎZxTJH*Xd1i Mm.t)+ʕ䆦xR܋ȸ$3#LA2m0}FPAmnZ8.pR$Yc,"+!d`$PO1&dIzπQ]o)tFHϤOq}UWxΎ^[/ҩ~>zH*EeߘoZg3e e*%+c> [!g54L 'ފBp D LL%ѥ͆?8MttLUqX r@mh|m%.49J:!2N@ 1F#@V:sƙ^x;=M_%Nie~*(vLoǿfW|8*^ 'pj^M37~a5K WidvϵSgΜgW/Lƕ~;+ޢ?|i.?[ EWӵ8i@7qy~aZ^9mnѫťοwSǣpL?h.Nl:-D/ϫ<e1kB.ά}{8s>9^u,O[uۍ7}tAk-K/l2J>U%AH[Ƿx'QDxZz)۔m'IpfYhȕV^zINfY6R,o{.%өl4")j/O'C5l5McZU3Y%4Mu^f`k{T~UWVL_2>Nw!X Who_LSgӴ:_~.r7*GJy~5TNTRolڞx^hA u+\(KD?&HOޓĽϽJWl],bWr̔{T~~4WĠȕ YmQ146l-Rm+-եl˚ZN<5\dIoйUd|J kI K. z3ATËN](KWWmfÙƘc~Q_zqFH'kmr櫆kIV{+.gꪐFa\aUP-h"fzaXZp? n\zsWGi)oN~/#Gt&DŽ&8Ι49QeJ>[@y\ѠdV$ˡ 2F77aÿw͡#{?d} pLb

ʌuV%K)xQp&Yqqu~0֌AÄ@\d.%h5 "\hKAs.I)S>jhIBSx34` #r2n.smnsk([%.NGLu$v(\cufm eiK@4:ZYFBYM%B{G$dȀl @kE KNԎ8d;e$*)JV*;e(˂= 1Ӑct`mV&ܪۈ ֐SmPV@ڬ0Q #GH*A/dW9B%J,F[U1fI$ Eh#ṵѥ%ND=,6b\(Ektfh52d9 T@c$ "2f7!VmAKAmBYACA  L}"=&ç21G@lcBgbƤX "zl\R4 f@57]&a?2N@pSV{+ T C%]X,T0Lblb"bHB`':ˆX2=gvb% `L pPYI N/ (*`VТӑDCy[p)\6 O1AAOCw84i7 xZ[[ UDZ1&8i`>b )ؼ(NR!'D_ waV9 &^֏{<TеQV2.d!"k!0ƛC G^ڝ76O[0V%SdHW=$V*Xȇ TPPjm _fRcA39zjN(dUʫ k2a96zOy 1|Y%@ VxQ oƕ}ƂI,TGWZ"hFymV Dw*<}*o k'qP'K@'&T(ʈ>ZbM=Az] R}R{IyH! %:.p jo9tLIgmD e@{{(%'6ܖS5Y$a @A (-f]m!I1M(3{Nw8= ΃tPJ"(Y3UT wj:& zZ(J+12~R*qwq`DYUZUr`*L BȲ$@3܁/*JH'J s 4+itґp64IxFDj3+Ԇk5]Zz!ui,e4HX6 ئ{ů%U Sa![[T`M.6cs^ym0,gCT 6rMuL%C`c;V$FOB=V`&}˿\;-,fwm5E{)DzI Id05I;)GZ 0#bpQ[tCۚbC%O_ g+B%zJ V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%Wz?XeB%̍( @_[V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+%Ч(^r9]m~PZ]l~zs#PD<~Tj[o;[EįeM2?17)vSdtrqF"8iX5r*eczN^׷@)`i(g}q$&MLq@=CW.Y uuq5ofyK9.Z{7ˇy~ vPΎֿ~=DQ?EpQ))X|982;:ߝˮuun?=_V0^Da,lqGМ6V%Θ1)ʦs6t`<{hp!㷔wofsT/^;M^ώߴ- duTG\(C6N9d˃@j/.<>|og9\Ʃ_OŇeEvz\(Y\10+^^iv n[qɵGgn{;z{`EFIswg~ӓIX޷m0NטF9~ϗ\a&?\z^&j_v喫!}yߍ^†e~̄~k79ԃ'Wr:N/zZJ~0̓(amzLPK܍vbOmo_% ;>}Y,x=-Ts&S;pNE7/lX vӮ*I{ٻX"]mE)LcyM֙| NѲx-|ds@kPkV}'xk+pDǧz4? t][:,_^nYPS:;3m]yIi;~n9c느(mX_y W_l& ־>thO ӇBv@5HC3W:}1bu[cut<3е]li=R탎&BZRy(VpzMb>ӏ-O2H?rLyh=:bSkn=Epl~۳z^NU*68KKBȨNn{Nq0۞s.Nr*t߾]OTx7o+^7B7*Kg5soYߚm"PUūN"G%P6tyhz,TBe⡪Fd/ OӾWs޳?=v BI*wpX,Ykgh=:E.6(3sah 4]L䃭Trˡtp>BC>.`wRg_w$mI:JCS I(M% <+{CZP>L˅oDӎtE gм !muҬ:Lϼ%53/^>y|~?nccy$weϱ7wJ0yz*(ũBt OfYz)HA2S{')1\vY!;P_]{oG*&wu6"a BK"iDݷz)qHɴ=#Q3=_3P>rm%5AHQF^Њ1׊o$f 4 sJV!W{^jagف-w3f6Pw#'%@P& \B]$Lpt)>74 O;g8\N4N|nGBzhDR\*.9BLU%)q P<N`ˣ Ok/</$oD=q%ǧo`R'2F!`H++,cth$e [1mzy$ެТa? "Amͮv ?ȖRB_qBa{V^#brV"UYLNYh9.p/Yy,uB(jf+~A# F?5yvu\./7)I{I1" sB擾[iޤv~&C[|>l]x Λeg8 ,< l{p!tvΛ58׼~ D} y"/[ݔ՘-"ger.TUG` }kOM_u;甎C~'[Om8RT1s74oSNQ>vd~,^=2 svw Es\*wl6vf`3ALrEY؆0I myflbf@ |NƚsA& Hg9.=+ @QW ʒD$%@lA|d^:BS8#;2QK/Zp LfCCӉ5 ()ɔ'#q{E$I;Qp%HN"2tYVqR>D55ks|=^R? ^V}1ط.#[k.|B:w#[P GE Q](9JR@(a FE(4_?^p2| }cIu -͑mQE㩆BPB 4%J{C4O:\F:}yIp"Ic⪬wт-e(s[nyM oFFG(:O==Ңʣ ䷬ $'AIʝIHC5I+1s 1 W ).$g9>[2Vx]]V˲~dY1ࣃ bZaMe 7~)߽m|$Qi4DRD JEQOЁ y|r`vȀxrIsNgeT0qVkA+ $@l8rv4t8mL?`բR8z6{6~jV7EJ \ł.և6W=cNmuIÝVs,Ury(0PldO{!NAu erNF2lܟ@Uk׿9&(Obߖ{FT'$:ܯV9=&n$ُ%/j"QLxT/Wrj7(Fu&#oW W_jYtvv:^*ՙ\6\Mxwb+Crbvg!q|q jXѼu1kkgmevƳj~*8O %1Go˫r{+dO,ϯ`\~|=6THBniɇb1{׽ɶI8*#G]LiƹcQlێ>>wS-Ù{cvkKySwyg^k~m &[IB/EGYk"M5Se*M7.|ot/|u_xzvًW=x3< $z% `;^êuPcfEZuݩf9I7@X۷JpD:]iaAe(. =Xԃ$.(e8k7@$\2RR99baOUh0Pz#Qrzc멉V% <1 (+A2Xl}* r3^-qe>w˲ṐM۲?]DѮۀG«Hm-v7RuAE s-AxJf BW tXIt':iO@+yd43VHI0`E%^Ee vٗ:&jS:yz>w=]Z͇`}>rnvio~R_ -. +X\!)-PZdus: qDr쑺IU{orUL; gZb϶yۦE'Niޞ4TӘ|WPBN/yJgXC2@@Ck< 1h]itb6.xpR!"e`u< Ɂ`S12m=1f޺5X{oM|o}Kn07!gz?ݜ5~L(rQQO\_Aq:Ƨ/N\s|?'~ i&u|u:CKGNut~'bSw *6ϣ6kDnXb,pPܲlB*D!{CjgM= G#-7}_W0jڏ^ɳK뚌ނ/GPoY;ŁNr&\߻dMY ~O'kNb6Ei2$\f5panv+bWB6iYG;*Ϊ%owz86~Û{vtgU;Dn%/cjrY]ANV pl.=[-VޯrjޫysmWݶzmFMx8Js߃Q1|+%gxvnxWfŋyɚ/->7]+{#oD!WmE5#Y8GBa9Ĩ? rroZ0:#RΕZ.4R `vF/H(鱹Iv8ȐzQf&eF1"̇TTbZ =l֝aa<L>vEd"n ,X^d&É@GJ:gWrF^VAƒQ#h[]I}@rWd@jgB BYo-!xFzغUl wUy!u6b#HvqcwQkqbwV(K5@ހ:{D>0=GA$@]܇]<{L:*=܃ k(T 8~l(~/S~΅bᢏxGqǓ}γxj7rpۤ1-x;=^*o3dPj<>|A~[_풻%}|_*Qg~8z9_ŵ3: gK+/٧Ms$Y%Y%Y%?%: %Y%Y%Y%Y%Y%Y%Y%Y%Y%Y%Y%Y6K-Kk!,,խ#r %Y%YbD?{xCD?KD?KD?KD?U@49 еf`9(Y,q{f bJ%TbJ}ZLZ^INԮDJԮDJԮDJԮDIQQ+Q+Q+Q+Q++Q+Q+Q+Q+Ymv%jWv%jWDJԮDJԮdڕ]ڕ]ڕ]D?*[40; %rqzDCuC 2\W&[KՕAs1W dtz2sL^~[+Xx[<}_nA-r JWFkt:x%HLȒۦޮfJsZD^}^&^~ʇ<,WG}2b+ᜐr^ W :{GGw5Lŵw r{H=\ \X'-wtmrF[/9kYw+@x\T̆`3vgUádLZPr܆=bb$]pZx61Y-@ܭQC};x V 65Y3A[,{"L޶i!$hM>3 e*N"I+0*ŖNEoS 9ޒp ^WEM\raV}ARGׄs0~;.~!v3qZktMwbeI6l][&K)^HH1 "}\4,(HσvF.mN 1TG^Ț+յ<ļWqe73EWZ*N3W'螀 |S%1%0zCpчB"I#(uf%^{ =eڱ)lF7l ޮ)#7zͳ'&ZuOug=]wkH!U*n{ dE-%`Pp&,Ҧj*yl?yX)͡xl;eAwv5B NG%z.F5eFd.%88 "Q:!4ş54_%h.dcN#28OA@ld}[6%І Dɼ縴i=Ey߼tKqhq]aĆrvv?_^:).nUF%tuְQŠ ˈRPyt&2 34vW7r$*: j^q)mL{QpL)gKkyV4B GAB-3j qLjċlQ!KlgͺzP^PǛO߇.mH 15 r*ֲD(j.Q;PyaA~N(2u1 T$?3[WAF+tDjdy 2ՕJ|#2RäX DA8D~By`Afsȃ5j=,Ҫ8΂YFɑhb8&yzqv(NVW^aq;A6t:ɒ|L\3x+UY<_@MA4^zޡySΤmVDo繫t3UDconS7șPt*`M,&#y!;*ՋЗACns#؆cLBvX;׸Zikp}#lM<_g]6(5e@&l"ooe}ye>zS6]R "E2@P`w$X ʒK9Tg=`s*0T|kMFŨZǢrё+ w:踖-Mըdsqϋ ـO184IX $D`}f =\{J9=qµy6Œ|>ӻ7 Xg!vk'GBGe{CGpQ & [$B+"@9CPl JTj^RV:l r7ܮ9{U ۂHW]pi]SRTP>B<}rE +Rӝt5՗!jG G_@s"m5bG Zl i=Ҳ[{o|<Rї*}yRoKM58: :c8yok5b%{n'!ZǤ6|-z iY8hjm}Oq ޯ af+F{<1?>z}:J/~ԈLAZt9Rlc>:į|tr?+;2 ~xs6ffiK:"uO)N_^Ͽ:ok~?6͋KRu7ta奮b'/}7ǫ3?w?/n ֧_`lм3W CS?R|R:Q 0@ sօx~{2 >#吰6)1ŨZ>Oŋgit,r)3Ч4m?~EGg?.(t<烓>Ĩ3\~qIO} #>5t!G1Ƀi6U'ۍY G)';DĪmùѹ!plMaJoAWT8#u~ÿ}iF"n̆SqӫSYu7.j2RxgtIeb+& ۵R~rj<>{S)Ch]mM>4&a l]UO7 ɞ0s`}7hve 7&7st.jf ܥ8:Uu Ь hH7 ۤwv:UN.N[T}?l[Zv.,-NX]f#79g>ڄ!TT>q5E 1_8晫 xcQ):NɁ٧whAW`x..Hja8RTyBA "L>kYqk`%g.51zUc96lͳhwwnfh V= +RMr=(Z2Uقfm\9_4!u-s ]i Vg_~UKNޱ{LN^G|"W;e}|^djMbn[8V gDˌ,&ZxnJF(kS^M=$rt?ʡe;nx("*.ULf@7h2$)`<(L<~^+#=Tj&끜.0BwhC?{Ƒ@#!@@,,8 S"eD>wII>G2ia[g]]]z|rmA! ֠QQjrUz؈V].EHS!!-9D1ƒKTt`a0:`LI-&r6Sؤ:2,"1rbrX$%Qm_)Fz: $IYUΞ][W:b~!q%E+fx;c R˟f?Ew p+kjsQ~ן\ye1kQ\50/4`R}q=(e`8!*AR d%%+>৩UL%Cyʟ/ǂ { /\}T"ΝpN ;o G[M׻7'<y,![nu '1beP2FBu*oZ^M\newek׋IIvwE/'uxr%-Hc!T9eAHa1&~濍>j`"g4`+8_r -%& }g0ݧUt'w𙭂4()x,gJP6#gPGcWb l1vv yTsۗ7grQ`Lp Ar#CA;LXPDRiK6o+wjqJ є9|_y@oZ\->(v66mlC%"BM[ ?`v0nM*R "Gc0a${X۔-{IV hB%Ň>J-OMg H2(?ELr$_gNMpE)a; y! ձ`ؔ/:&ފe b̎J:=B`;aSxztlz-R` mj+yet`6$Yj5&rl7.JO6l7Yn10b٩ %x5iv4Zd.E>czSmJ4Zl Dspr}oaf\?=+=ZoH>35KŠ )&6̨HL =I7IK8kmzN&pΞ9#>r&&?;BY)Q]=ߓwoNO6_N~ b)36ꀤ5SAC6DpN s 5HX QKOo)dSy0w)Qw'Yt4hd ݝpȂZN @T_r/ˢTǷ| 4V(I)\ra4 I4]7qC8d 2 pp VX5FQL3TJ- mg=Łm'LG n3&D豉h8rxBklv>{ǖ򟮿j#?,ƑP!r}{sbB^ .hyD2 @vSUTX,`D|0a^S T)'A%q-oYQuRéF CI,{P&Q2-q|fOK0&M,[*5k7G=}<!\*$1kE(K t#)D$ARqk{ 3[CS9UȄ?8^WF"e6$W\q6 ]1Ȓqj}L/P;/e<56OJK$c4AG̪4a"Y~Nu eSDN5\Swj"i11x{:>*@IފN[lB~rTa)~X0!=wU.W`\84>i*ppݙRLg fi^)ƷwgDjٸ_M^֗ Lvb6Q4&yPΖكMΪדy ~,TTOޖ^M7^/0%1 {7R= 0I7f0~\@M={m4wÛ*a02 f0b:ɇ}=żmojp\5J^A68Vzdkl!cpTf֘ b,gg8Q9LޔRz4.27/-8t^:|_`*}Fܴe5Q芡 UC Ǚ&ӯ?ߧ߿_^&__ ?괉&{i%mu ]S6Z9o/Fm+Q vMwemI:C,ĞQOp}zg9"` 9-A>.12E$*A"[j9CbI/mpK^χht8'S ʆQ ̺ha#q>*%-c3U^{"'wOz'Y{;ev}xfʍ><|yu hU;+ttV3O"$rG`::m"0JwʾTeZ2'U Ǒ >I$^y.)UVkAb@" ) hЁaʻrLP4D{R @a41ApX-"ropoR0#4'(xAc۾m,NWPI}v[x!ǫ-~>s!ē׃O#v<` ë_ MtvE Rgq+En0/ߴp.aN( [cI"oCuutA7HvP{0,C0@ FCB:g`Dd<(vJqTQ( Ȓ*Ea)$Z#g=>{, qH+w~X-/`o.(y>Gk;,?;󅠏Bdlf0hP 0G 1bU15V!Zv0F %9HK Z8a!gT/[ +,Bq*bM]1 7#81cCƑN(00HS `RI;77;s/>Be )e~RB]^&JF7S2t8=Iq6 h&l"2W\AiD:[*n-ҹ@;,e6`6Gkb9OrM_ٲ>o!YO8ƌk>Dde 5J@_/A:+qk_^MEIߦӑkai{ Pm{uN(7?v|ܮCvSiT9zJLlF\\dmt{ynD$HI~gZGmЃ.&Yᆷsxz;^Mx~d5 OVeAٚh+v?fjPd7dPSv8>42e7td]/IefWke`k.PH$3jg>$[ĜE/YI}k[C,C3CwI_!Lr~6l6@d lv֎,)WMQ$5j,{84l6U]]]LivCG?VCQ 70xopLPɵ{WM~mon0lF[7L<|Z ڋptu;Ylhf^.]o'T%EzNEe jmSG= ӄbϝ{: Ho 9LhYqL&Rr}C* Zb`Bt/ - gZ|zIXbPGK[1ɨh:s~58] 6 @{ ѡvyl;DHREk.\4 +eܶ4B:!K)Jo/!9t0νۀzP!j#OI$C +DQ8x%iս yTJpC˒Ьpj|xh7†XŜ`R $;۲{[k3 ػPۻ5a}:F<|e&%u=6b&2 sܾkms}kBeJJK%͓$Z`LHƜU`)U: gM^2j;2(l)֜g=>1iSg_;6wb꟱Wt%RCm}$^;wb L ]4#Tʔ6]rJ=IIlNR6&UX?}t c-&3oldi_lm٣lnh!PHSa"Z22bV::`1V[d 9^oI8Ja v( DK5S6j~E(HWDEuP—HWz.:ّny[;1E):jO)9E zExWe^+MsIـBV$[.'q1)G6P39CbQ[ 5vyCזk " )C$ ؝$9fdAm5$: WemB *erRRdRXY%ńY "ro(  [ˢ]uQp{@oyP(P -Aﯪ(xą|S,W+u)WUnɫWūsp ^[ %vz:-{Fx♿ɝ> }_!>%j|{1LcYh!+Yv0oKZq޻8u!fN8a/2 ܜԵѳ:7{MZ76I!`8(_Ժ_Uojsav)>0Lλa98;BڷE ܛYf֨ z8Ev̚[$r{ 5h2$qB=sP1y .} <-kLZLhp:ˤ$;ޮmDyY/Vo6[O{pnj ~KLlY^WSd bּ$|zYt$' n.cz9/?7E;GZ.&dq/TQIm$NΘR"Kƕ^ ૲.ڧ$\%18":/*EJ>tdڟj$@Gp7W@$p` JQDI( IS e A阦g+dVn%Eg U| us2"b!3}@<PYX,3 _aI):!qRJK&y.#ַGweJ_+2ؠ6CCQmfsD#[KmM|axnƹ+z@E*Lޱ(#&BQQ : dhwwfc"XIS"(9ͻqLMa ]"ʃFH$)%p]-+gDr˓]oaFfL5bbݝÛz.swN/ҩY&YO 0*Shrw_LY{drv5]o_fV \&0%F_?^l1MӔPv5տ+~URr¨>'T^p<9Hдg^_ z9ob'nv/nTW=R;Mƈ^`/a<--wuyW}'މ:B oܺϮޟk\b,MDH [-;B"j4ق曆g*;b\P\W:奎h{Oni A)G<̏P;6 i^eMjj2r@~ )_?<~\q4Ų3`{뷬i؈UmyXn+AVsKއOl}߮Dښ+EK1LJ[o1/]8hQy-VKf6Mel~oH 4-S^Ūܦ[:B vq&4pщvKX.o7^;T;\.s7 hܻnTm=#n Iaag1&6tPSjxc'qGeX&MybbL~n˃fdy`]GN! T<7nns+M͆v3--pvV EZQ*(K,$9+r _] Ȟ!Sb,2G=KEJZma"g/VUUS_˘-slwң%H1_NO[οóA&15Yfo9r8@a֞'2l V9,-0\ aSBV{u÷V(".y[V kR7mm\g2sB<)\`;t[Q!dtyԱ \1noUߣr~v{Eť^ W6`L4*[Cp\ãpe)B%)h/\5ݥ*/I`#/;$b90WId,xRq2Mzc XI,(/c RQ{aL)o2@ҙIMAG Te6U]ҕka{JkeE&N(eP"H-* cn)c(ˎ@E( HiʺC !,.HIi3D`j +5:r)u9*tW”^aʫ~ÔNNX C8nKx)0'8r;qtv>z9Ri&G c'\f:PjuEE羅 u@tcy9V  !; L`{:2%^s99-z̴eGBɤi%DVߝxU/.ݴ@8i3#d[y8O&1:* VdUp27ĮD__6 i7t=ӅYriCӟB+Vd<KCiRG.v>:@83z/;߹zrwr;5ͳ0 I )Ydd KmɄ׌`*Rn !]֥Ȳ,´Ae QxƘ'O{N\;׷pnGd+;9K>[}iyXISoI } >|&)ɪG;ʗoϟjϔ%9;Ô09y]Z-YlU?a"K2 @gbj&Ą򉄘 ˢK NYXME6q6NUE 2kjfz擧n=Y>;9沷N dIKy`G缑>ȭ#g x+L4Uw޾h1!o-Oǖ~篠+Ǥހoh"''r9'!* ! h%mn)LȀ@W7r)_qBNR_0XC qF_*t6Nk0fl_"]Xn*ПU>eayed8 óM ?ȏ 41isrJXq` ;e{s a}:{tlŵF1rHr\`,yWԁin.p-8 A49+PV󗩋)h~bT dJ?._-lѫWg'HԹqrulY f9GJ rʌw淗T_y7\x}~vxep`1'x|x4_Ya+{)@B6#7Y-qH-nꉫ=M,*I cLE+XVruٶIWζ^7Ε_ڗvZjHXy: WH>h:~ب^/aeof0'O~^a?ĂKzmIW$##Zd!J7,ɶQe$SOGa|zDJ7z|?7״h4 _6|]k] wZ՜b/S8[$& Dy[BݝbGw**? Op?ID2y±`u,fe3ʨ27&&w'Kd\k~Sƥ.vg),9 9['p%!{R!fcGyƶ:UL|{?>'^U-ugsK!, f96ӧI\!kA.pZ'v4No80gFBhFQ ,AA2ḅR a>ǤATbg$&E F "$2x.dtG@d)+9KQ\,V`qaӂ.2ty{ NDz/^̥}GE,ګfM]W TUmYѕsǴ.DH8atuwyޱL? cV?5V?V= GTNc(D(&O %^2(nv S:'<o-Xiꀧ'|9])Uw&ؠ3A~uc?h2s44H0 ! \gb xkd ;Aw"BZPL9c!.x jQ%TtJ.%%37x6zo1,F3epVz=}>STHF`C$("@`.dcN9OxfEIρ0k9eC:7++lN!oblkxFI[h{i)yh<~}pv>}~lNol\+\L<rCڄ@wtg]/Cl3ГoWQ< cS3h,ѥdF~ '0 )i%~+:@~KYyc6);h ?ҹ񐈻1^bĿ/B)B-}`Iw \{lu8.g;?MtϧU~<~z[l8GJ]Xr ӑN{) NW).Ϡ DU7mڈL>m*Ξ l B0jn1c3G>р3 N5-N%sWv5鸯֕nV!ح4qo`ee}P4!x.P50gIjY*7ӱ,555ߺmB7 ooZ.RWLTI3B;{"gX.gCl܋Gs:QXmM LG+D55HԦ$s3:LH.^nUjtxxuhz?yy:,BZ%k3pzƁ:sR`Bmb:Fɛcn!7xX0\6v(srrrHr dȄLNh );\00 >LBNEoSrޒg.-ώ+D0Ry ˜.Bp^iPVN3N$H*172)I% 9i> EB+/HG7+)^rۮpW%_E;V0emf ņ,|z=1y֪%Tos[:DUֹ+!TB*u[ܓO(J*94vd0MqKůclcc~16~f_DzA[ό=L6xXJ). NG%C%dF K 8+o4J 4"K딖UL0Q[c&+u"68ƜFt`r<vpvH)E6to&I'0Ǖw2|Whh0zKij)/Ėln%]y^,ˎva\hT"ýu%*d 0D"JA9 lMRYdTI34b4ӳCL蠢AY& R 8p8I-)l`\ $)ЪNeFЎqdV4ъ,hciJfEBZg gG=kf *K #S)bIKFǪ9v̂K8B.A S6Ub_} vR*@BG=iC 7'e@aR,yA֠V;wn hĈG!W8(Fm,B'0pb8&8=KdA n > ߏwYR'5#FKWJ2# ԔƃIei[{oDhlc~0=ݚzSGPdQUyҚ{ã0?AsIـB"Cm|0; TKv)ELfL`(j nE]cg0"<)q`\ (H4h-Sg0Z8Pi3S9ryGShѢ_P:CnZt8^HULGd^ "XaJ̬tL k% e Nݫ@5U#xc{c9 \N00+ù :=jEtc{ccy:]]"܄ʮmn#6R=Z}<{V6hqVx[GJk筭dh}TС.tkRd%ڃɕ\ezg~?#Ĕ1&2#L;.cɎG~ zR21ȖRF(|a< 7y ]@Rdct:9і< K4Y$Й7B,ebIFgW2Қ$e`,1VqEZrW;֠#e"%xI;Nzy0i*|ᗙF,]9 ݾtSiL9cL'֡EN~|3 tLgO}0Ö?2#')-//ӛE7xF?6%):GiC_߮zw/M͠K~DzK^bJ́:?bL}_^ǾjMyPKⲇ[5]Ќ4~vO1w.$طy<<Zz˩zNF"{& RVs 0Koޮu1^=~9>+2w9[߿6k0b]gO˦L+W.RG+>/~Jᬔ87uw\./Mɼw5j f@7}s[o?`՛6op4*E! |tu pެ8< bUzpY` wK`Bc S-cp(,׏74O1pn^^*/q%U[LY~H6ٻ4lUZ֕Iz?;FVȪIQ=l nbh4tyNy\ƪvLSz5 S;1 3p1v E3g\.~v6sya?굩bffoV ̈́w6c2CՔMTUO߾\sٞݰyݍW<C[voer)MMW e3䣡x >amHy¢,jdPݳJ qˬO1yjbzi`@>1mm$>*A$f4FSqT,(˝#%76*F0CycWw'e&&}ʪL_o,mF 7nd[oNܯApuw|QJI7s[s.2 B=Ru:yFu/) p2-]~7_&iȇ77'inwa}y$A2x ?^Ywbc m5~P,n:YJls~` \ܫp4_P'C?MpQskfbW}Bbں\ ̩=5kދ9տ]cT'a >)RsѤ`:F׍,W Y[}㿌Mug~ -|,1 U 馛76洵ܢW Mٟ%Y3FΪ{V~hfBKq).q]3qBp8X2`Ÿe$XLjA!M|HC}o`Syå O d:FY^1Tk%iw;or#(:e}Mi7h4o' 3}_.}/%g˶XN| 7NhD(.q^{-7^{eHuH28*˽2ʽro/# -@7^i{x7^{x7^{x7^,,%Fm-˳?JZ$PJ!\*=y;dˢT\?^˼}X;۪ٙfۦZ -v]7} ICf|쵲2T5 +#5yΨj|||&y9hT.M fnمy p޳tr7J\YضXGT3Q2l79w|cxloRJFoB4heYlFGrZRЁab R cxnN7Ҳ5F#SE,hdJU࣊҃/<£b[9+pt/l+e?&E 3C[y'8?.]R$` 6L^"RHj^2X3`%1CRZ oiDrH PX%b`A*/p{g(RDܦ"H&Ea(m͡;rHRIpڀKM@чrZ+n'Amau y(` -sĵB<FUVKhp1^huBOHHК(ç 0QHF(])X]Q?Ӫ*ʿM`8j.F8*0\ί?{x?U?QAxS0㓧Ol\ݓDɓ'œ͵) M =wNNE9>NǞ|\5q/>_ҜÂfx;4W^}$Z GwpDꕺ!34q .O*އPd78E'U0: ;,M|DT_:.0S}'[/7Ia(4CW k/z}?EvDk8(;&!;#9SU/;Nr]5\&p@KE9;pɐ s%YD) iSX~^ SgdM%?Qlڕv7& b)?PvcQ80TŝigYx x^=9?UܪEuz1&{qK}?jA$>XxN4r@zw`y9]J}2lIT^5bU>F-rTeR YpɡoII[Ļ_;oMjj 䖣G5aT8k,lZ2u˄h Ls=,ZQ Q0!S&*FҎYΔ |/1@[/5^# n o*}q?0[YJ֫ThbRM9ׂ㈸h"`  yr * p 4a XK"2̪ )d4&dyr٬w ~Q\ՌQVƳ핌E#NHKE31\V$ATT5~!~2'ԮdFm* RJ/mPq0tARLr4\Ց7/E w\S <|='Sƨ9fV)pX* @@#"5Jκ.>09w5xEJ eIa<ƕwd%Pylѝii3:z t[գѿJkFɂҧVj/KCYg`=U?kun3 Az#P"eQ+#1̭'H\S+-H Z[}b;lGg欀Gfį[poUGc IAAqĥ@;-&D$khV)+I1(v(O P1@ RPL톯۬n5[z*@tjÓ4պFfV|Em9V+r&or1f#"sFzE I0pUgӎ'>'kǦyd}GϙNe3X \ęEVPO!&ʨAfZ`~ŐnS9f,blJ Y0HBH%1,G#=@A }-bn? EW=@+l /ē^|0yjꀤ5SAC6 k^T g ^`ddaX׫m-pBcd#1RDRFcBS')׊ 1BpGu90[[]skA.;Yo9͇7Sh|^ABl3 OҥfI՝ ays&Sa*~;M._ۇiVJKL+2Lg_LPoz7fp7OZ՜|)/{@Mod+ğ_1;Uc8t9B/4מ=πf|jN|k.݌)`Eju?} `k/Ë¥Yj˫Zx⽎”:u_bƊf a]oŻ*DŐ>bky AC) xp1}Q3[73(B 2|8Ou#.u^R'?/Ng}FptNTL?Q7$З⣩Bf}jRp[ \TMj2R~ )Xdln]7A KpfOYmi8c4&a*\ >L/E>`eZsA|,\$-Z*0_m=ڍryÏ+:R& ^ bqU!u{ܝ↹I[&/.xW{.C8/z'v8><<6B12Ed:/P9'C7#I+w8e ج!T$^(R*߷z )-G kU_T5*#N_f^nW>yFӱwU/9G/כ2e1*??.[猉@4{_mM+PsH˴mdBxߒSy^LdǤC̗6Xa1}"%f±lJww'sTmOxn_WϗJ7*AV&JG^#C B4/V1%zDwzso|ɛboރoҨ.AxہJ\ D`M"H&6$gN{RJ>(Y>N\9 yp9k)QBgg^%eyJXZ*@}=ZqϼMD2R2Nꏎ0Gj1X<2VҰM\Q[ DdA{1H1Gc:a B#Y 4l y_9Qx#6F;IE8 qQ* c--770ofO/VQT֬ltAd! <&5Ń0u@v|9V H ,6RMdK`rB 2Ӓ} e@`$ ŧ~rU7mqGS\ˌ-2ҷR|L&1&jhцtAUWӉC }=}ݙ]/tSm툣&1oC[ApQ T65b#t[TS(Yꗻg B+: 2rJRP\l&t J*Ra 0) q.KYf- ଵ!sO0 vl5q#E Ζg!KdZ{մgx*[ IwG1_b*mןGx\^ eijn6Pef ,ؼ=|>x)N/{C@]\M N8YyazӔQ H H2]1\I ?( ݢ8*)ңB?^\8,i Ar nn<  ĵ ?b{lzu9nޖ^^g9]pϴ+e;g@>sJkKo)KyKȝ-K,a~*­Mb6Kٺ-g7+>tlޡ ;5w|}r 2xԇdzcz'zkx / zw`UsuSZ뽻!Lmg۞!۶%ᶙ3m/i3'δa\Ou h~6mVeoA CcMHGi<p+~8wa$ptJ!yCtA+1e-rJ Ƨ(C9FI >}>uǃ?d7 LkkAɐK5S oiʊJ $U#IfgN{dم[LDJiPZ ګa6T6j7tVz6Ӯ!Vӓ`zڰ^O~8p')mP`P 3VyJ+Π!pgV!-6f\O 6XebvGL,ѤA 2Nj{jJ5_XM3 ue_{_Mټ -L@Wm퓿8hh>qMC&Qe  ? 4G(x$*j2\BS؊L1m*F΄\66AKe4E)ce.]"Dg]T\KOq#<]M;Memz#؝l4d `e(|PdI ;p!5yLL|>+m0x@R s&EA$C&^3 #(z}*@Au2a5q._qlx0lEVGYod#OL^JdP9AQp PIͭi8vOxڒiiq# @0~tɎh m𣫣vFn(\8IID*yYp!p,UԙJ]p'+ɸN8KYt8MbȳY`>dOfQ[bx)!JەN:ՇN|}th|}^T7aU_2;αEjN6EL:Z4 ؼ};W!|c I7[ -(SVTڊ⠠O1xFe9,2%@HxAJtlJC'(#wla E0#MNg\G@TYSE]^K`q?T!m-dqxw 邾w[x8l=30׍V ǫ/WHvW*!u<[iѼP"Gcu^([Ko*A-n.O\WiY$cYl2,tuQˬo4$e^T]}x2zOC鷗?<}<@r(3ג)E 0ÏYd4qyMW.VZ(Z0mG86;˜dTâǓukΌãMlϣƽ/tzE'Ř1n/s-̺w-b˾hCaI[nK9ۧ#dhqt48:GGh〻 U Vhqt48:GG:]>uGf8lP`j 3ĈUZ *jCw)9}Ȥ;Va!Np@yqc/7h"D*)f<UX-B՘IDk1hFs+%eZwZʗI V(:=G ; 2X"M7[J% Qov1xCmw#%"^JAZxpܢb*]qOn@p|6 X&`l"2W\AiD:Axh]Q>"\fi%ڿIQ* s>a, JζޖUn̴b t+[!SߡLt+ѯ{Qv?ӑ, m\=Pm{ͦkNoB8 Ĵ¾ IbGDd{xϛqKnztENZ.:ʠ!$fwթw m؇.&9.ǟJUt{*]',~ͲB(C:;5/ _y0ݣ敒d4Q_tryPU[T<'U|GlzG֬_7jL^snm"J?Li261W=mѵG!ͥ42g\y*r0i0O;(rEHpp VX5FQL36x\S(s=doP't)ˏ3yhO6~c:2q$TB)A̙ ODJhlxD28e*!IJށH@.(q)4v˘$S{+%ReAm; jKdt1m'&^E%m$>۾۰,s@$jٓϊhcQm8`$I%" [9wJ;况( #TOpi/}Ԗx֌u~ViM3Յe](:]p6g:K|{{:Oɯ f0}?Yc)-3.%qt`~PgCHi#P%K@6^U4Mgl;ҨOY 'ް*qFT-kDiN#Fʐ'D1Tk142 8؂4U p$ l:"^2:łlˠ@ !LR#1s)To՝5⇛KmA/5s|"G-EN/)$# )\Th%`+S@F2JE0|,J1 YJ=aӋOE;NDrt>|"/q /EsE؏_Huz<Us\И3)Li܊6P(Q%t%mEkwa{{`m(LO,>"/>(e@^KǬp+i 1b.yw+ɻCP끭SO.#fpi[?Mx'=H~S*L1EUb H޳!j8 YiBD5G(W:|Hc]IwD@Hd 3âJVc  B' PM'ѹcR00 UP֝m)(#+B  $4u#52FW&DbDdDH+m/S niA2ELY$`8\HAF $Τ~[Dk8ouuԮ X$4.1c/ 2rsN,E4!S*. {jScאك!_9$l_pC%0 Ko2>BIqۋ}(??O:'ץL6Vvf4|[(bU< }yS$OҳV`$ "aHozprQL]EOb j:?W{+:mq !r] &J*0VEbv~=*JrKw }SVKeCCpy)}$Uf3_%9Mn,P"s*+Ew UNbX=\|BrWϟyW|篯0QW,﨏'1-;(Zơ;'Y\*7rx6<[`UG2K-;!ٟ6 dvL¤xW?pnҦ;xqUٰ<"{۸.]eW.fsX<6 IURDJƁZ)U@ĜEj4FkTn 溾ϳlK-oC6Sgn°#p&PB^&}ͣi髵mn80l4 ;n \b;@$W,a=uɃ`W:3i;(xm}-7t3=מ _`Ŗ^AV'|ڪ\{BP>Y|^y-񶝷T3X3^` %w8 zP_ ޶CAHc$`M B{ 42%"VEL$á(m7CQ{0rHRj/52(fT q`:-RLpS)OpֺyM <9Nt\k :e碑GaI#:߃^DeC6YȚ&qHBM6߾),15Ѹobی&JҀ+&, QV?ˮ&a:OPY EE'v`nˊwCeaXb)`Tп.$%U~M"GzcAV~f?gJQEEz[ [2jU\_f% (Pݡm(4[^iucncaBid,5\&p@ME9;qɐ ~uvT B{PJ\\r_$C~|MV[V̦I=[!m!`4%aXru\" z4jú\Vh>S=>_h~]2[_ +57_К)}dj4K'X%_9"A  ^ ˄h Ls=,ZQл[ގpNrIaZqa1&2l<:휎&F*єs-8[:-yϤ8. % h\".YbwmYq% XL   , )&6)bؒL,*IޮsR}K""؂%6Ô-3fiqTzd:F- kj3Nʠ}95{=[UG$cfyb;VH}!QJ].$?V<}#0~`c^j*ȗ|g׸ Cq.>`kO8Z *~Qe+/?p?k7v>ۚgXeVhXƿrjF l,̹\3qj w%90n \ZScn{"t}ekWCx _w,xj~^4~q2ݸujkOĿ_[!~?xZ-x-YM[2zuxgTKS"J-xNzl0֕c~=o3l{.JԆRqH pyuZ>[[7kƝh.}~vwStUu&gӼ1 Nw,N 6nSܪL1jχ*J<6T!l:U7*[4NLy[Iҕbss;NHOٽOVCv3/O.iW!\m ]>CkJ /Ǔ6޺;6NN:rN[ <׬w01` d /Pؤd`nd4b8%a7i,X \Ѭ#K!yx)(`~C_"AO):ȮlB)vNś}ϊJ2"X`p-iD^>T%@H/%i $D4pP(46je| $zbʰ WPmk?Z6i4֙p\lچukk/G1/Eg' nj1"Ee Չ$%rB@}o2( ߹a%tyd:Prt9|Hh0 }qL΢&icYN k P$RDM+ʫ2 iIQptaXrxF />:>*: _R˨z›q*pt/d,XNU~T}%*y*ΈJ!m+dəj&8x*L{^s@ QwK }|Zb@ @R{pU$\`tB0v,p da5(ɮ$)cAfVh" aIVt- ^qzPN"<)Y cnXTי5I I3a,0Vi" wU&PB@F:Cє КKd/tX3,25VU/"o R)BM^z+"t42߬t Hj/*kU 0dJ-K6Ѧic3ϗvӛSGC5˴,AR/sefF nfV"i`(tmLa F4໣ɠMŰƬx5 UeT$ҬdHrө K]0p%GD LBFq+׵ī+zG[V}C(|NV$ qa:h0)F3yUU Ѥ2P(i@uhq>$h)ܛ9{vItg='Q(//{Wg­ J 狳9Cq`) MU5eАWrzq@n9`qy|<~rXlk7ޭh*~.3^RGzz63'/=+Cy3>Xo_5i; B: B,mJVDӾoDtgIa'w=[q:GNjg7m'$ct.celM26]Ʀt.celM26]Ʀt.celM26]Ʀt.celM26]Ʀt.celM26]Ʀt.celMy26l<)\\kM5u\Jo{ g>-yXꛭKDz7_fhv^:K5eBLCmZ wt8Mki'ђ=}jv?,2{X!lٯO G)bͼ>:IRE4Q'SSK38ePJkƊlldsZ=fH1l;ڮ7/j^\^syxs86⏷=6!v]\uͻo?Fswkǔ U)! ^7! A w-+[ʳF۫g̹=wo m{񵪞oOk ;R3F) yq&$=n 9wBhbWtјNDט]ckLt15&Dט]ckLt15&Dט]ckLt15&Dט]ckLt15&Dט]ckLt15&Dט]ckLt1EF<# 6ӳј`s= ^ѕkLJv1!n~xE-9狣r-UfД}1x@Ozu6I)1pSËPwhX|<:{ݘpxC`﷞ۡ6Ƭc k+[^Ggqq^jk_PX;ӑ<_%Sy%'<.ںƛ`եR ^&&'H;:Ig[Rї9xw&y=E7epb$f .f A3b]̠t1.f A3b]̠t1.f A3b]̠t1.f A3b]̠t1.f A3b]̠t1.f A3b](eyNb0ظ#fsmx6b'/f+Cb/Q@i38nظpvayiydvCdʔ8Tn.o@zV70{XPز_{`7/SF+Ml=qY^ܰM]OU*͙C`qֆkW<| 6m0Xl&.LvdIQHò-Ymecn6YU;_Wy*0MQ8j̥&*nd`)HPx0u#a_ndF0b*q J;ەx'sT 9i @}QWݮvaaMZ'1>3 VՑ#pV\YZn~Kax$ ¯ Ea(,ŢXbQX, Ea(,ŢXbQX, Ea(,ŢXbQX, Ea(,ŢXbQX, Ea(,ŢXbQX, Ea(,ŢX/fh& DJzfBMWrW7/{|3ڄ= ebSȘg fcafvĀ.j >^MYԆGӛ/w {TJ}L$kn2`tΏ+vb#()Q 1 $HҰڬ92ffLdC&$s}X_z}hհԾ?ْ{z+_]L\} ɆۧKo^}.qӢQr̝|!veX 0؁_@3' 5,KaXZE(ږoGv1 ubnEȮG2+߂z[bLtQ@xp Y8(=2`hSa b,+(6z,XFχ$dW~MZXpI6MH`)ǥ)[ H) v*Fz {02| Yk h̜uR -<@Tӳb؂6ר{an⻸W/w @2@QEbFD8Fb_0 ̅hMDDb)q Ehn@x>yOnw+2.OjK(1Ť 0 $L\9A\*Pʔ ܡh4!Zo&"܋EH|haB0|IG!Pnxt(V^!y |@/z[,z p WuCHޝu8Fۭ[k;#ZJ<()mƥ^0.F"l_{HOԸ(\T bxdzhzynGL$}t:Ղ2c-B*pg6Ɓy)0 锬_,O:{ $I3Ǐl93[qJJ Y0H۬F]≎1,G#Z>_nQVYB : mԝ@QnϮowC'3F<)fއ37T$ Y :0Q^{e &[ci1w7De+QccQ} pT}읛I7?gWhhrz?gQ^}LL}kOY%Y|dM_^dK[mj‡0΢3ӟÓ&?a 8UBzMN`:6 Su7mujMf^'ȍ?iԍͥ.~F >W:CKݫ|5|wiC*/f/4E^ Dݰ?X; {1SEW/_V%|]1gԎDIƑf8zwdu6<<lX S:M)ljG2iIw$G)48)UoiܛB'ߚA.p7KA$ǘw+7WA/?oApS8kC4m/E*,́Ey-C`x#$Q "qjbAYޒ!% 76*F.y^@O&;%Z_M?zy3ƒU]̣ۘ]}QJy7J#e\ύ; 6wt綀Ye@G`TD6ڥ%ؔ wVnN}"*$f+ 4J`[q`E\/x/ɦ%A6o~+Q ,9z:G*@ޒ o? i*+7I 2hQ, xPb6x$wW'aq&]GrFdwi \sZ! QQjjUz؈ JnE܊c_=68U8$yB9G(FxpR*:0HSba0gEXL(QmGT6BQ0A[R. 0 1BlG`+U؉O-qQMay]xٜ6>~gb>HI9^iao9gVlkY0Z`pjᷠ/s a"{FX4()xl*H$`6#7!P*y6&NFXeZ/e(wpa`Lp Z4Q2Hn$q(JzcIOgY eTQ,;Go)M0!Nie~F/f-5moǿш0߿G@xӅL!K^Cu<jˤ&te Yz !Kl:ԗ/"dpm::FW1#Dm0ԤnxjHx8$^Qxe"Oc*XRR>@Kd;C+hݯN wVjS8 WT ^ѵ ,=M@-z^4[g\΍9qciˍi(zWx=3vi9 +ϋ?^UF7S j4 |ߜaY!54LO^Ne.5AXz<8M΢c\ק8Ř;⼜E-ohX~7P=ͱOUcx^|/r93С0if DcrDgftjp `ZIk&W @KEV~u}d/9v֑R3|%gS4a5ɘ29&y|eS:`쐮-&2:b v`;G6$'YGf=ŅẓUqL42 ܜ!t#4Ɨ[s64DcSjX|4Oo #rҙ1 LN& agL?w1M gK87#|pim-hu:>MX3((dQz3v"*02\5vt>'(n^wSA +y  t)Z*ҵ!f040v rZCEzWƽ{4x3Gfq^̏wȍrLdr'i`匦"_WUu/#$ՂZ}:m9k>i=%wšӊvkb{R;NBnzՍ/G+L#`T70I逻ҭmA}}ӳP ʹE~*H7ݴü1`ּ͸]p m F! 7z(ofav Du 3w+̣SKQpkXdkFG %9[-(a~$;%aX]wIݻ${NԥllAwO^%`PCR!~`ֽ { U( CbfK$.{ JrD bVË`vuLvKO!grFMaV/-+i0 g< ) lkb5`r>ӒpNv!aQ(N#? l}|vWyk2c4~?ClJ{EťMݹ k>+yqꌸO31{ʣ?"^t6Ϝn7vNMM{?c8ʐdZGǫ$qF pߋ<Q ̯-14E;?2YƓi៳Ơa :xy3rqE߰5S{Y{s{ߨ9!O[Uu&62Eʼn!~5?c| pkl %aB0ނ4]j"CLqOF{ճMҜFCW%$ٍzU9% O/Ҕ LhLo$۷.0BqǺ^0!vK!wgxLnߓ 1㤕a>a)̲)#Lw vH!^WZ }hCf iSSpX|wSb ᡍBIPsU{L{^oSXњv5p)md`QLQCKʸ6 6E96zN[bvcK+Z]ᎎX=WBJ&=-q-`œZҜS3|w͐6-c5&%IHw%ۡf@G+oY=%2dR6[h?3ñ3%&#T*.l -f1n)Ѣ p6t|#vnު$`'Li~1%1sw0Mi1n5B)I:9No3sȇٱakAKbpAy{7q HrǃypCׄCN~3S p1ψH CTPj&s*AOLCОT0dLeTr)~Z<x@i֠&W\䪮5:?xbxz彩ԚϱE}Eߦ !E ꮪ9&[ cur0>q7) ߗ/ۍVx%nI<=@ߢӂT,+/'uY{ya8`Ko :ڤVy"'_^ {'y9--*G>\*^Ai<}y袟C^1%(|΄Z,rpW{ iw&a;\r ,U27|F8aSrpn.Hcq^,d|3J?RndHDCydCteyŰD򁆽})*9)f#7$o>rb`9Zkw)ǖ?]y}?9O+MqR7ަ,9)/1G3NyY*1e'YVdO~ɼ^cE"̫%̳ a8B*kd鸇rYu)KiZnNI K_#hV6N0`S,m\frtd2޲Y 6EyM^0nHe e)+\NA-G[{\I_?aδ[uh߲}12/èӑ̧kd>hzGn׎x C { Ƥ %򳾓'$ Λfk0^,BM/%K6BdzMBAC^rQ;;&O-&;X$RiVP̶:aGu`~lO8>!qgYdФuÚSiTPy04 O:: MG&OǎEx~48FڇR ۧƎ]Aħ[ԃ˽,lD6^Z /%;$'Y(yr5IU0x~Q":&95ͅE1XwN" x)ǃEɟ4+H\A晎}4 F#9`aQV4' aO!$F\2 d2P"Y+xNcsK)tL jo,m:›v`l~Ns|ӆo ףʴh"!`V4êb3s $#j-jg[VR>Së88ϐ85gȚ\.߸]|M;W, 8!cѨyt<ֹe!kLaFcb-fQ1ȍBga,Xt<99b:dgiRI:W?#q4n/jB#Qozc 1٧>3}ק<˰Se~r1BXc|(Jl,͗!g~~'Eo#M+Ľ;?-++;@ۿI}t8~i Kj. 9W^Z4ֆ<\9qt~cNed&^)#Oy1c7?aqLU`0=02r T;^vQ`fc :(35h;\ߑ@ u$X>~@< G[^r't-0Y:v,ׂqwB1^J$`2/*K+xS֘cN##/aD8OF9EgcE-BV0Q6 Td u!|lo[\EmhyAn!hǐAk։bXNMߢu{>}h6&m///߫eN/~tZƪs\TҝS' G!Zo х ~~Nw0jy +d}5ŠCS6ڎV,Nsn\),^D9Tpkuʵ3*69PjwN!gdm ad~7Nfa_B&p9b}Үϻ۫a:+Gǫ$Y? lZN _!k>Oa>ͻKnn2DDh2/oU%N4/a>5&e5,pQ~=Ma̩wƫ=moz-|0M5l+OɬTit|?_0>/IKԖ߰icSWoj 0><ڌqÝxX_.LhE%04 Ns="{9N\[w_0㜷Zܯ:)eCv*ص] ,y/CO4\ow`Pў{ɑX@:E#U!>FLz/)Y" Vߵ4 oq${:ުXOW ][/}!@ml$)-SmXVV}nGy}R8/}Ppy^6PBΓ $& R!$ 9ee2lUᛶoqΉ$Iz>Gk|AbLe .=Rl#NӢz@Aʑ|Cˤvpt,4 ì D%1/oU X.,\Z-Aس*.5W~p[ZIHq|&oZ-3 R UT$Y 5W.)Wr$ 4W2o7o=QNrcǢu6\R_hHo4eHag7u*f|w`OOuy2z6jm\,Z6U܌'*<,ui]>(l==ݧ?<;߿ٲEZ/ TL50WrJ_hh;B= /~S--]>.|l\*!-j2ƷC[W<4~k QN^Nw ;Y=#Cx.}wvһ=UPٲ)e~y\Ǩ}-};sNSZ]Cw9s L+ ^A)S=Xq[w]^qh>2K|ھ8vo܁m6aRi FHQnmktn:ˋv攭ZIUڵ߸rmgvU0)NDՃ ^U13ԌY]L#**VcMN<'heVR=F䊶/ey+Vv fYZS5e=Yy)3)7nFE!SWVq)$\(hih߳/P1%Vx ,gi:)$4oE꾔t{40Dn\ چл1Wx!GoT e](`laaBM곔d C/eヤZfﯗ>Z<'H6c#(rMjgK^܏ؠ^snnl!08 de7It*\h+]?3\j.|ɏlνa|n(( iv媻 Gq>CQG+'! iaϑ|.ɤ͋)÷BQ yulY_DF_xo7llnOXL`Tů e4, 4=-~h'P=|}WA {@\P1 _G QB%7k;c?+웁$2nJàn[N{=XZqieX G= S0v\q\lK&Y ew T!((ltîp5E{ \X[XvQ;׆[ơ!b":E;.yi~K4ִO+IPcT,'wE&9o+c($2K}B|3Q9].a(J`j:{1p;oO%.K {Rn؃`ƹRʞ-'bչRo^hCċsJ;#uYLpnD6^1gNYÆmSvrǶ)@ er7 6xXu}y e@vO|onjY%Rqb>d8xJ뒙nRn4:p>#۲ J_='VdI{c3c,C։ҵ+2hّiVgr:D]v (.oMxt,Ycc%,}OѼ6@q&ftڪ-= <{EJ[=L4h&i鹪$ӛfWq z 8]Nѽ_h@n|m=Nk"h(}y AVV,mP*_e3!R> C#hsZ.GRrGsq^]È7_ ʲ-f }Fc20(Džsr7ob_6^asiw鰄@L S جC+kmHx) Y lk6}Dt!ၯd)u\cO*Il!u$т p"cMZdqgF 8VXs)WgKbTehKӒH/Ӿ}ONfH qr!ynϜKo FdiU8BobQ͔m@7>QZ+ j惈])1jsv=A@B[yiɗZxz v0f9:֝ Q=u[mۗޣǃ\q?hpuIJa#K!ߟo0Z!42jRƔQ>^nqЄ~,>F7˧4<`0E]Oӿ\,n_4h|ߺq1Ll^fTK0j|n?{z)x4-~0iV/~Y:8biHUZbLUmR%h$tj[rEtCq%+H/FpDž(R· +O_ԃp-EL-C vƀvwTjp-XcDQCYS1uP= MwӿQT&=W㡸BAo!7(A8uNkTF4؁RN0}4y("'ϳȱ0{00d?*Aރ^}~aR\uEM{;d:2|A(T;5;Y9rKtxKH W~eэ]]zP)v$@"t{Ȏ?m^ .[<=Qɹi.qUrOL<޺TJeV4ʃ_Gwo[IJic3Qg1ňߛ(|ׁ@J[7crQzQp7tٵST-QooŢV-SxGm:( κnBP.+ʁb҅ϸv8`]*h- _=w&;]{/z 1_l,WrVn`9it MmݧF}[FO|6|=WA(X\ɴ9 YBɦښaCy%PUp58e.u?we~JpQV? @~ nZ)HCRLSInZnRMx#iu"dظ#yFA! Fx8"pۣL8c頨 'N[pIMG]4 S$h@g:<ˀo2:ݺ4ew1Bthd?>]19LE8iB͜d+1bTcj# [F>Ⱥq :xt,rQV|WLwfV8E\/;{q6;$8ɥ&GM pHk;6}Xp3>$DK>Dˆg~,Rth5(kTS%2.]|_pNZ0-֨4@h Ijn?"tzr4P>]O\j(A9$wX Ecq]u+ڨ`ш{Z/E޾B!0d; ߸=MOư\2$TPS@nJ­pU 1%R` h)Ek[A2u@T*= L0VSJ߭,f΍w^}[PZN0[A04tZE=tU 38&+2ihvJEShW n'cÔ|cf%4V9?B d.h&c`C$̊ZS5^'y~yOKB(yp 뚟rD֓bYhk9b^TOd}%]Z$t6˓as2J)cy@N0\ wBgpQɴV26߻PhV3< ض"6Խ! %TdiCl0kbi[T?*' gtTM!%V!mex4zU5J %V,d(.\I 1[” }ih ;:TF3"C;$6ӱ_X#;iG9jX1)T \RbYT$'R|/?sIA}+"Ҋ^B}- OWN.MVG.lxpH FR-4vn>< g e4.gQj|`8&SxXQ趥%]ey 3Dl3{d+sg^٘ʜ6G!mDo#_RFwXy{yIHyyDK0s$=[y vRPĉ2H!YTSv9W2 귌#^ǽvٝ*ԧHO1Sj>A3t@:`9Xy;<}T~+krlFoX6\&yndIQGqy8Hob EB)@E{5tX & xj0GK';}”]D~=>_i/#u;M5SkI7SQtv5FRz+dp6Ny“DJ j%}.GKd~Wl;b%n+m)/H؁7ǯY'LӮia4:ǫF)á?_31){A.ZIY&ԚsHRMgiHhB#?ǣ;BF:C;5b1v %6< kW?OthO'氫"WB˛YiʮoN~t[lS_NbϹ4.:Il[<&dAVZ (3t[vbx KXq[mFrt8NxC=&R Ή bCqF.z4^\R/Rv`n4Eru:d9֡N ]|Zs<xuIo#f@T@KgkWq mxZP Js"˕$U"6Ev3d5+]B5B Yب,CHdM~u!B_])u k\F<(Ƚ۲ǯÔ>ѝT0`wn, txuR'8hIױHG(E5=sVyD.6υ ^Xc,#7ι1\σ`T,XN/{{@T3T[j:*uD7jORVL:裐f^5`EmvG( 6XA tS2."r)kiV5(RSdu!E` &9ǚLU4Z򉎋.eժ; 7cZҋDai :.wQă dJh\)[-i\:, !CɰTX[~yRAs1őz)V%o]9,([WS֟qbw>0 Z xЖ* h\`u@1`y^zM DڶCncqA2ӑʙIQaHz8^{Dx'Fq%?3F #HS1D ͂o>{5K9d%2@yL1p&ۚ`]CH_j/( JQWO 0ͻfEBNWIA?YXB:1ZT.cW4?X|VZOe_vPT6^I{R*6;$%}ѻyޯV)ɧYTnjIm4m].M|GCk E{WLI]M/*{922$"tϲ^j8#F6Y<5uƣo֑%QSUwo{6CNV|zxhܖ5x,쪻`:pHHZec4"GAr-cذtj]5G,o@4Ŷ)S7CNwp嘢휼Z  HQj9`q ;reȈ"UزlcͳR;']y gm:~yR+df-h$oć#@Fq^cVRBEF# Zº=t KՔX6KR>}`?l48Wln#[5?&•kx6>9%؃"h Ru:k-XQ 1_3mD`3SR!Q)p pK1="IzZ _ԳOG1BF 3p!Y0†jXn˱lՕ/vp);rj%[>C#09[VnޕalL0#rk3\n[(T9*Q9=Oyy[aeI|8Ds>:4ݲ="\#{ q\QA;cNCڇ{`aRςp^R%Zxc?sdx82hb&@Ir<(r,ٗnmئK[|m6!鲞FOt[^h:$\Qi?q@^^S }€Y!X0"oZ7ė+Q<0lLрgU!_VD6Y$gf(LEǥKfja/DDjgH[ J[dQ8go^+4G_uV_HW0=$l}yuJp)="WXR!ɾ+$ZxڜxUdT6Jbe(˯9BER"qïz9 I0]&GZo.]Ig(Ir&Cs9{')>ER'碷zqڋ3h/²El{NDbf>'L@/y(y hzKq+d\=w`q6Z\AZdyˤ3_N : W B&Z U,o0 zB|@_k8yDM0 8%+۬@n/a<M{v_׽- '~BR#D_!N٤OyDiN9HF^xpdf =&x01_&bP,BF)e]V̇KF_#Z5U!DZ#K'poϪxXf")L<Ȉ !R ?hЏzVn`{{S Iw/z2)c.'y` )q1JQҜJWwy d.c`&cYNQA4T5tS^ERN 'RJ1Bnٱ+kSoVSNh9NMxV VYwM [FFT1ej{>R^@<-%Ye$qk +~XiKM={Qtn?)Ax/kT$UTcTm5yFkV5 r2{GV۞/ ґw8_9EdJc'hِ$આT^x~65J*d\#BĶɼd5F49a|sL[!Vڴ&)n* pU@[4l{ _ 1zRzìMS1hpx|2%jonY@H2 YPa{kTp!bk 5Ap̏ߡ١ u]2nm 1 ? 54?\BU7&{_|Qxx; 'ꪉm pZYҵ&f\ݧ#VlBX YmI }#&%W}US{š&tx17Rl@v+iT>`El%(Z@%ꔿf*b0!#JuKT1 nlդ(1vl4Γ't'oe FlQKz+ MhKU閪f r7I7>@'0m.,xLݸb6pӨe[ gxܹqѨ5r"AǣSgPp Rxb52r8h(OCl1eD{,yf/lNOl@P,BzbbI˒q L6Ey==ы,fg*7 oVe U`+Q6Ie.h/7 P7.pKIJĽw|GB_ 9vĐ# E}2I K&8 3`Y%:݆#+z(E8cZᐘjK)X~%jtp6ΰBXױbLn*yJ%D("I6leGœ,+5@>ڌVn<.lRz<? 9d3IP2"$B@Yp;#BFn+"7- EWBK66.rDmOE%|E_(0DvtgTMKYʌSZ9KR{b6a(1N0N&#2 NhXJB*E'irFWց[!]mcjpiJ}Gf{Q//\hP.GE+4bd}bA Ѹ)ol@yjj*;YI<4щ%ɑފbr2[|"$B |j#rOC">/aLB00uq%TZƛb2\|#GeR TpP9чURC7d0zi:1I3V,-D:1`8i~XwCUCRKSRl*+qޞ6M> ҬG!qJ0U$DI1r{qTۡ7y6s,_E#1%N!MZNL ޔŸ,TNE16݌nXAgĵZh5u8$X9fkYs$d_-s^Ի#6H0*Lq:x=18}yXCoPj9 `-'cQDZX;>Kd4/n+HGڼxjp3d9fMmK-\ți2~)>4E24J?/J4*Clp< ( ^X d!޿§~ot1 =] 2֘^0tư]՟K)j*N/M늩n߾obvMYSV*2o ⳵r =2">6}4Mrӗ)[ݖ#Mk U*m67Ke Jh |*uȸۇQUb,5EV8Y6kMRJ<.dEۼT1=-\m "7߂4y+݋m_19!qSރV_ZLz6wK[K7Bo8/zR3-Zt~n$ Kkҍ\9<΀ARU3 ]a&eGN^.PqXtq6Ⱥ %!Pn*DN9hGҨی.EF&MvDU5nt%}-s-WC0ʭ9`Yz"VkV^8.Pjn.^zurAA1Mm:Xϕ}Y ˬ3eқ۱O& ξ^V_*C8&ʇ,޿'{x^+fAi/&MݧPeF;mr7Qe3t;m{ʪ `Vt3b7_z6 = ꆢ#Rub$3_Uq7&5k$qXµ#!AH0`UT&,E< 0`DQmVڤRhPbEg8SɝzJR!99!Ꚉ0Ug:2('EUp"nf%YDm11N x$҆3d4 \.6 ;+A|E(oTF30ň9l{lx8!(sӍ2kq 12sǬi{e@UC_dD2*5%Hg/Am8+ fJ 8Y[#nGcHz݇Cb4F܍;ls+ y]܍1(DP@7Sʛȫ V 2Q]Dڔ0\hr 1>>WgArLQ!o${pfxpz9緯-7Y-ygCx?l0?Z $⏔1CK>%ISl0AAiƽ04Qr \ٔw B^¼gzHr_2;'0~]qJRx}RUIYJ֝  ^QЅTlcZʿv/3n˭fa8nJ&N~c[eJM4o8ll]ـ` ׫u﴿ {6HykQ֗WzNVe?.G֋?;_%-yly1g{Vo{ٳ5=*|h2e4ۘJWAVއIRIpwzF2Ȩs5iX]}(AgPWu!l>y&#`xPj~XQ!J@bx훆܌xp;${AlG %|oM4 %(?"o-VQdIX Imb =csOۊރB-Z;  ǂ;PA{Fwosbt^ \:(ؠdouңq(i%^CuCGPɄrMEs&͘)\//9k~d~?NhbkZ1脗(ch6{ ٓ8naem]GׯD8 P3W?z> @n 7Gzz/#uR< 6)C&uk~dә}jѤ0 OqQVirTוircVLVX 1޷QZ3^Oy+-ɒ~dZS^˫d3h0yH"tZXzk#<>d{(MHIiG @5uL:M寧_,;vBPƪ0a{-QD V(^GV,d9(Z{$@w]YX:V F걭SVx^5= 4T^P > V"k@>{ gu|]*5VNZTBSdr QRzw%4p(+8PGgSCU%4ŕ8TyTV[PD I۳8Mv 0W8׾?H6H6iM~}vtwÓӳ/\_BM @ >ZS1_O ё\;\=壟+H7, X6+@B=}BRn4q+$A(CEk(?t~v zykkrKS= ;Q$;I[Ʌh>}6@F/|= B/%4^= vHTUm4=QZi4c{;v5|B+}&-Gݕ潫FFz|gEʨE- q} 2-:G fRp@X]X8dϒݪۿ\U<'ITj;(0*i%QEƈZ4ԪȎ ~0tfڼRI Xc %@d 3s)a \6Y\2%1~&B)A T88 ROv@:kקk{/\jmThE( 7Tv&Ug*B.d#9o"BQRC2^crm0l^!{yxA ER>H;?o7rXB`O"A^JMxUZoiEl8d$`5{ 0`% ǗfkX{(]0XBXWF4@>Dyq/8HpAw.>4\k+4Nhs^dB"<-2q5}5g\m{T!e':rƝTW-OyиSz8ȹMZckހtjN =v;w@6q̈U|?k O[M-,ݫQC.5K-eJFj&9Zת)KI|UŘG,\y;,-&xVdc %? ʻȜ4qo5uy Ky{,~/VECX|ӍS`>ZASCB5B2<r98[]΋=wxm )VD^!Rm ؇B6NhKJ<4ŗ a14 ZP(c!KZ5 @:xH.K^jmZ3CԤK%% IΕL.0Rkh͵戨WM,N1fAl](9^]d,I[_tHa%f,5LW^n)᎜W,<'bcqVX=<}ؗxmHH Z}fYeϊDU 0C+},TQ|PH9gMIU)αOIעMfZc$ب+عvH0t, U,Zie1ްlB!NaHxde|Q :JTA":4|cF@.VҬY E쒥`dh>q}{9{SL`=/Szfz+)>Dc%[Dv~CLpp-׉w9gXaE(bj^S]xTU<qg!NУUr HCgb!ہy}{89wmѶ.D+@yAaRGAutPn U>7ŗn]x=9Z;ʎ^|&JRٱ}}`%C++L>/иɲgd{ q/(sw& 764ӋiiESuO>it +,O ˼ܿtCUOW/$n. +>.hV2Wa8b-01%|+ٌƵ|@elm᧮Ճ֪{i846 9WW#Hv 9` xK%4/\euR&tQAxr4ID@Z[a Sԭ{VEB:U^h\)lC*|IӴ#?tgCxdeƲBJ6,*SdA9ѮW+$[w:g!Nv.O}8Qy}cƭ8]MvE/4j/lYP-=#h/3WV&)wy^}h-Է[v%7@:4=JZ:Ic7{qV 1( /-J$ Ѵ[,qZ|S[rwz't>?CmBԇK5==G^ +XIMrZNc^aqtn.L+Js{Ƴ\\YxKP7ZY{617}2૫ΉIׇD=ɺok kǮ2}9XKxXa1 Lj%3ofI> $# |"(Q:x^=hd(GIU)$ B"rȴ@6D:YSEN!;D喾yO0\,ïP-2ɛB& A k@GD\ԜѸU[VC\>~wR:OYJzK|\uŒta}>\1FpW~bFKQm4b7͇<9mrTGZg`|J^{,w!\BAr!A=d)ĶmIm.HOGa֧k# >^ruY >땑1B>ߖh9N$D+R y96E{LAKe^:})G8O?8Nj (n R6T U68>%%#'cIķUVՖP>$wt%Y퇽WCvz 0=$`l<*')Yx.yϾ|vt~Qvt~ځ~>~;vߎccCdlVv`Zz jk*j0ȯ2qv~Qv~H@s t89uExv Kxf1C䛂|k'M[<&ۛ_IK- 3~aN{ SP%y^1@y?*-=/R5JaS"Nը2>00H(UDT)A0H N6ɌS1Rd]ۓib: IN:vA O:cـ.1pn1nж*Vj7ZX࿼ $Yb5+Ϭn9F!ATn hNE .hmbUR}x,99u]s։C<8ݒ2eSUSE1$BpYwEcr-\sJ2ed^fEPP *JT"y)dl@oQu7(9x˪.D-K.1P]25h5|X/_ \9}eZoC&Y+zmZX7!n5YB̼諫*i*9&2u2ɫ\Xr4Aޞ}6+Y^TvmO_n(UVuSw=>-A] J i"քL,&ƬS,kfW]kؿ̾b>QYWR֠/e'c,oH(e)YȦT6n*x-јMWFyJT,ё۞,7I_cڛ%T"ڢ>H,ΓՒ`]&Ap$o-cdBX@ 5;F7*Dbri$e8yopvF[dAN u b#=hjjGçVKr;pA*T/茋r=L]oZyUWiᢥ.ZZ]&zPEܫ3(>o Rb{0>/|C)L`XlF A)R%/^44.)ׂY|meSU*ydQU2T&cY E 6rj>^b,Gsj,ɷi3< Ձ_l\ߴD.4jvRY%,c^w?X|sQ^؋bw=޽}ɬIo{wlfǒ׿4z^Ŭ*-8"z@߈Q։.զ{_-6 2U5Ċlʌ{bޚcTfv3S۞XbCvoTKߙEb.4\ID66HOZm|`9+ϲ1lv]kI;JpvM0}}N4V >IAf|MlC`֘)p>|)}5-N9Y;5{fw,o^rih^? @9:D }l-T햋ahvͮ2pNiWlkjˤo@zAQ2έtWY'6/-3N Qde. /c7Cu_e)=H{s[,~rq98l=qhEJ9gg:T< t ?>O]CаPCN?"G -/]^:fkjK;iv~sFɊz`Йme~v~Gx.ztzJNya;Fr54̈́^͔y*82}}W YR[MRE٢U?~'c8Ȁ_cV9#VRqBw\- I%JEj󰝩JXdϷuLU*g!9o+AIWmS'Z0ngE]L{C;|gv1~d+͖0F#qC(R[X*JdR;ޫJf/kGuiGod xWƌ4`hJSq߂1^Ek+aAUPZ=,&x y̞["rBɍ֏/ҏ/Ÿ\:3Hhke1Ӟ^3#] f7 Tr6rRG}kkVR4s?+eҿT*cÅ]i|2y׻gQp|%^ik?U ) ~@oo>}~Zro5S =Vo$.uCA~Ǵ>tXjh԰?iR0{V"iy}^o`d^v;:z9} ciMp;V]<^9mBd+Qn]U(!i 5I*l )֒Eڝ_agCYnӓFi|; R ,H-r>?NC xU# ԖÒz 7}|>"EqU2$ŬS6"b`J*O-c#U+[Tb<ZfxTɱjm]s?rY oc nvVW jd^z6o91U\5:)|V@>12?{WW[T^S$SOIU[j)ɞd}@YnonmilVMy6u$MC jupS%]L-& 5eU%iK"gϹ>FqaٱR~XCL;ӺO^)xߌy6jpFHs*S*\D:}*,_}OB@ 8;i}*hgrW PkGi^%m$ۨO[HMB)o,f`O CZ"` V.#.u Ig-$deyQ$#() jd4c%zHm &AO*_Kw;v .ŔazզT]Tlq.BZ`B 7N4f( x6,T"1 㧍m/c6dG͘G >`py':HY5ϖfJFJe<]rJ $~ESéL:$V~/jrzW;&n׻d9qGcFi!CYtlW[T7uCԧah+F^>hLeQn. Pwk'Jw,ۺÒ~](;#''7z}{;i*`ų;TNs蝧ɉܲ,XIX5.17bΌN~fSS`Ծڧ2_VjߍT)h*:D+oM7tZvu@zǪn<[T_D r!89ڱ3qЃy=yWONe[iL/>"cƖb:W(>G`GRo*u 0Eu5߹T::L6٤d*%thd-ԛC(^sGk!߾:"tJ oP(($: >b:^0 {j񷹟o${$>d!c&J5,p{jM${-kN鼍YIbHt*f-tbP82S};I?3FUR\’>hW׻uc*'M5Z((%m37<^kRSQ-!Zypޅ%҅/iFAq?{*M ZIlt9&mP]G-`1c3{`w_VF=Yj%&@Ag7I|Qjd4Bݓlbrz.Kb]IV)Uj)Xh?_/5b;kz"]#$ELg+m ү#rAuh]q+V.jcNOXb\k5 K-!`qQvKdҾvjOԻ3O&cu(W_?VZ54qګ޲YQWuVsLEZ6)):bq"$ޢÂvP1İxü͘gÛ3RT&~T4NIr򙺋Qx{btd&X.V źh iNBմ(«9A9ڤJˊ|H렶Ns`9,$ '/{+?rѧZ reF&?c1*3eUHݹ]ޒ>J6vuIJ`>:9-* |~۪c1&|ΪMxNn[7:! ܹImoFҌ2븛o(z}&XMܼuhL}V\~/b&G^Rj1ajMjVFlȾԺQNzJqQF1 TypsռJ+[ DZ@1 JT|+T0+Jehշ)֬PN5q|Y/51,npůguPlVAufS^;ڹ)U k;΋v QRςSVyY6ydm {^ }^ eb+V'tyu'gꮃTwﯷ(+b?]ﻜ1.ws&%5cUL8+o Q~۠.w(RŦH54=Fb䊬U=^G5A%-='l9/4f77Ɉ8<ۥ?C[窩ޔб?}Ǎ~3k7e0($s M.I 3{f6_rEk%"@R9Wu `-g(-yTx>X&$?|a̱H{L[>!r d9oS}׻:z9F杼^G9;y<7pҽ0@Qˉ,taǛ[kX͘gR23ef NIrꪁk+uv}D.YwZmO~Iͧ_qh 3%:{I/zs49 jȅջP7I.Et?*$Aqf9恚0*%F1}R ,16$@4ɧ: S¨Kk-wY4UOBpl*PP62FlzsaIZ.;(`r6+n.. aY?rf=}bvaI?A,ܣB<ܬ҂erPεt{za79ZztTTg+^g;LXrH^*UA9ԅd0;גf^=rD L։D`1&ǑSIhYmLwA5.h@Ú'x/'U^&"a*U<)(]_ף/¨x2IΠ@$A5?r&v!G. g0oAi}&ُp+t ,]3egf:]?xg.8N6pܢ$Dwm<u>Z?A€RةNόэ>CQMa7cmc/`Xzj5[>cVmhѵAZBz'*~U,FZZ% _yL^v#~n!CMo*),{ .Kw60*D:AMM Ԟ~Y$'V#ZȠ,aFu19X?H. 9X7z)+hoq\9Xkִ}c%l+sކqM7eH=X+9b:2;nq;~||Q Shpq(n$L.nޝ2x[uT4}ёKEیy6h 0ZUW@ hC8N aI*[ wK?*P ?춃s\i%E=WR2 {1wRYɢ!f "Sɳ¸Vj,V]q7^Yg5%17-ѹ\͘gcytT-#9]NtZYN*}'ha‰>)8ZGMItҿ2wSH"||M&6rBz[_=ϿnGϯoGUN|[jWSOj#^u'IPŏ׿ʨo/[/˗wohl"Ms]C@,w|Vb^tDڗQB~{C,֙lht W%_Qgla`p{(~ c/.Tf}kZ+>S'3 X vs8Lg?M_ͷo2.)ɨoP1GҗH>7hvv?~v;f.T`Zx\݊&̛~ 5N}IȎuV(N<W`Ö_t9"ݷrR!qA[+;`֘ώs𒐝H"'6yWhBæ/z(yqvOt9 \WR܄FG+[݄}#nv?9#@;yVg]K+> &fwgst_OY]jK*c+K,fc%ۿq 1}l@OȎXV@Rp_gt"8ٔeؾY&i9 e? (Aeܸ-@kq>N⾼R`:TxJマi}S_"NGrJ2drI5Ulj`4z!4a4mL8H 6]1$`8:h32eҸmtV5)ևO_oK1QU31l, COy*e9”"d f\y)5F @֪ڲe~zT Ti5/u(P$rO `1L#xmA - ^f."!=e{5OSEt۫oO(SN[|/E{ZiZJQ,^]QXb+U .&T.<ڝEl{}gRh,԰Ӷug2eTv+m|LQ. O^w|Xpi80XM~B ǩ jL^sjTqw#l5K5ߒlX(A(5VS#l^݉]F*[0d5q}FK$؍'d -95N+3i54[Q*1he.k [UVYh3.0*zuȣG}~}xfTf1Ƭʤ4ZlTMiNxdضoHdAq;n PHa)jD;(@+-gw :(۳G7:1 ׂ7hY\6>~MJ^wk\[ao<5!v/}`=j? a u[ 9a#{:1q8v 06v~/o LZI#9cՀidf>z꼤 Jy*-/1&NZaz/SaZUEq0}l_n*fF>ՋW۱ J.f:$E,-x֎.=$Z@)8aí1:(9&I.PQ0Dz,(| 26sƥ`@͙FWRBfʥ$J.`6O3_e5Cl`R!p>>%eƱf'4qsaqc^嵭Hjfa&Q0sPM~,Ab`\hj]wfвgu$ox'iYri.uM0SآfPUoĤNZ{w(ͅ᰿._^2zUfS{jZ'ʖP/f4rB 4Uޔ$+a7̈bHW btD/ 9两JH-Fg$±,)4ha Y!pWI#s]$($#G LMaI AL-;*&LS )81P[b/cGS .jiX{lB ch7jbd~Nt̾- f c@TKsεtLF\AH7)7GOw:l{DH3tN~GT&+ly_vlwOS#`0]0ׄ7?>?(|кvJ,VIb^LvCCBA8K v%V%ES*:[±L,YHxt=p7IVq1m&ZВ!POh~ez֬ÖC62Ms|5u4i <&qt=݋q晁 f*6$kߕTI'hG2\' AC;?b?ixڪ4rXDt}~Z?5GtjnW9' fkeO紞(YiAU6wt4xGbwcGLjq@%3qHFwC9Ľ$g- S*ڵCl`Q|tqlGP`P=\n%9S"E>^菷@ӯCa8=}FQhm v7ݸ}QD?HPOsMsø6Y磦E6ΔptxQ:q;ٰ l3h`ewry$Zqy n): |TVl#bגӱY kNڢjٻtlu?cwc 8CGu^!"Á֠ȷNנȗ[@!<qs^d Aye |A)(~Ԇdz6` AGL_tQ9Îvc:sw<@"$3Q N(G58L3Dn Yy,QW:F(3,ƕQt`ξz֐~v \ lDz] a&9>Kqh.$7,+9eypJɠ|-UF)WNI縳s`%_߂rջl81@T#LPJa^/D""Ev Gg[Q㸁f@h,ЭI9 kQ.o{T޿^ׇϒ65K^)ۓJ̞Bn m}$\}kl 7!_z ْ33sJFЙVt;jQu7; Q҆Ƈ 5ޡޫ{.:eG=R-{|G"Hc=?n<z) ye$ң;9>|$V mkX{ˋW@StG rW''I ta5,鋣Sxף;dt6uwcs5Gh*q,|1b`%3lk?k6!wj8ǔf%#9~$B:8N3dnPS?{D~TfjxL<%7 it3 =x-u#um9ȟWR)WIG+>{&ス}hݔU5Z0v|ǭ|z2TY^O6W>UŲ!}h|>^5 \VIVffFOSi?|m./ `jhJYd7m-8B/tzU_W ]ěz/Z\ jstX ]w{ƃZV1PX16?:'CsQ fw{^al&c4f6cgj݋˿ѷzo1B>]4> as\_~gG?׋z[z^a_^Q'^qj%<"m'7j ɸ9_2렂gBF˫zMP_(U3vIC3H&~}n>_1J7̏}͒M+6+A@XM pSCJGIvgKqE.%K ˙sfλ[pBQSY r1Q3-9˜R 3<[1G)>]8ls,ߍ*̹_1y+޽heINWr↪GF19]9X/''051:9]麆1pˠ48 X-HaA*$<{\\+x"|2s2y@ =eEi> Jn]s䫳c 3*%P7S {6,:ij>9hRǠ,4ee`JLj}U铵3 cN q<*v Q,dl5Yʯ` H8U洃$eTX Ӏ_/^af)ߦ>y꒿/mVPNM'lP@@FqOGIo{6(Tv*$'|Ots`'2OqD.]Ʃcb'ݼF1ML*@7װbb’A(nt󕢏cԟ4zq"VHKRF U`MbLH/±$ɋj|2AMHjH'/9_@,4,EJH'$ JavBAYwCOP,$4OoJ)FXv퀍RcvV,"S(%djɧr)HcO>"ŞlD;"`; y)¢HE0*||!)ءp*^82e0θB%T #FQjg*_sG'LYVηðEVo=BXb\ַke>YA/|bVv{ooFUO?Scso88P" QL^h$fiC׀zōnp&A,؜)L565C5s YJ5 SZI Ɣ@>'WIMjn|7$ޟ5WԊ:d+./kVm^xۨJg!!G◆iK~Xl0㯙} ~廈e.bYXZmr|.7 "(/,y!A\Y@x)L+D=׽j ?cˇýnc!%|_VvOŜžOZ͇evkg߁B1E2t'p?˓+A< JsLdyZ-f%h[Q-*,K9J lՈ7Nى< !z`ż1=B"'`sO`)LEmG/a|;[H.)SR)gc{ic5uXAܠ*kYjŌٿ֜4Uazczc{8KG/YyM9&8ء>fʃql7fXaϩ9͕qKj =BK$qEP ?%ۂgmA$@{ 8*- T뚸J%VR\eҁ$L tׁ$TcҁXԨqk'Lʤ=#(A< v0ڨ,9Jb*_sI&)(AZ1b %U#wo޼HT Z-[O:DLj!uFhHٍhIr6iMZ%gHmסn?( pۻO{wkc٬Ou2c l%b gעhQ%a4ExԖqC zP:si,Zb/dW߾/~7ݩ9D%cx%2?g(G$c2@V!d+4 UƝz]!pkoԚb!-%Sό+W%Q/f,Z1W.Rwmk: |F?H/H^sBeuH &R$3M d[+8]Gyseg.q.@-e̱~|7m^r̀4$cLK21t H: Kr#Fط#.8QuM#I~0ބIA+)wPYKzmM`q%;_|Mn1:~mbOl;o&B1][>LQ(sQO?hhOKN\<ߗY'5-p#U7: J]Ŋ% Ĕ!0TxB Eb&+f|#LBPLx=\fʇYk[#K+JEz5axі4WwߗFirHl"ٞ l^wxb]5QzLrAIY| rX1E VKNDW >:@p+7:a8Y:Im=MGaeFdJ1žqx|5Oy_IT[t^ sCոRUM r?-WGlwj,uTZ'yOfgucO :eHa_lzEOYw_,9@d2|kfƠQXA:t^CEyQYL"S9zjsFeC(Wʐ瞺`8e h.A L縚w?ST5^N4+J%bh?W}͟9 PXv[OqVTiI/%4AiRhNRў$8)ŀɟT@5?-LϯoʗL:.Ř =&(zc&cd"ӱW*o]Ⱥ$Bm+gVwtКTpEтO"FY7r A yg1u4\e//Gsh]>Y]Dr%9)nd$Jۇq}mD) >}aOXvgvK`D˯b #ߜv0dq#J}/?D~l`FW^ʷi\(wF1VmeK-;}i̛ouPNb;g,6G(s;|c:zB? Lq/7Oݳ1t՞_62Nɿ` a* BFd/5}1M1-vdYsW4Y:9CJ,pT5eN)a y|ñi]m!D!ǒ`|l&etQ/Z^YQ;y~WKiDeXKp0}ӏ{m r؃{bt+\U2u%c݈5{IZ?躐}DѮ?}fxLrMEwx||78ə}7d;zy%FHdKkVs H:&k+P{ز2(C"5+Y}y`au3;^IDcgLZWk3n ^}cY);\b >XDtϝ4mP`!as݀)zN-߻iکo' tةre@r}_~p?VpzQ$BCNfg,ȯP-S:a!ThR>rN8Vs*%b"q8{ D9fZJ}|-ş;%=ȉ˻j+;9 Lٶ_5LYə`uLؠi$2EƇH R:ȭNZYJ \.xJ6_xŪf!0e!fŌ1kE/AѴ1 }ʟï?Qqc].M[dGU60A N EzNb(Qe߷fu4 I\)i֚ȉ]R37))h^BsV,J|4vvTǣ#UAݫW97.8RRMNj&;X,mCs{VV ^3;b rYk o r1-mϮr㫌Bo# =TK+%==Og/jSOL xZ+z2x +u[lJ@@gI>84PA'7v=;Ŧ(YxFϕQ&4zu),vd,\?ڛue*V'{v2goZ\U\'|YzJX~{㼵j]Y:<98[<&R"Ct 8#H6xT-*6VO%RG[b{ "qDŽkT@h_1$:QI-? F &# .x4ڀ+p; Om6𶈄6+S-0{ZFSB9]MV\/.c{B=JEK*1^8- [;U]s'NdQZd(r4O4Jt'&I&a,RԆb4&HG d0j\uU~wRJӯ$6a4AOS I:>LH 'ny8 I)`޵9 T6qgChgo.j%7{ZP1lV.h|^Z +G.]I#(U'ɍ/gыg(.8ߎBy,W_2[stOdYA97Na Ht8dÆ3uXYqc kW!,n̈vY Əg!Dw|mdIsoDJ#_5ݥ!=*x:=fSeC^3A:HBXB')(.`vZ㜓 @,2\83B1q4d)\2)SAULNLmrGbco`oO/LC ]iE8BztIS)y``ձUǽG0ІubuA hP@hH[<=c"hڳJ >B30,]Z8@̃ZQtca8ɖkׂC54jCJ޳ jF([ 7IXZ`Gr|&Pm"0Qt5sUOz/bLXWOWو\:8GF8h3 4A LcӒ~}|mӡ"{Tw[7WR1`=z_~S8{Tbv)Pn/Yf'/ /k9E?i1Y(7gKegލvz0\\OQNRaͷa6r q)52A+xJ:"a5#L1Ұ |{h 4+/{pƋdM1&((>oog`p2^Id]V~2oo QھLPVCӲMYt6ƀwyl 0飻Z%;Q᧻^RkX' /s]n^c^<8Jt6'oG#/-Aş#O?wFv:O_hoyV&iΨ a5F ʦ?s c aEķx]ŕkng+7鷧~! gYwxuܵγJS,S|3{WL]KOx Vd5MM}n o_4XyۮC.Ќ{chn+' ""F(":l^to>7ioݞMfb( '(Ǩ([Ez(bU}s2d_ݐ};!  aS1'Av/Ku=v&m!byB!s =V<U>8>BTJ=,)%!cYQ!15i.SA1XE ;F3r9WkjEnpuubP}F{ q"'z\haz`grrM5qU57PBjBz*3*)Ϭہp< }hdJM΢=^ |fx=8-"K͐Õ|l4hF?%l*Ɯ'LsU o'޿/<b^(W>K  %P8!7^/zru@8NqS4t37o= ]R&,/~hv_D9_Msi-m~~P~ؽw},5N4 O1υ ^s&S,i gNԄ8ƌQUjŧI v#&k媒蝹ݷ=4>,qp2ذɏ\^qgFj{r+~ *ۜ _ wa%IOǣ\urU6EpR]N޷vVA.u(Lqٹ$l$"OljJ~d:==9!u8՜˹n?SP\Lͷ,cVEy0V#C^ȳu0F$9΂p|S{-il s2$ζ.WH&Ie̱7ۊI]g&ur4#KYYΆS CPegYwh|^Xz1j,dp39?\6L2'P_h{n1"͝fRWH z'[D~sfWmfma2jn ؘ:a8H Il'T_Tw0gRq?8mp|LٞnFaXȞ0sPkb78h:@ o"ZWFKwuw{ [ yl$ͻ,zwey?_Y;GPOGCXݯNպQ_/gYnxVm8Oc$EZK,D$p >P))ND%8y҆az/@X٣ݼ-(uz:n+e`D9w㘯ˁ7]nzikS)`݂Q&\3|ωFP \_j.׿˽?˹g9,ޟUsW~F'GljLɌ4g'gB )Sp-'3T9_R+'J4?&SSm(.rZ$Um Hw-Aڇ1QqBw)i Zy ̅ƞDQDbpLl#Zj!_% 5 ,RKTTZPeV` 쓑c(qA S4C {tl#(E'EmAzvBM=WE#6 gɭ2dz@Rt\_&0Xqg)rD(9 Sk >qjMFRT *"RČքy4iZo28hKXP)JKݝh4~`I.]3%$)u@Ja:]oS i!AB|RAIVaYB]mHM I'pE30]$kF(DpT@6V̞0Ϲ;ewx.Qv;<:aU a}Rsyy&t9bnkɠS>$8y J8 VbcI<9O, O `X7xe= o-%_;I`QHX.!BTpN0%1S0r|`sC X10f֨QfBXg<« 7QoN?V>..n+nHX٬#eF3,l@>)L'R38caHQ1 J[\l0*NF p-~p_6ÌZI)n ͱ$}oPNH"97 `Dqʄ|5^ֲ'xz^rYNx=&V 4 )a#cN'QeU c[I#TZALZ?DNaSUiLk@-YdR)4?WH/;Z|oɻXGN| !M_3s3Ί^Ǘ'ʗ@j8-`φv4 l5.P` vf K2Fg)G'rdO]4 S&`B(k|ƴ ]L^3;YyVPʁnɔ, LD&&/=-5w˛TkM"XA HSY娗E\M/Ưv-RZ?ڣkHƉn,2V˂25"a̿θˆxuG#ȲlGk&E1"HGN(%``Z B:GE>T%֒k]J7}{BaEGhB(䷴+-a!qS^"y`q5nI T,NzA_\%lMͩzHwR5C#1Jɟ. 'BK(ex2UPCZyK7F`Ąc4!Bֱ8zXE0cP+,H-s4R(w)LrTl8pfӹdn ]7EU0jbHȫ~rd4b;3Td~=sj-F03(?N&M &X0=9J7]Fd6BSa Y (c²L9$5X:ʓOhy,Ž6H;u, LUKCiu@$zzX=̇|v9⴪>L0[{ vq́R f\#!Ʒv5Nqf[8dȵB=FIw I@]r"$= zNw)DLN>7 %~_^{WqukCQe8y<^Ѹn쟜 }\ XQ7Zwmɶ}dZvK393O}M׭%i0[T:*A V0wa!`b}z@d<C?t9W)p73{E#voP1Ze``<0)Ec˕]Zv,c`pgitl!HEk쭁fRМcw7 i&p: >Bmd^1`4?n&.ir|~~]ZnDgL0#ͅ_cV2a.$aJt[qVq#(q Z Jg..a!gJcqʇ\ G()fxd8 q"^Ysa q;վp=mL דIzԊ=`Gz"ur=Ѕuڼ^k_] hP煄A6# sc{]BH+G_Q%^Er I-MU\3"0u u'"j34w{hu õCW"ٷ=to,~CZ 4j=x&x];_*{ch+Y(~8 <vH /6|e;d챈b|F F8rQ1rf=l%aքQs%`_c-2.,AZЂHJ M\w:B"+:i19YrO>|#! 40DP+PBi^ S@D̑<5L@)h^ɳK I%꼖.'Κqr'/s;FB6ELA WY,Jo%C:KPܖdPLu$ aݔ9Y*0O) SH%d"7VR a5q5xG`KYz!.?P&גq)c Aqq 7Zn= s6W57<OKx6]Av6Ns(CaFt $jqt6fnFk!pp38}vZd){CT֦XFoJ-0U&E!Rǟ5BT0k~'"(;k-|P] )`Q'WkjC4,DiZQ+9(*pjyUDb|x(62Dd$P#>5(CFbGv ݇]i2Dd(~v,;Vޘ3aJ~ssC$8.~M.}V\2sotآF%.RJdvvz0^FCcWO\zަt=߾,̪KJ۰^ܫ\c3)>3B2s pQ}mL <;&JˉgԜ\fX"EַLqՅ>wwY?=iwF]0J vb>ˀ*0DϨ;X3)(t3`bH2ȂיafGjJ7ep; +Et.kCR0/0)!@ )jOqZˋ~OzYι+ᶈߪ#.NڰpRuJRSX*D7d|N;BbF3/ q1*ފTHG׺6Ru'r53]ws)5go/gZބ BH-$X`)FaE d[!y#df*KЃ01_f`3i.*o|rq9ihPl& c5fYwnF lƖl`nEٌ^4JfbiS? ;'lGLhkY.ʦ\)S]۠U7dw/EɢўdfK>$wG5D:1qpkJd' .)!x/ظH7@e&ۤkhUq䛗O ?D B+ԔCYdXTj2KMiƘ|2Km4L4 Yj#M,헉;/tQghVZUʇ[g]'oT9bUŌOn1c,͓੗p`kb~0벲o}?nsB'Yٟ:vS;Z3dn0qsΦl28d@yj@>}(sX٪,9ha%O? q~w|4 Ɠ : 0j5 }s~?+trO5z:O2PB |r9rz| [A-TCg/*}S= qa6 oD6KUNj: џ7O^/JP+"̸Z d4[tqˇb*SB8\3e&،ki V=~ ` lwX@~ԬbMPBʞ YjXe w`ft} ]q[UzKe؃ՓI؎KHb ;ແ$j鞎ay#];sO05!8:8^Lx_QL?ە1+ٖ>n<1Ϩy 6\#l{2gZ9jϽ6w5h㚧B[kh1?~2.fx5fj#qȼk7u6J^Zg,z T/u8JSlŇ"Z)_p]CwX`ey;qX}!k,!ŵ=Dդj/.[,"8Q,.HvGMiMIU}:U>R{$R+3f+4\..rAwgCZaVV+JQV#L+J)#קOI +yf۩ZOnpW5Տ/97܇BDJuNUzDUp__e@f6ÑG~. mO Itv$yl E])ZDSH|x:P&400a[K !æmcYν'@}Q|m':ε+jI%&)N*\ B1t2\%(xU(ڦ!L@+,rx*x-\0[Y6khY8r2}+s м>"b(D2}+Ԓ4-_Q%ikGûW/*pr,A)D/]_O~?2a Ju@Gt%gDӧ[?ɟ!6mf!j}9>1RLdS7 X R :ك^{MqΟ/ibaE"Ll":4 gx1?bf;tcᐹv"n*-bĜ\1 G ZqSu8OdA76x[\q5 ?tXvB`Glʥ Z-YΠ(΄fcf%(AWj|]F)}Ң,F((̄$|5k_4f5Xl$ǂH# `k.TwNܓ")?۩!,hC9E(kJ6᡼)m4| HZe`:}?L/T,e|z`wA[Q -:*h .δa~mRIQolzD50d:*QP2M#F}8[K1UL`I<\Ez\'¬\MiSR*bΰ(IXƾC1Я(1h6Rh*ާU:fnY25E-W6]5|zv{q<ܥݜ꥿{U/-!Za"֟*^@D_.!P @H8(15{܉\bժ2uRE+&2# n2O'ɿ-I;reIRf->;-ƮDPf2# E G)HIhedfY4%TmѴJұJ[ןh=S Frzx]@-^4ܘ !aXUa! Ur2 4B&H?(tJx8F1/[gx^ƺ"r8޳;gEӳsuvryѷgGӵcuDUpl`וc Fyv^Yd?6>륙!(ahݼzś7/0ۉ,nb8xF_;<ϿKXr~d;:yNjj׳ܤ.m?y9I 4~4eD[<ŖbHL^Log72~3%j=_rq/XKgZDA:,/TTs_N۸ك)SÎ=]2pV$;YcLJ0[T;pljdNJ a:G0Ӊa*`D`0-<-ű%( 0]s-b *k'7vN'dpI_Dc<;d)ٵۚqi0xrg/RO'[Wjٍ}yρ8@!*N$;Hw႘$]ǁگʹo¸4IhF:|RvIJgdɐwG } ܉L3h )FڕvO*,q)2i42٠Ⴇ=Ֆ7\f0N4T ALiÉ9΅"3w8*3k \tkHS.T*L_pX"qF~g( M}-pm}jfpy{n tnP^8Bz<{6՛8*Zhi;SMBѯ@I5sWUP`ן?j}Z+WmaL 2ՐF`py>`aǔFB2JmQg[iQ ,{%Js~]K5)JԱxrUyƭSUpG#2!w|sʰfC;U=NiXUXngc;VcUXm+7 H\[-(Tc1az3b1/oUf* ydmIy D?R|RD[eSoEVdoE׷&B~<cR#OyX -Jsd&$NgGa9nS5Tu{nJAY2o9:ksOyc5r*!` 8ױ0ƞGS#7MtyKlb}Cfyhl>1$T`T-|k j*32ԋbcLz~aA&)ɈȴMG&-[\*@#ếvcd-ޒÒdoIdnPc.(=#gox#PI GfJJqNgmK1fӃe>26'ygzB1~%Pd6f\"0惿^ۿ~6 Nt 3( "0# N1WD&`9`PG'pbz\jU"Yt=L{֙3eipm!iġVa2URCa:|``F|BX2},We1mvs}Ή\0E]UoH̨T ۀ6.Z:5.D_1<0K%$^J*2Z`B(XRfZI*-ZN5zz_7GQ\:\‚GqF ՐR u\w),%GTt [ŽTe[5}1FWx{?1Ec !s9Z9OF`&ǎpK&ZFn~qC4἞'31:%XS.c:27??E4n ᷋*t2mcⵊrN~jmBSd?GyB'{{RC+mF]TK>-n &p]ZoN~'w_$''R_2?mBDxz+F`< |%ht6_R !%'߹1pv?ߍF`ܒE 6mſȀae[wN.5㤂6gs.+Qc!Z|6v`Hg-Q6?-7(PvFĢ4jO,I6aS4Cfi'̢CQ~z p8s*#īa]<1Ucdj-<1Uf'mG$W ^KNԻԪ?{Ƒ_XP~ȇ{86Hpɗ ~ZʦLRvW={Hv(5ztuUuׯ|/.%J0ыj)nXO%z}+6u?:ihREKjP߇ ~H@e9쥴h9}&_{_7PniZQզ5aD7!uuIG{/̖O/ݍhmݵX_1zomeUA'.,,/)bdafo}ZmWHfnܤ_n<^T]X6DICr-)n^׶ ;{X#` JP 0Mm`ay}#CK   dO*́(Gw^K >bg1:`K]`o=1Hm:jY4,+i$\GX~Te,VHجa6D?aXZ$"v2v2i<С q/fE툞bB$nLT;)PXK$a&D"  %2eιv˿9ݸ6K'6KFNf0X|5[,cf>}={luHeuHOT%"d8~(>]s}O!z@y|xC8!Й!hN}~_  #/e5|+NŠDJ`̩GD4DlϝX(8WyW\Lc.wSw4q,5d=;ނ1$ǏKՙQiFSj]"u: O4{L +VVLT9ScXM_3Is].gDz `j6rTSJ49d`(("FwH Ь)ѸпNP2z"i}_o=K4-G]sȚ.n]xv9ބ?6iMkms_׳nu:}BYwQ1&R2T{"_5?2DIu]ƭI@5CTOk,Uh_Vo*n<U r1T [1V\#Nc DXպJE&Pמ5 Fd&&BxbJ#p gXt'H*{,@_%뺶$X^Ou!:y.D3Q![fr9~ENv$43kQN}[D[@":(s0 .2}FϞV SտP=IOZ>4I i҂إ]b4SxCy;oy{-" }hբ <Ŝ{ S}Jk6z Jb7qd3:1[''.'1Uz%DUB"l%gLQ ֿ#*L*&H"u DS!|"_1+ꊑLe#I,19)ngO RxM* hH\9Zr${^HyƧMHP2yBh{.?_:T{Ƹ弣{| 9mv9yLzz]R,K흹FO$8tnv.nd2L"2A8/G7A* `T".NrIQNirr+D-N[]L]0[CMw- Moghr-ӪU?ĵex5l0{Ci=ˌ9Ie"#m4:v*D[:|cw{.g.`bd J9B.0l2N!œSBiA`aSc5^ ɍ2$uatp}CB6J:+&5 6{A &ކE:}%|CFX~- bNnA pd "+%fǭ0Jgks^]ik>PG eLMSW@k+ y5pVQ rfb8 7-lU[e`C''\k`:a8gj%8bcd`'a[Ao-bf,.KD::5n݋qE_^Wr]:ws(@%'/\nlՑ5=m*2eJ c " v^)p ʌx$fDZJhʉ"ޚHCPe1)+yM:0xx8۰9;.l=kACR4g-K$q-А\E t o9KKFt!EP+1*`归1AA5R S :"J̙[搊BVqpjNzϮ4~c(?죄C-C(+假=Ag b(;ߩTM5N)MΙ"62!`o>Y9'Bq?yofѾYy+mQ2Uѳ5FjN#X\iܕSWKr{W߳Bt"MHE0l㽔삺ӃKq4z*&Q%zW .џيBVp13 F)9+c> CcaDbkc*p$8Tji0k%1X(ګ )쯄hR äV9k x(`J{-C6Y]:` cd9d,? (R rѤ#SN :ё9Z3Si9" i2.\gq$!dc‴#Vr:Wb$v:lW:8i)kޫ[[38*3̱(9)8ΙcLr4ل4{çD@yy]dJa/ٸۿ#+h"XN"D)tc'$1S* Ÿ:I3$2-9Q(,4::e+*A`u@K^F]金4'1 a4o+K#5;QϞ":8B4 X[XOA)M7;!Զ$Z\IhKxsv+B/vϐ=Xrug]aPZM3Ol| z@, j#J0O^?,R۝m!IիB+bP8_O,(, Fx D)D?-`ʷw)o O_qv0)2zuqZ0kPg]d,+n,|/zc#6<ܡ(-T'6ż0D+ǜ%'(ѨN5@H ECPy0~_iH;I kR+6bĢ+ll8qzn,Ow連N(i X|ep{bn^gߦ2)szf C|,GZ|n^k F)[h_;x`֧xg<[_Ԁ_nqCW,a㫔Vlrvd[j̑ޞ1^$ivŬ_X?Ǯ[PpFi0y#1P FmSGb|˯ qY3 ٰp?f'4*dELߐ]xɟ%:jkb=51T pN "cV#X 5+,A^؈6 Ek@m4.5ihy+b рH1KVxik^@gqF]x[EZԕvGj1Ż&ϯc^=(n﮼9Qh_坪{ ƐQyG5*_yG)tHC]-GX:LHbn5P;I3H]P[?{8n$@_n,4=,wl._l6FE{FrkG"=c[F`gfX/V~ɃE@i3DO =RѡD}S+Z#=2,`ACq%U{kšh <٢wO۞w;zyj;Xg 1ك䒩 Cep=`+-I C3dd9d"8YC9@8g=miwx =Lf'cg/GLMĘʡTLh01&h-3KPτ>gVҞyi/u<{ fԳC_u;æay~/pC֣3%q씚h~OHtSK%4S3Px/c򒎘HYgxI8qƎ̙hh2vXi21^DP&2P${RvyTi9@Jn/vݶ :Upׄ#e ȏnūuWWիү⌍O^.B(6`Ձ?!`F:;S#汨DԊ 98O }d ă}@ϋ?˫#ݶhW`?5[ե)^"_,|hF3kw/˛% kTH-{AEh傯נ&c5-\.zg.$Qv52شk.6Cg_/7uE]+JG$Ub"ǻ7j `'D}W ]m63 ##Xif֤R#9kRk߸`lT%x]UVDTqZhиT؊Bf[= tf$&!+%WS`bW^kǝdoEWXK+ڪJ(+B銲<0MCɝ$ysR[ڧNgym[ Y%tI>N'|1yWv5i<z~ cpx;m .eQJ[P) &AuN8+\aG.m' n=O^%Pʣ/gX&%zdZN䁌=4ǬpB-uMƉ[- `LƉ(MNdiّa.v3x2ʸX]'"O3m،aȁ)3f -ؓ="! GFk!FSO !G SVCpNƎ~-{{FN dvI!~A,;kؿy# d<$o@ <+Gyhs7=G*f:kr6LuK,{Ӷ\j^^ 1FKҚpCс@<}st ,54 ݙi> HPll&n{0ƌצ\5 9o<aqeotpwkfA\ <p}cu jI]"P- ϻn9Wjb楩\Ek?mtLxϛ͟bYgswU/ov՟i/nX l<": ݜ vچQn&6~${}q署k.p,t]tX=IBr-)nޒh7 X9hRN9ށٴ[1m EH{MVJ)]#G2ٴ[1m EtLO?^x7.7>n~[֗W)eCW7" f&#wGjm כ |&y_Ff̧M!5Θs #;«nͯ#mz22Q6lK״Y-r}vww۟_7e?ꎞ,dKlJ5\ӳs8@q9.cYnv.0nыY:l+!O\\Y:e6{G.O?AC1fw&%92qѪDp1o pE"bգkzUZ_нj^  b@Ȥ 0LY3mM pVp669ܗ\X` BJ:Ԏ%NH^*"hfkwX!NGdrD*e&H"*b?H,;SH)c$e&_sܥ`)%&mDSp#!K#d{\,1F^Fy"bTd$9D.8`6Efsq}:ŠPZ B,^i) }ZZa)oKHD%Y)&;^mIQ_Ԣ؅;tG#?Oqq䖟ԚZ8+C/C@јBE4)σEވI$5;)uuY}Z'R](V6: *[N0!mC[|É\+}zwʞNٓN؂1VD:G9SIu:f"\D;?{N+q/9ɦ1_ ^*O/$愯-> ER .1CZ R;reXP)0 1 %AU4H'kMC T6#%$]Zr1p J+'dp=c Cq"3TfTT2G;_:6 | [}(4,Dp5zެ|:p1pH8nŽI.πW1u.S+,h;U{~x9w=Bi j 5RV .M²P#PVeU@5R4Bo3N1R gUt{D:Ց[vLjqShZ ,DJDžQR+W*ih1_חs0%/S$x2۝B 2[&9%/ܓ9o}zv.1V.ʢecHJ=9# frd o<3SOJZDAOrj3\YgDݓS;ȓS]=kȮAXj*oIV[F;]i/J20' /jWy4ā5řPKƄ9DC&06]a3D>]9eytELPXn 8!PI]U)<2@oƬrDY)ൢ;?2XF u xk- {})EjpfVH8`J]}PK9><vҫ~.IOY d׈XʖY- r  tX$'֪ AJ! :Xt†uj %\XJyu+S͛:>J'Tp1)/!eG >pS1K!DPh+)KJick g]UWB2+|*)/^-'bGÓt4`bV$ygh<S*EA]řwZ>躲A ,Z(`Lfu<` 0)ă$o]CNA iXlj&hjW`l<($>H`DQ;mԤaFJ) RP3#txj|)cHW.E2I̜>nn}ڭ)SFv;yɎM݊MncHW.E2e{Mwj#N9I_ ȚM|(k+M\hl$+=DӖXɰ wޅ-u]WBwω#I_QvֲwَؙR&-D?K 0€$ۀ*󗯪p yqI`{4$, YYC ^-A/WV!7, ؑ<i)md*1"Nr&_N˘Q3߈Pb97L\3sFSU!sē$]EE:dnm Rk-2LÊRB G==P0j!DLh !#8 q P=#$[ͼMjA a㔎9h?FQk<ƈ0$ܢf%㲝Ot6ydi'M4 9$m>ӫ誊>qrJszm *7_=fjZK&#vhiV! 7<Qfy\C*G,Z>}=#M%;REWoIJMw!Sˍ@|3X܆ywm;_{-\T5*mSc5{_ܜ pżs%y*oL~xS WH rq8ʅc2C@RǦp囵mq6}A@BN+;XJs\|V Z;T{8N.nW}*[Nn*zլjQ\u[zSM`m~7ZӶ&F-̽m+խao3[mX<.(4?0쳂 BmF' tL{ߊVeϜV7ue_}FYquȗqN 1q0J{I=fc*k$xX4Za2]]gc)1J⢄ R"iM6\%0(%$&D|E7Sd. [n ZQ$>H+c@ho,8֔lpO=yD$_M|vq/V"ˎ&eg=LAYA$Qky)4UuХ؂!s 9]ͫ7Řw;/b9k`QZ胂; !!a01G\q,u G[t~LH I#0o2YC!UDqoBPh~k_Zg>e@o|BA̓OH0w>=Nܧܫǰ"ƉMH+9C5Ei5VUOmS>`qpH|B! S%BV֩XL5ʫxqYW]Qȫx] %j́hYVG3N&K90a l 2 ` h2Kˏ9;K5":Ă$ܼ2kSsZhС?>%Q3hU7< AHʓTM*oM^E>Q&}3oMMstQS |ռY]*x fK=aY4ҦUdj ^iw.mx?~jyBP .Et8|%9}oI¢cMQkJu`jB>9$"!*N#A=8L)6%EDi;:/TҾM\ .5EسY:Y Raڟ5@i|E?!",PLP,P4J5vdT Z0Ac2da zjJܽ!m EZqy yD~ԯ+=P@MX=B8;Q|aTG%!T!>2*wv`&TgwB I{ٖڜAvΈl/#2K:?O#>\GP1}}e7]Z,(2[*T\8g3y\?2jSgz믝;jy9Gsx6oQĴ*Vq/fy@Ñ$"fqֲWRDa.vnlWKOKvu=-x3yt+  o/O6Xcevb) K܀Tdī1m˳(▿[-Q}VE3~ zE[Lzf;)7;3j3(.)<3 FqΌQQ G9 &~orLߺGƾ/GOee~W_pU:lx]{>3K?F}@wo}ZA^I;GED Q0LGwy3()5 %buNBlFƎIj;WA]{EU;\͞]x&ILJ̢% 8ĒF F,1Bn!6*.Rc?ΒU4 @ -‹ Lw?=AӒ!L΍@5{x R~b>^ gЎ܇Ej̓HEF 1ZleF9-DGD%`) VP8(aE`qG e3b.^,26~augfo0q+E0.AD3&]Fryڹ`$.I_f(&_/ҋ`07bOkܸ "{Я.Fq/n4Ls@}᳗ wʂL޿)~&=31K)mWd+]}:mbyog/_~W_N/;>E`Jf';umO~3>6:l|,b>}|OIxwDC;{3`26\>U=qy.Onhƿ::{x RdN\V ~O/"sgCYXV'MIG=ZJ&@=5>\O>_^~g8I+G7H4hMEk=NJ5X'}ϧp?9?K g'Cvhz83׶XoKfƚkffƚkfk>SWѕ d_h>t<3LG3t4LG3}Bt. WrhȁҠTy,흽9z'jD>PCgf[mRjCy޴ق;ܟh#Պtԙk-u-us{n[n}9_mfPdu)G[Bebn L.9wvt*jQA)wԺ-oO3BG+& WsԁsW-V>:Wyv6 jwu&2,Vwb]Sp`æ9#Y66:}*fDiFfDiFnd%c.?x3t5C ]3t5C ]\J,򛎈>&kN[=ҿCo?ׂ]}{mu.6pֿvpztpg](jh;FȻm/~}rn~wAaCΛ.jmWwYB|+H3|C1$%Pl 'i$N)fB%V{EZ2^8Is׍gfw¤t`R`GCON/6Fug㍦:֏{fjL\jGo~-b#{>EQ3TL:fǑ6ժJnk0~ޝ1.MNh@]oت=‡W a%厮}!Fu,gV][Jy6urS|3Im16LscFo-MӵJCȐ*$%`,Ȫ*c㝈T{Ņ__P5R;v"[cf߳I~Po`·ݧ,Br|JBKir PQN[9/yΔЪɮFFf1eTAIٳC2kTuՔb,q>Z>t ӇGt*&g]dTJ86>&#)SrQn'*7 yl^w&4Cx Zϧkg_-DEZg10l\z7gپ臁G׏]|y)reyx/_Å838>̛8.~>Md%q6{Z"GDc}I2!"^W:UgK W^wv߿/M2,n-]a)>:Cy||./~Zo>\ԥT}&@-U[*}56[>_*+%Bdsw6\w lIqZyqؼ7fJ*Mm9P'.z'h;ǜO:;{ӳJ%}Kı P5-D w:+gIdRtތ r17H5 UC*H42b"]ivĭZܢXqZHvuȔo֢קd|6PL=t$)i0SH{puI5MG[CΨ #Lm$j5`.AL^ 7.Xf.y;t{]בna` FΪJƐjdڶǵGA2bMÄ(3Y%#9H)SVq OՠԸrPYLz2zݹڊ˓@tTrxY8$:kBث%bQrQ5J %24u]cAq6ޣ2z. F"w/N!y (=i+.4y>ʪɛk(Qk-Tޣvv`6J˻q^;'wڄܔ*SD"@#H'nT`pDRvQ] sQcJ#OY[Z!Օgr26zs ̍1gSB |n* !0gݠTo`ͬ&=4FFhEw\BB4ը&@ؓ36:p3SK(PYFʬL5S58CPZnAw|8Fg(& 3Yh]9F7q%.#p/./ŝb+-#ѷ2f,: &p-7#Sx ,HkR0@ L,"O^N@ *"@ Ur`Yo3ܳuXI10~Z>@α`By]6CuZL&4LPLt_;DţwCe&撆YwJ@v D^~v>N04PS)4RQb(h66;Bmc!qqcxh=XeP`PBGWP2w5GڪlvSJ&~>Z R Ҳ4_MHE*&PQ,Q c*1Z3E5& r72ƍu8:A: JA uE\6œΪEy8@ ȾȚ` è hRϒ IijvtTCFGRD R@6q8kENT=Z_sm>;/@` qO`@/>XJ(q;Q)͠,@ rLB 0w$ T9 FC h8B.F\# L-]yxQIF;%z U)AW!et'c /bvaxv]δփ;+anzlP zcBBOBᜡm#A /L=Ӗ8R2Zfz TQAUbXAAFv aXeiRc {V2 ɍ c*zX!fT4n,"&Ђ1Y WL:[!dPx+CGe@Epx`|taRa:e(T}']XEs-CZI%!Hb!Ǎ5duvò~Q'+B,UǞtb zc@&dp0"1wurIn Ie#46 5yX%WH%v4eӋH@ pcUhBΑ[Y)adž@kho@X] @<Km1Ie+N^i;qٌ Jq#[q%Y#Pm Bg$iV W_S2@L~AmDX33ao1BI1La j 4kY!J 'jj[u M-jԳp' dY[ZL7{%AEGZ&Bt􀁄M٤ac > Wj`W׸9)m0e_ƍ h Ym>>xwzH4UL2D0`utsDhVNa0ev|)8ww(huܵc5WyA1iknV%F[6i7)wy.!}I1#lRQĠݴRz AAJh@25TigOt4R^XgepRp}u bV"5595>~Twt(P ͪ‘r cDUh#-&%b$Uc,vD wS=%u vT2nDNZO ?̊hmzuE&ۊhlUl]#66kݰHFy)i?Yw_6P'`&(qfq++KR:Fm|OWw[NߵU;ar$NaJ܀i7tƛrmմw9ڛUh?*m?Ou˖=J"H=E)ROzSދv]-l?ΥG']=IWOzՓt롒. w.;FYhڝKj^C?5lmu{&5l/]m筟omܭ{گ|Wp[kFZhj9}v.ۋXuh2: 2\Qx/k[b9[- ً,<ӆ'IG2qw  #4aհꈠa&gѸX)7xpppM&Il1p@?Id;h4!E m`1.9Gl$b%u +*ErJv3ҡ. XK}{;!}ζL}F";]ʰ8!"e@j^PQ木K#p+a͡Vhz Zݰj+$*̧pBrE.U/=# EeWa:i\n@>| Nz,R6@jBYZ W4С&'柧㳓T2C~]_b >i2,UsPh#MqXףn kjٓF~>hNj譩sӳYHkcR]R܋iRV*:ã=)PN,nXR BIoh fGx}}G/ş w!oa,t6cPᥧ;@oo8A:f 4Yw؇XqFl׊n!{ȇל[qS`m>hv 7YL꽄En ^psV7, Hj6Hf o巣PzU}a~y^ ^V5{눿8:dbXcP.2Or$jULYH:F;"yE]hfi}>x0huMNj 0D2"Dm&!EUTRnj/UMȂ9?¿NOWƗ^+ս oٓf}xcf'e-).gNaBYN!ee& "FJ#SQ _gIejX_Ģ[)v2]v[V K9_[|2ZU\aS֯m`$J3 8u/!erGU9hBF6iw%깗0Nۄ ͕q~q =XqүO5Fkz32j°S QRtk&Ӷ}fʘn,@mtkѦօ|&Ʀ$#>nfC{D:hb:m1gǘ rgֺMѭ MMI."Ap:hb:mqnO*oZw)ua!)A>b:L[o^ݍeKЭuGgX'nlJ.21VMLm}RZ[mn]X'n[6>"S޺]f)(c^Ls(0/NgIUaٴ.),#mN4}ߺ41\<9ExU=B=6W,aneKr|&_aF'+$m~kϖ,[ *f8w}TUm0gM*6BrƄ*Hx\tv,-_86FXpc[?Z6[DvϲNۆc '[XV5_EF(kkԑ\wU[U64q~iR`>N|[J+aLVViwl&-}D ƀ)m !tX_E2ǃyNM^XK{\}i&!i<\U"8V+|XvY7=P ǯ 痘RJ~9JXCp~;|9 P,TnTV{S|v@2Fk}X(WKlI{+!E?ӟ*۠,jmj &Md|HD&ou8%AWa%IJxJ1yb3-Gi/ZfE'nˎB,*Z25Dh.9p%pI\Pi4Jkb;ŕ X,{3K2W!FS&K0"Yjְm{teTG A-\ш[NQ;KѰe>1E `&F4!pNB) 11 󧬉1]'*I Ţ .e-%E$ 1>8C֐bg :F(\^%Kno֖L[٥~ Kȅ],BV#AQDS$j܆#DE\aw5&ŝPB(SRB33Docud!z, a=>l.ao.62ye,`2M`Zd$p73`\e\̊E0U,Dcpswa *Jw GRNѠpV ,XЧ>TRLV#c>t08 "Kؖ@0i-,-/a)RXZ [02TeQ &z&cBfk4;#7{׶Ƒd7 j#MbֲB<j+.пƭnȤ A$<.pp˦U51stgJ6C)y ΘgXlJA9l$^A@Fx-`,I' /&0 VaW0Zլbyu?~i0NUTtUj6Ht[#Ab-yg1<mܑ.K̼BjEV@Я*+9"-O[7ejԦ( >"0SLcP ܯ2k+]a4a;.]tj Ҙv$+. 0OVcynL OUUqǾLx@ǒ.AV) |eE'(VZI|(<pl@j =()|*Z LQ *!Pu"&XXA:`vaU 6a(\f)r# @F$21+B H "Z$et]`(AJ@I*J*n3>MXd*1: QDU;Ͳ~fA΋K#a]X-6nIk!N̂.&ʐhE1h4!F80`=`.-E T h,%ܗѕ#DeKR@ @L p0!xQBGpm.+e v,`@taK:""Jk8=P@ʥӷ%JaI^zRBV b*2%WYe)o[]3w6 +v*DhQR"2*ۣ5um o2ӣ hâH8z,5R&v` rj*𙳢ri *[)H#L5%Z4 Z`v獞N_0,rW\;8j[פL|WW<.L6khym\hTg+ן=&N%ܖѕR41v܋>7&9ӷiw(C8*{4jr5Tap:0,bH:) ;ZGDy6vBa'tSg4>7(;{wd̑uqۘpޚG>#BqB9hL)sB9 ʱdq"}f=oV(ٔdރ(Yz{VGSzJcu6/rQW!JO 'Y^N$)n[edz0i3)1DicC~XӫuVs:7[*,u&d*j% i2Lyk/^8[ïǨ 9QVrg-Krԋp9s1cXFbv.E* Nla\RtKn OWuz2OZʾbTN%W;}M:˅(6IAe^Va6oB,7anOΟ\Ũx]EPc]I?wѧ$*Mi0 +?O{&!m"3eC+a{5Sf36$C`?Reԝ‰vގ&e, _q}Va4%dx9s|8cԯF]ΛuR} F1 V/\ˀ~1K&@M^|6lÏx9" R8ه ̛!@Vl2Tfli0ZxTbTG .6` ֜` ֜` ֜` ֜`M (#nGkK5TФ4dE:::::::~Бc#AT= ˤ/4w?u&.?PIk^_P(U&}\h ϛR<;W:bz@7ר?AæА4{}]hү3-EzeSVb #xXagŽg|RD"NP %;ׯDH\f7ۄLqq%x\6kprMPNKr`. hn-f:\Zuڍu3{!uHBvsc̢oC/j-bx۠{)ϿqWKV$Ȯi&B*њ"ĭUt3TY)* x2KQ$ lgaMkOZݩ]9Fkk#)tws3'ԽpXwj}C:%ySrb4Nڟ/9`Z[΃kcXkcuԬs|WFPXEU+u+p5ڽrR~i%0ܽG3Zd͌ذU6!3S(ѧ}6>T\R:S&Irփǹ{ NYV'or{$ˌKYi"z1k2);P}J=pT袴ڲRQٮ©YP85ZkcRpv*BO`Tl貪ഽ#X,c6" kUPqCPTOlQ_>^%^x#qd}|?lKEɢ>yJ|.2s Ģ*u)Ri OI.Á!>Hg⨳xJl-#~kaM"*M/d,ԅb"^xO2$@Jx *Yl0EtgKSb&EG6(Bgᝬ\s\-$:򐝘S-\N;ډlz\p} u0xlG6ޫ^ KXE.d#/`1(ɸfY8/Rdᒍ 8Hڨ^M]3 [9:;R?uPFt|a=C]>yw'cwҸ~_RIuo(8t~t'i&0ƤMN,1>>TG+2:8'$\ޟ7louf}n#$ﳕ'B)SEҚ}fUvB;Py(r4FAEõ%֦HHv, F;3IqTZJ=}gG+gE'GL۾qcqmߵ:1 7Zasېf)W!:.\V.gǥ~x29km/_CSEO~xuW.TYx[+V6c1{nM(^iZrFոJ $F?p5 T_oy,(ի wϺy_>P6-u5~RsM-NуڭY^ 46膳̉Ө7L.8]~%EjQ 7H<_o^A!k@P(n ۞Xw/gI rܤ9:-mތ.x=<+KeYZa(jӭF-~xq1 }_j鄪mX/ʓ`8*r}/ӤrR̊j{~ }~eHl봦bHVE` hI,TO݅l[iuY暲}G0J Z\ gjϩwN:;ʜXTyr_~E\(vì>@fs$zqJ?48{F9v-BI_*L178c3Ohc}sBG߫|{%92N/,!Ygz7_`B? N2IjSEK_U$e$\3IH6S^;@B~ 9w6 :pf]2zL $܅3YjsbMΩ1iXPʚQ[yPBdaKlw.SZ粒>)^Wv(-ӨE ׊~0KqmfPY/<=4Lc"r %uR%_pAtؒ@Y(W7$m$PZۑBlN&P <3:b؃H&1BR5D-oJs)P21[ݛkue6#/I;IRjA/=El%b0aſx,ܷ%1X%$ڋAe'Àؐ<0kQ9jeyWflv`?Pȳ:؆ڽ6xsQF,z/ 6hz߃ FeGKB`2~i7tzQ RP0@2i!Ah{;[0Һ a}~A.m<}νlA DYRZXlb)c2$|  ^F$ġ 5v苨(x%̡Y%h7ad;tawr9r]V'´ Jĩ}ܵdG&Ǽ̰-dl:qƬNo`f&_%jE&7塪bڤ%fk C :J' @Bt"y/yll@N:j[Xvzg I ǖJ NBhY;T!1; 2o:(}ԝ"I>@,KeU @[v*uo#nZ 4FU'|Z' [ʪ8i7G 4|rϴ :ɈnCPsZ[z^lhw'3Im!_ZP(Ѕl 6x]/(2_5nSCuT9P&An.Kx9T# `x>Xr0(q yW je5WBdEW9cu_tp^u~aSK>;"7U ~ՒBW^*v?%jp MEjq%42Z)v4[G l[!:_z#!t^vPZ,Kļ# 1=\,o$HLsU;_gW cu;|L DBBnm9bBuuYz." .F.Yݿi%+zvKw{ǫA  TI)s%js(Bv\l;ݘ/IF$mΠWgR:Վ#ַK*m"^ltyPtlPw^ hoWI[sNfDJȡguKMI9m/2F! {rS2QvȦP8q8 t 8lq*ոd *@Yx|CZȞCXc.^A4(p_837s܃1p(꜆-@KJtܓN&݉0@]xގ-5O1ÈQ7Н$s~P6_T9j#kz8iS@mZ$0Dv D-ռoï. SyPy6NT.G@6JAJJ8nC'AOr:W࿇ӹL {}_ :)DN`Y u>:(!#@}.<,&6,/rILPՄ4jKh80 q{Ul<,ɗ`Ĩ|N֧B$IƞǫxX#iÿzoo`Ƙp8H'mmOK΂Tln؉i".89C;ḇıp}LSv>eTl,-pr;&u;$gx: p˔Ed !Жvs "@$ŕL.̓u/^]kT"laĥ=ma U L%rYFL6=8\B Jp^2Fw=Z9™z%KX܁ 07w;f(8'ѡh\Ϟ#G@+m [=c~cȬcϫg BX +0^4KiAFp],+r31N]=(I\kF/^)P6RdA) Xm/w)cq]5ѐ96i¯7P/u'Ku_2+}.r(؟?M؃ agn~@/ئ^q=9lTiX&Qx<Ө@A:CLr~(9^eYr{Q )a3Pa*&$k7xס~~bj\UVJ=,!N >iW!{gZN ΔإHz-,:(lLd)Wi,a?yy(N,A)# ؅;5^n>9N >rm>ِh+0VHfp)2&+f0? ! K=&}#cQwC2rn0%azC߶chZQHWS[X89}W H,A|hH̻y5>œͅTrW0*Ϊ?k f<__p52>Gӣp+5W>+eЉЇc󉼒&"I{#2$?{?$^qrB/o?~}'Ϊ_K}X &%o XP u]/Y@>R}NiAч2*,0X k0 :BJ/8*e\7W,*\.W8a( ;*'\قK؁vxU*Zٝ0m^YbOj=?~x@a)\zڒV^U*Tza+:ՙR9!Pajo]Uc0kDMVmrٷ/(z>dOW\vGцׂԭ_*aؘ%Ds ە| DտDr*Bݏ;V2ڝHkؠMlKʨh$sag0},wz@k-3d}b3払$Z$qn-q\QCjoy:Iஇ`1g\>e򡒢D11>cfU([ XGP9jKqCx" 9` |}\6D.' J"ed\IIBEjX-癥{4<@fbmJ.34) g5FvnD#*lkRF93A#J2h )d #($?9m}:{wDiIND$mD2`fGSƢ4krnFp!m6N瘵%& 岌־R9Ԙar|o. ?U}]?u|q.D}w܃jJrm:5DsϫQ%2U39ͬp29R#<'qZF^g0j: 8/PQ iejQhk>̛7ۗh!2r72Ws5:oZ=.G6(#3sɨFm|.z`>~;\d1_ͯWP9nyoՓfH74C}gYҷ47W2UOqIQ r}2H;>K~qX%;l4oe/ j6D} 2t:Au!`<SLjJH g1}~ZKBo5AUdbNҷĉ/ȮCwq+{ϘJf[[Dߞ]7JMsYIˢ(52/CY3ON{[/D֬= dZ DX#_RΌ":+ɏTĿ#_˒zڬo?kxpc馾:Zݳ1 S5G [QL'D7hzCeyGy"9@RG"LC3&d3 DCf0@":ci9>CdEϜlW==DtB(FEi,S)9ԋyySj6v^}5WSO}upnooGICDX2J3AцfJ #Jk`H1 Ќ(R|1eab>h3P02P9qAX-I m8)qa7Ӈ[))Ҙ>5kuS%@|fiG)3E-m jfug^$Z_r׋`"N Bׇqp2b%|gXԤGzƟƚE[z1'eI2Y.t{VK74N5d7+ͳ9OFFXP{qna-%nwb4VNq!$9*M4e͞߿ߧ{첂7b/8 PIKj@)SU|csV3FYeigb;}%=(Jgǔ.I8Ae“>"^>*{vy%yoR{RJ_?uN#" |/;KËtTun:}f+_!?e`i+'ϰɥqަvFa˽-{W:W\r`T#.!7la#_XA_AB$[t|2 tckpJPTAF: E\"DhKц(d*\wü #0pNh%F)RɌ2h V)VsKӔq*M( )Xk1\=z#R/z˺->~FRosggqȻҐ (3Zi_%Iq<&Wpw‘3^"~qIÞb 9mLUxX;S" (}bũZWa$ɍaIz -9'>Gk?˹9\znrYEiZ &-~MurqS!窤&R;\*^ QbqΫN*=+1)|Tb"%/ P/;Mڛ$FMGhp) \JD-[Zޮ8C " HG5HJ@4A5J:62BEH?R܁/_ԛkPx\1YlpT]i(z"JHOΏ7*Nw2+NC”¶F^"qc93+OVAyL^  +bqή*O s&[QލPMQZLŚ׆y?YPUM[4k;DQWEN$:'Y}Yeaݘ`'U>)#Wm7Wyp\Եl7{e[kj6r?)IU|U1b: ŴyU1y`_!^k:RDdN'Ve*,x>p>reşw<7D/̆r#/gcT8Eub- A% 0J` mK>ydFN -++8H%"z_Z NKx}@+=2#_x6Iz&}q ҇EIS5E쪳^WI|bšɮmfTU ?>`ZZd5)5A#-h|K;Vm OMFKEU!*^ %mC%+2?/o>{q!*:t6׷c2iZp1n!<1Q;pxrCx׸b@CxБN!* 8!m$|Brc[v`1KQ-/&ô 1znS -1uWD]Tc&!ޙ>s_kzv|v3{j=r^ڙDhoDѥ85zw"$*s$iU#&)k4r-⨭U-D#5jGsEۣ|^W,b0WУb͘PDN;C@RމpBL6jGDmOBht<3qpVjts1QR)Lg-ƼޅrNcEexAbDzLLcf?~曯^gjdٜߕ= "R ktL mPY)*1HXpjSG${=4bߟ)[.T ̷bc˥ze쨰sMl>1^?ݏ_cp[fOBْ}9[وrM$\}7-_F+ob-GFF3{3OQ6_`hg) I½ŷ2y\4ah[Q ^TsDaha: X 74 CJ9h8ITRH o"b{/oP-tב'yGLeQ?~ q(Uiޠ/f[a nt7n5ؑu+?Ԏf,, Rܘ?+ok]M߱< UxU_aVYǴ'emk*^0UdGr1c;r'nd-=kQ ZW\ů 72%)KEn7߾x?.{#9_(Vw[>BOCaB@J`m%uc]!d%)lD\~ 6<c2 +:Wao(Q0YA9%BZbfrN R*kK'RbKcT -I뇬i6kIqZ )ZF eWd(%耡\< ÿ`j5ެw!fKÞ|y-ruAn&RsI@C r zs`\b?gtݏtT>,Ed_oӏX}N}p7IgOZe츔ga> :!`_7c!|Lf[hO2<۽t=K~}[৯͔v#ytz:8Nb%e-:[q ҍ1l"~R~pj}(30s&ei*x޹Ki]&cZHڳ`޿͇߆%neqeAM8Є4QU'.UDq#0-'oȌ6#$6.{G~]:6X=UgFҍk&'Ƀ%Kw _n?&Lr+y9,)Qraغ$$$,Oy尡,ŜQsHmvake&72=P LŒ v0PB$A̩2r|wf60$LW!IϤtÛ`=wxS\pT:F)&^ϼ,R6R g^hI(2U{]Ėn6-cJb`ܤrcA CM:8(\\P`m2N6N+S*c*i<$i(KkNX2)  x8xoԠxz1DII J%*b#J "4)&^$+iZKBj#;@L[DAa-&=1AHGL ?pTdSA#ѝĔ>m5PH҄haI HO0zg$H{!I%: eHg8^Tr Qf/Q^3(AM8-J HjBuv( tx#i  qZh$pFE ^1CAAS8Ȏs:xA^N$MQdQDDY"rp`tPsg.7Yq7{~VrN0u c<_59a9Y'PO,~^0鮻;`V6hDzn/}5"!%jKʔVlnS,h:_rHedZYWP!l:%b2]^)Ot!F`T`Տ5'HQIQ13K+ҍ2j^)v"EQ?R`($E(:y`bQ*:a RJ$*J@ª Fm .lX).;Ҧre'@i"zզ[ ]}_'n϶<^)?Յ U2욏=,bn5>-L=ƨ@r!$EfXw>r{d^GB5 ggܯVLĥGxI6T6D2&4zSL NIp8c>heY9&d,l 3LPc֥9xzd,ak˿o}.`v=`VvlzsIΖgoG38Ǻ׾{ڜj{ V7g{71eiV=dGT7-qfIvSrcHdP SfM,9n'(!:\vX#ui|ߺ4LMa"L;FáG+/F+IK4\íw_sQVz=hFtr4F@T1 Ji|]nN`xb;b!~íq_lΑ`s>R3/1A~|o>:I 1XF'?Ns`0sQcy,dRqq49=`m㥬etZ͐ 4X6&)z'>`vSf"mc>˳1lu(L2#dr:#K:nW$0I^CI4+scO.;>pW=C='CXQN`φ1м+ShM1ІytԸ` nOݙ\C46$ZmmʹT]8x3^Usnͦ6-ݛ=3x<;3zze{g)VT8O6_1 !衡lDWOnr4Zxb8?}{XԠ)ޜƈEOFbB4nL1thµW45+s9&1{e-^:^AΩO\)&}+6_ظ A`uJk&T+E n8C84I)X,6W$ƩbsIȒY+立ukPŸ>h{vTֈ&qqO)5ھdbwB!+gm27 ND@`rTXAGD2: 93rR;P*X$@CcýJDL#ӆW Ezbo?Ŕtu# &d$d;(ԮUȨ4(U4 V/kԩTd. XՊpw* O]VRܩYkdNX1+I 00K`sX)DcFpylAH(#:1Zkp(AJv`" 1\0kvXyjc2sQ:HyN_< ,G;5[6~g!y@ҍ#T J7jcY:1-"諥K;H9Z0PvrJEԶ{Vk2TL>݆~.enx+5P3r[Lz4Dyp T_gom:G'MYx1hS]i:5q1FvdήoI  ,I uc)]Qf|gWa_/9_x$ c|xI:&hIƎJ>2_6Mc ;^JƜvUuQKJ1^YWcN`Dzs< A?-Oggx.;J[O߬oTs$4(8032[OǸ>-:DIb,(X'+ّm'!]3>{'qK[@^]ʌd>#sάJi Zɼaa[\9 puXBI'3j>;4AQF'zo(g Ǹ6|&Fo(yNK$'. CnmHسY3La j@1E8j$k#Q dFISB۳7߰] N>򻵞j*9S9)?^^§Y>pX+gSV\v7%iu>:PՊ$%} Z "N#A`9!ǠO=(ΐ""". ,e{ϬA$빈^$DSe0 T"=H?Z*Y13 R\-sԣim6<JㆈNxhPm#c.Ԧbi0Me}2#上i ϊP CTl9P)6Fio*D%݊xLǘ!wm͍qT<$9Cn.;r\* YqdvOix1 %ylmܙF7!(T3\1e MUhN:4Yȅ hڀè~vl)f@?P 1 Hʌ"*sOj+qmu`JؠB_D`YCULw o2C)IλQs(L}"e"9 x^?&jy(F)昻3j1DT%{f<(T)4^~+P ץF[NhaxI+LNIƅEӦv췂7 @֒߀~ LTH wJZPgx=JO \ śEMz8JbfЀ1 Xv31D1Cn.,ohO Vxe)įMy@MjckњT({( xFpne0 2”[6"q ??]&R7%#G{pQ?x"R0 2O._sV\R$a|_9rapq%wFUx?m~$8v5]ı<Ikxä`ŊT[HʠTձJ, ` `l:STE5 $ t}\ \D3Mki| smd{ N5`̚GV n|smW:,Yi|𷋛P5\ie"vq5#;`WX?LK ,?2.'kщ"{ 4CrFQPy ˭'ǭ4c$mBEbJPӄwr^Oc k:41cC1D[ [.x3aV2B5Y O*HɑT%&&KӇI\^.bX?+iRJۺ念ϫE)|FWRO(oe gpF{Z.~oD]ebMj htqg^p82UG1Xg W 6|G~ `{D%g=Ӷi ޭOA:ݱǩ ΖOI}f8J kY9MæBHp{8yk% Ĺ*5qwi@j\-Y}j|GvG}(xݲF곓DҝSNx%MoMU@J8ES&E9.`GPސ"?*0x%5WSk!]p5SR7xzZeOM=[xKm][pQjf!© Y/!ejrZKZZTآca m*(A1U!1 =zG w$ AvZxdjRWܧu:ue]/P4^"PBW}QdEBINU4E:{63Gw>"%,\qC臗0!XCAᒧ UVᔶ&hܘ5jIX h8UK+-U3gcNlx/ j$CDdH/ EK. blg-q[N3JӠo|h7ljD ɇ- E g;H9WjCIvjgJsR*6^@=&#dxUŷ9 zc_n{Qu>vЖg5S[=,i?fT0Gu<3}ɕ-w7fu&ּX_ ^E9G͇Uhwd8#%4k1/ 6Z\lNvVchU5!xonUfUNqTs'0DvXIP9']ƵYNdO+ֳb{]xr~[ϒN!rW}uxG Ħ 60OZK&;%u6nLCLrZP͡lͥA MBF1YfLN% O\g{b9aNxޱٸkm[+MU&MۂG`첲̏M#zn㟺dB*1\+29!J9d߲3!vJx1(] eE^RM:f wc:>cJlPIn\QHú>HR\j]=ƥ^'ap!q"ыߖ.@i%D,ȦbP,ֆVQv4ap4l/PJNP0-2~=~\O`wtIP^ o'خ:9Ԯ:);`Pl&O^nB?g?S9w粞 kꖳx(x'%!ʆ;D}[Cuy֓fyzd9MQ6^9%,=z@ 蘸TTwlx|YBeԺF ne~R]SMXgN?*8_]zN 6w|\k?vmVob_}giiOHy[O'^SQڹbr⧐ZLy` €+J:1fa> l:tԕz+uNnDvo/~WU~&9͸$gԚZ▂u.(:kܖA|/˜vǡpQϕ[=wC3g\$dyKA١S20ǖf6$8DžQI;wq'N8tE?dgz㸑_=-~1'"Al^΁X̌xOiL-iD,VEQȧ>[RwKpu蝚C$q i‡c 왖fAhG gàs1仟v1n񒅉zVB&Jwy0Q szhz=^"΋ܵINծΎ3:fxjzD,eX`PU/kBEG^g+T4CۗEdc"([a{ed $+^XI{؊=dTrȠRj>qURyb}$ޫ]~հ2JI¸obJTIy^.ABwfj'1F̂R!;"8Ι@0δQH`5Zw'連Ѝ ӝqQoҧ)u ?e< ~)ؖLY y> p /IX uM+3q%+NxflGt3D_|@/o.'aޜ<hMU] 4UM' Կ"HDPK-D+f֊":Rd 3 cP[oP4⸿)աXJt}z\Z]!24l/صu۵Vl }RW6|IК2#N1@ (FZ✶B4 h8vds۪Y5ХL>M>k|ă5b EaxB^* 츠uJ&1gUTWriu*׵A19%T/d707 %bέJ.>~s5Qr8hHC."}B6 ]vNC v= zz]DwqmH Oy,|xXÕ\NZ\GVyCUR1ƛX\}<\w &+m⃗#xZߦ\И C"g*{^};IH}AI9{>2,Ǧ_Ul=,ǎ_͜Fs(^J{AVJfxݾ0嚣NJVS>(E%;,Ȩ5+!4aDJ|`\8y$J.fۇsgPˌ%t8Ӄs:GAx,|g8E׍)| <,8dWwKP=fQ|p"s+j P)S.@u7v ~JJoI Ďim5M3jͽqrD<0$"@)gLGPV/T,"yyLt 0ϝB BREBDFml]NYP 1$n S󁌃_zN{.w)1㔿\Z1*rx[mi Tl/lVu]i90ˆ瘟<4OzKrBwWRv%O ffۺG}hx? y'Ƙ῜YOW7mFV4W.i"QnV3[i[-wmOE9CemoqO^ՌN׵ؙ|кnu wӥ ?|;!dg0"!ug$j&CO1Dgq9]hN9,_ W_ aʩUbsFirfCX1e!(Dx=NKR -Pr)4k>"'03kft0D,p RcE8Q#QEj eRhm ^qVy\.{.HFȣ<ݶ N`tdFÓK͗ywVװq_ΦcSQmXh'a Tu鑿:Ïm2 %ߓ~~2dBOf-^ry7'VSh8z4+?_y{6x2]@ލs*<|ۣ79X FNFWf2|9r*k) v77T=Mt1}x9vJ)J" J鏩m>@$ е ' 41/>"U! #M՘E(-9Lu{GPַ6 8<V<N)x#9DƜkL0Jpp ?phk<$ւd GS03u$TQjS<%Izq͓V\sb!f%EBF0̑b',!(x GF$`O W܊kGry $}dS,uHsi2ЂM)X&\ip>Wjr\ɶ ;E6%кmN 16S*]R ^=6i`MMv0N4S 47 lN1GN9Bhvsus$U9 d*a͔ؗlqΙj=Ԧ O9p]&O1` Ќ N~rXSL19dEIځ"$é)&s1J1Ig}=2 pr{^-X+E];_;!Bkrƣ\ up vS`{\g-ԿUMɛ@Y'U埙mxv;}}],?!L.o ?6{%|@ ͗>R+ K{^c9Gk,;s|sT!5c Cwi ib֠.(gTJ 10bp /93N' -nrϑ~߬^Z!'q'H#uWi>tw7-}?Q.Mϵ\F?5jЇdu{nI4>vJ0BVml)%O[^/S@F0`7[ 1V8F۾;ꪗJ!Ze&DxD;=m}=J^O_>@xǯR^q;4,%2tS/FXT9ΠH|0h+s&C2 Fkۀ(EYKt]zWZ EabfY'O.Hzyh/*e4"!B׃C4,^w= ς_19vg"\MTrZC&A$'ԠsmWm{VI= ŬD*tZj[_sAB;M$'K _ӪU\iJzz^8#BE>cam/v=F"OjRJ4e{M%'`B[w 6j|b:{1nsu'Rcga5/ 5$sYajrt^! ՠw"͇E|||]ӏi FO.}pfV7IvW)ZCvZ 9]Wav* lJăʾVH)J|&.Ga}L p%+#+:+-P`EN;^tϚIޞپsChzdI#6DWTM냭.oYTu kiu0І'6j;cL}Z)%eC^]b; _BH@bpBXzl( p |'%C[ %gbxӧ^kz Y'z-vxO[%ĔbERO{4>J8';+ӣ$P ^z8FjԐo?4>J8?xfQB뼅D~#[Rl%޲l\a<*Ɠl0/%\?J`SqgjFLl{IٟJs%~pIxٟN{nن|*%ZfK1sUe/I*y]Ò9TO"T_rMV!碑w1^#CƟseo9"}-nO@~r^lnש˾N.]%8c [PDgYt"3>3 #ơEn|y[EC6/,1J8K)reLRYKuDG9Sn#Vìfkף1/o<<;rj.aivzaj޵5q#9]R4eySl\qlpD-E9vƐF$ep8䕭4nӍ-iim8ϬzxZ{麟i}P+n2Եǖt\TqQME5US]W1GCeL!X ()nZ& ^9X> w}2TZjꛮBǥq*t\ ׻ եZ:FRKbv 픔#F$:a%D*ru=d(n*]I>|{.# ª%)Y'~o^-y7u:eX]f͑9.hBxZY'tVkAE?O,A׎S%,,&Y 8̗qFQGmTQQk`ʁ DLE!Lڄ 3 3!F BLdF25xvu;!^{x&_z% Yf_@+-d!@J s-೗MXJsγS$}1I*ϑ$X`jștoAIѓ;`T,$ZimKwZsFsꐫa `Gy<\Lv'!Ō@I"p=|O+X/CF ~D'83p6iήl'8¿K8[N7!G'yˢ鑓(:F@Ie#r:Ila i/L.'w@0˚;t %Y~Ó%eɬN٥h:bWɴ<91 !:SiQJdٗ'Yԥ".YTX$G!yo"*RQLDB5Bg!C,wq}}y\Z]dz+ m mW>8PAoF9e֊߈7G7k\UC꫟Ou~y8L ό~ ~pOW3^[>~n(?{oW',₮/5៕AZ5)t;׌i"U`CA k ɖDڑlp%qs$#JH$w%yIlWUY+4g ɤ" KF~{p%O9p+䧒')d%A'4ZV2g)+1MG?Akx$ăFA H`\BB3!"gsKVnX*_h ?,S`/N wό1U4R , $WAG,DesrܖOh9fAKU fk.A-iaku,{XpHx\w~U;vAv"Mz]|$f}|J-1YsP,ɱds.b Z"gu8,pdo@_ΎYa4@ $K^! D$aT#iqAT`T [sH^9Ʋ!nemEkknF$|LJ p1ZЮKY3l%}]{2{'CrqE<01 E(#'Xk?N Q`aR2<ӌIZ)."Hxٛ,z&FǢ!Yu_^[Ar$] DI{2SMt}ಣDaR&Z!CvD_%WV`2<] Ai44-HOZL7l !m=2/p/9DiHs!Pgݒ72a,>FS=:j[E^4+/fNdZ.kȎ:ޞ?f܃ɽl7ͣV(Y8,zpq/qrWW/)}IDd` PMgꞻn\ȷv%wdHՌZAcPkNM1|3DTeՒ0֚~&X=?i*,B?ux܈C_Sz ?ux׈A_Ox<2Fƣ-@W [V jb&[U>WV;w,D<wxfa ⸖Q?Y_O?'/0'nx1ʞb yAe_Ff)^ۭBB^]+Ը0~f\WzJ6j_\Ps%ނd^V{bK݂ߵ]~R %䋱zĄ!믔hԥ^ dQklH:! dѨ(# iZTəSRz6S2!r,P2A=D`B> 79rt>l,%>x&I.Glu@P2`3ʘV~ v;n-U?^U?{65pO7R4ZLQH-:2x10Z mUCQc0-ʲlوV$Q愴DNڨIk1nEef9I8L#֊n!sG]oa}[wD|Cc mAڣcwJn{,Z&ACgJˢ+$+ T\ӋY/9D '"EGjb?f4\]j,]9GJIdsp^O`r=M3OfWanj~Wh,])l0 WTxL`UJG< j%SVӻ->Z}5wUkuj)gCirR28v?ټd83khrݘkZb"O݁MFjJ)QJUZO>ْZ6EIqs 5i%ŋx?}ו._ FˤjtօhTnCjir8)^Gx!t"_d2뎲Vm }lrO>E$gz㸕_ i*a7.}:X9No,4a_eI(#U$2Tt >EɄ7BVkc r'N_9:#D&үܜōBpsBq}ۻ%cd%U319@vޡ 1o9YJx[$NBŕf;k&mvh^;sbeYZ,I w S@`YP.V 8 nAp1Iala*"Ͱ얭 t^|o~,^f =\Z# U{1j“QEN>mIiU/asb(}uK4~]KvܓjjG@N|@^hQ.(.ޙB.9K1Lc@9kkLaJp<(,>w0q3C'@33(6d=KN Xn)&30y&c`":L1A٠Qd ikڊdE] Td`7ŕFԎP sYkp/ct LJ~=p+/ctߝDũ hWL%Blׯ0#߸dI35.|K@]\;^V{7A6Ҏ Mp>th[rӋ÷,hz-zx} L_atϞgyVT={OG={B4B?99"¢=zeg Fl]W uƃnѯQ|`a1rЇQgvg6}P14;ႈn $rوܒbb"9Wm.R ~Qq_a4pBD%F+,7D Uٿu%SW>#N@[}vp)kT fuNK>J:'˼*ɨUz/5 ɿTٕZB=k1(i8]>='zjQ}ǔɹX9~'E+R]9?w(d8|nӝLřl;=^3 ]I&yH}ʜ!J`m"COтB`KJA M^ch1#!Bj#l'%ҥI-c^{&$9K(!)Z.ҡ|1O &ȸIDt<+iJ-%{N_!5 ЀQ:ZcT}4YI+0q#GDŜ+U!R13OY eVc4UHLAr2@z$oL.h!Y;RBXHu)^~l: _~\XuVכ'6NW|= X}0sݹƚXWU" Gm7,c=l]BZL,Qǃ%,YIJ$@g2ms1CPx$dQ6SA͖77ak 7oj8C;rXՃc+<0S|0yMO68+85 <]ա j$D| "ȼS 4 -/1'U٬zFyMnm qE9 1#Ĝ"_o>٣ TaRF6Fz1.6Gq#ݎ;s B*rtG8g#WD,nS@w~mS /_IPdEE Ӈ`Ll@Rxb*KR8 RVV,2g23dmɅUX}J)$d[<2yg /cTǦĂ֫rL!ђԱm;HСY 8R +P/[XQ5vu9ݢ[M|>ɛ!Lc"då1IOŨUh1xrۤOr*,a>w4\3DŸ"zvUi{i*HÉMW4ڏ^hC`X,4BEz-(W<ɫlsx윣1o_/y˔xmo=й魻>ߴw ᓖ(9 DAt:kk ]{9~S 5y34;rԏs9c3(`]$kB:޼}Sv_n::~PC1pl3t'?>4eHI":kLO[for%ع/bBx>W0e{NnmÞ29-h`$&_X4'30=xgW{;-< vv* QEk2+mn26ʵҌk4p8OOKI/@I"Sl/dZ"^x$wCJ$f=@6S2.s9 D:[#FzFD>ĀZm4gSQ]գ=~!˽^x۾17Қ0`m:$&,X4d"d9H`"x0q`l՘›^B]isF0`sWO:994[f ,YLM_W>sˍ&xfa[5滃WFbm5Vq5]Bײ|Z Iv.T7.P+o̟5 ^nX?މ/z؝Oߋ$;GOk85lb Ћ6¦]kP +f92^u2r/O#;6應O퇴1Qalӽ\]T\[v7wOzɪ$qKŞ{aXT)m-I6b㤳\StiHEX^e&1 Ehi4Ds&Ok)ԑI]V*čY6w>#8 o+yJ$9gLk"YNdR&8ElnO~rȨ JA*=c&ickɒ%cd0;=f,Bɦw!?߻Mau̲gp6'Z9,XG@g4Y 0%X]>:Ű/ݲ| Y 4j_r2'8!ʨ\uhG6q#ퟚ5rٓZtnj_?L&u>~i[q؃o?>^vn蓦|ɏߛQK?[/?߻Շv 3v×ͧIշF?W?:[O-;;]dN?ߔͶonDPmNŰk¶GnߣxWHwʅ/U[Vir5PZ@ ôuݮziJV`\ aE۝J;U,U] #IE}FFY;M^6:MKs(s;]GzAlS@ qt'A9* m"ԟZm" `voW~q4E )oHM QL$hdC%`8yG*h#pL-d@w^Je^rԱfU+uYZg ?Ŗ_TK9&'!| |dVkB>`7PYsWG~̂jjRF57f R Nz!cX޽D?z"ւN:,MR!S}|el*:ϣaZ "&yT1R7;la`ERȊPqHt48U]67_?~~B677׾ w gY(yp=-hP!ej AMP). wzv|`ӑ510W/^4 Ӱqnit+KI g7*Cz¢8oS@\Fϡ,NvAw)"7͑=.[.'2|z?0_h&$tMxЄtpU xM9EiP2ZzHq%#b͗뇡Cw/;/h{yѝi1W/kc$=gۗ 0rZէqy}08Æ%r7>˗ |_ߍӫ̈n쓥'uu/+Bp]io#G+>wL)À00Z=^>ӶLBݴդLJj.oDH*R *2ߋ$BgxGkYrۙ.O.iN5k; |vҼ#8?vD: TyW|'Y^=#}_M>kBuQ<~ ś]hyiE9wq_GtǃFo`p`=ߍo_n7+/k{P=z{;uNfjBFWZ+yߩ˘{?. qYmeOUt2\:?ױ[ogYtfRSSFֆ( &o97/UtZ6Ӎed"O_7"$"_jzI8?=I`3#-[ ~\8 52+4Vknui}1$Ce%pc |KiqnRYZ/|Ɲo'݈Vk\l4D+-'">mUң)di (%Z.BXKvvtNt24hޜ^|;iWq3#7e0 ^w_}3>x4F4iI:zOhDÌ46RU H2rf(6vFE# ~ ^ tm*|."0\ރn=Z|}vMn_2ӯ;љ2qe$ ~RуL`B E9tTZ+(:jM@J*^:`~10xUF/A. +LOZ#~Xp/4"8&Nqq;3dKҵi:R0 i`yYOպ=uo~ys4M.6?; cԬXqy~Ͽe+N+>J! lf4"uHyn0,AA)$@sU5#Vsd}b3wPÇPBLQF :#9υ+̎Poe8̥sj- b#\<3(*O{JSL < ϩCRͬڪÇZji X@܂i^\^2I*j(".N@&hREsbl!x`b ű  #ȐVbOƊR|z VC82^+HQDf5'\H# ڧ"U1+ $__3r0^~\nh:"M"ۑ=}M\o߽ɳc\:x#!m+@T,Q-삕B`Z0XÂZ$@)D~x"X!p<4R 7&|) bQpSNG, }RV8驆(Ը4hQ3K\;b3sG hg03H{ð880 RaBRN  #*ze@i̵ZK<0e0jpCJyFͣVXDnDDG `J  |l=| {5l_ Q >"L" y2]]# HrhQS4s̰t+&N +B2J "p3 kTܡhzp\R`7m`@rp:PPcEeZ8)IAC8ai *d0jyF-$xYj2#E*w$1gYh42*iU$DrK͝F@$VA`D 630$FU#!`7MiEC_'Qx=[3oa:_^gӿZlr+g}>`"%G:T8 v%7IX*Iy2ƴ DC7Ùĕ.i4, ?q2s(8L(-1p(Vug+̐GkZ i=,MiAٕi{QE5RHwB.f?TcEj1 3$TrpқCm %E}=~;&TQ!-~ Whӣ+~L9_bgfx?h"(fi>_y2z^QeTLkKxKĩQ)X֫2ef0 ^`z̋x$\ҾNcrTJ}콟HTz{"O_3;5a$;AaK!oB{t$h_IaĊ1+1$ѨԜ`*;:N59l!N()P#rD;eT1'Gv9y!kWG]=`63zvzTn0!\0yM:n )sں*M+VGCDT* Z(i/; ~bDjSBUnYz*<0z"VW)*zWvUJO+9'v}y^/WSS\+2*l >1%<$|w_s*C#ಒ߿!o~Tz`s_kAD h: F^_\!:=B+ͅ]5L-`gb+ƶ0T0_>Cjy1G +#qw*2hw1:[nL4#)Ib8&&Y#+V)]]a󉉞;WW|N1Q/=J)t.&Z+ RŘ;=sE=&Z}LRehbta{<?\]T8zLX2ILaSoLR/&qUeKsE>!N#tCCF5Z_T%$F7*$IK5 )ZY@Hz|?S){%ϜHͩZׅH_ߞ}pQG SVz]KV%33sFgo_][J'0kpڒISN $|HER5oe],RV*9sn9R9X1szM<$ƙ/[CT|Pi响 ~ӝ:;\+ly*F 6^ux}aG?utޠ! YDa#gCnHi~]R-az6 /h띄Z0N*gĒ([G\ҭ.6aO"ku\SMI{$۫ "r{xqqܜnbݫL=DqEr-q':M6 P d#D8>e+RQP0qQ%Y"fH1cZ1.>ZY?d][s#+*9i ^\l&:J*ɩ<\:eǖg$`Kۺt%YQ&& @IJBqh K))&arh42"XfRE,<:-上Zx4b\0`(c" ~׀Gc#z+ߎ>;Eg{Ɂo(䈌 C6>!y} t,+Tђsn:j,W,^Gfif٤9<;o9AVRtgttsQl:*El=)F 6*,\ӌޗ YIҞ\p1[Z3.]O0x͛Lev!t MQ4/ Tb,+|VM97a _ ҧ;o;Y%6Jj4p#*钥p[tJ9@9W.Zbi435.' 4жЎ$T_|hM";Pc5s{V2}jOd-wO{Z84R̫eYyM;2ߪl}{&R^(҉Y;za4wkmzu3Q/Rࢸ8SQIȓ{j$7_n=Mr3-7XV;-J)&CdG3Jɴ @, e#h fֳ`neҁdU8?!Ɂ|~M;Lp!ڧOܵ񉪶jBU#P0f\IF_4FgiIWYD#(H&UmQ6CF {u(Z{X-#!]^x8l._S5 q.[Pp.2rŹJNl-@0Ah`]ar˻QP4 4~.bEڸo\qv }{VƸPM0W^.j/$JNxZEs#8@Aɟ>)&9Zy,~*vNizSu3]=o9{>"Ob(!9í28)e3&gAnNFzETi `t>sGV3?MzDd-X"SN&=.0\BtFEd9td#2ì,hc S0^ir2Q&o#>w.0"0YpLM&SBKXЇ, l%sG+gVKr)ri}ɰT*S mc&fk:%i΁ϤPk !1)\-}/h ܭG|1𴎩ֺ>/}9ƕ3a$'URv`A\%ӥ~w?(r;i.?|k 9*q<׬s~_&5}qU1Q5?jH!E }7Pt;u@O -ۃ˳#0Bbw;Ol4F0m@^)pp`w) UZ7IS^3É-NVjZ-ZyucEeBm-4 0T51̒H;P=0B^DMn2 S`4C̀!!j BFF+:kƔXT Cb Vq#rQZ;axeH;#; V0@k7VtEC_r d9HsCu0T 9Y5lw*0h+`}hv,+C,@Ȫx.)*5H%UkfIu;suC V qu|9Yť&\2%͸,A)q`\h1͵^?lS -`rPH> vPu$B]Q@ldZ SԻtKJT)F@;*hg~C!&_y`4'_q&59c+)>=8cnE !ȫIyR?F__[]1{  |WdÓ:-6_oV='r[~kӌI廛q}Vtsu5?S,J)ieQ(RQEÙ={L_"띃ogl3`VT|Ry&G7׬-qmh}W͊(Z/Xx}\S 8*XNRRiĬ$>gqm F?6Oms \Z.wd;3ꨭx qؼKv?6t#=Ȱ>ӿ.~S/{;4O; %zz93%e7TFg<,+uX%\pfjKF3[ I9] ,瘝1LJ(*b3d X+!6j%y#VZ~??(zi:<)Sܺ-C^nG)Ŵk1 =iq\d:MD:HןAeдUS8AV6/ym|[ kF6~{i{aO3۬1m2EZ, Lrɣdeς!("y.,g>w`)??՘!I^*< R$kD-Ae KX3rVa?^Zi{)[əG-sLm~T378/y ΛAg)*!ɐ6D#2z9;TjW99iT}OQ2}ԫ×2JjdTcwa#C UIO>V㮷we26"۽՘x2 $z, I2YUΌ)s TdWjًBbDZ'qKTn!s䖹`\v(ȓmuAI9I-jD# pe'9r[C pYR0):\b) ( JD8i얳sDAYPK6}d ZmP5Mۦױ@ǔZ3\|XHm[*j/Ȱr.6 S,r~|TоE5c"P9!t#wl6@Lٝ="Ρg2m=M/k=d߁Ai#OFb}f3ynLu!WL|+Okƀr~ȱpB~J[F~T+hZeL* \}?iUr\}]7Rr-xLJaz eX)-pWh3-ͮ'%6:#'XC182+ON 9n,HaR9Ϳ卿N$QP;,uhŶ~pq|#廋>Ӷڪ-^}_,;z.XG&7 `$ `ϻtj n:Vԇ$բ-+u Q@/6a;cp* ՟;$f\] opy6yGwks\yVjgɚYlGXfnV0g8fjr}1x7Lʪ[|̚ݫ2{naeiǸhW Ҁ3^Pēy5Bǜkcϗ /޼\13db :TRg^yISX sՆggt8cmf;0_DLzvy`WF-~Уp W5[=JpeuϭS8JrHrە-ڛe^GKz!{H$bpxr*]\zG]_ `vo>RTи$~7fSIin·^N?Kñ$ 3ٕ& d+)&\1h^RzmQ|1[m^E"VWD uzG:$-ih_Rrgsޤ{HGo/svv\-Hp^ۍ*b55OxKԚUT˖T+e)ul[+أթm8B պyQSV2ۨRҠ-&Z CQ&RɚyM6hgFﴖM$\ {<*3ևrCgyݦ`jU}yWNk>aNԏ^@Ɣ.`)^2ɖIOD&Ӧz0[0nE;& OނfX&ޞyW81 ߇ 'GȝHr }mۿїNo/e;sMi_r%9mlS/ HJ4D煃tLR(q\QHP:DSEm` P_gKDy (l=b'8#3`E) Qp1!^@*Acll;iXǼdu e.':= bO9~S󱟵̖ںȐȖd,tØuU5qB(a>xўYބ+DSV^R#PD)_/1PFD- ".h˸98)0%[Sp$B&IVPG[yY8zO9qt3ӽk`Q8Q%89.X[X~;ʷBősOA=T]:bVU&po^0pdicߙGDE:{:E_d8-PD]ﺰw-z;:EַE~ΈC%V`:f+8{kѕ߰RB yE*{} 8u- j3 UP\[ Yq|x/I' įJӉ"pZ @<~ӟr/1=W8Ezu-"bť 5_!8@e\G?A1\ &y#fOt2g֌4#)6K)@v gKb13-.+u ofڊ-\E<\OWw.Uh?b3Q1iw my]!.j֊b^N*( /}f{^< Xuܱ7&T'np - T6o~8kUS`^3_Zee͒\Q;ab?- o-ؙZH.r]fȈKDZ".\8 M|Ԓ7ŒE Mc G`(l`GÒ=)5 vX߁U@Xu=&A™l?ZdՕO6Eմw5 f#~'J̆}pOH8 Tcp)u\yH*)X=ת_A#nWSgq4mHP٧OO,^Eqw;5U\]̭27q]]9sWSxygNݣ"xSo9|K=ϢEޥl3j&z}ahg_ ~XwRٛ/`]O~kZxQaLZf݋qՆ{Q'E]bt3b#@c˖ba7;Tڼb#$zv[6& }lwwc:G,[byX!-5H ZBIodAǔ4} MWm&:B)EA;&W -9 <~[rgTUF0.p t#q1RvȀ8;9bB[ .$)Įy#i,Fq@p&_z89oEZEhRcsGQLe{jW^(Jh}sh} sۜ$s:iFlOF`K:{W xpxE~υ㟬AEֳLg4 gl;rA7`qc{7$+,J:B`A=w=i8_ #6|kmR2}2!촧+ 6_D|\bũZ\G24S)DD)ď`~N2ggGBk 8]5;y!> ƼBRhv VYf ~SAExy].V$TbSP%֐I )8󂂍G N,q8/w`l / jp<]kyB^ >a؊0oG䗯m6 ye{%q0✌qttM]HHK~TQu}0'nO=HOӋeqԢcYñM+0j."9*F֛Yވ'NW- }q)azkpSz!*6__f`Vn3%IG|ƣ$#z^9d+3kbƬ-"W#0!6c\ zHs#Tx9ʝRZ p-w_B(r (loDn Z(q za}M=sy<^Z"X Tk%KE}{sHyi@u6/WvVrDJݲdD9%ٛPfzVcssus:$_x:/wl`'|wGDqřx84 oprs,̦4 օJr6LdV XmZ-0˱P Vb1\Mo\vÚr)+G*OF23P^FbyHZc%p0Ǽ,DZp›ބ\l1'@h Aʿð *t}v#̄%J:PmlM:n@lٲoNlX+%Tl2!֋8 cZӲ* @Veh5G aD$HinZSL`,$t t$4AL']'O=l4nsXTèc5hbSI1։ӡ5%qP@3{F#KaIjh9A@.k;we5#8ƘNe>^h}wdO̚L6|Yʃ3͘Iŧbg?ٸtv}Od6,3O/eMuQɇ/<Fg4͵P Hչ4Iunњ?SҺ ?;;_3\nL~)O38?onVh鏗zi-_vue2Aƀ]cQIsLN 3ZB`'2W.)dĂ sIFVJ #'\:l]TΐYH.8o#aښ㺑_KU9hU!;[[ɦ%eJc9(8$fe.ċeЍC39űQj(v@=3c"!=P;Q6qNVPAd+ M'F&X Sb8<:zRL.Hd Փ I9 z&1&bO9qչN%0PxKVc}[.ry%ZLJSxn$PÏ3%X]ʼn&3Ө TY*U}3@mJ5%3;9MwnO&_By/*u QC ݂Qo4MF)!yެVBLλFϏoEq霞՟KFY9m;2!0럮¥7k͠U L=m.k3c,f%li/Mլd/ׂ*z>(}z \:Q=I%NUd{|WOW>W5NH+sg1KfϻNUR{p5ct}Q/1Lo4>#ݿgt]7nf^Ѐ;CH5bz{3avCK3j(dopV7 7D5~.V́Bʲ5MN`?OTxHL)?:An1+OoHh)!B֥W}R*xnnj@,R#ӽ̱'4`G$5bhJbj_ RL\01A4^wxNG$[=6Z:;Z˒Sr0P DmX;dӁ] 5k_ P2pFVlU}ѧٳ0'i&Co}Sq>q|.pt *,[ (pp{2*x iFt.)]ƿ<(4kM{}+F"?KD#&zJr>g=Gh kOWJN22ﲟ&2JD! 3U`+-3@ThƘ-k r_`F#yّsSAgˎ9í\uy7g>zP,TSd `{5=Uj*_ LfΛGУqubTOJ,%x25C[cַ歩TI؆5^)3dy#e$sCSHv px`IefB6 X͛h\}4CJRЇhˀdc`4Wo߼B?SzK4J-:'??TLj12VEØ:Ŭ,fa* q=أo(}#[v=WH<8,ĵCaxi*r9RF7(5kp-X .o;۩ª¬\Em-:͇ͷ;>qD K]rmyw[Z3zNL??oetOW͌˜o~u3s'a'߾ *6qЍwܮ.5wUp!NFA9E<>Ic{/!IP1GЙsHĵ~dqg~؛_Vgxl!B![k ۰8_kn7BlwЪ.35UF6If)=һ 0JAQFّ rBV%4*f!X8` <$Ht-p^f}obj",Zw8&ih)u jG24 tS['HyBZWOϋ_S[+Nx 8um&#᩸[%bFϠ:lu< upUI-NeFƩeT&+-Nm!+-y!bJYiip#ǟv|qn %TO)UUD3t%R=%έԒ ɱi ] כ=f40Rk'3{Wf-prcĝE-:AG*Yҕ(sI[AH[LvQBC/\ɺvA :߅H']i &&}ctnRX;oU@?.^\N^XjW"~̶oߕU|ZUY_v.ެbʧp']eygYY||>}ItFOƓ(3'UdND,f Z㈃W?| k׋ﮯ!}ܕȚ{1t׋Wv͢4de3嗬"ޮVk.Vobo_/Voa⻟무{} ?DPbJG *g2jDu%0j\(gZ+`=ݕ- zmr3^0zkP@+b`ЩbGNbԐ ÇVyԤrdVRVEz6 ӕli=׵{.r ,hdf'w <6k~>'\2pٟO؏6s/8x#T)HC nM/VC ZkNXmРhw?Ę(O)=Iic!ېd܈0_[h1ܧ{ Rs$}X$DSfh\#WoȒD;E;NuwYi4ZzujYj%^18h4:Kb6@N0aŎD١{A삊&pg5&j-0 ^$2 %chBiiE'=r g9_Ջ_8zu,cZch`oe:kFE.s¤?ªz'=*Cd (w E DQ YͅV\ H̡{QOP=?krM aʳBjHfnf>~D\,zWha؏o /~^YwqymjqW*\>?? xvή.{Y;A}7wlxWX(}^p>U%/wjQQԌy ,(90F%產OFQf!z@ʃ~-}}6~"EUhI"$$Z)e$eCH+cE$DӚ >Dԥ՛& (zz3ZGLqhij󄳋ĤMlj"/Ң܂w܏[1z2FoY-c:_BKPf~) i˔&h@$8"En}u \|w9.s-6-w?O f}7;÷gFY9垇O+B"IyȞ*89nh&9"d+O H^0kMٹy׹Y3CO n-q|5,&6_:h]pg50;"Pd13"*cdVpəLNeT5e?EAn=O]iJ#dW!~#d1:8Rn`\gîL1îL1S k9t v`@8_Psd3EdcUDfC{oe䟏Uc/0%I LdBwΤ"8J,X#06\n6n[BH4訖Wp135?,.W{yIL#.Ǯ~4Ƚ,㠴QcQ#FyB\$dHEL1]G@E^ o h{y5ah Ԓ&(np/P@?CJFppi0SG;+pC~yes{*sjEJ@LQMSΟ?LRzrTbbrǝu {m:cYJ?cYJ?ҏ5aqY pP3[ . Q6h {*MQ:6!( $rð?ߊnNkn"\۬b2rhDv)@{"`RFkDaf 2w&2pReU܌B3dߤl(iXʇ|̊×ofБ3<Aɹ3[GYb6cVN8m2em䲩0n:~d9jtE$)\1LxJtY=>A j^ecVdH< O*@Qg9rI;F{]̂yoz,G:G¼o~,Yٻh&) BֿnX (S)Dx87J»9=-`N0Jr;ӼHHusmޘ6x] /quA%,wY(t@da;5l@$N498!? ְeDQN#J`"@6oD#U8aB.p;nR)M( YL!k\$L9b\0]lV_C #v1 Ñ2b'0S0rD 9b6gL\ gHI,n~jL`^(d7nk8wB0!vm-A-e)wl-h]*"9vQA5YZWIA%9RB5@32'5]YwqGLZ!H\οEh{GÒ9wHqTڽ{6oLI)DŽV9"tVc["7$͓A n.a=U(0!>4 I-5 BR-šs |V2BZCr$ ,Ui!\a3"BH*%g8} hLa\)j1"BD4 %P H4xf@4/B>0:=alJőa> bYჇ1$reS*+I҈X"?w~t0==~‰T] D̔jˏ+4q!TAN` c>*qձ=*u ѶږDX!u,;зĆѸR\ʹpd mΘ@# JI: KW II4V6.arOH-0X2G)ʟaUJ,Rjw4`D0fumu?VI´w ZX9KIMW02'f kHRWxcV]{Wp,Ya|j>YH(E7#[]qrB9,F^p uPsu 58ts %?Go&G$_2(\dz~2&~ي[WWpuoQrKڋKGokK؄ ʩ3+KvwZ#czIν88qM3_R$v\ȇ^w*H1_m4fZlGImU&x (lx1nJ腅N W@v$[H={b\? SόHo#̼jӶ?yxRv;o>^yϐ ;E?ԗI̗Z,>7EXXRg}p?f /##khʛ߸?0~G?lb JkVvr~yU-43B][.6/ gZp/MP9#67܆_ii2NЯ}a Wa׶fpeT `j^ xilmkP6b%_Zdׯmzmwan>n#,9b-:Jf1@Sj[ f$PN /m ˣZlIzۜ^RC8kX;KtIxGz(mǢ̚T}l!e}.J:!IrM}ɍ@cѨ$ŽJcN%YDJ)|/]U%yu0赏@ =ۥ왹|&sd-(Ws3ófC%'=FLTjz.&r0e#ZPWoylkbM> +#oD dfւH}U `OGa` ɖ'2mc7Z+mz;'7^SR6'dub^wK Ƥ]GpC]F[(3[֌%}WD)ìvE}ٷ+[Ӻ9.;I=5#|ɹa9\j~Z>P\*TcඦD,;tݝAid E&p$cBK+: +ōpLI8ysp?wSѠMk5IpECs'1V)t]\!}(rVm 9BHN2SOż0 MqB.oM\kFk o!hSoǪ*9 V$|u&q .Qgϝmy jTtsE[}:V/nCci k?jcj@H!U9Rθ4Vnhh4(8\V#B x N_ߙc¥g?(Tٽ&ga4upk1#'?Z B6T-mc"i7Zׄ n"ѵMJ- wbp 1Z=պ)i{L“VtQYط>@%(dh\@F|)( \=`T: =Xaʻl ,kjYrڕng2T- FgsxVg-N \hOފs Iq;Rrڽw PTr͵g>Anw$Wz "200 *)"1:RXÕNLlwXik,,~Fяdx^Z>wx@>ӫ0ZJ%7Z]#2y3pO+Ll }J+0LM/I-'i8 Y^O'"ԁp]XEnؼ22[urò+7a}A~G+m:nDXGaTG~!aPPX/x:$uNW*Q oo…GsxQ7Fb_• .nǑLa<4Q!~#O ɕ7a&qļi%$Zx?BAׇ$&: F2d,XPbUD!!9f[?'Mh"7jEJ2q7y&C39qυ-n{wĢwc1wՎFM+0g_UIpTEO*H|'7w^T1=rKxݟyA"rH0A5Q**9LX1W{T)B~/wKckjg1@`>Y F>uZkήZZN>%Q!E _~ /ԂK`!ӜDiv6R\C<Go^!|1}KSamҷ[KM楟ɽ7*hifj);65]5(I)X H@̄:$"Q8Fu$<[j2nZaF3QQS? Ji8FD  L"UP u c5#q (̍Eo@g tpvoHڃ5Nh?,?2L^M"s*@ qKFW#uĐRk#')F0rͥ*j?"Yc) IF,%̟j(Ұ=bT IUs[#dhh$"IX5cp 8π H(X`Z=rT Fh/FJw]IOE1{c:p,I'C9|{j:@,l1]rzT>ZK=rT)jƐoLc)4wM@Q PX+} (1D@-A$k攁<H)^/TC,+k}<;Iplڱm9ygWIBWRb*R`74]2Q?Aw[˫9˧hjU>y DZa%W_1#W{xPB>Fŷx?}|Ճhc:K|4M.(5gzk[ bĒ@0Ey#|`R-0dg<>"6y%1e'7Eޞ<O lBm#EKx=^E{ X,1X)CB\bq5WcW\KEhMaXGa`"x2B;ψ^)Gr3a|332Kwxnzl#^c} K&}>7(j}Zтe簌y뙁.&Jg RէcڻƎWL)eh]i"ev3z^|`?/k1)o N&;ō'+H;[ͼ|=zׂ=ƾ̂Sd}HYMV_dVgUu%Vw1+Ȉrۻa5=+Fs}3|v?]- 8ݿD QwQݟ?QA?I݊izTcpcDȲQ<(^IًuҠSqy#)TH$7W,DV!fS"|Kb hb!Ts.8A\_9`O1-"^uܬ'0g`5ׇG+\Og/uq4}-Xsj B/$R'3.po!KnJ7WzV=w WcTԞȵX'f;vK؛T{|bH[—%_Tg@åC,Ӄkq7Cq#mq:tŹ]ڳ &)NCo,p{gsM#)]n܃oXz3ht3pa QI&H#UsAFQg(cMR* U[c+.o)FUtls Tw}SBt)Zp`> bL.P &CfW(H-+DFE|T(ApSB0)6siUJ%R֦;)PĈ̨BH*PXD-ݝ\M=ZE&{qTk;O@s]UEr&聫@qkTt5:$o+T)HJAD&H&4kU'w ZآGWnуp~>Y i)=>% Ĕ5W1}Rw%b:pw[GGba[%OR%-]Zok.l涋ЀTp.4*ԋ(S; ިIǓ/MPvhQ15'4ObqswgFp{gQoM2\9°}.~P3SɳfeR4C+0̽9}k}s^,vU=QT4njvh(N{>UY,yW~j ߪ?m*憄h iFt*߰R<LJqz}Hm4 ><I1t$WbOveW.]^%jeV)&2:]J[nTje8'%*xK;Zy/q(/*B-}uLcVZIVH&AFF=7`Hp8rR(dK\ Ƥ۔1#GC XmD &TdYY{ Xӗ(mģ™%E)E=W"EB:ϢZ-Cb B!E J2 $&hŌhD /ߣunEx $5Pի`j5&N,^G!` 8N;(oe*a^X,$%RPV LCrL7O#u@ZH2Ns,5Z0""G&xd-cJl8W }JT. ָ hg~8?TS\~PdDQ0pWߟ~)`&e` "*D_gsp*l:3̘ǟ> 8XQٍtTO'FdIB5qaR?`v /A/ ~s%]*ՅDjsEUƧFżbۺ`b >EoUw䷭ BȾ!9$ݣ/I>8"ӯEk.zoy?[^U93rY ٸYHuo2: b!dRpxQXeHSA*+SBZLldos޷3{;Cbb74P֛Sjj=W?`p9ҧADObzBZObR ]DOzTc9Q)[8(By0L#.8 ikY3oF<R1zD1㎀ RMU@( V[ G{!,.iDJFpw#03ȧ5`~E퀬ez+#M)|$͍AXtxI1siv_+h"VABtwI<6Phnc0\iO)o#[Kutzt\AG*=ܓiRt`2D"{~x\\UکO%R#壀 *Ʒ>[qQ@"TsѰPtHf^/줪DrۓWc/6}+;w%VitoNSXsp熳i[RݛsW]F%:V^ńӮE) G\Cd#-Y'{{Gl>oڵM3={skHV֏Jw^(Yp2N;Ēh+A jFI^fp?̺{&vמO(pZԖ>"%ɢbtUDe&2g#]9\72,ѡJKr:eOG7|.qnpb7q'HrudCl:H.uE:x="ꠇ9oAUroB,tܬn59兏_5__}\n4]N49L}TH{U< +9dR^l(M>ogҶGOb {K8o='V21+qe?jFLb,4wc[8&Gz)1SA'MYu?RC~>gᱭqreI҆wjBj8k,jI:ireמJ~FtMu,j;~Kfv$ab]%}_ݩVb#PEz7ɭ"v4;JkZab$ t  ,$O&540f^Mkq4u;OS~GJJ%[ۣuҁrPisaRah 1~^dk GXF0&1E'\EyG ".V0KBx$7cNCZCo9d!/(3 P>vYfbT3؀O;F:K`dI-][Zve=:8;!(Y@L3@뒅ݿS qB KA)DqGg~tviΰ>u; = ѝkd2qt2XeBS`dG(@(N9;kxKwop) diK2 p#dݖa4s30jp$Ri8p$h B`4C p^")4) Z3&Szl1}?COk)n@A%QF@ P4 :ˈ0`=kgfS+3.I8+{m/:ŽUv|iziZnzh4n|(us\8y8b{y3֜Ge~٘^R)bʃkmUa9WBVq6/8*."#d/:+=a˷rӬ|h>V^r~tZ|w~VŮ^M风 }&ObzyO l.sd 9*L)BP9B'!ZlP;1έH ͅM(h;V>]`gCA)帣f(%i" qp`||%RUi"#(˧`d&5E =@%qܠ)8ɊSVKG9m(&ԅ4o^ S3m76dpgY3qY@~od'FqwjoNf֪hˉRƠcHǢBǚA5F^Zd& ׂzKSn YvʜBrj+5l!ɎUC׎.P(b~yk{@ \x˒ccz }4$nvߧ!ߜF)'6Lq)4;Lʑif5@,5"jv2_:e+TיT`kg8IYiNrvW"G=A%m|}5wEҚ}^ƫpy{DNds]9{Q+2mEt[912uLʸLBݒv̫,2ox\ Au“ͧ˜x),CVOO,aTY3}}'#V| =>-4=oDQs>RSDr !+F,W /JjHYH~bg qg쭔^}j)G.į'$ Dثq #9 ۨa҆k*(8NQ&jRB}Tpa투 %%RgM毘xPHM@X\6MZiZb"0PW=l&QU9 o:*Lw8J_i?_ξC-iE*JC<\,.X7KJ-_bLBG ~`d#.:+dQOqV68hIJo9Oc|u*aY y7\QMp(iNuM,)hpŶt]1@MJi_,B#H$l}<5IK1xh⯳oH+x&)j ~"oiJr^]5!6c6mE-V5!LzZ*)OdlTJ  DkEYγ`MuыP!Jy)QC'г>ש9kpܙ 2BEѠ%ƽ(HT賃i!N( SلakQ'd`=+r?A]f$P\&J\Qf4UFk{(qѢFjTX?vP16jL&י LT? 'EOZB aGcM"ꐋ,"MƞmfC6Z/s{nmi:mD :'JǾ[j.8+gV<%u [[Nw4n{<*@L+E}k n]pW΢+<<HEk1HȫRϩNMMsats}i(6̖I=9hoOtgEl.޽xH]OMbݯGHdZmtQip5k̥4)?4((g\$i^(k$$$zT&) (JKRin9@%L ~tuvNW] H'd_dXȜ zwW䍦B-vjXЋ,+=L&*Yp o &~bdR^@)NКs~4:y7FQ/ݐ?͝A " 8tB;jd:g؎TLis5Vcrԯa#G.CZS [k=D>rJ&YI1B$$\w+GMѦ]j`X[[>$ڬWhh|&TN7v}C6;hS??Hb+3X}/ 0q$7k%ވܢ~6\^У-7X@ PT ME֧\YGJl++ŌPdoU޽غGf1Fc=ǹҽWimobX;簜ӣk CzHȥ,R0ɀvQ+͟itR-ݾӾٚ5jJ5]t +Y ֠GWDg4!Sʦ8WP՚(J:Қхgsr2aT@ey!(I%aUfa+, V$,+աrȶ:TkBъ| ^6‚*IrS>Q 1."ǹc|_В||F@Qo3j/iRޢMUr`%ofm*{kKdh/GGL,_Ɩ/|,ڂ"+qfRܿ3[_޼iR!4Ez 8i ~v1Ypxa/14œ~.)4Y%Y u4knmd)\OWjQ}P ػˋ뫲S`bsJ=6|h -.aQɡ7$O}qZ .Eɽòڅ(.*18/UT5ɻ8}O߸tA7_I&eb{;L.[D%^_x{3ůe6ahrVoII@/gAZf4VRx;B[H%z9g6eA!Š }Riݫyv WwĂ#spR-RS sEtakW]Qù MP(hBNPi}qS$P=lYؚ&Szm5ғ:%V{SB#? bDqWN3/XZZJr ׈Y Kl_MB>BFP\U JZ_$SQ!z <#H nMUfOj`f|2)1E/'g@p\z iS0?lMH̰Z|T&D*U5T*Ź꯰ y.^s Qhۃ~$[$]a'XaXf( B4W7cwՑ]7cOMM90Kj#/ǫ|[:=E ț-_t jJپKFḁ̊ 3\vz~i'o(]1mѐjrS:}OwOw+'Wfqr2oy#D:ެUԍÏyx?,Bg> zޮ.)iKy LUwK]b5d_$E@P)/Jc^M%RuӾ?{WǍ /Ճ+"/C&8mZg-)F D&\)W"6cKԐ9 p=#r@jULd%;Z~l^2DNt7&pW7zqbVGk<+8-d͡u-+ Ƞ Ȑq"rHuL<)Xǁsd <W?hyiy~pw smqm0Ze}r7E)[OQۂaeV3ұ?6>24rAyRIY'!\^6 L3gg>J2Uz׆\G1p"ŌV?2ћOrh^-z):h7>fqn23i8ktB^TU5,&C$WmFI|hPkgQ H@dN$әclx鱡csXPy]d gJ/V a=F,`Ɯό[^Šcg%Ęg{X,O9>j/R2Njv':W?AS`љnQr@4 !P|E[$].*6}GѴ6S3N5޷Ճ.6(!hHKc.G"44$DI$ 87D* J.B@}$RK)6d>Ғ8ExC.мD.6(/[ޕH^s3^l%Erso.p}{.Ee9|[wRkWFŀIX4H%ۓhI~viobx KJWuS0pƫ?l3L)t5;6*p aёwQ{ޱLW?f'=]m(VA+vu3Hji=<7=C(_y`MA XIϓ"'E>OӗCIr(W&RX`P!JTzcv }cՊe(-q.DZ*|}.zu_ESwvEo[olL^I&dM\wZ\eGVH݌" Ղ-,sH gt)s탉>7Ή%L,Q=-"zC` OQZ9u9JaƄJLz 4;A cK߃|:䖜Vc:O/{שEu慎k{?rDƯwңs?p/Q 5i=49xTWNE9'm. Ѫ$s؈{lY`Nnλ7O0/uޠJMLi_MWdQ e㓑 L (I=bm_iRrr@g`g3$J!唰QfԊyM|eƨ1KQ/ pM Q?նmpO0;@* d1nLW7JbrR̅a!d*`rL#쓸B6iWCP^go.( 5(TX#{\ί}grъ ݝ(W>Tr(k,9.AI됕sa9I wm 3\P&b=~iX8W6ހv*躸 C-t;`BGuR݇/G5߹YoG-{> xHRhxud:dwa⥏3jЀ8mS99٥jes7쾬vztZ#u\%aQ5{".>fD$*F)sj_cLgb f霖R]qI]rjK,LNl]^zV;;]{ſĠ\Ej!O]GIY=vi$(f3xhŔ@[T{|X0wtQ 4 6D(ScMvN>ڵ+HsdNĜ.ӺRETΕ&(.lI2R7Qk"$ ϗcҿ_*;{@fxp֣@lvvl/jv>4u/M!>KcԮmy髂|ts{=~ [`j+ˬְ fhuF5`ϷUebꝌC6CPN͐q5-=*k`_n2nm(PO$\Ck5*o60%$z,;UоDgTjZ QmC JH / 0b<2S JvkےJ+,Ai NJvRBtCnS_l: oTj98fc A f6j_w5' 0iA^m5Wz>;J f90=#n8Y> R xHY2 ֳ :!MT+zHFA` +p"01*f5']k0\3.KYQ:VO0LAd- fXZy bt?-aEfTn=ڬ9[ 2*W. J% JL2HK~*_?{ӣY``'pn4TI3OV[zNN8TBz4+; BX*PYHYcR$6鬈Vq11!g"N%jInYV`f4vbI9\RRyձF|[qt2+9/(EKDuǝ![><Q虍$W'uL1Ŏ fb)oڿGjvso;8; } h+\o3֮<~[?mW*ٶR~L[e5'"2uZrڹ=sZ;-tCd NDP!ٴP}0'(xg CM9Hq"甼f&2 fH9Ci;!:ZCqf--Lj5S/D3`^O5[ $Hq!夎CN܁K؝mw@EA/n]//ޒWW(_)F59#LQ:F eɐ% y >|sk6܃̷]8? J JQ|yɸd(y$GDQa`N)Tœξ2k:^2ݧW\errNңmg˱\,͝'7Ս$P ܰfzВZ2TZd7+n%vR2 )i6(H5u!LK3cB)GZGW M"«΁p!ͷ2K]:g^k64z)$9B{=y ZRFU YJ, ^ec"ܒ\XHZSmPj˟w~1FE( ohH.ޓ* 4qb H=!x} }Cn*>g@s;2}='X:}v,c\,f zep bpVk!/KCDF{<uHKNR앖X Ix"P F%$ÕH T<*)P!1 LKe 1z%!l6!;It'29 L=L2=dh L3eP-ܒȉa{ brfǭ 4"f\e%C)f m(y wQ)K# L>SԞ}ɔ#2iL uf^i 9P(#OZC 䂾%ij-ۀ1p# a4 XBAw˘1zx;}NޓW< Q,)K^ٗ>ˣ4hGȖfԺFnujI:YUqEbk6OT|3y9LnG{0R_y©{z8,X–#SruWElq3ԧqz`Gp:@eb˺Ao*ӔzQ%onn^Xt`VDg ]%S.OE_ehfCJ&9lnP.6h Vٮ7,n>p!zҐ:.'Jڶ0<.KPu~Û A<ѨlЃ7XJq:]r@ H=ꁓI>⌒ZH[ Įϣj-#ޮ(] W/Oo]M~/^[|뫿#S5ku'KUs5{\?.޼S4~uXܿ5Jju?z}ja6QeQmNDž/Z_o5yY_՟s:t=Xyj(t K+,)n:sUGMQL N)\uކaOS)4CߋNVcu'l6C'φ5B#RhWؙOaqoڟ`;2θV#Sg%dФ|7hPRߍ:"~}ꈯuVc2{_Ôc95^RWϙG?y5ˎIwFLj#F6ݦܩZ)mK-1]Xw3P}5sj8Գoa޹tHuH_3VMtQu;QkR.s'8)MR y#q?##cѾ[T atGfR_}ʍMS%|r~= Y.jml8RKߍ) %RPɅzV+fJ'1n@*֖k;H>&5AFՙ3+Ϙ+Ţ!Tc%5"c%bLSw(g3I*PVwЈ4LZTTOR:]sl35z.H:"+ZulXT ÝHEO[1:Oa<$Z!PÝDI dt֛F}sӿQ:,U G ^fZ0^XZB Q'hv>Lh6 ėz j}PCT}G'}Hɵ[8ÿwK]a~z 6#~6\ *kT@=P^S}翻ޤp:RO|('Z,N&v84*sPm͙ȩ,F_Se^ /utE0xZqBGt-lX8{Hɦ&?ўE{\M;ᓖ:^.y!kLk͇z({g2O HOTh'^,HE-rg J?u: $@{.9X9g.H.\o"dD3aJd܏Rr%TzDQFzPo!D>xii![s" Y&و2.igac ]F:tԢA8J<.BiSf ΁!(MQܶ +ƷYgjtSx&M1k.݄NM(Xd" P Ƅd3FE&e>MБ ujJ Zx!AD5(J]7 a5FYMfYTַd`Vse ԧb'KI zRRwtP*U`X @u6L(=:XK3Df)/]P:t@ ѩ GBJ7hņ;m~jW5|jߗzi<zF3"! ^gwGY~{p?I4P{'u\$vIkzM4-=j>AOun[4A9:9zD5JDO8jgIC=wX_ax؞.Locl$X\gzo1hHPLm_6ZpatnR 1 :. tйHPc]k sA>#qܜԧ}v.#848k]'ֶ$MH,lfm-^U tq2EK,.,Zdz{VX h@(@n,%@5# A(2.|Oi^Dw't:6nN]US 31V]"7~u|; a@Qjw7#90k`Gn7Yf(DPP}WyKcoJIi i)8bОOR RJC$+W|K; $oL%%slb@ ޥ!?鎇C(PpuSW=m8hY<Q:{ >*Te s%7+^X@! *i.95Â/!ͺ!אhxPPZyC(ԚKm:\_r|Ҳ V( 邢jFD]1:`CW{<%a5n;k8Q R>PBV4Цm@žq;DGBʆf"Jm<𦧒ÓYŌLt-MɄ &'Np"cBe?,l$&<Cx^C2=#OyDX\N ^YmJ,3J!$u#)8TEo/F*7WRAh i⠐fj$N)JJ+G\H(2 Zt}"Ckm13k{L97㨸&VBhcjEQ2v{0T}:i/ځ:Nd;9qzHkWӗpg;@*㓳M3/Pbqe.<'\ ڃ/ef7)pz jDB {w48q>{E|J9pZ)CNpH-{)Q[L'Ծ .ѵ)w@#JPpr p`ӌ뾧0ZV6YN  R|x:Cu  e:^ҙa\!,Jgwt=tb-rs#2[4ZeOdgLt-50X|DjC{<PpPxH{rco|xNA_}E rӂA0SSaAɱpj!x/vC^F"dX3QA`eQbajaDZr,.>|̧]D@ti[6$2Z;vC'xkGtA؋[Է[$B~XDv} TF `_jXpWxSQ+@Z{)R&V>'R|%~@t=QzB -*FgzI_A23ejG-4Y \)lд !3 Rr Έ'\8~ء/*gc#zH*_,G':s;Juws:6!BRbj4f_;AJ ) g ;#όqB63!d2t8M/kzwz45vXtu>xH:g'Y\ud2<3Ɗ[)&[aN (կQcr.!Ntr`}>87b0|⶧M`U,y$~@qТނa>>(%y0pŗ/tBjmH}9}6(qohLmy4/:"O]#/!f78VH- }Γڽꚦo8?K%Ҋ__ACm+z'Ci kgU88ɕF1K1VC h1 X뤰Q`y@ C6I.1"t׬0,\_g4 (S^"Uz*HS)1gƸ`hB+6ԑ3k_l ^w.Wvޮ_O:1A@=A7&ղ)Ԝ};ӿ,q7?kqH-z4ĽEW#e yL|6@Bhɯܜgs}|Lo f??vnEO b6)*N'JՓOp#M`;S{]5~τIJO4-{~j#5{=?s,1JEX(Xi Vpc*! kRwݷ[S&kKy[sx=.%֬(J~{DTus;.){u\ޥ[ے(|."%R T-O&!ݽ% '|\svTH^Sɘ);s03L%#!sM]19IcL!M(m]u4mj6,2xvo͜9 '拦ּQZ ЇcNlAYP( oOF@ghн){=Ph M9' J xk ^֮Rc%绯oSзA?CD0 G+^]m0Tج- KTfkݎb6xOS)lFay>*6/Y%l3ώY"όq;>5fgxH GI<$D л>HeRsnzr*d7M4L z}Ȁf>=6v7ldTPԴ4\hзH\r!6q˂5XsR"JJŔdٜb.otY`9SsWhhT74 ?P5GUQ8iD!%bЯGwaB'CbHESi_Khh7ha. JZMl yĀWE|[VуTCB/уO??fwQx,Ż+<" QRRCā.;VyYlYe@Jf)K([̄6ܚ굅G> %fLڻtzz)nb65Q,o1RU;r9  ԂG|aZ>T݈Yj) UB~e?s?2zv3svKwU\&} bRP#0V@V5\G:f@V 1z=@E&)eR KwG=𱯴Gw%G# 7YQa g%T)7~x!LHCV٩>=YEM9# AHFD`%ZjKJ,g丣2Dl`ڰY0 qX)9hJB.gw^.WZˏ{eR:Fʒ\C[m PZXL-^>Ҙ FlIܧM8mۓYvl?Xq߁za}@N''/;@ hr85s\Ս]bgj 'ק"@t%2ު!/S3;Ejdtxd#sԎq՘kH:8 S׮0@EN* m12kGm3I(Z:xi3J $dI \RUcywAIѠhPR4(ij6WZޒ2ΊCi6.v*pm³e*'hF(Xc2@mɠ `L{&-EdbU "Ij@#6ìPXS^2u!\Y:aF] }3Ȼiu mU4)XDcEXmDA> ,Q_~ a>#8lnRrrR?ZGp)d]K9C'h&`D)맼O"rQk`4r E⠼҂CC(D5@:FR-md,M)aHwӿ?{i_xw۞8.wep@(Kod9q1RcjeT YBJʔ7EcbHpѝ+N4gkM,N}6RZUq޷{d (@|ƳY]sIBwq$Ą>J9NED*!{jA1p;>+굠b|X.?4{DjIڧR twz5#_Yh,FgXy}mp:v\\rK.,cxщJ1a5n3 H \yvmY*Xyc|T2w9g%]}h!ݲ5t6;(lVC{`;٧o\~[~M߾eS+ٻd ~qH2 ;- j~o؆P[}lN8~s^]]}U]7Jt2W{|5vgͻs&He{g_O83g Isd`U0Wu]M¸l8eiDFXdDr;=T6xo\xc[SQhc _kF='"2Hj> :v͵vއ<dz.隢2 lT[u"p 5*r`})`mU$LA+^uzMHiGf JJi,ɨ%Ih?TL! WMy>x6(L7nr`mxS+(Wӹ^XQU-3}Z.Z\ g)ZY*- }c3í8͟(jh nN}_R1.?'zRzlh 0-%03 H ˛zǩqF@!M8%peHGTL}Ha&%0 RY3R Ť\B7_F3%fVK6S`24HӨΞBTmoZe E "G;71Mqu})}Ly&V>)?'i NNi~H+Dٲh@<1HnMsy֨gG?Kϳ*wx6xqRvx)9&|#),` ~m=cXjR)Iu `]R8p@\SuqA*$'D3>B cL! [qUpN=t%K~NY?OlOb_Ez-WX,NOF-t3x Jd FCscU E0#M0 ӌJҔXb1Ʊ8ĈGB`Ĉ8p"z)Y9u 8֩nepa)]bj8N r\&@T%2*TB5z0p*HBr.,$i}M[e* 쥴TFk`ҍ'Jc"+n.2RbFH 2kFi WZP7NCtD073+ ;RLΙu %f=D46k?FhB3Un0C &)VEQE‘H*ԦlUe`x@ %B-ƔJՒ2 ew&H] V0E5y Eac,MAVYb| D 8xw1˜TbI5T jujz ]OqYQ O:@ ArAĜ5KիTx7Wh6nZVTh%td\%WbVΊ T1N\ ~ K PFTMv)!:fjsc*V 90xbU%qAeg$3WЌinˏuNyLϷ;mOhmTe֋Ts<LJcj_Ҫbct Uݿg2NT{b5) v>HtT/1H؀V!j!0/q 1v( ߞCJk:i4:"i$[w"SHy^||.F0z#$FhwZ65`q;Ng;:wj;-t?J&}qugy?Mz_)/8{띲VAO5덝˃WאywWo^߽`/ۛ7~;&F7|A^=~W}9 ̯nQ|VkѺuw;w{׮}VۺrUЯ[{-8>zW_wNisYI6 [_MӹLgƃo=v_xǼhi7)N夑N:|;d?Bl\zSlt^)#*]̖W nir+56E:N~n|C0 OJkҩ \'ZX^ۻw^=|oF^vuVWyNu]~+yQ)w/߶'xHzG/oYMeh˷n w`[govC70ޞZxwpND>넻λ<$ߦ I˭C"vP]F(ƿ/Ο~WnOR 4uwY{_ˇnH!zE4 6ހiza%߿E;s zMDWGgͽ0"^GV_7,}/Syߔ5/:ڟׯY<+xyLm/7ƺi΋dغAXg f_Oc5A_}ebםY_N.} uW7w.'t༲!rBckh'x~4(odmkz9;GqcǴ덼>`2JHQ*IүKнlrpVS݆~o:UE~ >*L*/ 71&5R-ƭ6|\߿,^D-Xyj;VFQa@ TpUG$vs#UHҊidt '1x\#B8+An&Lo3$z;6Lo3vqv: njkAp`Opg,1|Ã*UQBD´&Z3JxEx\j<}h#Q9OD'gqގZ_j}"?i&8li.`?ÇJ=]X\{QUc TC \܋ڬڴ *Je+&4]/'Ůsƨ^|(`1YHpM9҆iSˈħ me*륖`5fM@CKÝ6XηEl% L/&>Gs/B/B/,.0e%牑*/0hTÙO60uR3_b]z WUJ*-ׇŞ:ָ$SCdں8XCƃr#Š"d1G `NfnU"l6f3] (0imque"U,)>X:ՎyY\4>7?MXCyZ0텎sP -m@vj*U˘&fw B`(pkh#((C 0V ȢVD+, B撡ogy (3Mb@ bcPf@e@4Q&asЮ?'zr`3L[\B!1O 9 Nx6*(|v-B .<[y3_K q=ەah%f+~q)֎8n 8 r#M*K"HJG̈́!0\rD}wIϤ{}w}+;8s3Fiov`<>HWW_%X"V1PBEحB+)X8ڇʀUxEp y֫WRbz:cTOP 4x( Vk4V;ŭtT1A8<H`N涙n=26sϬvqvLLLeŧN3rp](u83/3v HY4ǁ{TԮ@@+WWTUrFIxrI VERyJ0(Xbִ.@Qeٻ8ndW}YXU$C8&'nxW#Ҩ%3Qb7)>i<ulV~R|oVnVn<*:;\ѹL  ]6w:vJEa3R=u"OE^5z~GRZr.r!"IgP0f)TCL!UJd>bzg-Ɬt%ꞭꞭ{n¿͙J;#b!Ng&h9tl ^cU!~g!= #My>t1UJϕs?`VGD#8x˃\,x@&nϓ.m{WE[TjPZM$j"VW&WDZv]*OVm>*mb[*>\b 3jֺ.~5ZyD5niM{B=DFzoDt2d ĘXW_4E+ً`(zDb/Pt{.oD n[97VnVnV~r!YPcdsԃ%UmcrI.@ ծ$MS8$z)׿{ zN')tP.IȺyS2IGJ>S1drrS'(zX_) V1%!5n9=v3-07_MTl'moyet',Wzx$z&*}Ͼyg_Jk O\d/e?Wޟע/zwxgϿυhzZ r8 \H5IrdKcp{/zB}fV-`FS}uC~3_MyǕjO:>^mwIK7N?[-7073Y/s{C&髃.SFu;שh}IeH 2Z_ߏFb~i߉/kGeY.mqDܮk-ރL"5̣0IW} K/ᆲ/͏dď/ă_OW|f{Gnpǒ_c"Sp{>k).).]U20)~z4"[ܲ+Xd@`#f ĬF%9$.fc -" &W*&m>=T ڝ<^[`ŀC$uID>΀&ǨJ筄e7h2NUb E}u̘5E FCbDDrxhR^O :#:`Mەɡ^ @t_Yq磒㓄Z[2› B\ V'x,9dMcD%ByuI9JpYw)RkN1ɠ; R>⌄IPt,Y\̃$b P[ (* Jj*/j%[E(*qZϜα,c:EZu`Ft$ w;8qp*ѥNn1Ivs" ~P(3"dm*"HtE* Bn${%\ 6]?lrZ9yɴ>r_b~yyz,x1ۥd׳"c Mώzg^_?,٫ϟ/.?m a^dO6>G ';IyWVEON/@|~#ү%MsZ Zm~89r<t_{m*]dڳz%^ỳDjjO+zCw=3"Ky֢xy%Dֈz`# 0"SðW{IhðlǛ5:1Y$ndG̷`#2,h57,J8̭Bg zAJ-haX^wQ$cX ; ,kC4#.ݨ~%<녶ߎV-R-J&djbt;"\+f3nW!؅Ayaݡ6%AMj\Ni`rJ$au8/tέ5(6~FԪBqRظAŃqYk4E Me>TBp6 rɄ 0*^Fe,grBؗʒZ 8(tY+tK7KА SA(yD(Pr 6< QhPG="`*WDuc,AFQ% hWjCEe# Vr;!pw^˧Qjo)&0tv㈄$Sqds͐:`A ؒJ!e&s7ڢ"Ok/XIRjF‹ȟc??0 H8C,Pa`^wQV9bS.{~̔sm*vnbg8N)@dH^vt ]^Z&IM| %Wo8b1,I1Z*x!뭴i XN6<(zQO-[\s#` nwЍcKD`$V/;|5/2+E$)вʺ7P޴rz8 I)mEI\AXpҏ6݊7S0gagQgfLAi kӷ~Mݝ09+1ywvW;R*v.U1/n_5Ꝇ3cBr]}hw4u碥K 4AK/_ͅlF/G;'pۼʥ:ogqS9;?:kaꢿt7ta?tn'{:hLwv:Io/IPN<ηe%a.L;BFezuimihsW}:IrٞC+/\*6J|m /FnuBv@9S6X#cYT_}󮙀G"hٹva۰<lקFN ?q*W/x&t%rL/[dz"^^L aIb"sp [#cdOx]W½]ο[obo{s[MŋV΋QB Zgte>sSv~8|1D 8,7Noy.j9Ɗ +YVԳX1j0R B4$B6dy%O=Z[ac)~H9)/v|>2%2.6QwIc|sժP@#HHqWnCIkJXr"Y 6a]p&tr"e|av>iL%FH %9'5u桤jȠqmHSK EƢ1D4ˏ8q8p'QO:,)> 5zc*kZ@J1)(T*]hf<HQz>y%Ay-qX$LBk9yvA5,tX0S 6U\A+,SuR\rԶER)h//k/xlbVPY3Bjcf)k+cfQ Um;;& Feܓ*n*e5ܚ#Al rZbE9!oȤoF~l?!X.QGm*\iݰYm-V* v = t!Xr3!T@R17*=hF ҺZ[*ډꛙ}^OA[v䅣a)=cŴ)hP" MAuEKj;`(ԋRD- By+:yudJK=r#\\q2 7n#Ȕ{ r "SrUĆA& σ dZ-K4*4&*yܢ !s[1p,mV!%S|T%Kj[ԋLi?{OGܯz [#ya`^@JEr=ԱwDS`V\Wi ЪTSPndVRj-J_M,ˀ6eMP?.&"`es<.6ʏٹGPCҜf6X*[|Ê qW46_0׽ fKTcS-/`}j<%ArsNm=? 7\a § ޫٗeKh <YhYwH'.F~L,_1m .b8y'p6y`*$jf;]1?uׁѽPti]zDmޔt㦬ֳYPmZT°ו%o[Ds^l:QE,TPaQ&EMFo'^EM()SM[Rw1ygpǙC!\BTUZ[4h oWAr q8T.9 0V֛ғ Q4uEZۨ+VH*rު*f'x@{(z%x^|?uϯ288zsEtnϨ}y_{*]vSg%;+Q@wQ:lاygj5Zz{ ]ov&NPzLT:<לrJ?]>2I:oo+P:T1 ɿrf0[?s8Y(Q<[ͼN$NS/_3g|ϖ/5o%xuf6#OՃ*zuvQ*u"?q;)^|n?<>|L)SDy֢͗mewF\Q+2+]R%"Uk RZ'Jo6nWFq CUx9v2kBG{Dbc֏h; Ø>)l% ?ޚBYI@]XmX7Zs(1VJSg|c~bƤ0r;J(Pf"&d'=\4iu P>uK=po2di[@ ^VWv_Ւ9{MbxΚ Y 3O-Ob͚o7H[ݥSKj"ʫGz(+\U?RO^xHzGzv)-ehѢ'*P%x?} }3tg; rnW/uE@ۚ~ d$VLѢV1y(]K~S ^׏,V ˞h?VbT !CRV ց;{DzKkE>^T3Ȭf²` Ωqs;jwd@,?lgou\/u xUż0UnpE|GRqZo+vpy7^(XW; ~|`F˚7C~|:QG,*)]tPG)P+Sk]}|KIWC7G_̕G)3c-l~~%NWVHƕKQZސ+&Շ`v5v2dLCOn(=e#ސZ8jcUWWΌDj7O¬`#Su}?yO``D>,ez6!NWV ocQ U[Wh_pQB2!@|s{>O_̯}A. Y7cad>rk&8hmE讅Z;4o iao@?Z/ncq>p,~Xs;E\܎~(> ǪP]~(ơ(s`@Տfz1Hq^3ŏ^zz+D;^?(A fI͵Tצo9?fvZ-M,=U;G竓f5?y?ٯy/I ס<89ӓίmOOO OӌZ~Op(zuGL~m8/_~Ex͍yNK:z_#~{> %BBJwWeX?/ƸUǬB;h']<8n}7Ӯ ;! ^mdR{%}&Mݬ`}P6jrPH759|{Cg{ ]g{i;XcEY@:^-1%YLNLQNL/Cӥ؟ 8d^ j)-'ߔw#6DLVh=Y\|ʜ07,Ŵ$kh~W7L1f.AI- 'ѧs)v*X[BEo Miey.c{¼,#eƧH~0\]cj ߋ"킋 r>Jo\=Yd3NȀ ~׼bkuwFj'$e ]nWcŬ=58F+[VJ,"Ɗbݔ BPWEyKbVk괴\$ڥ6)ʔI!hy9dΩYNg2%Li(3DЃ>nƿ=iY3OOr؏,.{"7ܟv#sLvqǤW =ě1^'g"$n'-JWL$+焊{cWHYٕ-ed3,䕍NJM-YEoS$q[j\B)e4o7+ZNuA2TA4@ImzLmiMVLi-%Z= OF^lowr*ȉ6ӛ"O41Vb̊T F@8d䧄UQBf:QT+_H_nE1g' S7d&ڷ!~[H'&d ̔SZ$W4YMABYEzɍ$y$IHДQ+ץ7aijpə0LKjz˜(ua8py%28.t?(%5['d\7#.C2sBCA=GtrRZև-Kw9U)ZQ՚jz]fkSn7Oy-`͑(;A) pn/_g_[!sRկV$T"#9Mxk8Xp;,Wb}՘FPPB &#e>eMS.5\ԑdztLŭع3N\\t/U2ڠ0L^]ax5gC]/%8P ߣj&]J,Қvrû as!izkYrq4ޔ3k%3"2(9 GL؂e7AdEO<">cZ w% Ɇ5NRI4'zy ڊG;~v5Ka+(VZ,rrY,7?߽HVf7%@ ,={oRrg7jKǹ}|og2ҺoyfK: LgA>'O `2PoM#9 .)kGa߻}7\/p>0Mv5Eù%FHOBts[tvs9)^1c m'ލYq(#ʬȓ/3` "H`UN>>mȐFn){UqxeǹֆDdc}/f5!j-fWܢ+H;D䀴n "LxÒ.Ų㎩R@]a֍naJ Ɠr~t"ry\7\-wI67ˤ:އt EwuG50 J!w}(ưPN `Muy޸`[j*\^#z{tog챂t9#Y1$e('\}$I jd**x4@<% i☧HT LhI^Ra* d[ F~xsgn{6цl?e{im4yC#zSb3#M89r!(DؓS?uh(ң5%ƎaDzH#&_wmĻ k^ABzNˬz ?$!~dF==wZ7ka_CF[{Z譎G4?e|~ќK'PaDPW&9R CNB:ǝbgLCNw\nr` Z$!a7!e!1FPg1 <ӎېo 3 5I㺭X/]1`-c6fa_N;<!6b,+k+@Ո𤣬04!!v`0U4mYF45awly!.|ѰކD3}WĈbk(};z߀r>XeRUZ2ETmS~ϊR ޾rh1o#mUGk\WPWqt%P3hĈ[L q)"8XPe5Z~L5^hNz_T?|_~76j(oՇ?O,'>??%w]u?~e7 5F ݻh #|3@j'/]?NP41:lt oSM6nT=}lxHFY0ˇ^U~gZ5+o(1 0&)H0QLҢI {瑋㲌` qH6&eSG Ş5ԄzhA$TqA0 ǣq ~'A?xALj"],ۜ;Kt4w9]Ϩ%Q@ê.!FG?pX(YFy>:VPSJG'w+ R8b^E? uw[Vx Oi.sPoL};%%!}D뾺ưhX 63KBr<Tw9Dp8 %t 19=M?̎.Wٮs%>z]4ӲLId H͕ s^,RJsM!{eii׹,hAQUpԽNoE?M34YЪ1.IՇuW#&X^/fV0>~i̊$KF"cs#>j&G*S)hbk&)U4B0̲RLL󬒅FD4DBҌYf+tm_%,#|-2S']z7~. 0ѽ=&Xf5'˫穽OywǟwH K/w};v3]^bٌ&Lͯg~a$&HMT o?ڣf>]']vd:@b_l>ӢtzF .LfJ/[+$!jJE=^[ T,j*ڐVCQ"H)QI\w4<7# #i 16EʌL5鐋meWunbb87R\` !U)%IJHʂ̤2͘-6TD8W qCkpM8Ѱh}fnFt<aA\K>wm_b5CnvFhOK$5$d+{$GIfwCrj N+)vx[P-!|rw7A7tw5ԧ =$Q@a,4N{kA'T>gn %ۑm& j৘ʖogة-0uZ˝1fz1{KQ ˙_xw7I1QzFZo;0(u2#Lz[k% Mև NY( цHBIy0׆]AaN-XJ-O%^|?QlM5[~j u%| 1YJZY"Qf?G*i6w}x3ҊsYC︥b}U(r\HxVBNs{H^3K+TlJ8 _ѷ[~b!=\Y4?8]">~d)ڏ^d86q: F_=hrNuwPA.fIewLJ *ڿ?ǣqw[a0>w-zP|<Lt8f<ɖn|I)1<# g& M7"\_Ɠ8@6Rӓã! e1!$`Q5_g1,^Q2X߆?@FS(yMq%܅ٟBjON~O=~_mۃ…n]tW]B}u%R<< ]ӕp%=)2!.J ՕSv~W#t?,arߕ]6W?b?I~]\eQڗxv ]{\9e:|Kc,fFq~ o=2FTz|wޛ\?{zs_oҷ,U. k *W|yRl8?'i<,t2OgHgG {V2x/#q2. *aQUR.?0T3: 03ݐU~cQ@u9Io֋^W7(J$XV>Gg2C~l?O?qsr><)ǒw+7̧^]=`$>V[h)X b(x7~8ȵm?.<' qʷ~<q?|(~DZ'0Er@*~{:|?  KcBd!R/.·廯& vY سׯo Kag ,_eeu,r@ UP(,]ѐ-yK7(h/#;GblY TjdcQB_=Da8+_Ax6<#7i#A!.E}r7/~JEbH /ezw~HO?ɬ=NێuO:D[˫p6x8˨Q̆%NiLX qtJIϼLxM ji.;ttdpp*,:̱( کTՏ`n1@SdZkA#I|_ c\Ц~4-~4-JJf";{g8x!uxѼ+zۀ -澛7C2W$췃s"x rs= :p{#oP^CyUtal;%u` \Fn `(/X.N,8E_$ܑO4Lus9͛M9]WXb"FHߚLaa8y0eD( u2 W>P X=]g7anD|{1$[j% Tw 7BkʘC^' 7)3z\ U=EP_۶wK[]6snlgyX%r^>/x:6$_]D_dfsf !Z/GjrZž暆buDH0pB٨)Z\HjDŽ43@k4bŴuxnO2 KT NjKL#xxR; Al|;BP-@I{LN`'Z&sB$ƝQ<t cnQ]Źg ?}׎?e6Gu_>ɯ_ Kv!$K_g(&]Yn *gSaqa4`]7ڐriP_v/4po){%#u}ҏ6o^n;YVxDqN-&YM/p~urD&ls9ଢ଼I7Ndz|ٞ["stZ# Vk HՍ6kݬMԍܥn-׍ܻtLie,;yqبi]=G81fHʱ22;sZ(EÀGhZP>~o`9[ԏY;$:XsL;.ډGLILT%XF{X @h%G.(0 ?Rv$'LӂiGJhʗ}"Sv8,26%1R#0 pEXpɩrXk4(KVD#dY[;H;@;1jg)ۡDZi;0~ؒvNI0i,l8s`g{Eʉ "4PU5dXIw a7Q$Pm1( cQP& ƌܝSk|/zy?heU}>> ƛeKNA掣چ8!<")JWp9sAI V,ʇ0*pͰ6eE.:IjUj"px7ݱeeΚ(ʜU9kȚF٪Y9De"DdMO#(z#5\u\\\\s]KMpU,_.V8zPaV ) QK!ST:hL-c`L#$L n|Q"ʈƞE ٜnLQ(mTu kJP@&biI958)'g[aZeޑjAW/N9#0DRh3EaxwRJ3b5?V|Q5r:@^a嗕H Fv)a?~P'{K (s;xPO~M&ڵTbnUjNwoUdT"&4{ 5ԝhpXG:C~uƂ&OXL9+kvZkY0&qf?m!h[j̷\@6G !͵CTjw^k M@r2ē!\sؐYkLT$B$!Xo[TX:U(l.=I(51gb(|&5jGfclɱd@Ƿ)83TGHVtsF2@Ҁ** c)崹l2>Hnvn2zeWj>wDT^S:j{Inq8\ f$f6X¬,$+ G1eF EmL-mԴ`}.ڭ+cPtYNX^i56׫ S:MT*.Zar̎֌dž0f֜ 1tNX+F5q{4鵐*/e{te)EPYD06+]-D=DG}P ;VPu 6~~}cMɵ!vk}<Tƕm.&E@"P図orry{q~N^%竉dzI٫N֯믣_>|~`2"O;nRn7@nw>‹Ot_yzsv@˃o)<6i&Y!U*b i7 Fo:|'rE9;j6tQȿ-AI#̯}PmXnour5^8!jK-p{1A#76jokXU歟b**Q"L.+ o r3SιThw%sq˕IsA%B_& k)se&&>rT,*$lY_}M/D1g|sQShr@)k#'4qK Y{R}j|\Y&#g"(@O!Du^V$FA :<]4ZRNRalNX_#"lyƱ+ "STCf6+ǩ P4ֺUAM*}!5y+P<ڲ3'״V(ۗVN~ gO2Tuy}\p]b<4ޔ~p'S|$4Aw0B JE7yG*5oVJG,Y覸 j r9Xo?}ѫq(G?I|ܸVoSu:.\̈eFjj G$ Aw:w 3NoIȇſOW++?/iR9Rws 1u8_|O!F^yahz />JB؛^9@$o9.)ol^]V+a4ikjWWo߾VỷooaE\N'sv{SƧ鐗T?g/2˚4b5)^ڬf=qqE9ƈ (0H X*)zie0ՍW|?o P=oyUM9ϓ4 \m;k_maUE֍]J_~-M^mH *)Ŕs-DwWװB\@U\5zxfYvNËhrtQ mf˯ސAvub{{ϳ)$ JB VI u&"fshe>{9w:ͭ޺1p0[N4zF 43x 'h )@[[6v^'2S" BePmnvݔ%Q7r 9RSGjx[^vk)a^mjL2KtWxĩ³52s+J9R 8^k_~tZ!p aG:%]ph}~壗jqO# kvbwA 4,Eëc+\_5\uTt%"*j`emPVV[u؈JD\VJ(Sv@>0[}nkn i_$Lq~Ņw׿4sA&u)Yj?+{wuw.!:Ļf~>T`sV'0ΉDx?P`EcPf7* gdQFFqqיxnjk$Ct1G&,枳k6O6w/gN_g, ~}B#SцK(ׁcD6Dp")POѡoy͵{% p 35bz_wIDC}3E:==~bTOvh #4|RU5!j~3 {3V~#̔)sr3juSЙ6)4qՐ{z}0{/GӺ^t 6W O97PAy,cS(L)fHmf0?86|{\ aD N|AƸv̎k5ī D(Q0LNLn s" h' 5'݌l]8rd(+GBRyޥZ"U5k͍}|f}<>⻇Uڐh{V/Mlf1Og9[%0/h\T`"+送.hxsָFEgyBəT3vz X@(\yX%8q Rջ?{㸭/ܼo>l&AI%ERv؞Lr[e[nSKAӖWEWdUSr0B(hԄ*~9 !0mZ[dLl"ccڀ%B@5%l-Ѽ/142zKmS "[u RUi#7h`3Xxg N, @_ԣo03\ҙ; MK!GO_FD "*?~??WZݾzĚI0mʱK~`rM*7j5*8|8Mr->>-U^UPrN.w1]9/'7#4_P(2COt26VB)DdeynHaYrrs5Jnn3 s/??0^>xSO+,.#l8rq|̇Ys dc:&蝾l'#YeG:'x! 9qcyAd$PʜJԡ^HqUTB0%Յ2=L"kā*8ڠr&ֻkx)Q2ZJIbM$8좤ؔx42zq2';*cs+M\m4  s@3B N;Usu,Qfh5~﬐SqgUQv칂* cQw Cc\ȴġը>GϪ]de3x+o*yDMՆ T*&5/-CsA1먲s2N{e%~ A0fس=rpה{D=rZ By kBF$qT$[8N().0:b7(Fc)0aHL3h Cb n nM2Ä_1hC1.۷Eajc)?'#qMUa'7Fkq`4 mj+xn*FlO [CL v'ڳJ̘dHw57톜/-O?* ś I-vmkS0i1@tM*2ZǞ k~|6.;|jb.W@xLg60x}ȱgM +#kKϳT6f>vxsN2%2 6oymb*U<-W7_̃r6_ϸxEiz1T;I;I5[ I IجLU0.TSutMOD,B^̱ern- *uKMKJyXWnZ]]~-=huMWy1ii2kUOX '?>v(ұ$lұU5&ك! @^]?Fgf9ah3HR.>vE&oXI]Ji5?>X9*)[]8m$js툚^dV ޙZK)p7Q쇋.PPBsT7Eq-[|o88#=!˥̬d$`ZЯn mbUކHmU k]jhf3.^Q ԋh(%uT# =j&]fT#AsXN^M eh$Q2O˥,vEܪ9- ΥDb̓Kso^LnuNQ18]-d+[+ 0Ƥ]-)#zeeJ~w#$d萌CLr;DjNd(y?Ics]^m ʛ 4a4\fȳ)+ܟe_OStLӵ˜</ U BϥkeC{W +8&$R_G+8'Qfv.mfDEPCZkquUstaĨQL8ݗ+w1y.&䕻rJ@Q5)\V%pV*Gk}VX + ~{5:н6cF WD})ki@3bC LV,*'h]ԅMR눩 YRd/?͌ ·I#9A.Ye Ж09N; q[p2f,;1j鍑@dNxmvf0e|pXIXW~'\Jc$Rsl-fxaorE BN,#.1WzY⯎ٙysOاI^v<Xg_5Ddѳ# ʜeàLvAYV]6\R-%S=[u+{%w;Q :{!2'ԹzJQgO-e[ssEo% қ荒5(c+I[-5/$t^54V9DZR_ڔ̔ڂ ^XZmZkE,3I^jcCȵΓ@ eAazƀ4 Zxy{-'L+matNEYQ焢WT0% zp zfҤ+e6AE޴Boon6X+J?|xVrƥ`ItK> : A}Fhvt.b 9F/]>wEsW>h&(CJv% 4FKSg@Y)8r\90Qj-NrE'bϣ`ӇOٽEbV/':TqKM{'h1@N֎5@/),`#MAlb=\s11pVHHX.pIn$(DT#eha4BOQEoJY J?hȉw+:aT 'UF', ?Ow?TvĎVQcSf) Jp!ܠm$h<L0?pr o-I/Iq/L։!ߓK̏eX7CBLI_|OYv.|o z}rB@Ѧv2k\ARG9AR!zVpd8j2g_Lis Δtq*gq|YEm4vr㲣p)o@Sq-Yv)mѕxIn-N--??>ГR:Vw5΢F{<}u|=Lĥ`t<&z+I{7`7GV DPFX8<䞷S;.)L]6qm-އD8|S2JE%QWLcKt_ RI RT Ɗii0:%tąᐶ‡b8BI[JZq1Д9=>3,1{!.*x^=y3| 8B99' X]2 {eZy`9Aȍӻ[q5i9ȍ,1v\o4Y\shÔJ%|`fngÑU贓:bJT'ȴ2kZQՉb>!tn2V$a}ast {~0~-PƋ`a.]Y9+B , Q`t=Π-w]T>f1}+u2*[T2E02!,Wbe#͂%]MÃ)c2㽱llxA:nY-7}Lؽl"n]K@ql@ٺmi~өBVVkp#f춂Z? R2 k4G>TNNwA{eV4:͏!a /H/>dp0kTkPSw9l lΔ~}ۘ}:kFoʈ_%V\44F;jc jםVҮG54#5ES\W3pi`2zүV-=rl\|޹pR4ǵll#X>^de6z;\?\8$.+z58(8 Z{mimP8dw8^hclPXTwM"g"[&7t^XH0ӏ?n.-}ήR}JZy|*Jf!6/+{2ֆAqX~./%kRe!)e$bQN%y4_˻c쇟>Fn:zufz 5[ޤ~7x zk׺O;wߨ<|sAZÅ'&'L7cr;b_2jumthzw'[TgBoobǫOUX!(yDvdpOA nKrUJDm3O3?}?r} ]'Oޜԅ~keW2 6t#q@!V:׬RMv{{=ivi^)u|_ C\TRAeU1H2ldLaM=4f2Eb1{d['4L7ۛu?ExWʖr9dn.\H 5eaVb~ @7e)!v=73a_G9N914PmEb7c _e+ 8d96 J,4#׈{ wĸ4i\>=a׵i Gb~ywK@[.Tg\[Ҳզ}ŃDUo;:G {#-ԝ[T1hb[ࠎ퍟Yr8J+r_)XmAlj8k*H"S$x 4:/ݝs(1Xa5tfƽ:u'`BY|{-P`h PxuZ[?6R}Ze+CT$G$Iι ٗLX3•j23\Y\Fk"|G3r_g ?owZӒVk{C%@:u6>XN>&+Xt !ןC,H:HҔ):#VT-$;'YId> AZG-rhY`e,ZqhIPv_n1f,ziup1@Xᓮ$+t2S\_O֒uT,}wV칓H:4-}}'ΚMS$#ֽCkDk Ftfa{JϺ6t"4} o [Pk6o>aЬZAD VzR:Φ祥84 屻e7x:פ1x|wAdfkӳdʬOZ /*`ƲyҞ+p9N'F ITNF˄+*){yt9"LuРPZÒ0/nіa&8hJ$Rq+¢]aM̓IdId[ C~ [ xR_l֊m^=ygC&{+m^Iڲ k v;tiQRr:&!R@es::DzK"#ClR`Lt6B I ؒAg^X%:)ӿVpm8kӶs-ɆxSK}3‡\aImʹ=2,+* =ǚOCR ʣUN9L\yĉ{Μ5;ZZdixi3T]cˊG.Xb,lZΐN&)nҟ7.) ȓ6xЩG+$59FG@K=K:N[s#"(@^DF&'bWLv,jwf4ޢ&9 o#|v`Lou+"?(dɵGdPhÉXF8mu {I JJVѴ#`9e&M<(t23 ӜfVkra,DZ1lم= *2/MJsI]>~KmeQ SJ0?C?HSkB5)A9%})=]匍OG{ m$}>Ġ%=/1yA1h=p>"ilY K٠@D"Qln2/Њiu;EeûȘ4 e"b"qSL2D?*r%˚'N "G:A&۲+Y"!}QGoeLA%J$2ORzd\sx 3cgH rʇ AQh` $'fyn}:y,. SƈM+!7Z*LVBVdH愂e: B іXA<>u_'dxG4xX>KGB%)%ө|\cȝ&3ګ6d\]y09o(dβZrq;)ʎQ0=u>lwiu4*|;H=3=ǟ\.zo*|OxkN wݏpV%FwgX?/>+ sySL)azFdcp2tv3~ ̺SH<}{t4ɥP̐p2pٟXy=Պ",7n-O%JўAU&}c܉Jo\~r]9}rS>jGS†5Ffi>Er zY=Td1ۄ>cgY+P[\h"uEKvKyDrS!"4m= ,4DG V/PIi)4Hy)݅2rB#6 Ѣ37 #ߍpK޵X'4 o-4OHyR3rg \0"4˱:qJ>זM#hN,#zDLSYN)Ur}aɰ 9`L3HIUA>ȩIJJH`̶Gx׉(?DN벤lÁ 05PR`֭w _#(*Xtbc~ 5gw9tHɘyDX2T,HQY PeϢNj! Zzf M¾` {i~m^nЈW>y[v;%zo"ZX{k;.ܱ RwmK'N7/`79j\1?owJ)^mTRąiF =ܥ[CĭgyyϝGT=4.{ޭt6:r$A&F2Y*pj}r)Zt30tsްϺ[ϱtN7" yt/8b|B=N=ybj6l_plj"`ӽTY4wdLgo@3%F]xގi/p lXUs2[Kt]([0伞+?[3C' DjdϙwW[-.hkE ,c3dc:,Uzp3k;B)4ڨ-y`|i~G|#0gXÏ:ݏnwdT @HzyvD98瀸pr+"%#B`EQ%Yh`RL.=L&xT^L./'OjIR]pI)URarj݉aXGcbjw?pGFtΉ*9C㥕eKuh)_nB]BK¡) 1#+F(%њ+$(OkiM#P򽿺Kʪ1PZ "2w ,#n*+HEsWV98RT2g+앰[+*=#4Cڑ+$dDNC)Sn+ J!rkE)(&*ʷK%{ w9j†ddbe●1Syd*pf֞gDYo8QU `X`F ޫ&LV`G29mE/N~S-dTbss\sKoUng7Onԣkoӷor^o>;=νJ*r VH.JwWAx!tY9pv Jv[p%yNFHBw5@281ǍT3{:O]\kqSJa(?eʿijtIzR77 SZZ+ r9x,8D II4G@ZܩH:eF-OAuZ=f{YOw:Sk/w2 a ƨ@t@N~DƇ0iMXȂi[Yx+S˥SE"Fc&i486Fѥ " almqTVmf!N#4n#a/~b~]֨ KA竜?BcNO#&錠~U8G ^)Z0a\BzAGq%9sLԨ3sʨ(璽3)rGOWB4Q;;u?[*Hv׫jIR@+,W=Rk==J'^Oȣi^[i_KK-%&{7x0~``F-s<gͨe':A.d T :p@3F= T탁ۜ|mvග ݁@@+_ė^D>CBIN혊c~'p@dS%4ZҠG]IM8yw)ÈM&ZX!KǕ)"CV.pJeՖ`H%ʒWKNsZ _1uҌP$<$ ?t:AN7v76Adz+ʲ5u #B{cѣy{czeQ+w"g+#L2ǃTyFM O8Rt3 v9EE- _gWj+qOX}ɎjgpN`og+Orq G¦D}SmZ7چ܌]5/c|"M;)8hbJwS(%{c =+g2$xP&XoDz+0 &2 LV )=菷wͺC)=K, [B _-B oV9gO69%OޛXӗhh)jBP"T,.B%z%V{MJK F54T,K!+Fǔ+,b 呆/xN('GGt0;cFFHX[?zo䆔e&u7T%mV2T#%pw[vg&p\Xk*SlQeK0QlESLrIW*zm\OYx4o߼yBy<^Ӆ%WJWyȾż˻շo  JoVw 77 :Lӷl50?pa@[kZyk#_@nلP9d[=J^Z Q/:$chHrŕllƔfK"uH:{ @=(Y"NCrIć$W\h䒵u KAUO0t<ɥ\q I+0IS u;܊De@",&VM,~Sr@'"7?UrId(&kɻMAU`%m2]w5!մ\VK۴ ABLDnd1QWN>ż=Z\WcFhQ TPTh3 !YxG9!|<" >v(GEɖ9 $Sd _MZ< ;j~=_~kc1&r#@l,WyiJ#xbIS rT^Iϫ73eoQ  ( u,H`(~*ɟ8^vf)̎U4hu~m+y=ey(ŰE(ب@]+5) ~ɹ)- $&(Z[+B QJX Y4 Ged5d OTz)Nj ֭8~S~oR6XNj.a IA]dd#X_Suzmcc 2KU8 "ⅴz1suC 3G|Ȩ:}UȪC˖9$ ق-Jvm = h#-eK ݱ޿_NTFͫh{f"^Xv1dp^Os!'3zSvZ-0F.6\Gpݥ7,fqiﰿwؽ5O߾97yy<(QW۹7O#7OϘPQ2D[/]јK?^Vz Cz1D|ދ!jax9BjNEp-sLr.'12혯׀tHn޶_vZbVz*Xs= QG҅IKv4%fIi5]d3ztb6/c (Jz.˧ XYr (XYB1UD(,%W kG[<޽Y1۝':n9/g߄vO36 8/tXʿ&B\j!,+E}B3 n+EYԞ८,,jG?& p~v\dҸNkr["o.oB΍2dϗ1*x%PГTYbOs'8/@MނiON՜pˎ~ޫ0hoL'NS5?,_//qs 'W~BAB\3mNO.ß[;+Md_qV*Zd͂_q<2' [O-Bs v jc1JQCTꨬKP)sT&(Y&=Oi3}Bw:umw  DO9Y;eD>F}7>mވv+TT0:zz(6t*-Vg>v *$ÁW;+y_LVOޗ(<_k5J][Sŕ0EAgTp܃g>Jgw/u~rO[Rކ7w>ܟN|s}=7:Efq|rq8gOPNo_~t['gmODSy2fI8 9ϝLz~SeOgLv PT`W<,L'm :q )0):VzHڷÞLcͬ KpJ|Aц<|lk*hCHſVD(ŢC)V@q෈/&_o_q εà} r$^Z`a,PzK](95siEcpn(X|(Juָev;8(lwv^B#-Ck%+sN4k@g.K6kޱ̭bQ ^יl=^WHVq\LM5cZuIBd,&HJ6:YbH%Q]6{F9q `D.P`\ 11egp5Tr*H뮗r@2ߠ h?0earԈ=;~W70.5p}r݆pv5S)ŭa+ˆzΣJ}V!{qV13XɳE Gg÷ZN EҪhChU\o)fi6lγٜg9oM.2J2E6ʐrs]O%\ Y52Z 9!1aW"LߗoʸZl*yk"ÛT.m1jzϮ䨦_U[euC/Lӷu0xj;93Z[m?9e˕YӳRpNɜj'%}٘^}>&A=d9nXREFs3n\j6!VS,mZ5"AL'­N˲",zxEh(Ӂ(oʊAOO.n2Q!=D%:zrMO`Wb1o F2i:*sڡl},P]}5CKO&"m(9N")h#jV>k0B rI"DrQQV\t)Y COcu%x'LɤGeGNhK8=1E|r&qWO= iOi`tTЋaZ:5J(mp)8iUDrIC$hwaQB@?ʤr1U,=ønȡ]侺vۛPz*9ϕ %]L0MD;2AelU{UodMCDb}TSnQ_'Evڳ \{2z@ ;NnAmX9J\2&&yC*)n.֯ m$ѫA[Jh,NQB'##n(m4n3CbuoZ׻%wkw( }VQQ$58<N0KG2N")ՕP!468ZuB`/I?$uE/  Œ^Ւw\߭֕~pD->U|*$c )3ShFp60؜ZYRtlV|3]k14џodP `dc0Znԡێ6Kk?9'Ѓ~PBns/qonpg%8W.^;L!2eÝ1P2@5M p+5Xčlǒuj&S̆pEz[ 뻽k7n[LBO!0J&KSG4ߒx⹼:xF~CtB6β^{0 j$^+?G7ok+ Th7 q,#Xq1-Ar.򞵻zP.MGqo7TFG_{58bQjnWuc**㤬G'gHѮWyg<:xu𼭃9 5r"ENF!yh$Q#XC LGEQWPԃ*ZdF<5}ٮeS}>jPACc9vӷxRj2gSֈP1nop'f7VH ߜaMV27^uGc`Dzp`C7uclct!=^S#p-K /9)4jG+bM;M(,3YƂ"E9Q'n/%gamw/PjaKp t8Dpy 0ΠB"̀J:Y'ByF$6C<6BB+0{י#y=ϥ;쳛u3gx$E#ՉF&+ vF"eJhz*gH{6!!tݯ!0lO/eJP\ZC.\.{y״eı5;fOOixЃT#ZF“%ȦŞ9f8H#4ݧuhg/Gq:Q)>U떶kK], \҆.E+piGe-O.(k2[v?lLPpj-յxtr v0T` i+/t&g3GӛAO&6MV3ڥ=1y2ɶxŇU0Q 73bG1޶iL#z3` Θ 0Lr9"#l: OOu1@AD `譇 *_.wTZl0QhnDhiZc..59 ~'\ wz y #ҒXrA2YH!f"wRqPR@`ڹeB)L؀TR( dui(pLfvrhY9Bz&aRT>(>NyL!(Qh\fY_MC gh,f]Nuiiw!JL-*%+w SEW|ڃBPݳ,rԃ`@p" 7ZDg#?>,KMg0(onO3j@,߼&~쮂̃[`l5C㚤x<("jּcyug(0a:mOW$-&ADl{p:wp{%mf094c%44@&RNK-ݞZI3QѩKG˱*tK`#e,8/3rikJE( O9= 1Erq; ͛*3kD~T45L\O`h$HnxgD^  |e<8+*$Qux] |];iMNh\xZ0#YflHzJZˬ4'- A>2MF$;4pPCHQJ:Qch5"&;-7{퐣jO0*E)Ctc7{>5$zL-vnYN<6l髈pH 'A7D6@]zu9_ÇL!\˓{BꌜQ.KPty}\Be\/ O%ԗf^9؉:D8y1ʄ+ r3.$I¦3hw}:T GB_)B@HꓦyF1 TɠtV ;}81҇HK,DQ^Ruj2:tH\S%Ї9dÀR02$AyBB1eX'C1;bxx4aXcn+6[m.(らM{ 3Qp^A>{xb@+P{ Nj|x-ˇog->7݅o3*KZ|qt: ,P\eC .q.  ܊P{ Rp#ܰVg1܊>D1$cIB/P}"WŘdl[MҐpf83،6jiT"e)"D_C pp\#&J5Jjq5Zq].ߺ_>ovb[%]B/:ׇ>’ dXR,8 X KD6*gb}R<5. c͉wTvcM˔!G=)ﺗ(yP 4M!+;q"μo7D.YHӈ\ j esf<"ȣaJ{d~Ig>Df:ݰVxRՏ&f!Ye>Ԑ"3QS$6oLZ%eSK+]L60wW数 3dLPhDZye[K-Vyeߎ[mDmxNٖ9Y}S#=-SxSvGvuG'WwTbc'J Dc&]`ΤxkՅ qD`}'Gl3򊋮v <..9X]g8SUYQ]XT&y)ei[1A$m9yL;CۼG)ǛnyNsq-˫mcOzO blp0W+R8{Z!.&碮.k=SRnmaj'xm9C;vD+r>pQUZpʵObU:mk լ,\~ε#S-dʎ*~Qn1JI0u J!k RhM0鹆km1:M7쨯.+͎ʑ*o ߍk{Y6%0D&8Ik#-.PD1Qp)'1BJHǑ<@^:vƽk91eZ'U$6yDJM[ux֧%g0;?0 ޻q nꝷ|*!q[)YntˠMh%!J$^x~g\@y4. ^F@KkJ8n b偍RQ HrJ-r`#8QG\q[:Qu+U"ms_j8c-_uUmaUWUPp#;Ok[ws´tz sHF.4<;MӹSuK1ڇ`&z<||=mi kv}ބx~~|hS 7|Ub%p:GwQd$ %:ZvͨsTM `ekRol5.d0!Ff!f`|k\O[ mSigl[ԠKڔJ 3l~5s_7qTc-P^,"%o΀# *%N DS22gRh.OfbvN"VDn~]] Fv=k"3"mZ8&\6^ ZmwuH=- (@ ڵX2__!P}VjNPPL[t 1spnwWidyтթx)ހDHrb[ ! k'?5R#vc4K}/Tjŕ :2jnwp%*f&4wWc)Ial`{>\D9`~b(mrwY+"cXܵ(}P$ qȈQ8cسL /2ekD XenTnlViu B(J̕`_iK,mbu,ԸgrUSp(D>hqb i2M@-Omj]?޽>( Xi3U̘0"S i*+qDLYv(T6.cJ/0.:_} +ѶH] #U8#C`eʃL ^ҁc]X]N:·%Lt.fI+f"lDb5&+hcALe{ ]'P(Յ'O\O;D "P&YXIYA/ J4Vf6۽֒lj ql%nGѢQ;Tq^p dΈ"c4L2AC 5nykY<l!vb[ڏP.9S4͘sA4c!{KOmQ v'%p],y>~n0etu ćDua"\q(3K vSs&zIud\یcB1ic(5aƬU߹̴LTf7:Ίa$٧|[9KWUɵQǐdʓ_!#J)S ~xW(ڕuт"BЯ-\Y80gkC`3]f$oi`$똧s^Bt)W‚Nm~tg5JkU sߏjT8tT+3#r02"A]3m,,s-Uhn-q-a Wx=M iXуO3T:!Z HK[VއO)=*F6捣.TZ*z{%uf!i(V]o;CMmb:5h1[4Qvs`IƧ[-t:grއ&ӆTJC +CS^/0$ʼnydjZ@p]dS6\jkTӬag|B#ki;!GcwGNĖQ2ߜ[% %L_Ҝ]k;9YO݁G9>USj.;h鉕`z~F78Zh\L(:b@ QNתIK^̺n;]G|rz":QҙϽh`g6:9g2y!+˯8]s_ϟ~x?.z>G_u볋jYT =uZͺ8SBBt]1%{E&%֩(FM038}x:7b 9#'`-|GCR9o\(  T`d^iƜ jķbxU#pKuYP,n"44ps*p3T{܁F%+3QJW0&Y[*2\ALL7~_Lfz0L@0{Q印f\i:İ|R-s)e?PLW +6rE|nC474մ=+J(y6k_o=dd5p$oi/>hpV_p~`\D_|bH ѷ%,3~}0ZyA (x7 <7gZHH圏j~4ZXOûw3(}ӛ F1jj05E54r@imĐz cOhZR賻YHcsk{QdFxk#;',&ž:2aq5asB%L틥 =oNۭvjSpT=% Q*;ӎ`I\%9RXGv\w`*ҹ{vg<K/_-?{\VFH#Eiad=}3}^fY:s젙`0\ f0QS0k Îr͝@tczRn/3.wp{)xgشp6` NoY 5{qӡzDښi)߉].Sh GzFTtXE%@_9ՎEc::>HM\~[](zڌ ,J]=M/\Lff9 ޟ 3&AETD2ѾLE/Eƃ, cL 읒"6Rqt0\FNb4koG$޿S~~aF5״/S3b0w0}/ZNc:k,7n|{D`4(g ދ̲S˯ݛ5?7[ci~ۄ~PT^B'{Ȭȟ qgf+Xf)Lh(pYJZ3b$`$N*l%1J*1(Ɍu f)B9-*<;;:4K~ÛYjәKe\*nI7"*JWeng=*W) qwemZ~i[6%LbA/s_ۖIN9bdbK% bQUw+bwy VrzunN0Kbz_t`& ,B.>!gePh YfPFBKxNFI2 Utat|vVp;Nټ]Og\[+5#z̿kxF9grJFDzJʷoZIFs'%s}v}`A=M`իLV3I{Dz\]Jb?aF1C,pT-m߈ET*O˱2+& T f#*~nxo/&V !eѩaWJFAzJ)չԤy,;A#K),k{׻?OsLB<#iKNڔagdS#WJ{Ѻ?vq^xɫB2 F(uvgyA{I=QUP|\z@~2R{BYվi <2 =1`*q^/:ŎAmk )P/)DOQc!\tm)Iػ'H Sp7}SO"H)h¶iƵR)F)Bxou ^S8P I{%fSpoOIuYN:Y-h8ZNy-Pgn)PB>:mhtz~r)A+ ۻQ܏E~\ZymOkYw5s(fwi_zh/G6/f'w#D"4"gÜXfS+,y0AisT.B5fLG.o_g w&0d/^)Pu-RlQ$RTCt|;$U$.O׵{~솫NEhiFG~*mم:DzN Fj_F["ZnWM@ !۔X0c {=ԷDŇO%}K$ŢVA[P[7ٛ3Ոc۝b+R&l"fDj 8%&v)كb^Q  eUwU鎻*qW;wyMĎjZ$ß4"ȸˀv& )6i"\YКТZJ+Dus=E]oe*9X4 J+Nm8XW4C)`geVЂāj)͌ 6ِ6`] *Q4S;Զt^xҜ vmX9`eqqp˃L:5 G,,`<@m zB#:cJp㒉qD_xA;lR{W25HsK1X3ǜ} "2ӭ  " SIw@L>WwuEFĆJʃw2 =h^-RIQ.N24 Ԣ#Xlᛷ zw[;<lv'Lv{ooIR^xN@R4t9ʀk[K̉SKHb \@A 0vFyZbT {k (\bwwW.j:6ڈW"$,f?4<~Z>4-aWޓ#&Mj >ux7wgh\Ckۯoq8Lp}=}Mpf@L$=fJU:ZrC53^Mg>-MlCeB Mf!$]a8knyU<(d G={S#tD\3%U20TJD&1!jlKt7>T pgol6AIrK,EJa>k^2|W)k쪞g״VR8fހ;o^ C:ూIH׎[oegJ_T]+}wL)U:G ,Fr~G+vBK9e۫VCX" *"( 5`P : 6`1YchR0a8}{%*@cF^mbpL/E8r"j%[]xaPk O6(BXs:k d'ĉ6Iŀp5wF`6D(WTD:` h.!dF14rpwIޥSl\S ^ !Y'\)^pz7\X_㜮 ` vnMNus< <&"Ǧ#T "7Dx,[.As$$WDnL 5ƑA2.QGB,FZ)P\bF%ʨf@a1&󍪟56*E=aDJ cyӰBo`%:/ M'S}6*'lA`{ 7x]UF*Ƴ4țϪzYe$EiR*Ǥb1(>8`?Og2.Ss6 ]Giu rP1D貨奘rSW"XRȎ̩X YurQZ68`^9G@C0~G羛No:J-"gN5Yq`j~!eV?lǽ덽0ꆹ-&n<} UWZ=8]k7(j:Zp|?w1[*m_:|[@~տ73Yݚ7?F)lì-0}0C_FIII؅lѽs4^#*RT20w5_X;|c F`^=c0~BH.[@CpH&831>*^=D;ogT+]RnXj8#`8J_ɓ5Yvnmq4sqw<)F8S*\[;_ &:qO|q{Nz|V^xߡmAQej;<yRȅ:VNBX9_Ŕ5hBfG/:Gg /c f{)(z0kx!T#9'_R_@3߶>4sǖwל44goc ˭2jp)[q̊EA>{1ˇ#Q3)0򎣆O[~yclY|?7H[qB퓯GG׊oz1?P`urdoh]WH!!i]EOH8Pn{GZVisZndm#j#a$Cc_0kQPQݧI_4¬5R3I(qR&^\ ڨ.0g1P#!iumr[ Zq. r5 1a=k_ސ]6ݻay]US,{hE~5iMQq08ƪGNeSt>wu4@QxM*ckg똳h4BQQ>ں"ѽ&q*#E疲):ݻy(* * pqzuޮ&DRdA4vruXD[{'Ax [wr-_|:)1 $O&AL)LɩEȼE}25K 7w?p&ӌeo21 O.m̺LdfpŌ/bs"8΂bN *%E ˟TCAcXPD|}#HH>twe=nI1! kCe<, !Op_KYdwfPԒXȈ̌A o '8bpz+,^;05#'wod'8v++\2MyuB%jbݯ5֥n B"jC'$C^߷*CoD+͋G.-~&ҳra<:5ԁY+¹.0kp(a( !5Cb،g/yN]ecG V[-"1!@0/^` % 5E$Gӷx+򵢄KڱJfݥV(tc ,DRc" ")1FE׺HGW-8%PowvX lyh% )^kف "NvD IJ;j+y"_̫ÈQ~]";)r'Yաw)o;#)1k1gJATh `+Q^x]S^L]q? =[SjuC|~9ĚXQW)/A"<' VKm xTӜS, "f:pf 0CPH "V!ưtԇ2 5j5ue-[ay΅坡qp˃!0Ӡp5X\"Ie`IyNViYc׻Pf#{q4o8%T `#4n0"PrbSDJ3XYj8+x\L]qZ!;(ldlԻ DHF㢏lј#OBF)+W2RL]qCcنFHNܳ<~uh(S"5 ZQb#=IVV>dYU8֘Yf:MP_p5\~^= JMT|2DFVe8)Œvh>&jk1j~t4ZhjS-D>\Bܸfx}- @_xzסx#Kfq%|FIN4xZ:/|: .NpM^x2-B_77}Gs*ӷGAl:]x8?R$%ks4gߤM%ᓜPmxZbn6{x7 6H 1~jΝA{m1ɑQQo;n`>X &tȭMɬ1]i\thz{?qb:\]}iR$ٲKLqs^NSfNSkAѺ[aQKH`ǹu}LOIO‘̐rO.ʓMD= >=w)ÍM%%3 4WWwksc>*Yu70R.y .W)҇ѣ"J+X <\<~1)86ٶ^'-4z%]5bGB:8A]+/A~ u@f1Q*d?F ,jEyU)(Qʨ]C]hTVK!4F ,ҝ(ϿX@E4cˀ_gh( X5ĈtG=RjDNYW]pG0V SĘg\3>]'OlFqr{=AEgk- ږBއwd 6Qqcǟ7$.Ϫ_QG+DAK cWs&SxˣRSRkpș`\ uũ4/9C20Dq<dV#qiS6V] *_-k<{}*6.M_xen%h],v&Ǖdw`|c<ůIy ʫKFh'!+]JYXDig3Tw\j[Jǰ$ =禵< ͳ lmkU-9I;-a?*<}Ҷxl-NkYwl¢qpKmQ_m5O=dg@6vUh)7'U] %e4Qп~4zyjbߗGxFxiC|z5v񟕽*l>Ň$P|HB)d|[9'^s 9=B:Tǘ[IiseGLԼw^^.'zG/3(OG 1X@g귁׬~h)4^e<4-! G13iY2dcrp0ĊI8`hK0:O+ US*By9?T_#cnK5`؜3H%7/wq/S$"axP.8 [&7m.HS#>7LdN#7n9|fsS- ->nYqkخ'?ݴhw3䊳ڂDf*t+3zWY[OdNj5gvmqهY7#wz `f'{u̒A_:+p>i@;Z\ ̖w|F5짟]J8gd`f|IcPcVDܼx@sv}yN|h-|89;@e-k1PB &}Y_y+''-GXS6+^\c>bNS%\bD%hozA9G̔y@Ň{6829nгe`sh#=~ ik&7,#4 -8j>ݙRaGr4n./h]e={P7FWu-8 nWG{=jȾjP@OqpC2-ŖR/Ych*vٖkm]FE~uY!ڲ3J !5XטQ!| mrRLh¬w%9Z&=>3&3Й̖LGc1)5b2Vո7P|8)̫HUS u?ƣQb9{6%\bKe,m![ySTg2 O㴧f4>HTלNŝc ᒈ##C0X9f.2|yF>y 9#cĊ-kSh xRj斺R(q=wIH%x jlΪRf qyamBɼ~q-@ TT!a:37'4JNٍ $v0rR> ɫIJ%t> X}oVTVIp0B* 7K*΅ݡЂ!V^xd&P:ݨ$I<ٷnOErFXTBuP Itj+ %&ǎD-dE0J"ƲcѦl!(+?z$"Nl|q~gs b˖5N14%GjxM]{38&Mӂ(] <? | nohlfar''Id!<jUVz"3,ew-00~iSpM4(^ ŽUS~7!׬$7$?ξ."4VELt5b8' ]tM^|q!)UK(dLUIl6vDʧTD+N'Xj!GԥzbqüI{rf,#CV(Oz&:kx'q][bYM$1J4HmY%E5c_'O-q(-sQlcPkɗտl?ndTB2D^y5Kv̄#_zD Uti?.4ݲ ”՗KO) Hl't.8*;(y(͐Т(eٞ{Q _>̱(=$)fa'\G_4>rC[ܮL=M)JA9" t*6/uyVL#Zٙ;q[cQ+4[| $nNPc VE=NULb35w-ަ;s\~NԄ@/ U:j;JHҮʌƚV]Qi;8cJW0[*æJ鴮XJKB4f>|u a2gGo`SD]|wyỳV3%`zݜԟ/^.NmMX'W$X4E )2Txk@ v*aPQs_n A 4˨G$3ɭWw_ُ߿=|G.Eyq9:$@Cc ouMq6ᱍTU=b"9'Z"wtbE˂Ǒd 6l=piHYgI Є yԈyDҰRWHD,d|abdULe)rz!>S=0}\XSFEoihM#FI"#Mhsrd;M<{M\||Q3}`ƃ#O|zwxA U5"TQ4,XsDbg载x/r13}y1$d)1F#4& lY$0>port VqWčS_VЦ <$bUjdW7Y2'&$E5K<.åضAr@X\덬0ln@o 젦C̩Q  b,Cީ5}} {Rk«Φ&*M)Q:s2-k4rflYhmd l@co'paRu/.7g'(Nz99 A+]jʬ.V%*(GT -ЌOpGf8,W7kDfQ+?( ~7 ̀ٓ;S39"Y\(4̫%P/R=3F< lcb0̊ FL)P|;şr'oڌS@Q=S+ϚVCnAz0,f_$ejqM].uhU.ŲDvgLʦ7ͱjDKu2՘vx%ڧTt/ jfp;nݜ4xֿ3kD85ݩdX(x8iŢ0w!b Hn04g,[cԌC^֘k<.\Uvmg2 JT-7^5,*un9$Ie51kn3x iaQ4k!䖢Cp*@LȈvj3!kmLEz`P>Vԯn'ZSɃ2/du6҄;mIx+K<O)蜄tVHeM;o 7`]~xYUΩ[(RM+ej7egh "wm14:MO:#͎fuR,ި.sI@\b.8N1k!+KS{l/͈ hn-kV1٘STgt|6?Д 2$dGUy-8rbN94lX10 ɲ3;7iBeLcT= Qʧ#J5 Jj>zL yP yDZS@ .;7YF>oogѐ"Ǔ1NS޸XV݇;ӱYXы`ꭹ]{y0!zo.?oz֌M<f z? 7rΊD6i˦[nq"xp /{+bxiB HesJml;$gwl\gEP#!Ļ Gt%M&Rle A:q8?QLU L鼂Qf^63Mtڟ&^W7~v|c斝Q!N 㝇vATU ko}d51qsԠhog/op ݇Y#sk^G(eDQpĐ%y b9߿=|:-13pf&RE:OQƢL8BeکXh` :16(bs8rKQrn41!$InZvv  '$VkcB=_G 1Ӗ5 %y,OR0N_Ru_RkeIXĠr2WRP,|L3V +Reac/Bu֖~&3Agw/ś_[8h$Օ(` 9贚N3ܸ'_)#*\0ऀ>W$+.הl|uL)f<Ղ'&.BB8wfђ Ч0 7cyLбpFs z/W"QICT*+cRBa)┠[sg 9I01A\,<@&%h9o((ƪ"1IsD-GY1H+Fd`PCPiOq40iObS5V!Mq"`#gJ6҇}YAP>(JE\>J%~*}{$^Ҁ8meuO3==6MJqP#jEC1>h%R(JJAFAX lRKjti$h"IU(c[fW1[L0v)L` L>U|Wg0A*] KKX *Q&245&S4Pɔ61 ] #s%DTՇ.7}M.e`]u)K\U' pNnG&&R%Ԯi4!3! ~jaX"DvUL(NM*"0Dq"00XV10F}:emEF'$b%a$"P>)R(eZTǂ?!b,S`E,s 1ij'lGsj$)`X/bO1< 4 Qwlww}i<HZF#*N:[A ^j1/O1Hs}ih15#  g&AD!* @qfA85L]W?lyWLܺ޼u/9:%,4tP^}NEs$_˗ߏ 2~zۿ3ƻ1Mh𐧢zvSU޾8yMLx5 s{{a6f[3z6B{Kr;$S~˻{w;خ2.>瀮}㖇 hX١vH b,maGJ ̸)%[MR Ф5.Y;F_V? g^lL_&O̜dNԂ~Mgۗy^ҟ8߹)NJ=6G|ˆ$;O<;ONݱ뭻cW$Ud0O8 vcB._*A ߻֛Ļ 2g1U(AzG-nRf{CaXkEUdMB@.]܇]stqv^zQk {QѧJx0 3B0+|*|'εŋk*YOTr1>Zk:q7WDmk̪Dm&٩.x 10:0c&ma%7V!/Ws$0_0Mݍ t 7 B+6}]]Ot2ApBpGX7jH-i^i0"y&NLJ b4"k-8NahmjIe̢TcH{9~F+Y1?rsA:Z=藂N2Jm)1c9Ѹƴdz V9S̫Lc¹4B!\FI(U&B @WHM0FcʔZq^LnhGvd24=SI'=!ZR\M'"#$VxA #iÈ u(!jl5 Z.&mZ2itcy>Re> }a$F4 5L((M(*D(LBp5 i*R[vG>b0!P1ScH(aQƠ9NoRǠ$I(1x!TiǠܟd&Mf>*r[#J`s[ &1iIJ@,J 8nyp]}{(Y/56zM3?]̳ mijr?-q#IQf ^I"u5o+=yD&$g:b$fFBtS$ݱIYA%Pa@z*d6&Y~Rzw 5^j;ew(:ݮNO+{y^STiPUׅwܖF׉^an;pkĚ7d~ ! XF*`#54F$FDVIbW jetU Un|Z)l(ґ%-5F)4( Z`0U8TK%"b^qC@DKPBz(EJϖqMz,{fFz ڤnil%GE -+=K=THmqBʡUkK2Q!-CP= XB2QE8{^e~)"u3:v/fNE ~[8֡!A (µujÄ'i+c[~@d Y gFx~8Mq:qʩCOaobYfk#HeT _ WIDJ d @,riODD}4h9h,ܴԛBn▎p4ͷC{d!W]LJy%+hB0*n -kegZÐK@"-Ԅ͕#L3=c|,kQjJՍmO/>J0X>r]LWŴphݘ.ut ﶧwiBd">IAJp`Dz;­:f1W{; 9E#m.;TmC( /xSDѲv'm< t ^Wxl,[7yd}O(ڭO i&;d d#8`8bqŲ`c%A[JBʗUv0}R>H??JlND!w$w;,5'nP0 ks;+ TJvpu?΋΋z5Uս*.>]{ ,z'ZΨW[L+1[Q[gŏ1)h\8Tv`'m7sʫ.N_:$i~<:r'DAV?knglNúx \YK=XT}&;aϱ8fu xcف ~9.>\&6v4͗OӇldk^ l?.qX:څ/qi4?>_$^zqu_?15IU8z2Ά;@+FVGף+ǔP7>\!^ ?opU>Ӿf1sȊAA.Q/&wIڗ7S\q`߾e SxOow9cl⛯TēIm2vLMmR.#Ȓ"8oA:(DOFFKbY3CnX$ 6(V_;.IϤ{r)a5ul/˝-T"Nx*Oރv0Q;n9d7$+ښ8b^tsu[*ѴwcEхNO72͐{e ÕU||;"[ P'"ԟ_Dr3=_d7l1򲭅Z9”ny|C8o?36'"gg>\u9Pey#.]G>)7=fIFRxa~`e!c-LƬ^oWTRη7$q@M2c&#Tl>K/zOeC;1qv }7LN.Q~ `xNΖdxVR3}{3vŷeoܺmc{k%ٛfKs3A6dPls&h{b)*8n.4-}y,]pNe {ebTm:'86\]"Qrl%XPh«30nNHv ֢fo|H:BF)#€b9h\ A(k/l Dq^ +Kn V@hmf 4a`o|h8\I֍^񈔜chrmt A%%1bSqXO|f7q1GI$mϤV_].Qs?)d/ w4U__>|7** }/p~bRKT/KwFMk4iGݻ@/o\?%]pj@vD^k"Q:̴-~xzN4Il\财:>9S9Ʒ]7)~o߸d$ -]3vζ\%Lk_#w{?閲YϫVOZ]rxY*Ѕ/&nܿno~?_CW~={=5oa0qQ1N57u=%Y>ܴOнWvn : `ulR8i:F/8M`Bw7M7lUӫߛ-ɓ-߅r^dw'N/Ë>VgP //|)/yHg`)o`g[dzOv}_ZMw`j~֐Si0/҆ L5ۙ+`,^hk,I,BM!R묈bSݫbUB_X˴ 1Qt^؄H_v/ȝ"T=9 mnm|E{Ӭ!(tLVT3%gg^=>>#wҠ5!;F_OM}KH/d۫ 't %l߬Oa>TV2ueP^c)L0A+LZ"hb, PDe DpÜ]V5Mh7\/sSϯLQ@xa-$Lڧ9#|:H>Lb 5 >MI4cp~.ۇ SUNA.1-`xߕ6pIŢ=$|K(^{xp HbZt2 |k _l-7nצ[ of5e}k׺Ű2Eevn(ZIX2o3Tܮ_P/ VŠj$F)D?`xtLu[{zT]rvd. dx+Qm=I,2jiq)ֲ4IT;x:Gy_E}<9k4jk|-^3θ35Ql\x̼v17̼f(!kDH "6K$[ˬL cnIR(W=rY6-tڵw-:ރyϳVrҷQ) ¡Kc JP-jzoبo2p$y97?kֵŝ%)}yA$|x7F@:ksxڟU?Y%0aF)N 9"Tq4<@-1WH iWN]Z ;lGeX2Zy~A'k]ߙŚ}x0_-*Pry t+h(>}w8eNdRJA&Γo vn~ HlGd\%jQF4)ۅ\*HR{-EHGh&,hFsk\aGao>v/.M)l;0,d3I16F Z[ߕ0nާf{EE_@ O\N4BV ؛k,\o0VwUTH"ax>DtӘ96%r N,BiI3b8iqVX<6al"@ZP 5 TKD)9IuyyN0# /8Zl<`EJZ I:ϥj͛XϵG£ Loo:I'_?}gpr'c Dʎ!ZS<Y*xE /dA-C!"0۠Qs{lk|mD%[$I |+''̋bjIM W(2;QC"T9U1 #PDD U_(^ۇzE/,:T"(bq"J1q#hB8t|d &ʅȂ R9a80p֔OH2U]]5?_YkMo,S(.ޝ ̴8%%$z.}擵;/NA>Yf=^B/I?7ͤQ7Rnm( B[$A=ukq6E'e/ CN`#k'OMh&ƷS4RKjtG2^[juW_XU,!ScN&[GߥȒ2,1œ\) hSuNNeYE++H+NsARa*x`e%>T\1Ahl5RH PJxnp;m-)HIG5TPՂ•Y2]D9Ӻ&x˺H cqNQCH8v,}DkƱf*F9x I &"[vS%|{3@k~T(2l_!/ixZ"Yrct}/Dr.瓸khdb糙ϳ25Jhj{&PqUK%K.cJURS)bD9fs uNIe{}iBˏZ?WÖo:4}ƔT9 ̊Mѭ> [maf!>{V}L/ 1Z{0;\iܷitCx U>1qEN2}L( 3T%SP.M[;@v}O߲|2Ou}>*[gutvHK8y|u&j3˭p$ g#Z:ֶ 8NEVQG˩-*6tm)ҽ9ޮiU훀6SꩯWl6eCh/RS1Fu`Qnrb}}u}b]Kt\#ɿAOI%d((%iomSa+ةt>ylm8{\F7Ca8XctZ+ 7-m%MA2!W*XBzn9hPnHZTKU}vu^N^yD1 .K]P1R^kqUW25b"Qiȭ"U;]m bGXdϯV\<'8<590vV 螞F>fcj6ț&:(3^<* yv?FA>|'3b1<cr_<+vZı[9 Q[SX?kMW֯tж2^:cZon<"0{ ՔJ2 }#c½ţkO q1FBoj\\xupȷ\n%/a :qADIcC_?񧕋? LJѣn띞,tՐo,n\Z ~+tϮevڨ)-唷5rJWټcE>gr5[8Nw(K`1^Zw?i)Ou8J&Fʙ L3 du&x\$z͔j4;4PH^DIi[r&bkܗE~}ҡ8:xj1@NqqZ+0+ЮuVR_4~;'1d4f֙9گ8<4gKx'\ٮ\ت=xalhͬ"`nUˇ詢\krB5VV ;a bj񻶵{{SZTV3VU%XjYzloi[̫UpARx G[̜ʝAʭ% ͜EHIB'UCKb!͹Xιm]U-:QZJ3U ДDզ'*R@ ֛Ŝq%c$p2<5x $9]XWO[3oWBB3|Рao>fPE_x pEHӺ~+ï1k~JDZ>IIP}DKJ-X,Ą+*-Q"߮f:KSDBLY\T eX0Qx:(JyN)E-OE[&K1@fOb~h"{_usxL;I$$Hך$R^1٢$DiT/VpxZ/|m@2 ~ͅHdyXl=/AWTRv=qq ͗uDV} ׍V5I S}P֨oFm$рkx)Ap0g/2<Ւ+]- 4>^~~gKu9FurSUHvyZ6=O%0ͨp,c N2 jqvMRwT7^#( Qrr)|yA@'_Ws:-N8BY*Qfxu 5MKi#43)R^xu.M_7//"F)DՄQd`RD m۩2y#60T1U=! ,ʜ*Ze Y 965p'g5`At٬Z©i@H.I\VE Q~p!G7,'S)M-i xƐAB*/3݀ >oH b;J7i)pð*vݛp76cN^Ѯys\L"뚼}bB#[U@`גuk5c( hmZ &XƸZp2S# X/#ϧ }3X Y>}0w#L({ 3N>򫧁 r/Nq/NE"- ҀuS<9A }X9Z9VzhL0-L~}P- SJ&$QH2F`Nuxd}EY7ůKu qF #)΀i< q &Q03r,p4VN!cWwhNؐu8t`t.wLYRTLj,ay*!Hb Zke&ԃ؋=`wso2Lt0.oB.0FU<?}n c1cN`r03ljPNe 3Dȸap!471Y-hk1lUf5cp1RHHv0:ڇ.ecUt9ER_<h:hJ:ueP>p-VF[)T&]sH,d Wל4Bq \rϪߎT̜rt_7kcqmՒ\񂉍R Dr> _L`]Mb$1v 64Rz3Nc ?t)"eEMaE/W˹DݨL\-"Q[jN HF|(􊚳ñDs CCቚa#6sYۄ/t(wHG(Ą0m3f3xԜ5A{ Y鱡$0`H. g!1qڜ jqkڪnVQc[ [Ӗv7?!Ҁ\l0$XaI$Kuι V,Xa>nǯܿG P]8)Qm8kQܪ#~XgB!c̐< ^A9CK'V0> a!oќe,?#G]L.2E\u= Fvg$u& s:S:#$)ia/,NyR\e˒E%^E:*0(Z22S"qp( lj5-)FN,AWe1W^%Z. ~t䏳}e),T]ըXpWS"ħw5iukHny<}G'_E<N1Cx7󏏿w˴"ݚrZ7'zk.j.5H{ظ|_`Y_Y k?n;77X8:T -੔ =Ne I,n)um/\IƝ[0O+W {q9JJ3rr8e}U`f&AHIB>TL}%)u. ƍ+$OÇ" A٢z߽ݐP4:6 )#05g~kJZL)\Zo/g@pv 5hJpnr6N-Fe[ Aud ]v=L~g9;*ϯsIW~ZFʖq#{3poq8鍧O7>B4Ib#ɍH۠cd  ۫xz3'7O}& 7o<,pZ 3W6=Q-bץvh.d}h.$BIrHK|f7wkj-G6#ӄ?B)!2cX޳XJ=|(9'D-%eXԂ<~)g:ONfue:K08K벖2l4BʹtNl9rH^XMde 7% <5h-!%@sFQ2.IN-6B~\*UQS5Iz*ӽ4ؓ*0vP_ и=Χ3; ")Bu=G 2e(m^ ۳F c%(N5)L7ϥ"D$q"F  f(!UGWepB Eg=leH+a-ܲ0<d27KE/x1܇y$s2JvYf\GARU!*J4Lq4~ `nT߶rˆŨO4 |( UI7 M3N]3W7Nj/Qٖ(GXGo|bM}eoO"KaoCjWl\wnfwK2}&/Uwa<7x7omOːK r+tE1TđJ1ISaN,ML+׬69!=:vcm燇x^?T|8:3#l#(.h9{t*/3#8 -ۿмQ3L)-p}W6{vѳElSP&*I8& FD\?%}kRQv6r5ύemL*u$_%rSTG"k^!/vuv4ͫk]@ÙErrT,f`iq_5zQL+kbI~d\̥#ɠ./ڎfx9F\[c4IUB48a.VSpũs0MMpxqb/W ۋfvy 4;\*a.PyCJenA܍ #pt&bNEC]1EBt9`c7E@5bL rkuEeq%[O;i~a)r'm+f! {?,IH@ [',Kk.iɕ a_jM"]m_L,% yham#ҵ6ıU1aOj!ciCރ$L9%Plj8bK*REHA!a;aS&6,ur]? YBSxi",2\.4pg54%ORd  u* 7|O"ָFR^/͍˳ʴ?8D;fq, /x-v/ˋ[,M0E % aS3nDHeB0RGX`ZSPHpD`SJ%#%$غ[T Nj\ˮ?Fhg AT+Va + ke |:^_=1\`\Ɨ?'VBy~T!73.`V"J8 pଋHguHy<-Bawo{x#U Da\>b?o}>9hh]\%P:L?:bDzHJҥ5c$3֩u8EK,;IyusdhB( !%.0d$\hZlz1xYn砚 Ջv;N"Jukq >|7: ./u8E1cTeLN&kK8 h5Qi6艢sk`wn~ ~ z8ԝ[Ѿ;'9Xw(vN꾝mTkliᶵN)B5amdHrJk(&n4p/-nݶbk_<q|iU뚔o3+H]]Jn nh)Chl.$2"73ѠAeVuc;}EĆ7NC+uGVP+Z"œ "d 7: 9K/*."-qs{cS-jU/4Ϊ9%27eah)iXh'VnZع!7m<rlɮ6yu6>[pF˹o\Obu\1Z,Ze`H%X5sd2WK=ʓqgZD5iڤl̨˳Z4 ˠV4If>z1i Zm4nYG8\LXF dI56g9%Gv˗/' -"T3MJTkr-d11.qH,SY[Rf,GGWiB(ݝ~6e$gϧiǔHIJ`:E:M\4։MXBQQL`+`ɦzf`ԋ]_q[K35բOd"c#N6V8 "$sX҄ "~e WT)@SN=܆tLriDXSjL{KS*[J#b$L2Z cU%=#toˆI!.b0jrs o$3Cyb|q~Q*5+(%Jp(Nj,fTpİ S[rÔځ#Y8#[`W^hfrU>`uYn#U.\Y+M.N"2Ii`+A8 ?@0Xa8)~.@uK5B[*=+&+s,4Nڀ  `i0#O1%Un2 q|=2>ͧ[x4.R޾Z=ݳ`Gm^__qKba^-y{+ހ??t ""Dhy+P|? s+B?, orBg`VpE63)B }]Kq{)H zāڻ{[fH8+/o~[kb;ā [bxwW(ӲGe%{+ML? Q|\ W+f5w?o.xDc0<) #d4;iK'ڛ?8 gh?ڍ%YtjVOjd9vGO@\; UbD+!PBU* WLrEU^ =9~*DD.uIoY6(g9llX CIsTh:fh!O<\&Sy$Q>5-7MxB3x&G $[giҁ3*߽7>%?&>^L _X${t<XEn'FY)Wx1&i̠b/^?SWP*٨ӳ) 0+C_i?Ã~ФLs4&Q\jE/aW^Yz^L_#GW1iࢰV.^W,Зn!$ATyR%?IA/y3hru=ۅ>^aLҥ-_%uLc%NTf(t,)"W cNjMg_bŪQnS4pIȳ4AA{އѯwhUXk6@ \} ) JbC*3F=~% s i%?p`z]JYh+D Pbm+BJs6F+kZCu䥫gR) υVBgh QXf7cO;0JɆQ}%F:4<Ecw$|i3MRj&4* 7B[" *rDhOfBBo]]گ@&']lT A> )! Z7'zK Bbqd"Ga/TݬQtޝΨөգg> b3-GǨ̖֡XP{eZڑ%AfϾ |/10<35̓>kY;oS ?}`Њ5Ql{h+;=|VuvӦ:9xu&VxTR3+Rxz zݛǣz9-sY>Hyp T-rq5^q# ZɡOx3Yh ӷ ?zW ;9G_$77з2 sPa>J|QH*#Tlڕt hh@DmNNLKS3* FQU8<apQIGkΉwڐv$N#i<r Q1 3#4͸oI:w6Ԩmfp?eaY"E` d:W4l1LY4+g ~0V `p(Rá#9@RLYW{p|)Δݓx5^̺4l{͔&0܃Ұîwy_Vq 2Di 2K*PTFzD,tZEVH` unw?'ڧo9Sg;8껓':)u-KY= )XjV! mVHU\s% YgB(P[ȕ{xغ3u)zJB 2uwKAAW ˀ>y WV^ּwe>}pدXca|ȼCE&~J϶;ܻM;-<OEab)Y9mr@Z5*f^0x⡄[D!%vTQN 3XQLxaKSGؙfpnܛm"UkTʠf[Gu=L}BAhm1QYI? ^Ӆ֦yU_ G+M}'LR0h}D]Kd4& RC4GGWnd/|Ŷ gE4O 5vhM [dN4q ZMS\ͮSPi!Rʯɩ1< 9 :b|jZjڣFQgy~%صa0U?;[t:,FG9܌:M.XY/${ë\BD)~%? HC}/Pߋ48wtp~eDi崗ϊ zIml'QRUmI`3r̅ﺝgAQu1[ iJrvJpdvGՊ,9)_m`Ak\H@T/>sqn b&#T\OH,a^z5G;5֐HQ(17y&-l5ir;sKs0YUD,>Fu= B.PK BEn2b=1s<£hrruDE\bn+Ś%5Kk,֬hìWt.Z"c"97ǩIESݝ~p'Då$޲o7? /x>wg]FRB=egv~M-p4ǻuWi6!t+;}߿?KhK2QP*ӧ;g1Fv'6eӲ cVRXp],9YR [P,8~cu$>K"W vS=]RTwJ.~E<8 ^Zt'-J@^h-Ve 5-S]OUwEXUlB80 &[}U1\%~JכVqtַ/BۿԂ$m M;BzUg=Fԕ*qq>*i{b>E5G|+gȕA^upC)o#a:x:Xs@L{;8rZ폨|dwO8\!M/]:?ӱ^e=KP3y7{uܜ: 6Z40⃠@/#Ё 'yNi24  R 3 ;Dž}xgGWp!\'/IS筌L`8ꅑܨjS2&2SųD054͚)<w:ޛLPŵ3<A'ҕ:9:e&z*"Yz!Fy'6'DM9LEQ`ѫk;W?G2[-#b]F#~,{J6νm[̓6b%O"5, Fe .0DQp:M8@@l 4Ä5֜*"wSNnv,r7;94;mm9֚Ь6;+ihVNPR^#ٯ7ݘFpS Q{ƧiUlu.M8,u2ДQ` R) #Sg1,2E7 ) cДL3dz&(1Zh?Z b !\Q(108ۚJ1Gڬ~lMXEs}}FBsb6r*9]3C?3294QsT J wܮn/9tML0ƽc7;IvK7]8~FĞuv}JzQw7{ ?Ot/ekL(O_4=< Ww0`ƘI!УS !t? yA%hdj콬.j#!\0dBm_I`L*U!b f'<]^ K٣"=|Eř\?$?k;FwWO!`G٤v0̾~2,eLziwQ((yL+H\ig~3ckvFՁDi.dDyaԱav^;؂"H[\%xb 7(Ȝ:i/-9b6 ]riϲ~ߒ3c4)k[ *R]Z%,[!#%޵#EЗ9L|?$v'83vap˒G;~EzYlu, nտ*VbU ,dk<[i}l=ne)/Wj\"s4s\/r6qˆ=G~X){ ax!F%O:'jmV:Wtۭ+'}JI+'߂zܪ"*' Em6m/T֒L>IJy2I ( ⸲)g)d\b)B̥8N(asRQ b*n+[ bfV!!<&7LiH9?Lֹv0ɩ9,DD!Tۂ/56J1&LBU$x,1* Z(SLI"OC\+kXc(D AjH -'](cTab4D> Ht&:+xA"Mo:g3W2be+9 Hoٜ_5g"c *2' 3 [UR,ʊxL$ʁuy2һB((֜4ݻM Ѕd~E7qyG4Sᙜcq=R>k[?e3{'Lw8pY7N}ú" )@e%V meր6vI΅˴E8cʐ Ψ7bZM7WxWaՅ e"BR/tQcowKf#T˛Sr`ӦT˞-cDIK^/DŽ_Rx phYͱ' >L$Iڥ^>DT~2cR9+ k]7ȧmcq!%Um\`P˥M)!8c+_ܭv!#Eiw1V;oNS Q9"!)HlJz9C\*]uN9*A Hɤ2הX]~ʂNknhcgUUlg&M~ t=3#a&1co'MbUByFLfmCg7!L%q%cGp$q m^;Vq>B'ǹ_j,f%oF3h%aݿPX!k~,Tvȭgqb=[`GDz\7Vq\NMڈv,-$A (n '/uX4Yc^w*@q@PB}_7*$F鈥] ~L*F%Ү:ϐ*D˒ TQ #BW#@FČqre'q\I1cc%sFLblĺyoxdBJB5ިK3Nʱc)wi[%ThlQ+3WX8< r`8l88!9VYF{X28^x-e*(2E h؆[j+OX?VV};gJF qQ8&fXc c>%5iFnZiĄJ+Rqѫv$~^c~twGegguX+}rצٻq8_`v;,.nU/DdƥbQe}] نY֜nZR*+}p%W2T*QŸ|.!$V *UzU &9΋x)Ei@SR(֊2cr.*0[Ha4 k*T@ϑ lJ!' G0НGWxeRh3ƵDU\WBN)kҝ*CҊD |ȥ J b~7] m(:W9G "9nxAB>8=9;$9h=(fQ0}$4YyGKD3~|DsI"za3:к;lLWUv W!bkO O;k+lNZ sJe}aʨHr9SNp\ӨĄ( Õ(4k@puC#Y hSxÖ&Aui(-=s .nTzW%d`2S*uHc"cOrME/SUjE/Ձ/aFAVta͘B2s)]'` KK uR]FUp2"<, wrHyFAjI]ra\j¦ѥ B+ O.,8Im-XTB{F.p퍲؃Y+dTu,`b`+ `@ "re,ƥ(˄V?,{K ]$aj1o5d<%1;$ےbڻji~C3 5wg)aB!'ZZ l׌r 21ȕw9ALUE`PC^BI;p y9n#YB(2 mko}p *?ߏ`;o' [) iiDF{u{|f *[~~3{xrwrpmguD<9RSBqV;{OwЍAk?gHRg#0х䘵uR }ֺ`3&rX&IcxaX)k8I@ 8ؑc7 S:qq~32_(@A?T6^!nTK]/+)}OO<&:~> f04y|#Υi$ӄ4M{fM{MZmu<,p4BsTϴɵFᰇUpD8H" \ {O P2zh%藢Kϣ@eT m:Hʒ\j&RI;A UXk |#E łv\5b{sCk`]}`glY$u6{y0ėg/_7DXnAf2'P?T&7<ևk0!`G1cQ"+*:Zc}dGRNTƊJH)4]=ŴØpr((/ƌP.+讴;wJD䭿_o#`0|ipx}Ջ /ݫxؾ !h{A]38"7#3/|o¹l:ܲڝZ+%j$d6I>ۗBfTXqlϲBFI0=[qbU]efUCUT X 0&x$L|.[,hL[b U(t ] ] ysCGXiJ9*Ɖ##s'<ɩYB+ut?0YU+%A'Kw堃3mb,3LpE93䲍]̂tVNwBkfGN8&hA;(b‰ };Z&1|6ou޶y!hw 5DN8p n .,PJ)E38' W=,s%_XnPtO.Oq{šua|k]!?[Wl_1`U.GsBzca0' RJѫop-J'ps(L|J"*oY$#Z_gq@,%x1BO2YgpTTs,+ʸ hb]\?Ka$)Q8 _%8|qwfBwv[ixmW`4 V\qkO9?1 ,f>nrI?_K=Z*P/=7\NbfЄZ1i0$B2CUÌ(>D^|7ߤ}{gwM[w#{xom/ ͭ{SV]<8 vpV { yޮJܨ7NDJͮ nq8#a.mn T#v۾8.DDJ5A\)-@)5e]JѹAT)E<%X}R.4ٞuaG/@K9r%(WX(2\œ"%Hvq1Ap&YVҩe0R~wY?{ƍao;Ĉ7{ƀ&mI4OߏlI֣G짤 H=⯊d00p8I]LRo0 S/M>d")RK,9vQ(Mjl?vX%N'Ho%#).],Ъb1J!U%g k'KʸChFw\c-0$veT{b\P v-y 0SlUXOblʝٟ/t]q>xf:->+s?e)Tq- H9.zrȋ^MNar@SL; y\sDcٛH" 1v%`,Fٸpi7] Bm[ygf+g흇hxJh݇;|Y_#||w^&Η8#eYEؽuɷ5!}Q(3uXF]~dHi?X_+?La/y7wy&tx^aݬY?B1-p &&Uc6eUs}wIB|c:/o>\݇ pw_[̆D2J0` M,1$ft\ڳoH f~jޡ@}7gۗM  w,Y,1 0ke ]dDSƔ@]iyi&  tu:jرh16nSi{98ЊrZDj 4AѕdU;Wx.mGXU,~3ˇF'QZ.P{PQucbcYoUFva5^ )xT^Zo(OR;9SFg)8ќp!7['y2{.rrvS=!] QvA 7JƸ` Ig}JQjUWP_5Ąs;Pw a6$Ȕ`.a=u j-9nDUc-1\REE.xXRv& Aiվ,B,A;ѽU$) SE +{0$VT\!]ZQ7e`(n 2)=v㠔&/O?OPν*^h] ݇%>6D2K'6q/{ƄHr(xYRz_xvᷕϋn _|R7->{97_fY~[ 2?n`6}Wl拧~E/>B" WscꛁҘfihJqjNP0/(:͌ ᤕt7bf^f&ؼQYqj ^V-6/z .6Q$3lGb4OE1RzZKyAAo{|B51h(axN (%b4oG!\29CQ1(gmЦhhه}(s"ίO __#Gzt޽b Jt^CM(͜Ϳ\YF٨RfQ_˕}t*w߆]gt R{wޣgڰLHu&1/1x?T(Tdp&43F0,&B:fANhPf$h#-0v[i*HP!3*JZȘ&ip!!i3 7\6TqT=#Uzi}M+W孆+ [lyDZQ[qlhߒ[,Y.L^.h6ߕ6ݦkS$vbP@"'OBRu#H}}rĜ0T8f GKU ,KV <=8~1`)C:+Y==VUc9-|~eHS/ѤG ~Bn8g狿Nߙq<݉ T#rY NnOza>5I`1)/)Y)Z+OqLhynV`Bx1糅Ԡ ke !q(ƻBA,ty.DxA]>+ck"p`j*VtX^+G9^Lhz]T AE Q6vr8gt+sPI(Eԓ*{xga#Yo9(#Iwbߏ}GZ]sDZ_D$t费Qn%QZᬱU CI(ܴQR= "=W3*:bN0cUte0mT(Fٚ\616UX$-jX#AOvS $CKEhwwxLD+Z#eaB)(rë-!,Yd1;Y@3z 8GQXYB364JaҊYBd-|5TH΃&TzW BSe-U8*>\';:*qØT+xAWq'kAW1o(  bG:dR?"yг/vG^$Ke2'~Q_6c`$s%LhŠ%66s|1{NxYY,ޭf~.<=m8ya䕼%V5[Z!%3R:a9-AXjם_ N9"Ψ] ,)Sv e&۲I|購|M e~)QX0$O|a?jw?B2"aD)30g1f!I#ykZ}x>!+ .?7>3ϻ7zd3XR> vVܻ=_޳HQ8zf9^si02NO7Kdkg+ $ !lQ^{! .K*oi!TS8-9Rg2y{HȆ,B>|s~in6gf/a ǤP6o~`6oz z>~O4(Ji2q7k#d6k YR6ssVlؒlVTlt$ڛ 9MT;n`؀tNeWF_:W~>l&IlɆڛ77@"zY^`4!T(:&G3R n]E(z{ꭡR{F'L]2Ĥǖukm 5.K$@؂N4$1G,#T bQlhQ3ov q1<*%Q+0Űf6FOr19}c/קzULÒ{/!Pzq C x`cHhϤދg!R^vYRvJ)q$I`8ȟil!*;bFiqjTfƴ#f\ jgG#+_lS.W#8Bt_-J1GE_]4=u=}c`g@_e"8i[qԥ-~UyNeԯ&oE&PFo k* P\c Jd&7$ 'PL^" W "v8^V^BW M;SM(X9C>O"9|LgZkS &Kiz=I2)cm=3A˹=3HA>N o=/-$윭4穨#ey*AH!g+/&B,SRXmG 8g@+D[V9(l{DZˁ.*n*U]T!ڶ8hVYndٯ"3ف -͂v]qt"H{6s{+@sb:3OI%Jj;MӃe'bKeܸm.ZPVvoIYYsdW^^چ :{D&D"(r"&{S"ƧzbRAC,vI0֛ڹӂ@{C''DBobAb@nmfν4 ww2Km$ӾY-8pz&TT"NE"HjR2-glG/ܶF\+S(^]}Yr^x)aїd-y1B_ƶkRTs1)NՇ"-].ݮÆMVZhY?WSa#~a8GE:M0zt D 8!ӴOs ĸ'%tvSp@9(a&Jd,6Nj9wg>|9oуy㡢5H;$QR>8<0ŶTRM- (IB u;&H(~i_Χ&8كT'-Ǔ*g:xYRy⵷cop1-#"w:X|`* !QNd1Aon3 9yZU IwY$|70agMP qA".H ӖdASV4r%fFp88]O>m$# F;۪D-1ˡF}T(M&OMZ HX ]ŠǥdGoàp'Jk-0*rP8wLI=18{A kSϑs¹kwv9 (&QܞnSrޞI[ܤE60N`,2gL)EEv&g@wE&盻pH]doFp~jr Ȍdз'r7q+U{,$ʗUQٵ]N?ϬR9O}/9aʒS|v^D޵W(bܣh)1JA (IbXHg*Ȥ0D[A)ļ0ֳe<HMY:…[ );ISXw!^N:[3FJ$& ! ,ѴȔ8v7^C2vV5!5rd q״R ؔH#9%/pϳƣ.c !&EPB DaJ28L;4Q퓠6H#!n2X,VPkMnY?i(Ȳjlx>+3h,~{C]pfM8/peAbKY4l(&nE1!Biiܸq@񂹦!C.0#9[Ab*@wϚߞp}!"-#n{¸B6h&!yUڍ8ć;Atk ez6XO_|r磏qfgZelR;dD Qt}HR]$8Tkz}4=uI2DeWPݲ{'Z\QMd{]]:yvfraW;ĝA8!t'5y}/*s<Ym~s[`$dPĶ^ybG (?(Y"\{2Ha0!P20%ӔJ0;ȧfXU!*@aiD*I3J65FJԦPD X2(R7WBm*96_.s/dkJ+.dʱicKU":"s*@E-Im3g1bCwz\wz|*zT!E|p6 [-$2_\}cȐ7f:'nFև4"vQa "u6ۍdFbƫ-l=hWMPB{~ !]^ZP4!ȣFWW{VXko=d{~2}3_'>CKyD-{?*G(`͵CZ~Q:PTPK0nגUhU?{[ށӽsY-FU"rH!]EhA z^ ,(\oT>!Ht;[R26S[8zg` =GpCGH"i+EGΘ쿎  r.9) :"paGńopuLb]q5 * p]BX4Ly[^+/CiSmɒkM\{%#DV}\]hX$@NoO 5)Z8f'nmKTd wÑI穝nRG]i㟟be>P#%x ˓5X`'&PŠ=G J:((%olv݈ĄP]%ġrǛ)9z$g?+pH+G;dfjbє2+f;]0:;т၏NR}3{(s)bCr8)EɃ-n@HXa u;((ԉEx1#(T"(dh2.27tv4;˸`<\ٞP.-19-5K{dv,%N(2(!LepH-y/AVAe8FĿ_艒ttg{ 8:.TvyH (aTͿ$$cN)+ȹ@uϗD[r(_A=nڿ4i, 1#(*. N)O֭>*-G+c 㨐$R՚ DheMKc#_%H)rOMGNՊn=r"o|d4V muz3uiC 0z9jo|Ic]]Þ '<0H;: RƹЉd]< R鸈;HkT$NIpNސV$p $^- !jVmsIo&j`{HNH'œ I`'22(<6$bi"I ,#Zi~A >|7חƋ>&xTɷ]:??{Vp|k7zRpRmY;$ڎu;Ň8yYhiU>Κ*5/B,{䄾?2st"*8/O) N'o%|sxɫXO92^_qMYr%݉ zy׷RF-Bl#'G@XG . jjEn̺8_!2 :ظLTf4Ejfvg;&0h5#wxR82fô;em2{S1S$`.Z_ϫЎ6Py *8 (V!E"N  |(w_ CyU rE2_Vo&(HrL* nRe0aJ)3`J((ҺEM+.dʱicKU":" >6Pg#mxwc[&k6+˛OOsJ{p3*&{o׋7~p쏇[~Ÿ(xvQ8t?Æ-%D;xIJ.8֓۝z2^f~,d@pd;IjرZrtŚRf7.`B)4΢%ܤVBPQؙi䰵X8Ԣ˲s @14v v=9r| -lJ(D;K~.z DhDb2P(H2'P(w~t[GHk/KSL)dDS lJ)k 1BtQ%kS5ۉ5Wդ͓o_v\FZف="O8FR ,]r:TWtdI)(lTFp ;]Or{F{6<j YOoJtqγL犟W!&OE1jJ(R r?T*vS,\ %PVcŕ¥sF,ѻMIg?@R,3ZJLRJR`d1VA3BFmoB0CFRv+R5e:@)E(T@RԒ@2+")dvvx}^xwIϋ*6(fI#s:m=KG><}->/b.D%nܒϗmsuOoCd=@LG?ޝ9 y6_6kOć;tkxDGvG'w|y͈=Z&w.l!XlBlڵU?;b9)s#mōʟ3:BrJ1&aē5٫6A,/<,OY:Z\xt5/rnd;kJt]x4xMhq":[$ޥ.2lvĴǼb<$.;Ix#Z*OMiǹ3W\rvwL1s+B43|D#,z=[DqǚmiIn*sٻ8$W 'g_bg~Z4dKMRNjlRU,R$TgWf}wFH07 w7;L'>LX 3j!Pu&mI͕}D'?]Dh2jQLfS @=Cy˹x/N[ԙi>%NGuxB9zg{]Q6*'3'NPr倚g{ױģjz.67>{T<lJ~!FTnoi.;{x<eFBՌz^8" zf&7n96phP+<7LuzifnBbMǓbZeN]u|6~ ClwGe`uqbQ.tbENz*TJ]TUj'McH_--?g,X5VwSjU?j lIo]^_}dpku.L?wԼ " - i>R:,VO`!h 2VeEjY1 Xu*%SrdWX ~V iЉޞ˓_Ό0,Mgy9[^yv>& iGS3z\%n6j$]LKD2\L 6 LӍS8DXUyXg/٭_QxvSYVe]=ӼUg0 ëck_lI!EsoRq)/&h`)fgJ 0(s;Il#BCzN9 v7q dU}hrEX / p9jZ4T&c,^H YeJ CRfdD-\هhEN~!'Ox~jD^Z#D=`,X"7uP("`U?oY, НdR ˙0x*v_Oc޻o%CW%CW%CW%Cח ;Z}m8nlGO{bI6t*'>2ΥH.Q&J^7i- VY (!K" R(Ɛ j%j-t  `a|x8t%sæM3z6 S k0|!05tJ0Œ*a2o8uS*ƨ-KAδY]A~T m90cj36%Jiͫ6$f70J=WݬӶNElօ:&#s 4a˹91H:hf!:T"SR\0/CdN2Ika? Ӈ:lڑ񖈼(2} hkUEsE.YRdi<'k?;&h"С֚vD~I,Tcy#2 fֽ%C2A7'%#DP$>&dtwHEl U(%ԘȄ" grXJa/؇sО@l>C^f)Ef,?JlقYz_Sasbvh{&`x*-r#+cUUUe~8TJ'Q}q.W}ˠvȇf[+'jי%,k}@arvs~&3"d= L&|sx:NuG`SB#s}:tM;b QlA&-P5\ ͬ [sd Us d`%+`!!p)#a#.|e*"X'ՠh[%UNMYmǤJSlEb'ӱl t1ip=_ XF\,j%CRD%Um\>K_nrejT^ZYu9]ca_śtۊ/-_7Pi׳!@;P@o*x@71gQ[3$ٽ3Vkθ3PwöЬ1ۉpsww>f^};zʨ;jTN nF{ #H)!OnN0?zž1z-ݞ9~3biϐusTi[оGV {Bׯ܊3֯θWG3ff7s2N_ӘүN*X&S&*8dGЯAIU{MC.8H[^Zwq7jه\}i^m7ݐsV<zRYE(&AZ,ݶ0Ȳ$~1E1GEg5SNt7YZ"'b@4NRdh㄃S&U㗌iWO;*$cL!VJƠUAHwrE堢!clfTdXk0 SlJ]h,s/1 ה\`څV @9$FлT 9y=?H⽩A`L:_/?wXg]L/yMp=2CMท @Wpg7yXy5MW=w;vPm>7O.$w/OZ{wqm9{a5tGxތ>{p@+O2ۯ/G\B^#(F7լt8<ÀSBBvAR34OM)!Q#8U(eJLjKҜi>89;+Kp#yakfc$4vDZh; :߳U[Ca۶p-DHW̿L.wJi4mR4P@ O* Rւz>=*e4]Ҕ9QB1Q ,Qkc bvaH> ;0c[ǿcGΞo苫tR[]NSشEJ!))~hR!9^"&ƇFh4 :g(9'\@̖%HK&4'J!%.Hm"\J!f-c\݆GTVhPViS*KåXk+rJ6BGbm*ӎԚ8?ߦXM7ᤡۃٍGdpROT1NPک7]v2ɇ P_rf٩O)OjId iďc$&CRv##q)TT{ܿwG2oAy[8*j v< "*U9Ϩd]ɾ˕&x+۫xjIYKkv XocYh+gD5;<Ռ8Yyw8vSHvl݌9f)$[/( YvɞkvU6<5tڄ8ɷʁ,K m]0\JP}HV6ٵ\G61i^FqitS@]V<$RϨ >&O]y*On&^jLL!eL3PC^Hl:cZf |KI*/<B KH# `VyJ `2FXlJcDTp)Z"5b#1 =KnU^t{ o-ʺH J! _-u8Ge7▐Rw'aѓ\.fhFgx!NY4(V jlfYPzƣLE%/ئ*6]q8$^pi;]Օ=`i=RfJ3IZ>|5%I!ƨeQ4kk8*>.%>{*4~n q;Uw $rP:9==.\J!̉' u2C9-sI<6Dԉp=Haڣڛtվt|ݎC=yJ( -m:n=]tGZ]u={1g;[\]< e J{/S>`6ӆy~ /Z?{J DT6 jb\̀d/l >TWeifhP!T Q ]@T9pD{J D|L tuj&\yeOvWvpu@5pVM FD &@ui/Co(>h>U҇_~T jyd:Ucxl{d#i.q84oQ&E`܆)ƣ:0K[6B'FNvy.=8ԊllOZ0Rltd˴M[JXjg/] lM.-PKG㏴9[~rj*݌'W/% PBSdE9uOȮ}JRmkI;rfa|ly~;M`{;Mǃⱘ꿩,pVap vܞn|UvkNetHS%`9hq ޮ}{ғW -t-[KTZX|SKqdp&6|OEO8 '~>KckQTj+3BquRʣRZ\} e$XgxnGPmQgPg4!z`t6ӡQ5Ż]?N Κ2"*#|;m@p-83r?4DQdhz'{gO&eJH׈rj{-6ANOMΠY@*QU[m-=a2pQO^ԇS{^nШmn8s/'_'ˢѢݟ4ps ܕxC,W;Ȑu4R;??KyW }pY!YEP\0%q^R|2֬DYz ej[K,egLNl|ZSn:ux p7~qY _Cf`G⾕R޹@߿9 ~h,Aufywt&];{x>|s)}xTKVm!vt'F1eQP݊vf#ю0mLȓ80!v. oI)a'qx=7 xr*Ga7SDLAأ3v0 >Z :H$PuЁG #!&˒H#aNWraL)c\݆Gm@&&rec AK|DTH\TlDh^I㼰ԡ4X) 7߄`$Gɽ|&ⷴ9!OK;>A#{F}D` m 9iOw IhPVvzrg'> ^rH9Q\sSGk%eW{ 4RTҶՊRPI\߄L\WaU~\]2i&J#&NY K<=*{S'+hJUT{qrxf|t8XuMX`SBSS&ZWi3) ֬RJ#H=\i2%*J#@nH~LʢBKJJgyptle5 ڄd\(8mhZ$x@|H P4KpaP@ܕJRi餌eޖ+x5J*AU+e\6V-dh\%un1`[F$=[չy0%J"ݼ7/)G -ydR%cLn k ů@7-+];qEL(aT]/,OTh\io iY0El?qKBH좔/;'G_kqN ZhtK={Gr{)LS ۘ }\_欆!jMoF;[eӶN3D,iΉW*Kϳ]Td vI_xM_׳7v_ktzs٫UJҮpwͨź-台 JE=S\HB#%LRͅʓW\((*0L#Wt)1"J^R @ ֌DR[dD{)wGQ *mb(QEAF !.Ջj9@ܢ:J93%XBC B1P((+ԠIAdKm f1eT1&bpH"q7vhlj=v~LۓQdNZ,ҀiL~M<w>&'v< 5P(w(24àtJ*.R|5Q?:s5bt1 4:&;l𱨌]=hRTi)/n݄\3Wu;rCt‡!i{MvzDDnR0E5Հ,),T-Y>Zg׵hZ;{9/Yۡ{i9YC,lr::;͌}YxDZ4ɜDyIlێ;lhcub=aD]q^1[RɞoOc'CI(g>{OsJ0&oFȰ ~3(Hu=Kz4M& p=x&{ihKڒ漛r " T_n1%0SZٻF$W~Z/Tm,;m?hֶ.S$uHuh-UQ_DFƕ3apY!3`\1rFfKV.y3mJd&.sϿO=vQ(v.kI.W%qu5aoǰ] +$S845zPZ/ WB-bL3Ep9HcH'T~kd3 c&O[%EgcQGELϣ,R1UBk+ i+gЏ&,6N"&3SUEsf/5>뉍qHc ө]^[٠r·砵SKh:++vh5MxQ7 4JmTS T:CTA_V{&`rD耎#唫;>諥rf׹K7:+BBqT,E9V(pɝ9y~Ųn?̧t>WK]_2Ӧ?6T{З {5`D i~H9c,&+s'޹ٙ^YӞ>= ,poNUh)';@ ƈQzù zev , uE%٫"h'-hl{njJ:*ŒUl3Xn)Ƶrixpͤ";MC#YN9fO&Y"Oz0' t:l_l*zA*m}ՒZy/#R!fDm©s\3Xqܼ)nieGʎ c2tqYh{ɹ$DHq7)+IiMJG~\Q$<~PH5́1U{؄vte45.%՚.1 MɌlBPmFvTЀ:Q#QP<עLڅ,Xn@ˍثN ,dPd}3tH.6J:+ }ዙ=[N:RYS 4zܛDUgIigU+4.EǢc^!ϴ J5_f6 "VĐDEg2$hB, ĘjFY/9Qʫӹ_F^ M)GÖȼkC(c ^M2v` yޭj7,r5<)|!>|.5v+n1WL@+e?#C]]lkR!*Et!f&ѹQR}:Yd:&:x@@l|wu[|Z\Yt(pW>oK8{&'jdŶ|Pgd^*ӋP,mO[Ai` %ΌQ e|Q DRPCPg3Cgog&C^;M=n,q[-Uhti/%lPyRyhygD8@˛KDd=B@빽Y.a`_>ov%pΠKK1S+ѶFUۚen;kK=JqCƣN lZ3(3D4ȋ%1%#JP%wOLfܠeޅj]xh^ލHk1eqV՟!/gLY x[vzIk3`WI çX=WL{ +.j~WWDla~ŧېx}e]X6#]؂e{]xjmܒ:,Ȇ/0QLQA1dؠH;8L$r%^Mm6C)tZZK Fg4N큅c:{gTmPI*uig]ƔdR`原!Vɘ03$ =Ц`P(o}t hIa_B&(5@L5Ӵn^i7khU~hH ǚکTB 3S4#Õx]\:v3,/NK g> W]k)Uϕ zn :\}C%/O|8YAgLr)+Z|D#W;ehO,S/'NiÆ~q~sw۫V~t1d˾6tC@pC@⺀ӣ_ g=qAD T"N[fs-[O -M@2GgÃ{7!z\`ey}98Ԛ[Wƕ F^ypa&rEۛo?Jl@!=AuxM՘Aa%1Ήs5ZPUa怪Qi˽4+ƨcs H$Gr6y?{"%<$ٕpvS1گߥed~f gVڶ^׎HsE]_`f d=#۷ouxu^ٳfF+ͨ.]}/f& 52^>u5Ѯ=?\\l2jns03Ց)cѾ{_)v] *=.F\dTj'mP\^h]ʮ#0GKٕ9㑿 #WleA:O)}Rb | bo ̌cɞ߭'sE=OGzsA^NVA)wֺ2±QDTP4\>}Az]N"d7ru~t2vח'.}̞0}sL*.j#tJYHb 3] żmdQ{ri<}ɢ|{J69Vuvvvʫ4Vy3c" fԱY2AY3NOT(D-ш( 1ma[^8G8]wubEo96]5xr"I*P9iT<]VMikBH(.{K'S+)i?B]ʜc>-j3GQS$\Yz^Wdtw%젫_Rʆaa\S!,+0wn:Dp7GkXPgT_L `(! 'C`tcPTMpOkt:MҠ,E|%0C "Po5מiHnlP.Ղ gl\xD~w-Z3'nN7?$:hxW7oN?Ź;i6E&9?s^i ˗|KE/_tŗ/C/UO׈)FNt{ u[]GFT~ / PL(pd1:9sU?Y;;D*oVEoa3~M!MS$GGk1e BJMin+R7a<1NQt9F,15NqbquNF_eٝ10ٝ,NyxfO 3NtK'4Te]ʌBk6THu3_:3(&tܨɀ&Kwp#5yw'^9tH7Ɂ=^n%>ฎIl9uN{$G2p]ۏWHk5,Z 1>OmǻR& i/KuSIQ \%bfV㿑'F!3)&Сi9f(D}4(Gģ8CUh|p!"k u%ᅛ 3\Ϳػ綍$ewol]'69ZԊ~= )B$H 8*lQ03~4R[S\p 'i4C֑пRB 5a۲ aR 'j0MJV-ڟKyVgM!UXè)/0q Џ!Sb >s{ #90 TjI0^(Sxͱ0K K #-)E-C>7q7Y0Cyþū^j$Bu>#쒑Wz(`T`^ww//nmbvw /G?rk[,8onf 8'sV#8=8zK6dg~q9*b9b4tMJ/n-.lU h`| txlys]C6|^"70˙%a짱בGf謴'#yU DaJ1[ X92㤵xxd'8(#y)\cbFb+0TgF)fyKR͋B%EUojRZ>q]\6 RfPW䚠=p k7 bE@d;`~ *.cFJ[d%7%(E њI%1zஸ m4)li11µ'|8yךNiO`1lf?-Y{?. ߁=|/~1"k WhL0pya{_~[.vhb6 jyW 3=4?EpA]xna#-!-HH3ۦH^.;+~u* "Pa9ʀ- ,pob%Khރ ՗(L;s6.bT.Z+G13e^㑊řOƒy踦gá& lմȐ2CGѰd Md%=GZd?P[XS9O8bx991TI[h4hx "HO$Z()33_73p!ǔTj{t~=")`]EG=sS3Rw:~D;-h)ߩ ECb6MBS|&$Bq=P$o$oj 8'KHP%@4D"Sj=oWճKpFK4KbY_fK|OY3$*!Hl tc'PAFR{n֣ךA5XPT*I4;c12K꤃= ;äa# '=Z F%^Ж{ؓbehpf do֚PtĘIF[NkZp.wp(S( {2XdӚJy C U,'7 I &jtf6EIbm=  mq^E 8 _**qߣ'JLJ/&0;_σjY lVdf澼LJtt0<!Tb2d~XFR;n{Q=>,yw }!)ǒG+!l˧{b4Z akDQe_OM\wn#0AQh[O'ݚzqqY][ܿi)"8}bv/( ifC"Tmn:mrYʬy7|!>6@4!Tav:=ݴ"H%Lj?kF5=rZ0$ {Й `DNQވg@׉lZbV_غZtq??iS~F/XXbp ]^*j!F˙]CmhZq 3rYjQ:|0&Ȝ]- xM^[$e DJc&=d=KV _TH mCrM)B5]&*)Y'nİNM ]%e}-=+R!_9&#ƃnİNM b)IlH|,Sb/Cwd-It;?-ݙ)E)`>MŰ--mkZIFy4o^jNoYeNC~_/SkFiga5h1NL6k'ݶQ,҄x[ #UQQv# ڋ\*BZͿ7e\wﳶucgdl:CtjPGP?kngIIE%yr(+hŏ%oy6+7̫1ZF+4F v|6٤&Ddɞ.TedsxMfb(q 6HEg8r4ZXd1R8 X4D#*ljX攦Q}]#VZ~yʮOͱW/hM(WJ".2wuHѺ͓XXTKYV OGj 3sПfQL$x+2&Ent'yzV(rt ̈́ƵZnaԔ%@b`X*cQM@ޖk{[.%E)HSUZR3aLcA4ԀZy%"(@PXKP < ,ǵJ*Q/W %s;9 #w1xBK߭%SQoț_r)+^x ?xFp ;""C4F!h^]q7_Dc~l>,CEm~{1W`JAb4oWZŒmM1Ұb4[! ),m@.)arT1"+ࡋѧ|a5u/"&^Pﵗ2H2 A j_ߥf,ξͰz;nAb*Yz"m;)tŮ_>k(1\Ft@wjp}]PHsAyg^˧{'`4Z܈#==wdӬHvӬˆgV ]]']Ꚑ0:Ʋhw [,Eu M WOGkV`t Iz&Lt/S֓Kd^dU0[SGfg JoseJޢ UKo&70\`Ϸ.8)+wS< oKOrMAe~N%hG979ZiRmj$q5Ղ %~C 2SJ.6:Sݳ{&-]Q Ugs.qŵuJj/*6]|LPOWh<޴K"79ry"B8ǰr%3Hm{S*"!?YD02ZeĨu„<˨Ô $W$p }҇Gxdx](-NUZ\O. nh3\ԛi6 \R(&7S(S~p˦/|Np&aod:V )ȱuD .0(a09B YMsb[Q(g .C^텗䉰&H F#}(B!/iAɽb-VabCDk `pV)!I<}{ W[l1ѷX 2`._\Qyatb qtp]zԞe`BtB)J3́ (Qh-4AK=]%r&tt.5?fԁ.G ,êGk< Lו`oj^zSeojM֙7u[:>F75ӒcNAicsSn,76FchAK24O!n0|!Yx -Y^8LyAt//f_YRqGmErrhv}z˄shRvC̿?+/]~tYv ]-O)GQw9Sހc,C=pXVAGдH2a0ThǴ#x94|d\+Q0F9ڣ&}XEmSե:qv+$`d =볓+g'qTv2gBf ?;yƁj;|G"kf*0ֽT)$\Y1fH~6GzF0k4-yd2{H@)*Dꭗɕf0+|0w:TX u6X֣ۛ\ETA+>TVM wpw'S>+;z'w <~?/'sGiΕE<*2#P{)4l㼊 %-ox* z_=fCTgz8_!=K~:"q|'9kCТ=gg8E gꮩuƏ0<6 5=QС!Dܷ (T7jz@㪉w_RFp TZ~'!K%@Żxq E0뙶SQ/;QԣG#JH{% ƻ{=|<7ݼwQ&kIPoWE8@]qVcyԺ{dQsZp4]\e74wp{T `٘]Rw-R}p2'\x?%KL̿2g#?}oOAX7|l (57u-l=insNӐavfL٥aW!K{ ܸsmHеwEq]8)R{N9s3DqcB8y#VQ%]>YdRS({j]j&*}Ǔ rz;^˨Ţo3dC{ %Onݻd ]\g؇g<Y_ۉywp6ͻ2?z'Iw)-hqWWWS2{ot)kPAD L3OѤ9%AmucuF`R1:Ϩcݎ:Yt:Hc<)NpP%(y4LC|vSEO1qin0˪/g܏7 w_]E2C,&oB,&o*בT Sː.w@P.D;GDPNy M? oy|% %KB:Y&et,+Qֳ!!5f9 EPbGnGVkZRN*^R_|YE/_Zۛ7wJƣM<i]x!|2e THȚ淚,'ܿ#.‡.D?z淣n)+zu28[|Tz(fvYK.BT̓ r9ʎYoT")nԺWE%kWau :]2{xC:s.`-v 3Y pMIA xd^krBܿ.톔pΝb)vX m,)($؋U%"sZZ^vA`BgJ8<>qQqB P"('"0M0})J#kOX;Ti.-JK; ݀N^>3𴜪N0@ɎM^֚Ξ[k:7Q1sf? 9(}kl#0^>p[seϱz TnZ~~DN9q?h QIEȃ "3g^K2aWg%cq +uRN}AU0PrpevI{YM#}̴+#fWGxzI[P, yq/T d5A5̼#h*2h5.{w"T;<Ƅ!$&bKֶ+y,,@x:z@UJ mP>He`wH_73AԮp !7GSLAFb3p|b[B}ZћQiLPr%Ŀ/A b.0`3;kz';]j^CNaO7G!z>G@9=oihGgySc)6Qŷ#Rݵ-)Pߴ;emy[C=%cw;*WwbjGXG\RQ/:ݩ,k rvY$_ww\6ӆ ݜ cr֐Q^3%7U+V+$2PrtJ4 ňӔ &n+fOo=ngZjjfTQ+ ;!O;ÂBJC`  G˵VR0 U+nT#4ONTjjuKVySNŜJ~5 QD ե!W\r+ Rȑd+c;ŝ0; c<N"~2hW*z%e@I&{X툠("Bu,pdv^.f)-G+!6m.z7Q.[؞JU)V[ٗ(\7*)@Wq0]# ܷТCiy­@[oeY;-ԕfqU}g 75ά˿W>C!_RNHSEw)^l;XmGAQ: zztrE;=yӴkJ==)j][@b͸Ԣi]eڦO<&LC*fM5-JxRkKIC} xd>*F9 G^,Iڢʖ>j0j*ZO7fNME@n鏯{[TA_D]`zU.?4Nn|L+9&C%9Ž"ǞZy\-׮Pak^Q @^GaE|nZfj4n:}q13iްnbdC8/X73g32նOr<%c3e48E4Lg,$77+?rf V~䋘h8EIzb,R)'U/ yS4iNI| uK f>u;bi%(жuKhu1CMS -lZ^a,,׀g e3/X=gx<]X'q/.BpԳ&IfHP2Z\lA)`5p%Z\v1%e3v B u#$Ute;q{53vϩ+]"t<( "] tkon:*/s:|@BukSeRp(B$j4 5rԝ'WIiHvz<Qj1r-^\#{NdI(}zJTt=۩ORΎg| .kQ9ܼT"qtcb(죢?] pu&[c DCT8 "9گElat^5*"$?}b}#C>iw;>6CӐ dJ; $+b':? JxZR9xIK za α(m s : j}>'mMgTECq>ݍ:™=Nk wHD !JK^hX"m N"1QX!IJ4lFe@a!1)$3@ 4Bޖ ײH̕`EԸRpK5ݬVz\,o *Xq 2$ i*3(["jaV(;k '7kͽruLCobW?{ڲgP"JQF'%$\,GDs))RWϩ: %wcCX{GnݗbhPP26x؋VLes[L PbzxEQoq:,kPar\ |3}novJXʄٯwR< 덣"M`GMk$0Y3HYg71\mb]k# F CP5ָV Faٗroty\+'E J Q=ɭ4 HE G{`,TcpgIHc rĔ)$KUwX\PR{'Lؿ[`_|\&!*\6eor6P^ݿMX7a܄upS];x7g2rJ/bI4g('"JhFFZ* F!r0%Hlz\H[mH&-[s?vZ# C 3 A C++h~nAG o̤@~y[>?Ivo $ԩR?{SK5R E+R34CLKZ0L !@g j8H1'BSThP?0-&ZZ#!xf@•kpw2#oכ|=Э=CU0˫~5~pR=z/*[W>t36x4Z: ~Lg&q?ދ [8, .T3Vcw&K~4 o>jK],<77}qM% ~%TqDJbJE >\w 䐥Su瓡MŔ$eL1_DaV}ޚbx{:HQϗe,뵺ے$"3L~{ia%\/'.90o+bv-0$ƿ'L o<`Y,H 1^_GTsR!X ;Ic"ky/u\T{Sy7+,Is8f&\7FW+렎? O7!<݄tM5q~MfW_6gvoVR02.B"/8 UhJ/DT:,B˪{ މa^H\goc .%L~:Ek)i1g)BAH (5f2zt-U .#mE3͕q{#RK(r 80Gyƌ &+PBsB1+Xt[C% Cybs?6&Z*h T୹)ES.,87EY9ܐTcR0Q?L^%KoŗYy/|2su<\6CѮk`c"3I>E~l=r>Jkg|'4LX&Į?,d,[bOMM OvX48">c]g^=O9iMJJ&v0andLa 37Q)gi0Y2Ů%׬TG37 lOCRJ!E IdnAa`)>D(M6cnl oM!X#C;QePbb  U[w|qbJ [ȏQ Ւ=CnV$*#"=Cڱ)͊f֢@\H#9YU듕MH,ZAQz>(yD9iGH8,9FP봾вTT׋j\?n>p^ÜBbI%W\(q3@)#zJ)v+ȧz+ҼK \~)b+ "F:*z')ǝaz5Nr3Kl")H5Ngs/߅t/[2P@3- d#S׸~Ͷ]}xae2+?Nܯf W&iKS֊p޵~18\[Z~޻~IIeC[֐g [4RY3ԑAZ_սb/Y vSImbߑ\ٿKg E\ъ5.R CtIY |Yl0 E>w>`niF(9IFNA",:V:SuKpb2T!JWʮt=ܱzA:֕,HGűܱ9%wܝV۫˚K{g52ME+v/n}~x"DN11YX!n>&DS/ӯ^ӽ+g[!4 ɓ(k,>Ɂ >#}ZF'|m\^qNf!E_I N=_=8yEhS)5sKlvT5T8# K5U ?ߦcFr&/uQG]6+KJܕu3kq/|%W[JeQ, ;>=NmՋcmvAfeTCچo@ee~-$%mo{f؇bLߔ= Dعv ē~p[Z\8oy0Oët]-}XDUZ<;ό$zG eH83;Krc*jA舋E=gkFB+MhXeYˣtXz,8z)cfw&9Ƹ( 4 *ˈRL32mr[A Z te[hkW3ϑʭƖ[j,Csθ7".2GI,IhI.эe8ֿr{O| =pѳ-ެj(N T.EK(qT9QYJw,h]ݎ\6v?oö7n$US[a2.2tϑŵ ^o/(r?Ey G52 HB`(f{eT[%{m2Ԯ:8;R&Fd3L1K[  K!\w9M+Mڈ۠((Ǝa,,$Ҳg*$WT?t_^qaY+Ct$ e#FL62gn5uf~L"%'c T X?l%4U _ʶwU*,x .8 UFa-f )DiqAjYؤ1y] %d"16$ \䐫H'?t XW\Nsmwc+!zpiy D(hWe771}l^QO`aYcEwy\%ܮBETS)t$$bi`i>-@!wf}&A曮by$-7b:[Z|,NB0]r,;XLXIqrČ60U@N er ɑr9d.~JEۯ+\׻/t&7,d'Z-OC2but;0/39dQQ6CkP.~cR*ρ$[!j03x簈C6|5OyX}s=>iͷsx<za9CK ,BސEW?c#}Oc&XJS׽._0BzIJ݇_tSrõG!ʿl&DueEY÷EnJ0P2@fJ^,a )qΗ6g #I JmFR'}ya[g.BƖ2$ gBx;$#F')*hWR^@TFR}`=MUAm)Jm)üj lZMҊVrܺbӌZ_!89^٥D厛$IR30g(|ur;j,1Qe۳|Hj!SEhP31&() f`hMBaM&416V1[T AJ:A =|}I@ ^;V:ѓͰI#U뻏A@zI sx.G]^cul!|uޒW6 %HXs] Jh^/#rIgh)ͶC?n} c$ vbSZ`O pl:ś/Dy ĥ|oK5qsxirߍ?f"H0|>ʏ0~PljL$>UxPh¸˅ .>i^;UҿjΗU Òk {> ;D(m潽޽v/*-{IOshW_2>v8VGnN Sg(mnVE/Cwd&zw37Ҏ$č"i;dHHI[Ǣų]Tr=ams[ f2]>΍~m1υQ袸t/nN{9 𮅴4!w1#}{E=yT56PUU1UpG2c T r1zYJ!i 0'hg1E.B|Wmh>VyRqӌ5;?Ό.m@W>d9"Hv!.5;dlD3lC^ξ%bnACK3Hw3=HL% S\KZ ]: \C.9<ƎVO,Ea7񢠇EW?#Z8 FH[JikjkDZzڒu} Ο]bm]4~VwSYȕEA65$JCz]m=ٿֶD[n7YtC_À~3T<Ϻ=~^RX^z=t *տ;DrP޶{vݫxBW> nap>~@wgDԏ㿳iG{vG@Sg(oWKkfcA; *A*2FX Z Jy $Ec2yLE2*0˕Jq+?*ñK 6K2^GB,ֻқXJ8cCj(#($U59/HX-u]L_vfƏ"hJŊ=ϰ=Wr8=n2z{릢UT{W=Sn0oLN&ڤI?Q!V5nɂT6F:ɡ3ɨ(b]8s1oL3}6#>ILeRҒ%zU(Dr:R_c837O(7"2A Mswc+f^Pyy|2k=T:}~wÔPG=]({/:D[~ĝ XOw+@EtJi{&;G4}ݒ3K_[Rygܯ~hpbwh>6 Q(; F˫ m v[j-TińX`g-f`rٴZZ&ekh,a/O&+'؜IF2*`r@熢CȜb1pbr)YJS%e 3-H gʦJqa@kʵIEJFLnݥf~`l ʹS2e)gV*܆3gR՞'][cUpA ,+sC,%d&9MJ`wKufByar$ 74CcC+.Px4>\4A&k$TYŌpQSC[[٨`nGduݞbr}/,.Dj\K}wӛ_ ۸Kh)8^M\=D[J^u`l4p҅V_>^}fQ D;&R,-ft>cQª&7,X&0T$rjJ17T^n{Tb'xI݁ʨp{|_a ԼCb &9}zaMHhk=|vR~=088?Ʊ‡QP#Z st Xb!І<F"kDTJa6֘-cXR*6\ql00؊kl胠{;B@u) Æp+ׄLsI!r 9'$#&2i&!8H%1\XЮ(r^(%so @]g|m}WcoNvpUDEs7CѼ}P:F@q)[&m-s ָ҂5.:ָDC #}&ݜ?ݻhtF)VCs;26iKRij"OuB@Pdь;a9kr3@z]v띢HnI`!."n[#+uxJ >7jtQ*|7]|X4_ A pef<d@R.Len&:}e4E:n`t_A=A$:GVXf1v;WOb,̰p#k('1smp."6)hbPwDn \TBzy OE5g8Pƶh]vO9ۧP6P7HtœJRx= $wnc@Q[ըjkF(o˲XmlSKԟ-pW6t2'C,oWbT_qK׮HUr8wXEpDdčdÝx*P]7sc&}7fگ+%?w!^ r9FS@?ɳw&NI0c"I%͌ R2J{&_bԕc3w |E-6ĠPfGbf N ]D(lAJôfPAwVCB}B;5&p˹VFI~i铘9t8 xI swRQ]S? do#Y/g}Nqdw_ j1(-Izː&0lu|U]]YDgT%A`>uTAAAAU~UKI?㜌TqHQ+K b4 !tvVUN2YBDž>- :׶(=ʼI20[˂hFӞJPZgt:hg) kǼpzcՔ%G$FmkkUdRl>e˥#=ބH!ncק]̵̽cU#!xI" fzbw)Lk2x3I1,)rg BG~\cW"t"\`B'RzaĴsYAKx 񮷊ltHBV6Sx4ێW!w}j+A6fݓ؉h;6:a (٪WeB ϤAЕGO*28/"WkEŒ}~'4mnuhRI?Z#5hڥwV6"(]PM*k(S׉; 2%D, eP|M '!P%MZE5^bzRtuxdaȽ{\ l]-޳tqAZφ<)!olxǁɶ'yQmIlһ>Q6f >؍M ]ȻỤ s5em@E2M T-Lj|:ƨuVKGp_wZ?;ī:ޢ\/3< mse$?UE[}ͱd!}K`Yu$[rC;n-./82v?vDׅl8xs+J C)U]&yث&$N.(s$iM ͅ+%s O%UZk \z:J"C^G`kڒ {fK&E3Ыnۏ}=ݲ}V1 ^^Dh>a 8RiӠQ.${S56.(]ww{0/I5qV;IXj^tߗbNNpE6@:Bi)1PD?f6bF6nB3€n1}{OH,!#(j|#kW4 8-TP8|k2Wvԙ k׻K6ś%s&{>PZ;||R]eN5GmXkg!jH"iTM.?[@r^!D*~=6Οfねgi_t"yB0.x&.|;derKa#NҴ2Neo t#° f2n@>(Mhf@m;cק8aoHNӀ{ ܻ$%-?_"l s;_+-`wM&=_~1r*@]3o>`˶Z!ZlZ23_ LST.#1[(E"!}S~+.S+'Ͷb^ݏiLo 7_kPmu;.B+kO夁EDBF/AHhJ `U4>?)nt)޽|ܝs<|g xȈI{)!E(I\`;D4:``֒RV[>NJ>N|h>][Ank0m ֠*RqK[t R)C$%R AS6Wv +Q_||m7wUsW|:}0lobfYԦʙe-뺵#5W|_#;pܜ[0D|EVG{v p0BN 8DqJ#Y Q(873d;oDZ #5vWD'#"/z - CǕAaL|M6ظ[WoވTT&鈦&Pa95ڳ`N, Zrʓ!9Xb?uI x00JΧZS6zpXB6XրJXJ18$Kd#q\j yOkչR"&(/7ʠPa9nnaEST+-H-4n&\>q YF'I a(:aKCN FZE 1rڻj)|]VuFI5@02umĖz.LR"K a)NpshPGEr[ K(u1HVbZ{A626$eI|ΣWA-4*dzAh#*InQ1^)R QS˃n@火ӑpms_m܎>/,`mVܘ AI\aq-A>2 p2^΋-1GU!LBO!d}gJ42$ahj(adXLi*K'Yi\]|cƠߨ撽LPg634:wJpQɈOj<[/"d(:ejxrL S':2>($1V_5  n)2ӘC ,, P_QZDGϗqll-_`mVC~ہ_z#~~o㧡?9~  9JjͿ>ہ9hힳjAZj 柭 !4@C`"B(::9}Je AH-yK+#{-)^^}TwP j m]XK4mtkA=3܄\|n\\$`晦rsإ7:UJLg $S3ԃ[$F(p܁Ưih jvY#Z1hTsܶM#`:Ut'5x?Jtt/̴lO q&JJ޽_y`R _>3dAxQh?~^x2]?ޔ-g)?=BN?߹鷷O߹맛>A[Mo/2 ~wyW*i M*}Ap|bP!QAL aT.ldSLio/BPKI0&:E(D 0Iؤ~߂j*֤ZZDBhr6rvuHB2T[Z s+II;']@%L|a䬟v>KST+Y}_!"#,8fl'y.)emUJ=$TNNvٓCo\weP枚vW(",rpVy+ նOj']䷨)+x#)e,с ABRg-BNlUjf/;6!אJ:ߢ*(+m2nOTM\PVTM͆ey~Vic|hbjmvx_\?n"ZcX޽(?h;njByp'Gp2OKtNS6yz~̅w[)kaE)1cVX9;ґrm%S`<(!P1vbPGtDm?T[n_vBB^)C@~($NvbPGtD<fkڭn]H+e1P鹛!~zܵ# h*]WmԧPk Ʃ>e8P8KmudM7%)&vgJQ3R FJ0LҨ4s]MD鎚0%ثpO3g /U8H[:e6lFJNin-A{tv4>sqV%*"g2(:Dd"_ C#]ʊwQm$sCk$Pʹd}r~DZ 7{^<*O'*_f 9rqp"f >QA9!WZ_Q©a:ؾ*&p8GZg+!LA,޵ƍ+b wV }Y`4HL|}b۶[rKK'IWŪ"U/ґ+:όtMX*%#Zro˧l;H=vqI`<"$A# GJg)Yd6NĶb^8bM0~I@Z$kJR }i[HimvimYou~W'lim;ougVV;khL}E`nxƜ!&G|sljjbHQsY,و[:4VvN$xb\*1)>]RdSMR MRz=TwRmqiQL[fި+&ӣO-mu%y|غs}Zc.="x7*U甴2Rv ͺڻrz0Cq\p$"; IuDxopӿxܥ娣'YʫMc'/<0q䓜: F3kIϔ|^f{~-Xq XFYV:CNUA`c+Ǻd[D[ܞ~^e\xO:ҩ?p],.tJ.61*~b!bu+?,i)9g6k*[b}~,~ڕ EB8{p{$wxTbh- k. պ>6w?!_X./?]_X}T@G1?sa}T!+iqx5*vrdR D%a_%;*IouJ҄8kT14Nnxänrl?5TQǭഈfW]>\oVF߿!&7>,j% =԰1m%E4 G<ЮRn;R_e{n-{TL<RjQ(cXeRdBhu8TݪsyE΅Qno[aW<[tHww ֨".Ť%ͭ_Y9o9in-Z-J˫eVJhxq*i@EQ yNppjna5{1tld0دsjG5:>:P :5IFB)uR_ɫx!r[_ ΅aȁQ}uFqkF5/ۆEm_; [&qx rUƁ|]5Sؖsd\WI:q(.Qf.MGp)\:XL3eC(jmY!Oгݦ]9s6mn-*T0x3bs[,%\)ogHVB. W> ;[yYⲌ,|\D>f'1; |:ȉYC*eR11*yobɑ?cg ٚ5slg}Л bj}&mxX?]G-ه30V6z@ VJJ21+=` e>܍DII"8,ik9ʠ!=sQB X^ ` q1ŤJr$c.`Sh)s 1ZR $IXq"!$fj2y&vK3âMa 929n՜Mr| Z}]hvP[2A`.Ab  JQRe+RP# y킗<*',JL/4%YJ}#;F;0b֨mH* BӄTpds`j,!Hȓq㌕`>1B/4runf>ݹ2c.ۀhin 7vi[2YᛋޠjL?Jb4BKνgP8sѰQ躶H}@6.-J*^$ya5[[vqbp"L Ȓbd (Y16%aj#h\p&0:0 wק_L7Ù4 w-?;&Xdit8EYs#VgW'|%Us3[ ٥( Ug~pn9@#ܻSۖq'KS>7}@ݸ6F()ylԹP.uCqpe & Wa~2uÈJYaJIΔȄBtݔ508Z0mS(5bL[ώiaWZ'ETYO A^;'1Zue9iPv.fЄEͬsxD -@03 qҎnf;V?!}ؙkr#;`)04Yx@ Jz1"י0Эqg&*0wcJ IY *0u&ő̞q͌Am1P@V ;J﬙+wnX(gOqh]l|h U`bJJZS >cmv{ywǼ%|C ZU􆡉Y&o!uou3f CNA&<]iD3"' 9n:cLIrL3a7u[9-g (МY.LŒ>Mp[9ۯ`b'ߡ]A7*}Q1Q|ѪEFvLP1* zS tF&wV|o&zʠrផtR%0h<ΕJ'ŕOAZ"=dBfCp7zQQQJJ>i5KQ$IJBk4l+#Cji*Sͤ*!9^- 1W p@fo#tP)l iddʓ) {a5fnIU1~W%_N)\72&${b'K^qVCM扛|JͲ~W3]Ϙy/"9̓ ؐ >Zޜr(R|ry}yO 'IGkvf#"K&FA`ĿVՌԪF3OY {~. )CdLJ==MUy l,4ZVRIXnrHCwygIϿqZo+?OpiV–ؖ>]"I ӜAJ \$%jtbk: n+][kL fA_y{a}[Ųܖ?͗e>F$_|2;L Rx(AR$_y5Ptl_J! xЌt8]+E<4>w-(;uʴ~-w`sTS|J^8g1k=~>z_cAQk@m\J6u$dy1 Ȁb`lQB׌6^?EfPkھ_14w5RU::>wGjRjy qR>bh)j.R x֠(H:H3{HʛJ/5p] I{{>rё>r{vswy{z aK_Zߺ?p],.yj/6-j5۽~nm4xr\UW94i򙪴 Q&ON-$\I($${u]ВFNƊ&,:QB@](sEmr'jx20r0=3P`ɒYZCT`!` y#$X`q@MO_&!w_`gev&DCCdYN^Bi20^P5=-]p2F~mviuUe<e K;ΆX@E`E*.榋LD{Ktq sHT-To/e#SH221+=yGANϔR\RXV Zg7@v}4 RP_K!xXRDm(oA:reT[J{CC VV AF p|[ݠ7 )9τ [˓Hi ^R>:'e,=1/M@0sy{7L)k Ҿ4L+o;U٘52}u*gL Adc P3nm1 y  7f1YB ,(͑F{aQtR.$i%˼< x-!HLtSY8}pDT6BY_q`?+z8+_C>棘zq{6AYGsF01̣iJu:B+514@:2|t$$nDCX}<>猱7wjsiQ{9gҒQnXsS%QsZu~ڶEb/4^.o/?| k[ܐLp_YI_k 29[d1ٙ/AĻHd2+JܖU- q,_ŧnN.fVU 0=yw'tKOq‡ # D÷Hvww: >\ f=}.2 bf( >}+ޔ5d;G?KCrt,y4Jae|:]J_:"r.ws|BC˻Wcq㉼$ \uG` ow>ziUeoKKjWD i,/ʧOf=3<r^aGU յ/,/ha LIC-쎧ޫLmgɈbZJ蘺JhrPra5F(@ev 5`>吘Tsb27X6hrngi#v>=hCAp:BbS*I@fPMVZZ߅@e!C)P\3 2SJP҂!&/k/3&C'?~mV2x%}u}:KY[<,̋Ek: wbpҒ&Pq6lvPEЇ /!B+h\Xz!S$e1PZMJ# 42X% 9$聰E-$3? XC zVI 9*p.KApję.R _CI9ip HejzF͖]D۾9Ϋ>hޔ / pZBsW\9[T3 gF{L$2*뉓ta94KJ$ܒfZeὍ)5d#n]Z+ik1դv%QFxS&!}pV8s*{uP%,+QT٭nn3RK6t4v9߬d5=[BhvK?1yǔ!LM$ , 'I`Ĵ24̕>qzHk tLdt榽]nYg5%L57xBdE:;!rzt|Rʥ!Ss-S]L ŘpOB[=$2a0v{ ϨoÄFT9IWij{n NvÃg$), P o@}O @/gW'z?P]GW45ҵy˔;K"e4G]|Ku<^ˎ5hvJuQJ>-;{}" Gl1QʄDZC5uz>RY_̌\gFgl/O3NttUܝ_+yxonl-sGy:oȧ*95S'{\:rE~2}Oo?~ 8WK)g*tl=(τNjQMi51N1i։jѴW4,Pŝ+DrHiE\]b6Tz"{ɤTMdB: *\rD\%T@]!S; fY`]:Y7wtubҳ8 J[cNi jJڪ9&\]ѵX'=aaFNWM¯~`4{i1䯁 bY I(h~DODVip<(&=vPpxsA3jCBmDAK7Kzs1xۯ{5rw_ PdPz 4bx!(L6ͯ+88pg^KFR+M#&o)mJu񪱁 G#Ŵ;Ljs}Í~Yz:H(S`ɗY$~5y,jJh$3z/H. xǥ4H%֖%Ҫ`l!@|FWXl:,6۴g޹ S-SV3yh?lxowRP90*U8D&\D/QCE vaKRْ5:!Fef4W6H2^}|$Q@k#.OcJ8gf |ԺЊV H:DEܙ!8W8>͕OҪ_:΅C`+԰-F2L192vp4R)PF3w(RfeZXb*[*f"}aV|blqݻu5R +V4t ƈ1&CAC֭׋0b&2` Vij ,';h/Riv5@cVu *`3I  Z,YpQf4:7IQ L3T2Y͟94\HR>}/gI{!Oz*]SV5e iI 1S&BdUh]d:,3(k~לg?a&wKd^Ɖ룚6>dt`$#C}K58uhs's+ؓ:U-0tǤhmg@Z_ΊJ_iK/**Vz&mt|4~$9-mA1zWfq!(Rn c-F© KChKQˈ[Ɔ*[Us귪CI3W;m{촬NڸMn6rȔCS.45{WxQ@~ _?tDqCP2DM 7]"}{f8]-ޞ!cm&m({Xm!vws/_os|񊧁-6n}Rޭlc>eIeڸ?ibwZ_lCPPϸ!$^S?'|S.n{͵5J?]]a$V*Gv˷ƥ%|ˇnpDXa#-kߔte!W֓hH}/$ a-3#ax3rUƒ~\%Gbh&c.^q0X[-K?@UQgwxvTC=p2$<űI}m@QC;C!jk %a|rK/{QL="̔_-Q&sWx3(B{?r}]An(Q).Mu8'Hn 0°TYDD1%)ct%S<2Ӓ1DJk!D**#O"R`Ax42jth=AN0a#PN@(i9quLcSЁ2jKa7rVJ?!㊧:h/$XIKRZ`Tϸr .Y.OCd Q}]L+{O^-Ŧ8}Cf:E9"̈́x:ElWN{_-&WqIc a:\|)󳂳rkYoo㱓a*M*gHTlWF" \(gDtBɂvE T7uv5mRIukQ(eTLUA)-^R#;SI88RJ Ύvիe|h&ۗY6^5jKK&YkЖ,ۮ\ҫ֣kqpI7hJ+]GW7ӐI=ǚW5i9afiKGSDGIfmd1StM/25BKbtta Ɣ&p)"D;ai#QAS|TXg .x$PJr0ы  P&ETGg##%(!N8f<~6H_˨쾏e c HPL'W'YJPhvVpgʚH_Ae=mC|YIp31^S"(i<1} Ѹjq4ʬCS+UjIy0V < fGb|hP3Ʊ$̂-8P-CG*^7 5+B^:(\z}uA/Vo}v;Etz~?{9i!ZzPHHa $Eex{~9e|[שfըT^JzRiK/]zLВI*R2r~{ta+Yy}.=sQJìy$F2Uiy;MWݚb#:MQGnW*޴[nmH7.dJ߹ncnM1 'K9^At+5롫tG6[ɣvsw=x+khW-Ժe~6gTm7 40bXZRe ۵~1Qմ<U)!Rks,0u=;;&#"Oj%!mWZZ!щ0qejxB 50SC( *Cp /5Kؖ^kO k^( `a($Pj%YIRYy4LmUWk FTV8Sj;-B0@ v ˱ao%W5SU 늼*J)T!k.˭jYWxL 7ʳK1EaZkJ⤁k/f6% aJWRR\1ɷq-Ǥ `.0 ;Z j*hLqNiVĐSH9#&% dTP`G(k[Dl3pgtbhƲTPtYB` E>Jo4(kgWtdt2{ؠVЂֆ.g f ҼE2Ny)Dmzi}RxE@JkVQcnD$eBTpx)5Hhd\+>SFN9]!0as<:1l]M7 c@|s5zS0ΊXf0^0p+?. -B4H>(ƻzeZM3TM>k6?5V҃R `eQ%JYB%'@(X>m!aD+.BѺ$JThIdYxfm=Thܥ /Pʳ"5A7WG{| 0}$-0fpH雫+xH%o*A }֔eV'x;;qp. ]#EQC?b%V'=dzQ%E{V͸:dGt(23M#NjբhrœV" Ȥ0I嚂rg:+z_5bgX)`/t$W/utVSD d(WVb_2a6f0V"!s هx1>AX4?"Eq_ [hUm,)ef-3gc"T=kQ ꬈Y_¸ԓ\^?ρ|Wnүku-FRGgb}a;JAȥ] ;MGaV ^D2j-v+Oxe"iq^2.< &Dsi8HE#f3l'0\:p@@x;5XfݻJHeS4o!$1܌2fA*D&!#5 "$҈ ,PƘzBu:E0jɫMf:;߽^V:Zntv ¨> $0`%8^U"ScLo T=7$SjYz4Әi[ʨę;7N҄vp%0B^l(Å dtqEij 4kLQynLN!MBL1#cDy8bmT6Ae-go!pմYS5BIf=AG;g,XbeJC  H4DޘSқѯ" R9XMᕶy&\ߧ-vq+&gʮL A. Ke\cDZ%ugeBzaKIv͉heP*_q6ۯ^γ.z;u~Bq:GK`4Xi-be tftO1eͫP$Ç0uލF~p7^TX:&Kc@nş RcNyz7ip?#ph#DWDٍOv|6ʉ>oV 1X#X fVέ$esHך)Sowf~LbᲱn-9Ӗ7[0 ~N|ac0<"r|ǗM߅d3h@} s_SL4k>(dC|(;pks;DnL?r/Szh_H9-t_I\lRukjtDSZqY }覴i9WWYF&y-Iqg_s_q! ;B CC=)g{Z=QFk;\)9(Ў{<01(Q?B0hE7&;|s93?d9KB/B_z?kD&ghҢ݄pQ`ޟh"{9L|gGP=\]U^8qD3%G.⸇윾ȇ{W:k՚p ؍;C'G}Ms/[!r0iwl2ts^p&7(# ~&Q}gS Utq[ #vlUxs ѴH{6m ]=E+!0癛((}׉Td[]OT3O)1N5t,/[@Fv{u _Egljg03]\𮊳M=)Cͳ:ʁh!l&M-`Wd u>`Rӎn<ɇGZ( b8JWґ`va-#qz EbSoQS8bfzf:!0K80Ҵ ͷ`RYxr[NҰp3 W)9(?H|{6~:\[Ƹ+`}sz71ӫ. Mªe+nz.[3}ٗOU{[_I b/\ 6vaVrjC/[L>LZ]BGIJ>g5 g!5)T``l>عѳ=sAstiPb@{ P8.Z4U>#iT; x)mr3!^GF@I Is XpXTe*ع^6B+I!CYiѳz%kB/¨ն7T#&Ylse%M6a+8_ؚCq5y4SnkAׂw5d1դ7e$Z xG!ҎZ5ӝlW ^ƫx/. #ށx1݁xt NF9ʴx }5 5}%5TNykqqݭ6`pv=g&X"(>\kьؠu(+R_p]s %I,Vg'i3W9`.M-\[FZv8\=l}I=p+Zp2۹ aQ9xq-nL7{4oO'wVEe>C[-}~R?DwE.yZ /Ë}Wb2w.j=+Xb5ᅤD[+gl'Otέ-L)aﯦk/~\h3I^;Jqɟh)Ryd\ ci$FH5(k?N]A@$"BU6"حDNCaJ'W䴶;wZs2j"b(#F (B)p8" Ad! mwFQzx(#X+T~8Ȧ "`I50d6HxQƘ 9'Qc@HWL "BդrϙZ3Fve"D  P.:3K,$>"rXTꈮce/<4="XUYi nXµG+ In> ij3>TI'3r|Ɍp#L-U jXm+ED.PcWp:+Xd[YIvP+=g.Iz]=Rl1-z[#TuҜM(|xV-B 唼,>0y0Ԍ-L{?>ŔJA}粅Ic>,V 8h^}7˝ܺx.p_ҙS?ПƳtϨv.٣nv{;  ӄK1Dz:OM/)0d+^5݁" F>oPlG}qq0rѠr͑)t#F•j\ N;h4SBnjjE4K4齟xe[& c;n}O+紭n7n$䕋hL)xzSUSr1#:hGxn%$䕋hL8GCO{4=+#S<Kȱ{ղpN_ 9T> 8J G')N8 EWG')N8 EWG'POgD8JsqeSPp%`ܖ]]4=]H[{{ Pr$[i)%Dgy1 ?Ok:) F%`)0'u,0e%BURJI4JT2]( Œ ^Gi-]AϨ )qu"(hnh vĬL+0 Qv*ʭӖrFZ4g VH" ;,2_6mrx}`? lL.DDJT#S޺Tؘ:m1e sZxoDvnԊW=Jfʢ)7 ևkI&ŽwuhU5fp *h{qa#<@%?/$ޞo86C\].y[Gاjzcd!Czs5W(>6EK -F]Q/sv2k\H zҷ\J"bt)x%]!T-gv Vh&w$W +f5IXqĀEvsyC2;Hl|ZBHw IWnƓ۫Iٴꭙ]=L ߤ3›3&,=z]K| L^+7u S 4sd6I@T!mI7ߝa4ixV(o/Vkuݹb؛4w9'pq˰TGA"OpI|D@P@#VmM=x_|j}4^Wa~VZ3FH30QIDb0SXQd S;0gRrR”1؇(Eq9CqЦJ6dY1?(aP΢\F"h0s%B[ K.JՀY2 ԾEA5g RNCI _G6sD;,ԷQtFNX@{Rë$BQ a8un|z_su_MӲ:{Bf,[tXݛDԅ iZ.3Ҁn\;#;V 1䌉g f~hW?Axݏ-3_KR :x$/CG3~ĄlLZ[`S?*8Zbנ=*n}xL5#T[.u`澣I߷m&z>ԬZBB^fɔfw7#y&ʾgQGWȞh$䕋2řRsSv>=ٻU0au n0ޤpBx2.ݿ˙DXp۷ BJ" X=t>NR*ODr5P(l(rwn\!a\p I$i6{=(j.iJ\N"| 7so ɕQg>#"3^_nhv7A3ibWAk-Xc>㾬 Ny_-P%fvjNf9ˢ.A 'j*6SsUt; QɤD͎"ktu lK?`  *Fav\5\l:nTBY_ܨQK-xGawLػ=@-jK(k!}qH>Wv$?mBU$Ban T N<]&9["y@L4ՈTx|In@K*_LMN.GŮ^Ժy/u^fѾt|WK`BDZ_ qNק BR}$U PFrTBp.QFF`H<5!02 x٨')=Hdx-{4 0 N;'<"-VDi.]^fSEꌰE&Oq/2A2!u ykrX ǐ K\!b x#;Jo#HS8处?!Y6:brЏ^JPlIL3L g8a RTc`we=nIzV+Cvmbذ=><_&1n K0,_M>UM)W~ [!D7w^MSvHr  zj79Bg&cPr Wg Rx˕*X.K\Ҕi{ߛ]I]sfTa!>,.8y#XsYX տ?n09]~~9M").rxDx*yrlsSb b>Y9pL {{(9QC.uC MG * }ibH2hq5\j뤂*ira fz|^>j=_>WW)5uũao#Q!.tܫl)c$u*4=Mn+Q%iJA28X@aAAIǸgP792hvQ*G4DGc' Kj+Xj7`&y?^w,B|Qk=r6u8HMxTQ=޻</S )GO4_Br~;z\;1TDh:w^}CqurIxRWIGBQӌTuŸ U[n=4ҙ)w?~| )6>f$y?2+ KMր+RS-sIr:߁ϪYF=sv>L{!yVmv&]Qu ECN!&g;^C~Q}~7 46RՎP'?tD ;=R ?e`KFXnTNQ o30Ȕ5,C}i&.!fTX[7([&IVmQs֭|*S:˚Ёh~҇(.ij4q u!9r_KUb0 c`"qyC⼠{dmВLn|>ƭ,bf*s<#r){Q)N5FFV'wDӪ%W3l2т6[^gv;mr)ދaM9ngs=MrCfڔabUD fN>aX@K's%7 $zPA%"T(0LZ,eK'}MUt|b3D:sb'Mڦ)PPN6"Q1Ayg߻W ߠ2wG3ZgU'9>&DHLHz`)#٬5̰C9 Za(TO%ooo&bĺGMlLֹm;!ܟ1ߵSX .vP8H%ėg7 6wsx'[.+Ɇw@` G7[A sMX!Hr=Po;*NÅTuK٬NH;j9V' !#B&`EVB 0Nqb&>ͯE=>1q@E%<5.H(q DZa)S.a ܔ0(D?E#hEjUX5ibF` 7̶UhBYNx܁IAH}Q5F g+7_ys0kc2aTW[g߿Z*M4/߿7⥤mR33-4*X=TÏc~l ୗs'Տw.~~饝VuZ?K$7W B*vH _Jמi ݕwb\nVM]tQ׵+ 1mIWftgbbz2Gir}PPUdpwSO=ZQEhror#(EvZh#ikM=A1QCZbWI 0+Kfv:_~Vl(QCSg!ֱ5 .F6Yz`V7Ũ+ xҮ[s\lAXbR䨡]sFiF˹ST:r{4.ߣ7n9A;hXWDus`/޻a;͎s~ya1Ω/(kjMt1GbwڼyvveEoB'Jcg|Jfg_]ݻ*F!v678셏M]%p~SҕMnn⓿wU"c,MzxoŎU`ޢ&6~dcE3:^aё>ᛘ;Nk ^{b>2_V Nl #[^w,;nvcʋ5vl! 8Shr:$1Z$g2Q9n.Tn&aN\2;߲0,1tX k'jjlA0 A`d` 4`@yNQG4ar&gsFM=$#)ŸQAc 6%]R\U-] ;FKJZ#e Y]YcXZmFh2^2S"({x2Ѡ0Na쥓0%h-xͥT#``8L¯藢w;V5 HwAYg;νbK HwPBC>sҩFXW}-}F6eQo-ES[ U4G88r1HQgnF S՟u~TVBC>s)SF*%yքU-@ BH֜A۞6[ۡLj>JH|=FLUC@n=>%'w-brauKFԺ%¼0J浒h#^`x ESBZhQ itzVm.]nлl\"gZ@h/[Z!- S*U{TJJ6/;Wu됶,kTŠ]sNu% ($J3N{td%E9ߥt},hM1 GLr|aΣ=xdWЄC{o;ɼBzwtbF1Q^nsxK4 kT ֪ӔmJqZX V'2/ni7/렴#2%r<䋵e#0WTql= DqUAiGkvi)b8<q,KM>gaayRR[WsAƔ8^XѮN{ח74& h7r; wd!okC7w{!=tco9PtsZ[ӇX/EM@0$F#%ق׋ Ŭ`8WV+FՌy͝D@9Au}DFhuv/q.AZF|s.3,٘i$΅٠v9PU@cԖISF[d,--M 2I8U\vU'֞p :\{W#V{ =tUc#,DlŲ q1Lr#CnsM ~8ng _͢|wv,'W)py^< a\؆e)ߣ4߼ڑH0 fR_"Ń&cW"ӓ)7hU]_W i*xol<՟#>1CQ]qV+qUe+hruLڢ.I8Lp5Re-$qRLmgB DW&|JKlX˪v(4/JxvBqMU}NnJEp/8p LA](_TApy+J4't[a.pE .DmNA,:ԆI@1dlj, R(`<#[92#IԈ{5/a˘R_՞Y6nsǦ~CmPEײ% Y(v%k$,vBi`BY+)bU3;-kZ8UX8$d0vJEW[>SH3]f@UcS|G1fȵnc}7n^EvcIP{(K6ޒp>6Fa@#wg+ uc-}Hk;!YE-__dhV{:Ic/ZNhآ|ro,yxuu<w1>%gٻF$W{E}v px6<,)JCRn<}#TH1u$0lI,fE~q+ʇKƁMs0wc rx f,]k cg_a{8#k>|43tJx`wz}D5׽p _ZU\sdO՝%?֒)zc{b":uQEo <Ţ'j&$O.K'ޏs3v[BC%W6=~(%?UJNFI{_1ÙJB|+z(/ |}Ė"ғ$\ד 92;ǷHoΤL@K;%ew8օ𣟎xތ+/BxKqٌ[n3 rG|Ɖ%6>Ww o ܷ!ikVhĆx. -D\ &|ݛD@.9Prrdt ~cԜh-50)B&8t]ШU<48ˮ9ͩk̎W]Fե0H ;zRBw[a6\}2F8L/ev -"9TtKq⻛h6+_; x>Gc`p~6?I/cpe v:V`y~$&&*Bd RJb`l~b\Wǎ+]w]tQNޛiz* +ؖy..~xn{kf~puG2B~G(msH@C{w:L OyhZ-#NJKin_nqOf^ {2w$H>gk6]yLN8VZJcdŹݤHxR\j^PC$aZnmd jU 3;FeՔaR$0)ClqٴM<FxYj1Wu&"T:5byGUQ1 Rid>c`fڨQ$)ƞHNs]w 9< Gʾ]>!tf;k&}M20$jKU'^&j=ׅ^}za!^m H",J@ *[ŬJx*VH>TY#՛[U%h?i_ }F43nq##m21GxQۚf*HO`aO\-G1@".KM:b .4HtVrAc,σib1dj?|BX??tv.cTe ,GCɃVsaY2 M%L` ,X@Sdq1xka/,26Fe,-:r_~*M1.E{~b)E^TMcrU>mQ/rUte_Y,4bhCV~4g1K.Ֆqa8v}JAz+` h\c-7l$湗\;#ֹi?@A8lMsŠ ;dV` a%\p8N0v99e QRu5ag>W /5҉偋@rmU r9AX:NkabY`uGSs.)r  vt(rtm.0!\2&0/u_e טϗ) E8ԷO?.w?0[1,E3[o97ф PXb''' T}DuhO7S]E1z<4/meWpP~8+TȇkcH.L*jm3S_He:tڶ~kb!^>Y+;+YްgB4QDE:>2Q;BxbG~Z*_D!}ujB6TXeިxsW<ɝ|a5=UJl'7T nztHOcUZU8#k u6(,=M@ϊn;wgU[qHg!ЋU:xx_.R]rA\K[UX .@^ +WF!w(zdtlnV$>F77οFu P In_hQC1{>h 52VgXRx$6i)S*hKY2B1Z~#K 1+Y`(pI9S*#zfNRפckVnh y]zTʛseNf]*^I&JNH߻$3J ڬol[NtJ92ey<r|~zw~v7/_wSn[U@ZQ.q(VBFhf R*2rIbp՝ըU<j)%*֧C  e(6ܣs82FU58?Xᬦ.y~v3kv:EjRkLnbܱ djC8 UtGG[` k}Ax-ŏu5u }58PY0!rGe3J]́CvcڇfF:VATq%PG)'#e7FPAt/Jxn0no%bedededeV2ـEԸ aRIn89Nk'6TvcvUȽ%Zd<]>ٔz`,fAgj"tVRl_;WQ+yS1*pdνVSˈV^8fW\1/6jÃ1jeCiKiUݖ|IDR0'r{%^LdJf_>N $ec:X0;23[.!%D4)ks"bi}-9y42FL(:_#GcT(֊kz })/✎o&ɍ+41l흙_>A tri (!*;V4X$ c!s8LHgyNjC5&Jm8<۟\ m')tJ 8X{N>e?ŤD̴6vѓs O#m0~p1k#a"Qx;ݖG;Ww[hZ*% oG3G`K"nתvD(tPhe+P92^S\!8zev,Cc8aic.+krH/sU=x%y13k5Q$هI-i}@6nm7LdfeQbIL羚{b,1qx@? LI+J :($XxLJ/2E8XbeZZQC{ԓc1CU9]?Yп߹YCL_y~51W^ȃ+ IN?fc+W)y3X$mV>cFy\xs&Gp2=O$QqxZ݂+ӈfJ#z_ J5!o$aZccNKE4G}Elv EtrDoyho-{ڭ y"#S ˣ״ v EtrDp<7햽Tֆ&4 +fJb!#k`s ט'L+DiI-;)vdXZh%m`[3pa.5FMheTkvhҊQ]* vB32 [b -S+8ÊIl"C<'X{'ؙPbW5p#T+~ǡduſ7t X5w~OX?}+rɘM@?dq[eVHդk)_CSs}' DI_=Š'ӹ76~RkN$bIJSu1pq%M (+)`on^qae+N& G%܉諎jMލ7 YcDY×X*W;k39LF& ^_ فxR/>x17M?~{ ubٮؙWte,_I\B0k1FJ\'qVqyewҿ+`OJΧ] pTc鑩c}usqib%=lnZ1+ &CJJE8YreBqءh˜5v k`[&QTʧJRVr64^3&DZ'H;$\ R$>$y&I^I^mH3,b8vM[8햋A>v둧߬vӵv^hvkCBfɔG?YncnN3h9ֵ]k셦j6$䙋2E&N׀^AĖ{%vMRryNZxpWyN$"/5 >,]T]3};,"?mhoVX䗑/§nhc+x : 8֞YVXEPs X3&+~>H"J3}7^G=TY- ½t)tkk=iUwQvjVԭNR}Fao1B"۟j!"}2r~,b-xKG-g/򵜽&+k:O%T$4ojtqj*ьRp,hȩbiTFb!9pb ڣj$FVW(pJOJ s"K˹$Z<$,THΤ }ϊٜC|upg]nCņ+$zC=2A1\`vZ1. L.Yx,grY5(2JOpC5XU9@xAa3m2%9l;cL;o@BbXy,9-:%']KqXVh`j]1 ޮRdǰ ŭ}F]֞Oы՛7%cۮ4?ꗸ+mCz볿Èe_f24(yOJkKҵx׫p=ky-~{I3 ?\WuZv In[^1{7[}#p>NZۣ,>hɇΖCCISDLi ,58KV{Ag6G쿄 ,QZVrXRSKbi:W(gdJcZM-r٫9#3)%/cŒzr|+Ni:Ƶ'@}3W|ςC6%/.\O N~BXkbZ$wd~^A0 m)4#ݛzS0<: n{ǿK{oaW|Dq`!AX85`}sK8a|xX~Qtv$GAN_0HXHQ .L b3Nw7in#܎ !0Fw&CKve2 ?֫_ I@^Zp&}du\V?RGϚ1K~Re)qAЎpv(޳V:;&OFD[c0ns̈=AkvqVGON=I9 MG#-=>nzh$׀6ɰdϷJ҂R[%Lc)ig-v*U"'9 J;;w^9Y F4o M13vR)B0}4C9UD.(`ą65Pnd\cmRujQK'C{?r+$Fly >stcē9%t‰jV9!%YCOeW1dz:JfآA';A3]UX%>W#v8l,WH*:=5&hP ]"PUb;#wKÖbM)cŧZ|)Z[ӄ<Y@:"8OB9Uk,^U/mthFPͭ9_W̻&8oC3{Z#O" ,!L+. uJe@ܪǻP1j gGi 0/of>αhP (Ѕů4&P)qMU4Wt\Mn<-`1.cGbQk|!22gelz^_fÊbk#rr0oa[i"ϟtV1Bmoqpޣq~YdQ"e xq0Wbe -62 nȸv+/، P7-@PػI^b8.8,)+eSr<;e *Vr)0=/!̂ ̠>?}.>Nw~ ˉ3 @ [Px8A]1>Hx) 蛉SbD؜T}@:)!B-ɔ<cc`\`gd pq´*nnr]NɀM΀"F/QSbٻqdWTz9gΌd/Þٚ=$}ɖ$X5% u$&CK4n48v)@2F!L^Ubx\͌_^[qkcƹR wNndTM GJ{ǵ<&U塌)rT7.h끫PCq.Zš0q^<<Hh.}U.HV$7V$:{+oШZ8rT)stGUT%}Ϛۮk{VeˇUƟ$2Ll6T\]WL#X8uwﺳi^ftI:,޺3NPA~&ˑMMӠD|;FJЭ{y6/Zs" <Ո$m(~$ mޭ9H.tJSS7׌Em$䕋Le 5i@QDQbPDt|(Η<uk7[EKn} Pq Av;_\4ީ['nUH+Q/Ҩ!C&jvŠQF/<&ԭonUH+QBuȞ-.\+ms<}  G%Q37>%+6՜H.V,0$YYUdyE}N&{4]2$Fk춑k9s씹ʚe'O)/7yX|>2~J cC8;+/ ŜLSՄrؕ.s.vy1Fs0<,r Vn%NdgT7̾gD9ӖM[ ;LSό3ϭ ]yvU4XZssڞ"c{ e-ERCu6<ޓ.F)#aTWwY7YThΛkxj7[.z cBRL2̌ts peymn_BB Bt!s#pd{-eI>z3$ڊBmG_gxj~!.`ͽ JE\?vV!s p4/K]*fTWi'Ҙ!FdMt_uuI ,֑j:E]j=z}:z&x_a&t_ kuj,U/0ի8Uzu;rF}V\^}E*(1¨j9~.,S8#hSLgLz쒪@7(a9=Vf~p;t;Rh0{/l S^?Y7p8 Y֭KocJD)RkElC Cd9FŁb (sa ĺ7I-m]K6足vۻ$VvJx~(Hcs&Sid R4OWOoF6p0/l-ՃWgd4X>qYO2;g=16AK w^K8ȍ^=bLS C% "o8~T'SQeH@,CJ1\F, N RH@ꢶ hʥ2ZB2x]H=@F<\NBiicBPR`ZYE*L:W @KRz/Wi},f0%Bٻ03]=14˄Enw/8\b<ܯ*Gj:O^r_}cw]-U#?Ot+x'j˝q Gu;`x;Ͻ4hn !Hw;,cxW@4R\?x8tS]їny)GG2&)#%N$D m&؀4䁥$R"Є2d`n L ۷ –?'3Xi"gt.<wen?w>Fg1|of#؅Cooo xDq~ŷA"x{*Nf-d<`l_}8g2r.}}7P tqgrk:l[ZE]$/DPP _܆*.$~77' f!.u|T]JQ(9ujU@bV7Gr Y1R5#{@ 1$^ȅ@~l1'r FX,XP a+uZh[;s9U(wޠLTFg(@6&\Y,0Qִű *)Xk!C6e0ɔI<ٶ8{Y c/G`]o\L"v0,)'-6ZlJ ^~)2ꝺGCgC$XURKJ IV0Jʩ~lWTrmNPiW}oCpKC^d !6/ 7*0>N5vk ۖˁϒP SPqt9PU}#,TrjI$;xSP39#*كj_*νӮqU(_m sTj#ΘpnO1%/W VYk%Ԇm-UF=W6"m]5-( Fokn u_;H?`v?8^E x7F)c~l`dkw~- GfC`'MAn6b~['v3pse:>ʨi=sLJ'Z-R͡*v,q;7TQ1AC'S-0ƚJSA댃'$4GM//bx7\*gO{Lt:[tjw\rM ;Ib 3WL~F}x>w@Ȝ ¹B⢸7.kEDKJgZ_ טg ŭfmGdih؎/gn'gx} L48{bmjP`TBGUa\VS-7*]S\?ajV5f vX<sÓ?Ő}t9x /h?,kaAYOvZU؋_}3+G>m/.:t޼Ti0֒g ^t_i9UM(]b׺c2!Shݦqه'*S,<0ĶTd='YlJfieyUgYMZ ⑍Vn5꟝7v.G&|I9$tz[7Yle=TUO9m T ^(-*c{ z5ڧWWwmKp|pm7Sqh,ѭ . %Q)U3Egfgggd< 8AW|* Z>GE-cQ=l!/0J ]yC(BHSJCzN䯅 h<#XrFlQ7L >˅wfO".Syĕ#E\gtj*:E|d}mnX)e|F궗EZ߲hN]^3|Pe6PdNQP*%UݍBN vuX~ŐTRKgTC]\"5W`qZ ;5z7n`$dT[%@dT۝Z rsX@/gQ9n 1ݚҜe"BB!{A8[ 1D3X*?\(YMaH<'Y"I&+N˟:QGH׃8Y0Oøѥ c:;wMw=QP"JH·b@u#<0ZC3A<79l"_&XDU9D@Kra,,BmbID(T)J2m3=D$y Uܑ>/1R2 y!S/.T3mnNZ-N.e2H 8"8[虯hS=?!>bHA +Se0`cAN$*.m| $E7v?Z4M|X3cgS,d-ܕl@Op3E(i;mbsW` 2yk͊GUMF0{r2x L73m"XQ7$YItD\ g?hSXceYt  `kzΉŖAXuت0  d7 6xA8?|D#%c%/eұr,} KQO7\p|VᥖlSawC&L[28.LPw'W" tnOɯWptI2}.,D O%Q{7NS6Rh/tsխf҇ ɦJEYoߟӱ9gTѲs7ct岍HiWjѯ]͵0uR.# ۵Xq LTͰJ6}x87zlVoq겒wl29r˥-je:CA~f 7.Mw\=2R„ksWS6@3rWpg\jtW CQcRZsM'pMv%^t7),OY}W´)-{ ʓFU5!3E'';)0 ;K:3g^ݽ=yɿo߽vn0zcu"5:܍urr@X u,M i.٤ML sɤh#ܳr-Fo1(%zyﱡl ȱYWjƼ-8 ʳb4 CD( e~@Z, db?tzJ*GFcLjeUák> BSyZ#sa5zL\;>+IM{mdP;J$cn SAʳzvdI?DΒB?+{d֣}C}fT49WI6+YSbb PmZ$z!KFoGldzv_SVGܴ3xlh^'.=f?Ӓ9)}YC?zJSuשO;`}7 ˌYSFMRFvJr%ݾK,=iLl"xu:qF7"A1WT$h$5,/3fl"&By}qѦfjnX ̜F[a TBFYêMxTBtnŅޝ1j ĖpkǑ.zC *jžEKɲͅ\j5j7R8>_kqGA5eN Fꋙ+*.^.n)fO2c{2Ns,l 2ރO2l٪,C}z4Te`utiC B+zB= 6 q)mI1`mP=G?vKJGz=7 /GoToTu `qO1&(S`cAN$*.mAwDX`cJq)ePі:K'L;UP.ؤlȡ(Eu} mBA\EpBz>0z>G>BWiٓ ƙF}3ǁ}IP6mT`p0ŖDӥ%",#{)e pIR)Q). @6qU5d*UZ&wg啠1z ^x1:OC+CA- ,QR$VlVӭ۫Nwu=~ܞ6=ii\y_/mT-%B,PNbrRBR"h)=X(oa10dTdk44//+UgWKTw.@ ܍kQ͕q~eWhEã5{.@s0V#C吏ŞY`ֵKwwjԱ({2K8e$MS%KR۱!dϒߍ/3AyQ $eҽ4Ճ)LIflGۀyh7ILΞĜ=sO4A*x i3yFM=e[{$r9~@@-Ih ?+«"7ыgQlqU xenO0ypTL DŽQ%b;%lS_,il ׃abjl؉blE`>lue AỦBAf\F0@J\p.ˇ5{>p|a TP`\<Ȭv Y]d>%Y) @Cf..xn`I$ e!e{"(Z^DKibP A)ѯ+)UkBYEXzm yEhy*[s6 o=̹*޺w ؗmgmsVed%9Z[mABu-Z["PPvz *!b_PJN0;n]s"+z9ȷ+"ЪUuhsahNeY*`[7<" Vb!UNץF5cכ2V;K;$8n ӱw: Gu !#T sE :_rJBL|I;frhew%+IETSƐ cml.άxǁ٧ d.6s6]8s !6<<7+7++lzg.^l[8 fPB^؊}x ̨,gf8088kx|*kqAiZ2޳g \6)yw Yڟ ro+[oMg ChLl~<,5 ArЈgr(I٦bD MHYy.$FP8~F'xۇŹ)jXagjLZeY'!!B {F+0քg>7wB(/re *+[ux[;H$8ґ@])2J`}pII%T 8$̳}E"oq&Ӈ&24vNBۇnUϾWIZ4o4Bz+mEЇ-)Ͼ_Mu`(ncJ7hIRhIvͱQ%ÙZ ꃾK_zfͥjG o|:8vdS{MyzA\77H7s0hvٕizCvwnLƃ|̀iMRUi^1$3OȴCoݖ4Fn䪺Hw=vG{d6wQ vwߚaa󿙫폇`{tz'|Fͽ~~:tI1G>ڣ_|=V]uK(BCZ[ZpQVRB[Պp_xʵ ^Np_8f?7^NʟO߼9z~qvxq Go<| }|; +qWM=]{&R`i_%xo'n} GԠtxSfh1۹n㰗7H;6zf{c_{僝\=18 \0E)U)~)`Bec847[f[ X̎ٹa. G`}& jziplqƗuI5DN|7t~ yNv2 O47i]{ҭnfv6dMףVKkn"2`:V0ʉ8z{V(^=M8ty&EhUo<=ކAg [oǃ0;E.{rRkJ'ߴcp׮S:+ hv-0p)F{u\ <*ws'nn6G7~D_~8vg`iw_Nbdg7ѯYW0Vh҉Y[8Nyz!ԙ@sGyX&F ߇vޠ|Ќ3 S&L˝WӷˀY¦it%.rޙ)o!Ng"m`|>wy1id@byES; 8!2͸ 2F(?4ST6fmq+*ȯdV= 'ާ0Ű^k(ŁqHG1r32b#N0 OvKQV"$HJ5W0:8* Cpr-ek-4{j2,,+BrrlA'd7":ʙ [<Ay`8o.f=˴,,_vY(0G]PQ'9k~7i5@H)Uepaa$|`Ta?&H-0pI,* /ld֨_*^5,\7]ڴa)~1m;2Ï|fBBtҒ ϓ 4S}4wQTfZŨvs>.j B8M7L+qVނ痆 |p)g1Lzpg;<8?'0N譍B*Wm O8DHF&'i7+-aW߫'(R+iEf]c>G^Z!EXD `Ut e,3(ct #AV{ЪVAK!$8(YcJB EifY`ՐĐ^58b)lNW%?ܦc#*y0gP#0Zqg6COjqFO?kA}pI ;4k0VN RDCyTg.-ypY\ck"hC#L?`E $B_W*5?.fWhqLs0EOra\}ib l:\$jBQ|+d[VG *5o:f8q⓫7AlBD+P~]:! *]/ʝN\ҩ@iyW6-(kY oFDQ_f!R7W[; { 33g =D eϝEZ=2>]%Y!PƌuwiƺތIV* WL0$r Tk13q_ (YoR@J1$w?&l_ u'^buw!?< _ j˅{-~hK~Bp1?0^ G= O U;%J]D[#*Ktp*2^ mBJn/Qr). mDJx1򗨳/#DSSr&zr jп0gBɴ!|Yy [u0S8ێrzgh@owuj`&fy!cʹv+<9 :QZѠ4>X匱yE8F;]H{6qB-x,,iO1[&ȣW²vY! '*b̓>m- rR 7TX VA5aGjjX4_l?rV BzG9ڄꔵ~gV "~ŧwVE9;?%D76ؠ +EN᱄Dhީh)#\.HB4>j`׻k7iA[ܪp~nbs}?z7Cۺ Ř&,&[h2`Pg WVN5U~Y|r%l9Rʂ ̙̂uv!^)# U"BNtbTj\@ cK´S*!d3@BGgbĖY(%F{ VDIG;gZHBK.: g{:{OYJ䒔7`Ԉ=+!"9_U׫sOK痮ݖmk?zcevkMf cу粥O7ń(yn!5 H C՞!]:CPkLy$Qw)Zѳ(y2QrNOWVxi=6cHBJ Nׄ6drw&QYe2`g>3hʢC_9> Pȶ`aNnsojۛLa&INϼA^oxI\駒~*'VMOJ>7S9EvӽR:j?Sd*G46y7y|klORmW8g.̯^dE&.ɜ '8].77Ò+zbXLpw\{p,8`sИas0P& +cH"ܞ1NMm|pBK&I>E%ǁ}|4INꄕq(1b991X<XݶXlaRxmYm,8m\ߌrOj+/wl뚄L 4:K;6 N-.ApJDLb*ad]%h5-樲ƩGGkc|oQLbGkĹc|PšYޗa}nW;Y)HY%ŖK+#1'1HxPL`6&HݩfX- J0u*UN5>1.P.衷XXKbbdIrAz3}Sa= |౲a-~]]wx^nznGaZntv2^ })m¾K$u*|9~7˅gj{}u0@9_/ؑ\)&Otn[[Ng4ݎhNf͝ݺ eRqCˮHԆv/wh,7ǏLJŃc WiT<<- w~ vw^hڹM-𓪆0%f:|||_Gb8à`4{R Q R9jiWkM6& C]|,tE~j!| |4RErV3Oe$eh=NBt1%-.X\'t] oM"B,c}\FnA[\bvW ~I)h_l^2Wlٛ#$iT0yB 5/@2eegLsʝ1΄϶gu?Wg~iO/fN,͍*;¿ӴDs\2/ [1Sg!MN:3Ov8pl6'LfL[Ab= ;-%4b/F.+͚6Dua*[sEo=C\3G[~ԗ-^B1#Z\ܠGhO`\CEŧO8ASm%xIϿ cˆg-@rkPLu4# z(7(Tf?^Ahcufoc@w>'zvK53.Vk+: HlkG읽kN< BONa| .HYؘ xe6pÂn&lQlƥ`C9<߅'vʼn,Z8G%az$JQ6SKb\XY@LR2팜,p@wMwoGC##NE>@Y:6m.}^_L`]gAcqť]`):}M0^ Osa(BP[7̿9rV{.{]C(,Ҹ$Stj }LS+]h \(tMc"9ĥF'SRb"vP, ŧ]/P.tܮ6MȦ,Uy F~^'AJB#JDKsL=g!f. ϲG (Ix. I:wPE )T%lPFi`.9NaC2L fB3#X15wbK"4[;$Gn# J؞C3 ll(v~/_\Os}??}wfH9w_?aS($I(?!cw7Q Z,Fn9y{ӋtwOcwD_jAKL{JVʛLg,t%cOa׌H%7т?=>½K0z?K*Otڛ*x-6(N#GjYOVH͂ss2-@0t2h2Ttaqaw٧7p`,&N9Bt8M[#)Jwa; tO6OK1Oky0lHAex;'S(,f  By "JZeP,RK]`#d1T!bq)XKʀ1$WVhK^[߄feľV^j{  ~/)gs⋺ XS-Y;]B h/NE @)WD$͖lgܔO*u,SX?m8z3O&XBAz%k.r[eG sX _H'\kDp*^'`mN=&j!tP-Oz2w a5!O}rC~_uȫp-\Pm]0uJ 3lgD.j>z/w }P]mxdIw>]NȖӵO'2*c̟-:sX~,i?0L`1GČ.q(@,@H1Fx NR&kTW[l&ht,r>S>p? cxu&.oJP#<͏7}Jkj#`o'f)|cً.(@SЮҒ'[r֥~Rӏ7d{37=_yzа?׾;ԛR1EK< ; 4Eu [=یJ졤./48_45w`0$;a;1Or6^M՘gbI%i7W;ո 4W KSñmܚOR1MW[' .h JzDP}3ɔkU017;SQrrwZǣ5(!(XmXx u Ƃ0sbݩ9V4ZSfcLpiiڇHb@ `!y(Qvhlw%vPmpHz=T0 Ńet`$[GlY SE^zHNDmCJbƉi2RR@= EOk]۸̤erHEcL=\kqFEbn˼_ aa; = $fr4D8.?Ŗf%uKd72{Mgm/1e5C8<ıU7l[VRPw<=myuU#:lh8f<}:V݇j@/:۷ߋtyuP!Srѣ'䍊zp]֋qIը  JS>uTO:$lRt ѨۙC+j>gkpOV~wY NE͋),^/Ԑ~SDZ)2X䗟K#»S\32PؗBDZ"jt#͡)"-HhnD׫Svo߸PyoGѿ^tDΘfkL/" + yaY`aB19f_syG3O~}þ̽G3)WZnYZ1 (U)*ytY CP!Fa3!Gg Diș(spma\9"6'.xQ>1\Gv /ɺoLjhοpNe<0ĤǟF x`aR@7 OzVzY"Z=VE YoU (j2L3^W{v;koJ3E9KpA<1K03Xږpk<[(Ee֔ƛA #w\[FQIT.$3aϕ v+u,g!^m$_/mꪰ,~,wSY؄Rަ_x9ݼ\.ë_yxw3^!yKiڳUU OoW ävk쌭t{ qss ;`ʊ[(=bUlҗnd , M쿅$c,SiY5?{_$0zXtGDn3*Uoo# pva JdLUt:Va$7jsEp(2t^ﺣa5U{oKvī2KpYiW=a X'XXu& ︿~mU=^W^zQD6jPg_e+妲p_n3NuI7Lr <ǘ )G_ (JEI( $&}RHp ]qU %3yf+ [ldi%tFSCmMr;rW0C>G2wYg|!82k`Q  $K%Ɔ7d4eRw4Npȥ˥h4ٺP<ӡCHŘľg,kѮ鱠;X F%; -*:datMc%BdR3tM{\+&l<BcoMaaS<͏͕CqJ< PkqP)͟'; Ƅ'b\AaI#0Zլq߹AcrZtVl/ﶛfzyaco}TV~jz0(+KzO ׶e1K&Oz[o\a`#T$akweA&?}OR^'yؗ0n/F¸ ū+X;Ah/b=*8N4 {&47镻٢̢qٞVyڻ.guВs!C^$Ml%=s(B*J>03Ǭ^-gs#t]v޵"]Vk9ѯIn j/u1MS{QJJ։}ǭ d=FRbXso묽YBX(e.RJe;@06z2jΒWmoRk`).ק>2$L&Q,`a*'/t~٣4#W#6[*;|R o[mV7yG xRJ ږGEv.`K$EX: 'g^o`ɬW 8ۦf>O'PTm~?:Vd٠ݟ?y\O HGqr n랯{fH剼#,%?L 09" %&&4"c $(E&ǚa/3P yy) 1X3UISRMB˝iڻŠm$Eᘈ2Z˯uO% >mTybAk)W K "0閅<+$5]ICwO[Pby흮w^C^_!a߼ҽJC4FVjlZ"N"SUnI0B!UC ތ*%"z.\px ;lzDvX#n =G:~^uct.!K(St|'n@H+?Մ)O>y#V(w}?\*Nn,J5uHQ4`ni}!ubIx6]# h"l]^x!vmuưd9^L [ZH%qj0J2+xꪻ[WSI+a!d1S]! 5QcF%?PJ+岦joORQvwV}]^}[[UΉ;^:97*"xk<_m#ζɌwSNߍ.}. itmw>3#NLvZhUTOd!pmoS+N>1V+T]\) RŘfq|ZhHJžd͕Tm[9'0sTqv1^'g('N\|IOK~Nðbd__Vg^7cʐJll+cW^4ee)Z۬7ĺtqXȡ[hqONa^7_'Ƒx~N ?=ZňQGRJU{Iǫ=B055@n9Kb[,0$5覤Wp t"Ig_W//9TL ̨d04"g )-2+Rj2*"ϥM<azܻ.z/'5RG;[oOƜdJv_ 750@4niKښxaX'YuvL7B'K!TYDZ~LW?Ymfif)u aKn Џs맬غBcn9:&b$cW2?C<lHn~ /x}ND {8J=fMHĹPVb !daR(%Wr KaC 9 G/_Z](C/6㏡P&z]-7"NqotϮs852 ~'9L|qh,c(͖R3A?5A0S9Z 2 kUٜ)ԣ1m~Mw ޺)+4+Eyuh].2]owY{g{h!,IR\![hj\ccC r$,ܭU%iN2mLY8s쟮?J}Bj̺p!vrvݹE'=M !R,BAi6Jx *O$-2Z*D9/̱Ts@H\+Sά.6]0cZp}Qz.D~ۤ`a1e١Uӳn<`@ ѻկ\!NT\=-y/X&^ol7W/,hDf?|xvwPηw%xπkQ?=/XqT Sˋ2>\lOK@ ״' FS_ܿ[-ױ@*BԵ%̨!hon޺F5 $9rbìOCuFIMԔi*975$M1s0 噩1=g#&15To)QSe &`AY{b($ 6{x`l<+c]f^ " m fO m]y9MaIM{)1R*"DpYA7h^xki)c`NEgϛX;ӇeQZ./:άlB\ahƞ6C՜.=麴cQ*5 []X hP+}9_-T$7=S,SA&HsIgR +3#Sd!qg ( =to0+^B2͘<"ENќU+jֹ 8F:EEEN2= ;ʥIn7_8/^H$7aU8m4(!fBWciH 5`M1@A^#95Ss-k 7b: 2]rCn J!@* 4s} 'r}I)'ƽt7^v1#ɥaRahs1n`j`G(&+ W:COLO8h(Op(;h_o(QDFO-%v) D]ĮY):4$WcW[ܻ.c?/N1Dxf4nyIo9ē 'd`*=uhJv,ݹ"IBOEfZ4;W~^9P91EDQuxMBrm{#RZeD)-eLO5Dv*HDz r*/RD haBUƚSFu2[XIyk-cˠNtG)quRe;`j^it^Jfٻ6W|z ΉЃ:YM̱O3V)a 4jvY l +<(D/2R8l:wG%IRpvV|ZSiGʼ3S0>ktD_P#jez)шw j*:F3S^_'ѪcD׉@wl*R(36!ZRU@榟я &ʵ _3wfv(:q0VyleI$tfܴu/7AX0.q>%%Pؘ؀ O"RK(qfݛμ+#O)ĶAu5c}SL 55׍6*ɥkc-ڨc3!xrq^%IZ%hH)|aMtRB< D&ٞa(Wp)ΦU%z@5d23j 4g"©4 B0*N*yq)~izU:DBLT`A{}5[VuF:N^c/ߌFʂ7dAs|Ppݱo"g-^m:nm${ІZfa!ֺgoMRޝ,MM4$iQt'՝]k/%8.{OVE3ر꼘Et =P$cLg[acȐa<ܯv2tcCƖK}]%hB7qA&[-(46CCS:o<{̱#Kq# -p 6gPbw>.OF?%m{ܮ4WprӽqmܨQV^Wep+E+5idlC9p|Zo2ookŪ| luxs/-j(qBcĊFjE:+5t$aLj|ibX"v A~5^W&kEG7Yb(SgXJT{|MP<ݟ &,J@ps| JAc Õ=,2AKq1{g4 h'f?߼O?}fut[QbWww^O&&fVjdi4tmsbK4u;o}_JnpPW֏oo-&A̖9sE"ScWAPJڽW%Vj7 JgTOj.L^kCi$|{Xji`B0L3 LC-×^1䦴Iӥֺ|tLS9x,8IO(xi0PS/#R Q6yFߘj69]6=o_n]#Kd'݂d>>h]yZeUx1.L׫G]!igMk1B"h+[u|}a\6r]/l7ʯnq@"WycQ6ĥ~ֵnSsc|6=amy4}SoraG-Wñ7 Dsq-}SS"v"AhgmJ٣GgڱDGv!ݚbPtw;c:U7ָU[6]Kois6@džsXk |t K 4Z@"ڣN9W K=gaɡJe1Ƭ.+Ү_~q){6e%c51 .0j7zvvFJO9OVI{.~0jQ75^q2op^&»m1NCmЕ DtT&jǠ]Uh~)@G_-TWh_WJ@Kºr$%ma}]?cy]|{Q:sʲ+?s]lkt]E]ӎPTnc-e˒hqƊ h㫯>fDGoc,K?l;595cò[R@&\>(pFȗ =K?`K,[,Q&8DP`3 8 <FhE2Pk%|di 1$f ";1kG[TSL`H(AQЀ;y譩 JD+rwl Pg5v>ɖTOWcUM(]0%EeR0 U ww>V;&*mjԣ9 F0-beJ`k}Hn:Czy(YN>.V6Dmf`=c޳8Dl9awƆw4ʲ;~1k&VH}zPJ Ew ~C]U2b]KkS:p%{WË(YCC= D!inÌ'qcAQ'&`<`o$BDt_Kk 7޺j""vot=(`TYr{w() e2P$1y:zNR%Dp07.u.Z"h\ph3~i dOh a^XGy.y#"b̃s Jq(U9+T0̴W9J -D/͛mi4+m&PrIu `լt;}ɾG \C iTqR5qB dL/ı>od҃]@)';Bf*~n8>p&=>Z逄+/bgK;#]*9δ2H34 K$ڜ{$4dThT0T&:)!1 &i subM.I ~U/KM~ LvAEYbΩ_{w?'p P$Hs7EOreq!jo"Dv+fhf 9 N@4OSLAȸ¨S\-~| 8ǸojW@pNMi(˜Zĕ?,%"q^sZƹSLŃkJG񪴸bYcs9O

]A9Y=,tEN>\S s tFtϑERvU _s B+ODTA~ DYFfuJDC_O d`Jͮu0:5g-!\쿞g* Af͞)SCZj{/Ui-y>Rqn.PTU. c (y**0Y㨈 8R I$f8޹CQ' Rp2&c>9@1/(sG2jGD*/ .#Q4)K''ԖhH[I}‚6ZZTR'PP`ePI(bQf[j%J+wb=o T36ZnhQūd1m-7R{Co>Y?xz_ ?Dަ'?)'pY|e0Ӵo=QDqa*cgdU^4/w4%Z /ַh 25Iɼ8kVkAM`A(A`ؑ,+$'*q.Dd6pu`#?!>BLD N"NzK'P !fZ˹]iB☕`D\nJ2J莕 #'g~gwwͧ'1˷6O:w8~\t?}xU>\Z> {󇇿B@IJh"q);Ownf1cr|{ӏOgԾ`0.9?}{V|¤ZoF3\qB{j1 ; \r.f4ߠVؾnv3Ӗ]oAwo̖W窚uq8J"xI7RcZ,E 5a4=J.@DŽ$eIBR,`3&kO:ߝ otB$B8]jXx3@BHeBcʁS5 " =I>8( ]x/NF&5!,g̀(,ZAd>&d0#IB/AQy Y,vzlm?n#n(&)Y$uXY5ѰbVqefQNqezK BmA NƄq^"YfG ~)&Aq5UhX!@5h{J#.meƥ*)bשHJ6ծxTϏN6 eeUsCpeMvUeU;wiY-R23C9ScJz'Ʊ(<c-ج2pE=D QhT'!㌬@6m =Kh`63ݿ۶]Z VEԮFoD W =?S3joЛſg;>/,lk$+AYzA 4ƆLI͏pF9:S~j/=9:RSK1 N!(ZG-AC\uv᝙aCB!ZMOo~>=J"ds9 7`G^s'1B"芊E+R@ccPjx952Rqp.rgApNk)J,V6ڀ`l9y+Z/f_RLρ!Kڙ>U1|WVr|Wd`f4_ji#}3/ n2hH!0K!"n x6,uD&tbL˝swR0Lh֒EL8Lgnsŭۋ~~RI}{1Nn] Z9rrXStgu#/>8KQ?r;2BH[ 57n}x{|sܪϸ0ØȖ`}`6MMou)ouZ/۟c˲MjzoN^4>kfS|̥YRe$aG(l_{8V涯gEbpYF՟98/Q&pهٯl5 62Px^!$/+p`D4StgaMsn Zd*5)$I :\kT˗wcCH-Mw[BHw+Zeug ^vpuf:zlN(aL&t!ŢF`"QÙB;Eڙ͝& tn~QZ{J__i„b tz/j瀼(9PSouPζp)IVaӚTV庇i(v$|zn0ߒ{5#-}1=m80 b\J a7v.c୍ Hb? :Ł^?K%PKSLr^#q;]kCYKfhkY{3ojM*ZW){bI~LcI;ybU2_7!^֪KFogV;ǜXme?) 2p@n`-qOXL.u_П&Ɉ>*/>l`-QOXRY޵)Ͽ}I)ߙ7MD S38Xv1l{vR9ϢMuaO& Se cZ5WXo!)CNƒ#֗t*( YSvO:*&jGm*}(S&QF .*QEՎdyLGH}~V炻H42h\a!|ń9:L!=\ϣ5}/VSĂ2;qׄ`Vڅ9 ]wwoܕ⮝HWߍI7o&te}k=$ oؤe9Iɐj~h&mtHB׍Y<[}\渟J!=7ހ(.F9ri0XCRRl㙐!aֽȤq6N !`$b2A ٨Q(UX3G):B YznJ8O^Ԝr)֤7ИU~uMVOEI(B\λ̀<-e픱MNN 3S S?WWjt:VNen`&h}Mʙ;-9݉3VfA5b/w$?(XB8S\WP18VlEU;]hќ׭QAT2tģxR=' L |~Μ|9"}W}s"csʽ#8xI" ype`F*g7|~#r'Qk"c$hϴs R,qC(Hcm0Q,jcE(ݬIkf)yۧ${Y(.be避Y-z h~|=ycgy -Fz7 ?|q« <i/?;ċl~ϙoF>i"̴`Ӄ k`dfKxv< |EP^KĶHp!8zc<:<'ӳf=[eHRH4"-)ja˯%EK4:j߅ڌgU :x6SHR.`Gy!gLvƆ۟+Jd=Wb;EK(:U_],m+ѿ)ժc4^y**Xxpq>o'dt]ɑlNM3Eҿ6-HjZaP~ݴ& ^GgD cJW:4#}~>g0)9y1Z(v `6d{L+Tn\*we&^r(hDaHf)Tݜa$2bt+O" xd] K}LCY ׷gTlf`[޶=َPlY٬Z%ШH0-$Sg'rsgYrg)[W- zW؎dF0LW!M'%]Lր2(e!k2u`=D`)> %~*Mi%^NGd淡IeFYK7wt=,؄銁Ѩt4ŽxģOÂ#4)$sӏ!R6u(!df# e3ηc)pe22.F 5ٺwwb3iGx X P_v!)D }D;⻐({W)u.u'\ 4z['J r4WJRwC$L6GLV,^)ac؁џϛLzP)Nz)<;E Db;BJB<7JL3]<鄤3RKA*bJ(r(9iH:ePbBQ]xV_1/Te ,IDQJ,??JL^jUzΙWpc6-2!9#DJ,!vQ l?̵ /M6 J Rr}ǻ((Q>R=)3!ܿ iBиIţD+s*ĕaG3FhR(m&iS5AJb.v2*`$ K+)1BUvI/)uc6zCh[.GICF#\[n؅rIHy>AWivg虹OB^JyehcP%'m{~дL^JU:Pj,&yX 3| 9"5S{u=1vC8ېȐ"=l3rbm*K(IDq+f xɈ@D#ѐ1Bens6{=DAU+!/7ւtG W+L]z%Ǵ+rn5XCI6o_=jU{' u` !Mc ),ON]Khq@k a%p "tA0^+%%.Gaߪ $\ꊔ?ժxa(⚠"LYM.]H тٚx\%VvXd 'H Wxq4DcmtI΍( !sNx~G.1"@hm3k", XiH0j#bQP`?Պl~T;Z )71N!GvNunIO D$݆vT".RC.lhaInO'fZ\=",][o[ɑ+6CWwWWw" nA&0jsBgjR$R:7ʚ`@>u몯RL#UvsbrO@p,üOWe,.'dxb9~#L¼' 8ooQsJ.M4RC1IU`ޒ ;6Yo KPDʏ3e)wnE'/)M}&` .#L So{I!laX S$l0ުO0a)8y] %xZMFQ$sxbS0/L)8Vz4ʉP8j#Fyk e+YBeY-HK1L$'20ҔS*⊺Ƙ3@` La0,&41ŞR!|.`j ɺ^H(cm2b$Ju61^u )-E#c#4é+f(~hBQNI*FY1l,X {e`/ ҫYAbe-t,l)go Ϥdk(xS0pd&ǹB(af8R$)`AzLc!IqJ0^6. MF4ec B8oHpjIJq|dżT3][}4ֵ߬v~)Kp0f!|۹_?_\|~qv\,^;۳l|kf j=?2Feԉ9{R%2/*WzjwNjՒh$PA8g+1t51;kȀ3TBC&ߞ%Ǽe=҇E|jZcP722t2j5mS/꫼*ʫj'ٱzQ}EpvȐGan&ˋ&/EŽ I$TEѲ=3ZNTo%*'+|l8c.<5qԪ{5qޢ9:٣N)֠/Z<[ySҝ|/YU`kjߺD.p`HMM`SZyҗM'/=܄;?}]\ .ŵdLR\Fe{Djj_zv>Jc:iZ%) uT}1ϭ]"%iϮW?FapzݧQA_ X3}għ#-^O('RX,8傦X4Ne2zrY+6u^QA"JaDAm8r, BFs79BG2Q~GybX؟t"/]=/_B7~EenyP=^:-[G*7/ž~&CUlXVB^(@g=8NG2LG߀j m_[Jߊ~e]~4/ (j1yCq|[]b˂=hPđ zyU>;yV֫UiLsCĦUJˇ&GYnМKNyXZ\?՛5_gchoA;2gE>g&/?'uM76yk5 ڳ{!{,X!=ǩMR1zvqv؇Kp)l+5,diʄnCwBnZPBnq}VN+4zÿ@eZ,X<6O x"'PŸoVy4ݑbh3_/o[SaOWzU)z4^$ZZBJV'P|%,hct{ރYŶu&I.y:MVGB\MP{wˇF 7+T:[0t1C@1d 2ib Cjy_H46.C}L*C,m%6!Ч?,Al[ G?݁ X%j͂noggB^o J0PX]>T3[~qng5̗ۼfUME 6+UjhO/۟o?udtfge{rVg3ck׻W|WW˫[˷o?gkIc!5m<~撑r6զ^͵fUJ#dڌ~{[`ο =c$G`.hjţ7*`'%ok͜?j[%Ղ^=k.M=XB*9 G_r%(%hC$!Ba͹xA(&QJ'71, Uvv3@IXbNh(ȤDɰвDϵⷍP[fшT:U]içqo -Y)OP\GZ/n No3$BFZx"#Cfh-lZ'ZϪɝx-NB d)a \eZI<N608_MTW Qmr+:jevahXnq(Ot%2‘r+7RO]݅ZJFFt%Q4hD>Q;0_q,I?2CfeY7jjf{׫ڄ^#'ǟhqwvֈVKWp1 ZM}L7e]gqE,i3;ޑƷ]?5''5<^Z-sYy&7<]3Vh$IMQ6 %C!%7Gu)i: vɫI<٫nH*!NIWlB"]Q_rYgeW#^O=I/nz};Ա_m8s\nu:]62PBe.]") *4 q%2r蕍R*ZIbq7Z(e Q (R@¢*A)cQvʱp"zWtS{y8v"ΰ^eS᲻BV}kt pVp!v.M7x]] ɘX/$uYuL3@k-DɑK/!VS8iwV0Vbzx Vڂ?jp`GvcQH uń-A{2d2*2éFHCod#J3#ik#E>,1v%q&A^0x%m} CVKVXEVu[c\un$}'CֆpgIQdn"@zp+V]jO\> 9*PY%*w,u\s:]j\RA5K! Nr捗k6mI|Ƃ @i@ǽZ|"ԩOBYr*Lj'1 (5.vii!Ua܊)aU>a#~ZlW{7㒾ٕ#+iV2@jA5#?j '+{?8V/W#KS]Ȝ ,S@Dqi7vpHNUg*s\QyܝnNÓ /7ut]{{ K;wyIKLc{0B9ʄ&^>=u]mVӖ xXOFȭFl8n&E0uއ?\ Q*Jqe RH" (; w*_G Ԓ7 AUpOp+CSJ) dӻQR 8Qſ~HE{[N6z1WO\ԟz-̽N7OOՈγ"Mbn޽URI+Bm(pl:K͢ڶTD?@ݘ]QI8!<䐲=ɏ}1ڞ>~v!'<'bn?&Dc#+^=oc,]h&h6M|qT=q$LZIj+͟f Se-esHaJ#8m~r i}A3&:V싑k2|2}v51 M?+XcdE Y.D=ЧubkqA ?›] 80Fm\ki*g߃v߁Z]tDn۴oO|Z.yJ8ϯNY賫\w\BvC }ۗ×Jlͻ?ߓܤQtܰ 矫L"9 @Ѩ咋ThMr4ckNϥUFu tt{ ,ƒǣ|}Xz6,]sw~Tku8r辞#"gƏf&b&21|p2ˀ]ʧL>}ItҤ5q {hjdce)\ė\6}F'ȹ ;ێ@NFT[w 9ɿB=L.붺:;u;6!&?oK 9~0֧zͭ j蜗 23zA!jKrk:=DyNwk'o닾 T'>N'@ OQ Rۦ}m@t?'FW(9H޼&^̃%2b=MJkM3אcBIӜsTJ-BLF7Q3*rnG8ٸq[c\-٠W[^BѕBd|[@ r4_?|BB.MҕRh%H)/K4:D"ܭ`49<:e!6 }a~vBDˈ2A]t9-(c#eJ ) 6^43/_Q7]N*٦D:G^H&aESj^KJVC4dM.N &ygb83s2L S1V>uUa%rn }SL@zJco?-!i5cyec m`A䕫 }zP~XF1ayn҄xaqP2*LWfWj*d)36_WXo7H77jamK/rrObÕKv2nsW^Z"\r, ɧr 3-Dn ]qWig=הfMYC*ǺFߟPQn6( ow!#gj٭A!ZS>71>؁pvAI~cv]XnvoY8 Zu)%R퇪(j?>9TD529֊w+ٓ}x 8I5v.εqHDT}RÕ) I `s:x'i5>C."ڜqFFJL[+UAI#1A -+˵ƧfM&s!%O7b1(y5bljFl=\ Jފ<% =*VDq`e(V(#st=֮F9TorZ E;݇W_m=e2m=9߷~(*X)WVE :u)mSi0HPxeU)*SR Ls3t.}ț Ucy2_exڕnNM^XTyۙ%CSީNw'GCYm@̕LF;wi!&Gby\7Q"L$(18U9X)⁾Ti=Xghm9˓teiݪ(Pَ`PY'Gjl/;:@ c4Rr6/&*Fܱ,>V>}}X\+=\zѱzvF^>owc<- iᘼm'TcqaNY&LvOgf6 H%vwܶX˲A;9nQ_iJD`gsv3QScՀl!yL^@᧎Rt¹Wi:J JmP`;V໘ud2 , ׄ٦ 矫U4ָ(9_>3\85(敻3W,<Fz?]vNj/(J'(3Pҁ@| H)M7xD[b6N-s">pCzR5C{Zc|9 +#z!LGXG_Vڗt>iژSe$ԥȤjM2r%zd7KxixXO1-5:Q/dhd%n+{^%:u2=CCNүNjstvvz_]o9rW} QŇeo ${]YJ#ׇSyZ~MKR^,VU;;@(>W!H`Z]7-5G#$_făp~v_MoӜsƌ%UEҍphF@f6lI\hTm>=KYsgL_/wgP3 f$IYu-ni@= ;˻"@k3]9UtNOX6]Rw!fw,4G戬s;d ԱTQEqLa{Zd,IO#uM˱; R3X" +kI)sAM9Өalݤh1^kn q<-"7yJՉ5PJ3#O ="5J{&^h/=Mg(܎\2U]rȑzc9m޸&i& ŦA}TC> 9(W"* S Zm!1 9&J|.`R1iev[]Hlk`dVٍIKsev+ރ/VqrC&-ȅLZFSLP/e#]?R5UUE_,?o/?uvM1mu,tswqKZ {E}.aOqIÇ;.􂔷F[CtW?_KwugOXu\Sd>VZuTM}!VV/+n?2}K5Z`TUm]]ƁHu7ok"Ml4s>j3Ox>@18e擫|b[Z)^j ڣ{×Eg1C;0mUUi|%X mA+ G03jFJ81CMse4U@,2.Olf6Ukh,`Nb@ԡ5٨ffe/XVoz /'Xk]%V;|LC^hנj(AERq B|ݨ6DM# & B K?d C? ;@qWekQ8X(5P{_ (xѪV ȄoKToET[<yC AӀ)lҲr͂ q) FŦN2 ݩFe-Qi Hi0:*g1`@QEGu5֔PbrK̤Z.t87/qy{B>'ߪ/;mjmMd*,XJ 'QSHP tQjSr?w?.h|'L U`W@Dei=WVkZ wx{Lrۜ=HAVM.~`մ+×W#Y :/_Y鼓5͟kŮxs^00Qқ*ӌCU|Pj;F ߱gLÂq-#hsŴaBς'I.ZAF1 yT*S;&4kkQG ȰqQu"0>u{Z%{ =DhpEV3<@^orZՍALǥVwDa2ҨۚK&#kUՂ3@ )FϹCVx֪QJByQ(l46r@A߭e#\Su5!J~*֛LF|6L\i*֢h@GbuTJI5*=rդ++1 _{kko97"b{{r=\Z#0bUWX?-۫S0ҝȫ[9)i j6zV>c5Z=Ji9Za`'i5_U)VW\W(d[@*ozÖꑌ3OrgsdZ1/&ߟQkulL|vy$i~zߧfA6 `j p'/<3j!->yJVsf|-.tyK(-w m5Y^:ڔ&v~2mB7KH8vIND_9'"l1;sc3"{ DyW7},7 .g)r|ªV1jq WIlFx؈Fl?[ _9~'L[Zw{lӓk0iH)fv6㣺}nr-0hz#rL\p 9T%A G™$`M > fUE3 n 0̛V>Rd9UK>Y{i3B 1;_? 6aùA|PY5xc_'q:Վ X]A)v;@N!!T ds0l蜤NY0*dsЦP8,4Vlo`YhczH :,PV瓎pݛiǺZ&+ 28)ȕ1_]S?Ÿw ~iNsjgY~= X_pՉYaaO^*'QIwJԽ*%::PKaAL_͝C:F }r'NGŎV1<R>B3R-I \Y_эR 4~ hN7_~V"6XN[)sutM;Q|Ef76s8|6%* ߗQ$,f|?ZF83~[9j@o+J -QD$rvo*1ojZySE:LBƤL>MzygByT=>&؂Y4 N|Y@@+WMڞ8VZyb425*JB0My#{%CмޛJ"x\9hxl.c:ᄎ/[Dmׯ??8yHf8Gyүt9 62L[p 2j:֒)`(FL_XxRʶOUFirŐ e868|8U!Mjk؂ tW{!5g$os& 6'dHn8ݍ{Z 9~:=U]_OGCt*BEH* b%u-wWG>kq[RyHe]ԥ5cyjf),"有 ~$@C^ngvR&O)+\4cd(#߀"(U YpFgq"87>˧Wr9gqzcZi#0D|$NCCm$GjmBdp08w k#VAЂR3=K;rB*eP5AlrPQ)9ASVi0$$۸DܼHK+3WR-%S5 IDa&r2xW3tkͬj_FXk@ 3-]3H%7L)!^SwLq!y !jMV1i} uTȴ #&GB Sa~CO,0 g+ARV|rBI0v*dҹ' jm,_Z'|*Aq}B6e1u>V=RP 漳)|Sek'g!36bQ' Av#R߄%|dz d,]o&3~_1Y49 ͔8S5^p`0X͏OЛ HC5(ÿ;ö6,Qv}rF_oQͲ_$f˴ @y_N##1gKP!ى(&+\F_V d}%R'M;OOd QQ%i]k֓ӫш)_rޑ855EH:1Rc͚1PуjyȂHȂEM GI􊣊Q`ؓ$0rV!2eiM!FCTXgxS z8j- l?Q`R@OeYoN=^ ƻw>.e?W !{pֈ ᏽe: Io-EzyoϗٜNd\N{Ӥhwqpf4<Ȭ.-m},qm=K֟ⲥ%@h/Rfw5ޭ rLgx%DIFNz3݆DMq{70W93J)T9whwBslSe.*pu/6\>@kfSa߷zTw@T Us]zeyke:vmb'y8姘j%œ?Ѣry*ф^0/ݺ 9tWgK{Ǣد$*bpvn #dNjs pnȾm!fV@+33*6NUL|3Uc.lݫȩx/WP'}شęX5Ivi1yZO#;IOBEfA?{WIn/<ZO=`NWtJFۘdRVE&Y)V)+_0$Pُ)꼢5ԹHj!5e"\Lc B_]Ffj̶ ПB9P.g W1Q!4[AÂ,Ͱax|U n_;jazv#RK^˧t_>_V W=_K^ݽG=ޑF^$47'>z__>A:=FgL0p,q_"@BH`hoUzITXe%QY8PVJVԺ'P27k],I%GS+B*m*bRFBu;lDbp(˵{-ڃGшNü3.4jUYJ,R8>o8<}:Fuj^[O礖PP@Jzz<έ3`$~kr׹|L0Vې0">Gteۉ6i!>7+NR `YEʚQZfxm_ytR62s}F[&1 C"ޱ~>ֻ?r[5Dn ֋;lj%xHҸM+\V$y3R *(Rف16xj5mp{]5Pdj쫇eAN}tY}ad&H2##L4ؠ~2]Tp͡zFH@@SM7lƯFɐbJnM󪪠(ܿ.[OxUDP Y.$TSOch{# Dّ%XuiY墷?Y:6&ٴ4/?fY}rjXkdhD!B)ߗ LDHTRQC9V??D3*enjZynZa!մU *%B{ԏsk9 9h7AqMAkjKQC)JR4WҡF-Q- [* cVDV]`)eIui폯vw6ws:2@ka(3yYU5jQU\e$ L*/L摲6քEVYM> H{_AhhBW~1[Z~}O|d/@%Rz愡g̷`SbR!(o 4Ij|kF5[RS;)|eO„YAפ;g'5b9H:F8XO3\Q(a#K&EA=8ܠ`u{Uݞ ?֜ ‘4AAw`a L 񥻭zzpOF|Ǘ`D>MAz2)fR+m[{Gnw#M8p 5,@Hkګ"I4wQ)ƶN &̃jv.PTsHW,Lh!%W 5*3(re(8\/8݂C+'2Otj MVKXpz&.FEWF=-\iB*\i'|v=],/_Z|/ۑ+|ɅªfЀ&ka.VPi]9%BjQW8YNbWi領ގ |w"Nٗ^(b%c_6BJ)Nym-Li*3<$d!JwKPVrUIjjW&ȍ k회ki@`Vc? 9a"C\7]X=<~4cLi^rqyō݅$+@^EN4z7ңXq󖕲 eݦAvCXHQ=c:~Y|n}{;0!75n߹ }hߗ&`1\3LgȆKq ψ.rI "޻E:)IUe~!x_ȭ^mJ7:^^ȭHt!hws5~ہ5^""cxDq!]Y>1D|:ŵjNRQtģ=|1~;cyѵ;3L؝qB}]Ga!S./_~v;}}X<_>=q7{^/{FSW-ځEQҗuv#C lI'%uC?{RqM Xq05ui{s!i5.bqߛ,_L]''پfyH>}k/ ~|Rs)N|ҋ,Ljsɤ]ԈaoKRv>x9-ˈ!aB.fSDͅ7>#kwa.-FHS٬?C:ɀMmo{/2M۲%pqeb*'L]bʼn45P*=w38zFKrS8daaR eo:*' 󘔹?7Qs ~;rn9h y}Ӑ:{ct_馚&Mf <[N@''rAztR<"Q3'ѫ # )=d96ʣcu\>vHH8f OFS嬮LŒg)˴ײkpWCqCxEV,cVۻoܼ`=0ƇTh^!岷 P)0Pe! C!xS.^Jb1$_.,DcqYzΚfÕwpr99Xu*,[,oeo9Buu"f~7}rڪtv5 zQZ5,Ʉ"sE\~^Y?na]?oګN2x9[L]ȕqK Pwj]֙4π ifH--{7xuT>퍲~巪lI}{a*+xNB/؏AQ:LJla? >*5{< Q? hN'!>(89-T+>0Gluw"-yjb>5sg]~¡>(T bmɒ\>>D-p#!g>-?ue:kvxt/Rnw&eϧp>}Pܑs쓇:cꂐT)զKh 5j.zw yD-ɢ]q7 RWyK0Y=F?{ܶ J/9ZPs9u6Ǯx$1ErIȲO*  ؔ@LO====}¹)4TxIhKy,d,3š-g\th!>U(e6jy'eSO8#Dt3 (% ˳} ],9oG;FrYk4V{M 2Mi[֛UPe{RlA*(qfH이'I8pkp>rucwtpU \P]"gIton>j] Mu(1n_hDF4FJ$Ѩbh@b{= 8.8<#`v 6.fY,FfAX8#s&#(\B M"BMPI1 c,˸z^c1Q aM}1Is.06b:PQĹ %1Qnu+/Op|)ь#ͮ'Bc&͉T01xdƮVƊª;#-0A{6IK_R p uTo}l:dȄڥq=UԹ(Î/C,h{&\bAt-[n+<+bU01[>NapvQRD>VC_4 A3}_+cyچ,Zլŏ 3K""\!9cHj^K;o #(m(j1 XJ\*,67U߭WQ9ͽ*{%NB@yIÞ~(g:/w^I=!8Z`;4թU34ݵWĒTHl$@ bFkb%l, S$"d%|M5A2 }fvv6I]k{xp:OW\Tj~Jjqb#N{?2Ƀvf6qW8#ʣ#`9؊or_ko/r^d$}fÅh66PQķ^嬌]&}ON 7(H{nW5NO< `?/6WRI9Ǹ8AO'I,Y}8L4mp]]pAPz~q%A !oe>IJ]dܰw30""'Pq, LhqAK<{ƸlhSVR>6R &s{q V~( C8[&Onʂ@iݹ5~a;&9p*BqN.Ǵhm]3 Sڋٱ)[$71G Ω88B!`tK%,O)S?0WSm ʳ_IdߠJ,,͜BI7&W'#HG:T b%)hdkdpB0K,f"$0FEf301FX14, fBĉ`Xi ج,1F,:DRTƌ~q<ĚT+ib Lʀ$%b€(RHN&g uKLw26/u`_E.a'v[nr}qp"tv:VG?,>wns{{{ l8-R#q# HXS[M@j8 aCU2Ԋx|Ly\~"L]oHU@$Os!G ȅa4زI1fMè㟛1֜wo*"VXB2 SzCCY,(yIְau$'q ll8QeGȅsU]? pm7QUȁE$""4!!h$Cð)C7yX_j)(U^zCՌdˠ%h>Rk-n|2j k@.3}0vZGqW$VrrL`Tm5j j4'g/P;/ż$ ~R꾩P u(CNsUՙlšX& TK/AV|_;ZQ |vz%fӤrөHDu6ՊdZ"$^րgGo7\̡}MEqgI}ӎ6N`O6v&(ޚ$/.dJvn"ڭ*N1hY_L[[J2%h-]2͊ϨݪTv;)BiܞV>fAVEtK VtUmy[i]‰ۑ5Nzd6}H\G=;*75Ûfv=ˡK{z9 +6y#!y'Q}M5»,&_[oSnZlD8 qh݊pPQ5#PbMަ9M3TcrñkKzEDϼ%'29qɉ+Wr,_X׏֥|eS\h:\sj:y3F`S O(QUb ";UQ[k]Ƈgg;ja2?*!ysQ˼-/zNꢖ à+.eo=|RYek\ʱ}J\XJj4-2\=aao I0`<[n$'@Aë8>oT7a˜DŽ^Ծ~;rgMogλ^˽ey%XPry%Ė#d"[2Ǎ}Ϸ>B΀U$[;Z q54('.C^ 8!?HM^eJCww%^ Psh pEr?>tc!ZD' q/@G_ ~d:?ngD9ZG09c,2:~^|ut>Z srJmyNG6oL[(Gd*ω; a,9)zӭ`M+Cm8w {;ֱݻ]ORr.B3$'&2>=_hgʖ~n$f']k^R.{D$We#퇟>Z|x?~L/i L_ c8~@B* ߠ1)|t'Wt6WDPRV'GfYk/̄ɪf%@> 9} 9ddݣ(RS& KxV,% /fTzUwO̍d5,,5)[Q)Ry.]&:Sy6X遢類sC9˻\GMD-d4e3*s_kn֏X%yfW3O}wjmX6ďߵ/ qR .j[?q2u̡xnl}Z6% _;츿|ܣ^H`aXETPDZ*\Q)-*2xZL(4 oE;'wNhB|C8N|DՑ1qMӍ0+QdD` A!F摀F,_|>`ܠ?o .6@7y{ 18%O."ϦhczɃ )9]F͟.l껹ڿg])_ܥ?lECeO|'կ7\cT]K?g5Mesݬ ȷ3% AO/.3K!q??5s$@Ukj_?e?rsUdP[_~ퟀ)Zit?33Qyv& Ύ\[qo_noyѧmľ?^gaA!XeVaj՟*efq!y 1#JSW8E7nd2ð? fbqǮEGK2Iz],];1'[]RvS}j|rg)PK @߃_4݉]M/6<"joA=9e3G0~yă{!XT!3p첞͠0P+V sXCf+JbYEz;4e;jV;>&ՠIjކ$?!}FC _fLM>>$n3Sx=Eރ;<ZHF?ܟGѫ#2wudY4rI@G$э$Vyn~pncDWPSU$Av]}i1$$ĘSZ;bv1^SSUF\r^}`9Jky➃݊{:1q[L3~bcLCh1Sc@k!quâteUMgKQZC @@ݝolnOو4 &?+s , Wq`zIʶk9=R aݩw[$-LZR"9vVQH/ԤdqU~`߬k\m.S<{6ٽDˣ5>ĦPo9(x 4}1Qѓk|N) #gx"64Ϯ`,-SͶ^W>16&Y`]`GI($;)BZS. PZ9r$2%g]BL-\5d$I&/iC:` ZˤL@X&V%TjO gnؑsfvKCŤcbK؏si,}@ Wu!Œ_*q8 $6wE3骬=f(Xݩ'CFzs;(lBh^ܺwzL]T_mR-:6B%"dnNBKwQ}Il9k k)v^E5ټ%cj 2a_(M2&gD`<[itL&c}vϵMH^ɠ6B*(O{eHBy^ZԆXHU1dlj eTez;M -I[eZ  ϓReZ8B-pl}]{[;`NHz ͸$ <@N3y~Z ^}$+@{=EZ9_Oge3C ʂňӫLO*vswaݗWe hp)A;J48j*ܒ\%D*$uCVg R߹m^ֻ951~8 QN$'q;Jh[lAXV[m%ZW;} .Hݸh$1kpmBI1y@"0&Oz1*s,G3(.B9:M|ƃBXiĨ'znYtV 7HS~=߻M3­Cw,Y"JH= kpzVop2K1- |$9 Htdt]unGB|T `` :Bf&ւP0i茮( ,Y@>:AKUVy:L/yh;nS=E{HxJE-pd< ;i$e3}RRErE,>ggPHB:;8cLb.rwv] F} 泟2C DCB]CS橠r6ʰ쇌.>ƨrrSjhR3Uio2Vb@Wo L$@kd D9W$cu)qM5PGϊϰ8$RhI9S[td#R:H%7߇ }F@݃/ʙ6ڒ^f.Jt)¶v]%1H௜?؀y2!Zm=ghoR%#h6= QU}P P馭7hL6*X?&&.6vҘl\s)R.ķ\P-9k ki[t7FV Msw>4*Co.u5vb 1WACfǼyTyj.4:UNYeǔ1Dj;^f;TDAL5׀ԁN2OFMd#gVϲK6IV'Xڣo6Az]dI&1gjP^Y9kZ!լ45 .υ5Eb=4,)#8GxJ^čV:HHA k! 26ǁԒEG1APwep~û7;7 3hZL<^c7WT7‰g8:hN:ƯcSTh&%kê?sXuwg9JCph3-w9zV܊.-DgRCttV hp7ggrZ>Np*<J6EZIr׫4Y܁`mX-~le%$]۪*nJY |S_N}6Ĝp;$.<:÷_ݼ+\e.WP/Y8y3P(&jj#0eZ bnSKijYKOQK0'\ oSKW"iap^ZsTBA4( Oj+$2=DP+6&d4xJ{\x Ќd}FQVR%6]L5.;ΆhŒZ㈲ ]SUxآe$FQ?um7WZ8$Jdǃ%]ָ'nJ[a8#LL ('cd Mة# Z+f Z݈UY|SA01ǫUz= Rbj j1 ק`ډ8#=,k昋_ڸ ԢMVf"nl4Fy )5tВQdB 4 lJs5.J# J]ZsS ~ ǹ[vK* d$:Y9r6Qѫ2Ӹ*\Z$cNl;:ꠋKuT'A{q\1Qqή䲵vʅS@[Q=г=UG @URz[]e U7ﮎ*y‹Q0ԋ@}SZaYP מ rAH޵5rnhT}:kl^:`#%$ul'7 ù<䈢fCH+B0Ltʸ &5<jO}]TRRf| ?/{N^*?jq_m|;D}`IP-[O;SB#8@S-9Jz :Ul j@fTv#!e۽Q>۶ƣk28wWJ]ur}1YBszovE,՚ԚV[h8{\BqpC3-PV[f̵RH1#^AVXP!eE5wiGggA:e[M9%Qf|] mru~2?}J^fҶ7ϋ`&і6ZQN|9'/1eg[v~55A&?!7b৑aPC>ddap+[~}mMD7[!| T_hjyYmnUNeku+ʮ/ԭx~lGAUQ[vpKF~!g*'8Mzݪ.wck$''gy);Ѓ3 OMG zs$) x`b&8"DW% =!}Ajp/x^_aU%.!L)Iݏ <ߺGNEml&O5iA,VV"kbj祂~g^ґ$j+zX9WZ\/ԗ_8)  _ &h9M

"ʍ(R gxJD lyrāHq~ ~Iwc}Gn}hQhkfcb&RޮZJDv{E\Xܲ7_Mj՗ܬ:[5ʿؚ[gc.гgg+BLGFM~v_%mGhE2͇ΌtHX%d>6ͅ4 Apþ YDaѝ5CL޲2H5M; iByz ) =u0#i 6a#HFhch6hg|1T! Kr&-et p*a_f\;IZTx>a<"<\ovS0 x@i%vU;0uP+jji_zr[nrQ#BmWOVy5zM wvP.+ %J3!;PރٴG$$8 )/g '‚Jp縄nwA=Tl\ k ת@U{jTP'LQ*)|Ԓ!L[3$πiQ6~)k tL=RStvХ&o5(AXij(~kC4z_-t#G?*se.,,bbɢ"ףs.Z}$P8 H )JZծH[ X3g2]L-`bsx2IN_ 3l+fO.:?f e<~iHĊUΐhO/~)Ⱥ#ٯX&&O#NJIM|IewrTm Q(˂=ut[kμ2h"HFP9:sQ ~uD8“hg|!LA 65MOM)EzS{S$6Ԑ|\ru8Q>oezNl>d(f_TWbKz\1M5X@ .8!`}|;YwY&V,a(+V+ #VƝ``5FӸdH+F1}؄?#xKgyWW{GWylĥ֨Էq"+i\̴>N՟H^~̷kDUc1VBmGtk }Nڄo,%UB֗wQ%<%zNP--Ycz^lL=:JRY?_٬o>G DEΦ;=#kΉާlJ8lBsMe%qwvsW= ve$Pv˵z Tuݼ]SJFRW=L`H4O HQk"z݇[az!ћ 9<-zS鱢˳KH'N\.{7+şäXǍϾ X]K%x潻H/(@l Ssj깍iV:g=\(fTr{uunMu3uu;1[.L Sќ~uV kclZTX[c΁\߭ڜd68m3_b6yVfc}AYT t\%ʪ uWih;Zfa[ RG XX<)n쎏ˍ t˲^ˊdȼW5?efw?z枇 :$} zd!lDY%m;@뤕ApRn1׉Pϼ?I$6uqUTk)y6=D){q)55R -V5s^KK܊`)|Y0Ɯ&QgAX cJmD4iG %خxW<4$mZu90FA1rx wOgNU Vӂ{ogi=a(0) I#+%.7[Rk{ ;Rř BDk(<BdJ8-+$B ObQ@@0Qs84A`hQjͫ-zNLԉ!VF^@qX&Cj6.7P ikD &4cm,OHZȍH*D q $Z$D5XfGm41Aغ'.2'Q**t>RU񨟩 ~x a3g &BKy-кø兡6H5-)S]@γ_t0", UYXq4=/Tj;\wg$Z@UmGR&, T$>٫zB[du_ElfZu\<٥Zp4wDA=V G@$Ti>hZ=H,ӛJRdPNR G8>LP0*n*.ŕ4ZEaJ4Tz/HQh,rWXCAp- ,M;\/W\ym .˳Փibگ_?;>;4)|yJV=9Fg )ӆ;S}P*_`@NB/YF{#H( ŞZ*LtS j"KwU{k:՛FPcf{'R8jt@A8)t0 -Q^8 ,(BZbь?{6/ӔqpGx5ݽ/DܒLfk/@6%QH mZ HRJ؋.X#LܙF1(0F@oZu* 8b*`ǰeL!BBiq#%Ddt(pN9#*Wըi6[bD< h%hx}ҵ3G(\YAT#$T{(jӎ@ͣe.hU_i>RApͣ{Qr4ɨhhJȳ?R8;T?`CA( 4X:2T:7E8D<_ \7h) gWjsnm̐W>E) (r)Scm\ps̔c6S!25S8$D ws (y99Lq"pB$ݜW:'1lz+)i  ? rt. /{͑vkY/HiqJy74MߊXnJ9!t|.Ʊzk#^F{ʼ}0~;哛<ѮٜGiƢ..n a &ӅJ'$ݚ86%@qܓq,(Gb{w{Lrے .%!x˧'3<1"380'c}|MgE#0NQC?+)c(1!J!Ð;Ewv,`3|TʗbI2 .ҔX9K%FL21y&t1)y#͂YgE _B.Cqݻ#+vwݼc_b[h^:yzP6ZK|D ̔^uS(nFpK{wd{AE`\jTc1"JH xqynϴ5]ȝtn׋]Kb,HWMl~ы47Ro; Q<[N.\yMjl| tl'J+E<_Ղ#@9Ym8u(N";{U]jJ!8<6~dSZyW zgASAhTRq^~W~^HhƳ#CX:ϦPt Jy3;S <'ѡfMǯ MH'w.yu$B2Vt^%Z̠j"vY,W~~N'1"RGWŠH[ D\r6Gg4b}j 0h/3[/}W%(2)m_V 8O~Ң'-59VunZdIٹjGU .sf=QƞX )Xo~%f)b͍1\Zk)!:a5H2ބyjD&Q(mq&dB(W4g+QدJz-[ ѷtj}m VgGqו$ߓ,6Y}21|f KKlu 1{rK1+cVsqͅKĖm0^ҲX7J׋Y1K6벦OƑ\,o}:gهb}`"^~?>Uby-Z>;+c^UOd;#dWyÇ&qѱ8}n=36b]?i﯐*?'fne @ n9;/H* jf'$WJ(HR)DQ&*8EFlƢ̨U8 ]=bg6v5Ygi!y>Yg+CRHbcqڣ#z??w?TJ/j K]2D1(i;׊ GyR$dacM7vMuꦾheť6aLbeu R$'fC;Еn]{|JաP%9-sp"7^ ]<`ҶUD/[^KZ0#KnK.7ty}r@O֍itAC<4ᡔg䆽ʎ#{u-N ^I4 '} R(HNtr!i{b  N%{ZLr3ͱRf24y1CirӘ`pTc:IͣF5fJ:F \ h v"1-RFiL%ll8.16c,Լ~PՃ@qZ;bǜ0XIJjf 1㊒Ti,)RB΁@)% gJ$2AF4{ 1CVP )욗Q\l\qJNQ 3r喗QOi2FJetX%Qc#8;,龜HkGsXM_7-KQk h2j\xNKCj/b` II}'`OI(rhU=w?]wtrRO^AHa{geľ7Mq-6Łuoq+|PŭrpL)9{LUފm콱9sciȍE=] S==`x5@hYk+m7/ͿͪZ ~oz51Omuܩָ=x_˟~Lf?0G7yO<7OzE!kP~:8bcV33r, ~E?S`ʞnvL$d\ٺ08;ub,ظ.>0') '#O`bt-Bٻ޶qeW|Y!÷)soDJvN=Xvl^,j"udjf!9T;J@`ZQr2ހ N()C/VՅM|d#kɹ(KЫ8Y?W^un%(=΀+ y_ XM(Wޚ4WJ:%uwۯuV2Q(Ϛ*V=]2puCC~pSpQՍB* hcR.ov|%vAC~p=Y{.?W«$<gt~pcG%Sfx~='W_##wYeGQppϙy. K^r&L?7(e5e:~XGf1{Nh.f}}KmvKnYnllVLLk.BB SiN7EW{Gŋng &h` ,iwfSHHx! TkrϿQa%g;shkCR FMRvtP$#ko8AH7xp~u]dF&y%=BM%:X/k o% &c,5JbG_?5m4U/ #G.z ӯ |ϜcukzinL"QԈ3ʜ'#сw<%u&BBwViz Ũ–J#mTIؠBxШvjmq'{/0$ Atƙ5ZQe324IP[YcNQ$‘ȇXKJn4vfm&XSNhew%8KYK HTL̹PJnY PUmRQ[%w|.ՌR6_aR1iBZ>m.v aW@<]N5mB zz}{/l_ z8抌Z`BTYU[V'KZi(9$ o.ݾ.'B8%Yѯ'(RLdv ƹU?ر+u2s\rY}`n| 38Z\ubXcECmD7#2Mx.N{sqg<_<(t=nN5P޻G@'NM}VFjgȦw=?xOM$eU[D-6iN?Y;|z(mxq%Q9_5q򘤯9<:Dsy:1 pbrBvH"!Ha3U~N. e2lNM:ܻG_́:8N^shq@Q`uYaGXÇbG9[X9Q`{mXQOFM29֥)lpDm^Rpzm:\%/6Pɻ Ǿ:Lji]D_T&NeҹLZ[RFMOvl6\߼c^rJK˦TR]^8IjTJT4U4*/U] 5*'VZpF[{4HkLi/V>tm1D3ea3d `xfi&_)M!$Z"DFS+sW9A^jS-Ex];v[HkK>h . ! so s+&jꗂ|t L%QQ3TyU5zCڋVԲWN+Q=mܷl:.nU]OkMFq:gdwrфxY)ZT"4A6'v'4?Ǐndd}A?}EZB)5PK]N3U7W 4M#Mm0G q{'s[ ,:.$$P[[` w ăr~py{a M(^7rӍGAkc*Q"8+/_a;==د/>+-i^2kqowdMuC Ve) )Bo šɽ=-5nhs[YKEIr}Ғ=I%}M=RɱfGBEޛj`랽yG+Bgf dzoUL,4л(Ȁ"K,É&خr)~Oy_0Ab~-֑5'_]TWٿw:\C(]1Ph`ڐs脖bS 潖^@;omXlqN-=Fhjg ^K/XK1 )J"hZ!1s;3g% (L8sE񺋯lH^C3_V"Vl^d- /B합|&e Ш7ӛIK[PVUz3!}mW6sF+SY$6oqrZ Kq>"|;{A}nbyu6n';]JKOZiiVZ>_ p-u%(GʗK,)Ӽݨ,Da.nK&ب>hTRT1F%%u9т$Qlzk-V]x&SUf|\LooH;r4.f5O.]EJ*?.gִ$ɴ|J+y/*XT'4 A71uXoHۤ5u dgL~X4](+}h1/Vp:9 乌.yFjd 5ܽ_- ꨓ zbeZO'W`T4Ak6([I; ;룵*o,H>$_ٚ^qF0QzX^~"yiTD:/AZ+ZWNq)QJ2҉3qa!rJ˰νZ\nEwq܎ rAX _)WNx=RMX{V6t;GX7crL|oϢ*_탩tKh m ~lHD~U"Sؠ&45Vnm}A콈F@Tn}8gX5eH8z>ŀv 4XPrF:%a<)!C1U&I!X(cI*NL&(G8ʘgEr"=XE%d9qT2P ŝOHd*6 ,6AL$KxY*"Jjj|?>;rUbORiaE!d7B.~x 8i+xF~9/bp3O_1ә{O 56=L&}s։Ұ`''W>q$$^,Y3<2^wC^x5_;7j`OGɞ8UZSEpKC` 0N@ͅ,Yգ]eS_ <`N섖 +5e^q]0 w:YAp{-nlSҌfj*X*¤( ltNDTsR[^YI\=kn?x6킽5 < q 7p0^8@\%ju{?<șQ\P= O/{D,{Dd=PkAF/ q͢Zrݰ-@9]-a-5.CۢxRC>CӛN޽69cXqXxL[N0IJQ2fAj7"𕅋Gϵ{}7sE0y`H&/.~;ˎ,]W\G`DtZ|Yyjtӿ|I )F55SQ g%1ed.+k&{uݠ:=zHEueQ|ڐBL15?_3$D$HzqQrI&-Mw(POç$s'u-I^,xRxG\Ly\/ag7|ywɜU.*dM:Fpru SG~%AH[T_SNRzRT2T yRWTSIJZJΓR8-֩NTf^dU?M&]s4/PgϾ,qՅxګuj)UO~KRPn?pI_3 /&pZ1m]TO:*~~ǻDANoBʆws]wwKLCg_|ao+[Y2 gw.ݒ;LBp[ѝ։~*BъR*2NlމX#ľ&P;ц[RY#$%xeBtLz87}AF/8 P{nIFp#pH`4qCdv])z۩Ȝ$0}^R`7 X\;8:h99,]aX2r j"ҽ3 ^yAuiCjeQIjN< =8']Ch` GF} JX+TeJ=#Ui!hܺ$BaF祶$%JX(zm8xhhp/$[y6Z ;fU%{! !{p4cFFk' f1g0 \/!LK7)zMV0OPkr!"A=6JE(4ړ(0ڠl;ލ.Ѭz弮M, JX'~&ěVZ[ b`fcᔆ.+%𿓟C%i'Y|b򯸵h_ۛ>̿652i&tjջ p2+tr\q-y![JAHZҢW>w) Z/IS eH4fי*+RlW U rF`-GSF 'OR{bK#đMM"{I"Bcxe׉H"i A*QZӾHd(RlOرOE( fŸ{LVMr|{󋧗<ո g=_KxG̘.&Fnal"gÆ;Ϯ(NeT4Chy(xI6 (ģw2 VS#q)D .*\2 5p IEO {zLYO0?8? >.9f2'*wlJAU/\vB+pOgOӶqy #[G^sx|/3gc ya~e8%/]pke֨U" %*+jmQO˨,kTSG5M$i-if.Ӧ ZbiLS1=e0I= 'Zߥ[5ަjR}f4< ܚ45fTZ7{Wte~ hx ux|#>̿hsRx*0C|Qrb|qCUt1 1cSF4>=eccg>z>|wsrX(w9q%+ @\9,Pkⴍ4$=箝?u =~&Ri>pi7H -hL&fJQNΟb;Mw'%v|Ƃ+Tꨀ:!:ƕ%E >j)E KQ8?W1$CNCi8Uq%c!U9EPJYrcbE*EtpVî†(h5ã32lJ& u(8aL( gp "9o$^JeҖB28ݲn@Z)wܭ/^ DpXQe9˳n,%n1/0Ô{p}"JpJ fDžF㺯i2]*ؽC/BrtjjYfE뛛d8/KuC@Iʳrى/|J,uQ\N>PN5}BK^>,T_Gɘ݄.hCWmȯ̪qo-,R#2Sh5*)Fhi\Xd+-+ ̣$(n}nرigNL)%R%tF*C]4D4^uKktROh J*§r1.2-j!!r_>Ѥl/Kue˧'d _~yK6,;e^>??)#n&#upydAXQ??_*|ɏ*rw)mm^=N@>TPo<ʍ̔qe4׸移;+MlxLr)2F5CeדYƏ.6 a*2Ly /YUZ?nBkr(ф9z)'BԻfNATz0 {ahCh 7M.߱@yMn\GJ̪i1)4AS)Z#H b{ GȠʏӇ-t!̡e}A 5Rio ?[P9oV3%Wu>'Ԅ#Ig;rN!ό>Frd#SG֫Xx eIKOsVRgqtF.)Ⱦ@ zɱ3,= ?;A8p@n$I3ʩR2o *~p$hlj+փgEbfLR)PJ|(:ⷀT2_R'Hu^NM n!=#k߭њ@ǯzH }'{.$^M~à^u˨_^p@+*Ԡ=C)jF4Ǡ#,Z ˘u@R jƉou{p?(= *g4킿T0㡔Oq#AfMsBrAtc<Jbr F(hdJFQmP!LA} ɫA_=7% J<g .C|Yx8}T||`W2{39}}7 ^yE ]K=wE㤸}X~^WRQ쵧,ZOP?pVM"۪1BmՐp;#H WV?6(] УEiYHBq/޲KKy{zCc@2EFMcѥmbF'@'F(u,Xqx 0u,d)(yĔtqP,*[D#UZGd@Iqudb\umM6yzLM]5Q>TȌ"R}k1 tÀ1J'6̜1 'C`0{0Vl@Q-f=j:2w6%D(>^J;ѹ>6(U[vrѹNVaDj\5h34$ ΀b 0DnӒ&pRkv>'N2}&6pX@žJv|{Í|5՜.Ş#l ȥ@|hr-Ӗ`kU_WTkB^/ڀ ߭և^d._lvy6{T'rV6ܣMy2cʚU"kQ~aAXP&t/c+SOkS eZ X-s_Ki mr9mlXH偏a^c"' EL>$;&mWVit3[h8Ts] [W rD;h=zMu/o\D[˔4rRʜJ {ƿ?w)K=io,7re/;#!}FX!Ojg{!{FR%;t؀FG],*K'b|z+6]'j.N`tn(GCvŒ'w5P4"X14mէov~ni]nhe!śNHc</[nʝ@Br0;ZR;]Z k~3Rf},/CQ:KEKЃd:SގVK09}oMCсz~Df렑'ƽ5BE|/hT?p?RO~yš9-xx<掼[u[Wa4i'dVw X,懏ֳyD,YͳhPޞ_D|[Jy7JPЪH(5V_6?,8!NCc]θ,E'ui0'|p4%w_vy-БҬ4Q:[5'Ty #\WJL3)e-Iʝ<|l>tެOKI#J*iq6w-nF8UC/qÔ8Є]B/ 6DEUJB$u! baq+ +V."j (P zkw2ԹEhl!b@jTұTXWZ]1V€P )F3I]Q*6gTE ?(J 9ţB@Z ndڂ{`lsOyJMl,%:ל7S ۏ ~k騩硜WRz4ҳ!:@sFbi8xFMV)5*YtB/HϏp4όN&гiE:,$Qޛ;O-نsY`AKneKb{z3RTRCᜀ jTe `02/DIRi͜L>do%`"BR5 };>BV0.㗞 !|zC,&Hf) h[k0AҪfޛ.~EAu`C,RŹn)齫F{Ucu\FG]58* :Dΐ:b8.4"cev}C4.':%FCo!;u]A" LR;l@7ZR :QpK~X UI I)vBNk8p/lRC@_=%$uw*JHRW,4pVpV埔E@Dz$ MʧXÁ9r{K?&f"wL^deEyO,kE :T&8PmHW4z 8P0=x#:\)Mću))%'Jpptwu\_9!]Оtxӄ3ɮxe$+|TkUh:7uzhqH7$86^ fI;RuZ"= = ؐc#=$29Ș K\b #Í '5%'$ʑ-+mϯܭ96wry;スfooBs:)΁Xd i3'.l1~}rK}سWqmykVȚo}t|̫ݗR]|-G6{Sɝosun^Ey{9}V>2Tt}AR*>^SA_fK]8R;XȲˊ[G}o~VAj@IW.$M>xӊ7\2%+C4æp D `'E#ɘ`4`PfH dE40JIs$M#ٌѫ :?e^c Rucp}N̿}:-`y32\>9>ƭݢzT7pCgCv ṻYjEwC8l..vv\|X߸xuۖ'>~byǏֿ9 9 9 9krgOPy.p/K0Լ{|\Aև+fWyOoCrϦ7)#LS Ea' XWC^rCZ!`CG(m$_4GU5A2&ԚEYc.h:\4Fp5KtՃsq?LK( Φr;%^:%OeWq9\v?wH3WvK7(Ƶ7֑3Ue Q`2=U KŢTdiv\TԻdb'BG)./K&o T*"Diq4]l@~ /%@73P+A]C`D[Y'!W )Ep6BP-\C9 {*Xx^L=䊣 |yl!T bRwCQx!i/򖓾+fP)i uiP qYdNCB:D n;R:Dwx0$|K1CtE4Kp;ɶv#j\ĈN;Rۈց޾LHn]H;On@[Dri#:Hny7#{vBB޹)`LO%Kπ>U?'Ui͘}x"j;H{vVLΊHa$~5ʾ ; lbNI?EӞU}yeh]z]/ .-C[UZJ$9O)}RaJ^$iPRGq 5̸>ߪ=݆/',o7qey?NfX[8*7JCӞ]˧L@5~w3\{f.K8]$hzu\=Wj2kka lA-ۭs j@tl$W}\ap)5S¦QtmI/0]]> }Z5bq,.!zHRIGⰧ>Xqgܛ"#@Ӏ4E E m V,&w\j]2Il覅\ԯ/71 f;fڥǔH+f.]xsYl~ 58w;l<]uݐEIHSPچBR:)E4t)R. )=o)%R%IjTEJQJI.-+>m/U"g-,ډs㬣msšs68اH&Q՛eh"פ U5no8ki_mC˻쭳[Twvb '!t 5wRR'S׀RR'KAۋ.=o)5vʪp$z |,\˫j1; D&݇ymĕkhb4{l ]yc@-f< GQo(РxrC5O*qVl{-5*sBڟb3RSrFƥ60-kg /ּnB*QL~#́ .ً7e3C(gbf^x3{F&k*R+lbؤ[NF#wA#ZʑX!.@*_[tԕt ٮ&)fTڄX\ֵF Hf;mY[ mQ!J*JkNM%*fA2n1Rͻlh vbꘛ!5Ԫttit ^I$Qx@fG %HL)IG u jq4v2 Pt$+SPkҗ4R )} WPkxHJU Jo.<ڏG]zs^!aAsvT" 8uæ`]Nͼ)q=X hr)jb1@JVI#9 ]CLV ַfjT޽ŀSȶxWPk{izs1 R\h!۫3S Vg7SADi\*J7ZZ*ہ|Pv@y&pقbv %;O=T,QZ DDDՁ"|4v#I2G&aգBls&,:P w!'ժtX]zz]Uwx-Z1Iq ܉8k28 ͕mNXY@YF7mtzI!lvl+ Ǭ6Yz9P|& `m&9L5+sj +OK9R~wo>5fy; n0 =z``]T|EOLQ5)86q `zAѴ.)*$E:MFĔHJGqj4_f9IPZT -#'KlmƬM:ooA9rhK@|Op]no݇WE ubHÜ!쏀VאpڤU3މIJo~axt>ڤ_)$tEIrؙ:8cFtي5?j+R:0M0?[U,g3:x}wo'?uϛ!(yA... +q墝aӋV'vb\9kGއ I_΀Jn8%hJBMP_kv;ȔbX5%hCX)&z;LPƄRHtHm$@)&|(F~m,_ឳFQ|~|pUނOFQ.4L`cr˒d k H9@{4DVϟ^S{/v @ȉ\YO' \/h?#,rg@_N!1 zJ@g9]x=zE5Iuë)FWJNڙ:h<INٛO;f];fW=cg:ehE;eϪwtHN7ˡxcm o>5 o)^=9|WOy0jh4y0}58~c'GPxm{͕<լP#f}kjɰe?M!c[MAS~.@ X/?^hq[вڔ5+D lD oV)^a%^_^漵F/zyt{E9e~؍"˷7?Z{{/]΅N_0#X2syj&dTs57PQER3'/FXk:n9GA^AҘ v[<31BLfXk} )1^MFI +/)D(nԖ?3ϳVXr? M#GUE\6).6ѢW*H9PG2Na2)##tҶ$RXPē\dRHm)FmܢzW MJyu4_B"\ tBŸYټmZJQG[~M>墱Ic"p>z:ɝ]=l%*vYlWFGWoxǿ$w$0_v"VmWOu:n[pmvTqzjmG+g,A6K0|'7V4 1Q})U\1W43FM$N r~4ٷk!DMHlCmuBZ ]*( Uaecro7m.]%n"FѢۑFHOOI#- SYi!sCއDg 偧v8PCtcMUfOmm]9J@1iYnsE0( FN (DfðY>,&ʫn̝r6tP2~*}ZyQ{ҵ|iI׷+}ݥ!`o?v +J?dw~Fa*]?hӏwk~|} o_Ju/nzz8zصObb~ZOߗ:[#hyv#8)o]Q5E+G) B\V<:| FϾxH7!Ab…/~j *"˴_LQ1v}`_? M7wo~ lBT fGl"]E"2rEVAG;UZ>8zx2JOs?c^H!tkmH /,t)@g,O9,fpW,RC{z8CJæFztߠ@= ;xAfj&B<0>nl&sA\2K"[ <`s\ٝ%Dy%7{b}fwn{&695tO5Uw}DU) yuz?h4;?'^8ރ-no/y˝p#?*p |ͼRq'5쳦z?V˼&{L"t :xWW=N6]r̟#PfrT YKԕqGqG'X8GE=| kS?/J nMcӃKc edsooR'`tN! aGgej}XlZJ5(?&7tEt"3aL>հw;ĆU BF*~qq>UܲVBQow?W- =غ#t_F5d7WyWpAP}r{G -WcSL%4+WH_|yh$Ǘ`ry:ߑb"lC8u;iА\E;=3 NO&ƼHa",d&X|-XS+zk,Lr1+z@² K);RDFfhp虷z;F4FE?X#ܼ3k{~r <"p€p Hv۵?5-yr#_翻 Dkcob_Z[13+Bz[1/ode'O\m_@e3iyg\0Y6E ZeD>$ObZB6u`4R$Os,Ѹ4$:OcA乗D]O"XGc?/n`1qV r@46*0Q nx_"knǛ3#q,W`isk(oÃSB`U>V_?= V Qva^# hȄ\jl̡TB`wa:b}~‹8tZϾDž9RiK*Qh⎕!A$•̈́ЅWgeЎӂT+Fuv۷Tk XP%D,\BsSP J()I썥fIianj xâCu3 ͔̼nꧮcz݌C҈ŇfCvĮ̈TR k2 78X_Z 6kIҰRbw8{p;\OƢkP,8d2^Z(d&h#/wԗ.sVQ#y f D& 3]HB*|§1YKQx-;5yFLte, FJs8"A~GUy:Wم!]N!@ȵs ) r 6;1!E J}(#_75f&dXAtnхXV Rr`B{ U!'⡩D?}~%d^gmtPS(N>uo28x08z8CNh"s1pڢ`X@ڨ [墴2H9+ⓙJ:XH:J)VJB8 Ue{Cemi ª 5RОݤ\!CuxnF*#ѷMdd~G-GLlhF GNf܄?V \"|zv>߹ E6k^ k\?~FZ¡H8!}2p'9GS%26Bkk$>gɄS)"F7uoAw&O M 秺Xw>ǫ)+p(ј9H,1P,E,T{){)\]Hz)NdEis,g+|݈ŗT(g-R Ze!x-h_٫!ǺqXQTC9yX 5^ p d?y5Ow}ݔ >o6>YOM13F/s]axwwD?;/;j,.L54W~ ȑpE=A{osw9D#`NS(Z[Ks%c.^|)B%`ɽwx姊K;e)xSBK8l '#X+77"Qx:Vb!UFdDl4>R` *hC&`Vۋ ^eӓl?JQFw68`<+" u1 BBVXY2dJp$ImdZZA4ZJ@{!|$3i)4IWx0Ve)Rd e BBNu7Q}cZK# gs3ȩ8yKtsd$poQS!Q rRiF=\YB= !ƹY|Q/WC{E{714#P-e( q} 6!h۽=z HR;.Q5e (hI@N0hУk$1zsfLƻۧ_}|%#Rd.ۣ:QK,:ժuB y5aɲaLyW2~8v*]荀98އvrf4ګoH |$]GA0SBxz%Q@QsOlR FQJR3NB[ EDRx4a>HdQqށE;ln_ cϒj9TĠ+ƃ)Kg]=HC8TZIy>6J))O@?jG޶eȜ"io/D^ jyz5G6 + |("9ܱ I)EqR4SZ0-|g uvLJ:7#g(! +T5܀OrX5y f4[64ŏ(TQ&-%ft7х:ӃK^ u$jLe9r8)eed8 S3w}YuwgcƘS==E~y"pO{*KHJK9l%Nu˾"NM4H)j.yFyM a%Uǹ1hT AH+kU%DU BEܸD~8-o&%L4!tnݺ[mhK6p½ˀ]['>mUxs#dzhkʷօ zXJkjڳjDICrҩF-:źIҠzD˃6L4QR,nHHև|*S]#C=uA~Gu3=n֭ UtO"δSrfS bڠEg0 ʥf+~Ĭ=`2Ezl vjZ"b^BHOKFc "ŢWHcTTC(=.c6jt%LJ%Z1<ޔTb 1+p%D NL0Z׶syk'%;zaiIE9` CPptz;2f*u] e >A2ʨ"EyM'0xGK/KCp"pp6ѝ $7eC.()F' _"%8S' 8bxdPF5-9%6Q\bB) K (J$"Y%3VQW'\F;!Jn2H8둳PĮp{A}ØZ,9(@&\Gq)X2UAv%wҸZ0A`5*գnK5RtSo}lF$dڋ{w͹d9AʐVۊH&&hDR -x9."NLQUugbt_n,4Ƣ/I>YܡNh͒R'J P#&ӕ1_;Ck<ؘ.5U73𱖯gu`I (E(-ػ`:z#La(7AtCr_S6"({ l S[8pHnX}&q>Y5\j9\p)uMϨzXau ) ɯ۷Z|7>QedaSzXԬ;0wuj\o[_X-o:~ Z ZYvJ4[ՔH2^0)=ŴE9u`4R$n[߼&'|YX7̛!D+  r?, OŢ2& k%(g]F fb_Y-*~LљW\s*|Fb?ɿdž%2wn ,% ILƬzuyYUxnƘC$8`s(!'Z!" lHa dd#VXLYb$FF=d4r&27'nӾi,8bQ1^[{x6p+$>2m}Ӌ8B{Ff)<'(d + (--PEYH PGrkn9ue͎m[Y~(b|Mi\Eeknw(E^ʒe;l, 3$g`8tnӆ.e-iu輫'%6b-kXBY㌢Ə{;tBnE~0e9ee7p{>m0u-I}OrG(]ȝ0ْ6 ~|#?GM\KNNBJw).H'g-I)# D԰2'j0:]zRʥr Fᓐ]oʨ /JtRz~R*~TyER#Z :^çql‚J҉!hjIu!Q`;לSpukVA1G-;w<ڏGTqf/0Ha<0ڊO6}\ ^M|7_S  .u?Hj] &{\JۋXu}g#Ctd,IE*L+(Óӊ`&b[EvŴkdы7ʭQB0.j!1 }cDKaCDd>3̕~Cɶ}Ǫ$(7;+QZjMβ1O"f r ([TWE:ցߔ_B؟P& ܟyy@jtz5 a4YRp*s;[q\*qqЀڽ9N Oܳ+G@N??y@S̤n+l8-9q  J-6dw@6t>t1i] F}SFMYO-,RB4CM@{wRzRWBd!@+ vRJfiq sRi3N1NBJ 5..vRzR=rΝJ)OBJw)BvYJ:k\H.!M (l*fhx2b0+PbnjX"(epR)|QߔQDuc +?)*? )݅BNJQJ|>{OH*yRQr;2O;%8'!~>5.V{J)g="Q2 ]b*c9Hn7exlcJEz+VXS/@%o:bۉ@bs^ lżea֧d9kAAl U#. f;ài*AyC#\|x;<a-"[Ԟnx,=z"`nT協h>ЊPX؋l~'bEՕxBR,t4Vj='{=(., ruLF 0dXY,1LDIWۘ! SV%&qazd,Y_kАcIϒB0ȊHf!QEȘM"D B"0L IS)ғ}dRrXƘa%ZS Kc,Cdh]^( Jh(C#2zcધ҅蒖ZoWjUHZb=i{nkMoc4FJiJ *f@A!@1 846(F!#gB(IMi*αZm.2T}Ci,PZr%tESFҳ{Fy`+B=ɯ{I` D lTSxKȻR"[0v1s>ޞT)B''`0{iekIƒ-x2]N. Nqǣx8FAǛx($ }TP ">2.nfš( x]Nz-K~*WF}Jq ϶LFŢv,j &TG̱$WH<DNB]PBa4 Ga5 R%j%h߂Ҹ[obvAfS.JCef C EKljJx΢֒ϒA`W􄵖|>`pj5&U&QIC/ f$"R c#w'y3S ^*Y?l0+k+{/(=dŻ)fA*ʣ M?g[kvt׵})Q]zR]:.9N{9̿:Rߑ'Y!R$IHb` v-@+n|%ox5޶z.c `FMs֭sq3s9**h9Wܮu.?"gqW*D#efAC޶bZ`-b[w׫`ɿS !SU1]<|hV?1vkBB"Z))S.e5|oyש2'l9U=cHȲO->ڻ28~] ɭa֑4~bΑeE0p@ies1^DT{qc;ΐ5TX|)Fď+ J!AT\[_qǛ W$Q,V Y,zt7^Q|za!3tppa1tptX$D4mw7%BD߭;jY,:s(1#m'1p6>!i˼R .uiuIt2 !qdIPŒXP6 PdS=5\ͮBtM]/5fߺ<|9Bd%l3, 4x"Eyb$(6 -e&zP™+D^ gڦY{?&pvv61=p_rIER,һ~b>LbkIUiq8J/L?3p\{f:V!uP; &x-n~5HBH ߛZ73 w!sA~0 1'i0N,z!W'PN{⽣@,ӯ4`ϯ<1WPS"xAs ?y?U..X; Sw!wD,?A+F ~kM+o#Dn9q~$./ xE*b0P&[/50NGMf0uc]~\>;x80$MC5e 6 CJ7'a4Oxy?A_^M8&Jwy/ bGFB_wY\^r~v68]||8\&50'c:,^44{2itٌGhzFb?Iɮ^tQh<] 9 _?3>)_֯oSj&Wx0C[ M,XgM7wYˠ+9gqcb-{M#(NJ$?/O6xB-Yy |`L2bc.}Wwro*W]}ȳ5e+vT{Ӏ}x}{/T`y BY+$d#^?nsIp-3Ibg0JA9<l8'XMB9|-C=isǕ/Y';þjJ," ZRy=90)|~Zd4u:h׹8_&_o!~V7g*oʂJj,Nb-$2z[\Ѕ>g~%'wjs9 gEz'8%W'85LV~PHe$'GUj_\<Ays 1xrw]w7 ʈt+h\dTdH)k<=+z*8{᛿:LcV8{}*r*%Bjҝh?b"R fY!si֢D{"&&*#<9!.ȯ[yB(+x"PRA(3gw#0@GIda]אVЁ΢ylmPX]tz:8e9oǻZV9^moZkPFyӯ@qz;&π@(:-iP4- .CƯ;lk$ e{7}O(a”Db 2r A'" XER&;ޱ&k8pKbE3B "NEg{zTCGpFf3FCuGQ@|lWǼ.5uW9:ȤS' akQE3y:sWk`#Щ}'E .ũ8_&fNTWkܐ ݳo@0v6+p./] uׇLfe<߯ÿhP0u6NGCyIv?d#P0>.Cq|\Vv2'FSyǎe,L@ǹYaS%2|9rJӠ.!~:`3DȢ;0f;k_yiQ:t_ޘ;˓ᎅƺ 2hˠ.jlU8k Un=cyxr=V:wFplA3I4w}Q+hƪNq8rdZ8) Id Z6 J9LǍ ZZb9֞PUJCˬӴ}U+G+v 72VX<)AWb1*#9ϥ2xp"R`KYYGVhS*c"X-paÞ8T `3pY,ٟ kg#kZIA P,=CQAe@;0,s,c3C'd9z53١EW^#LIt#J֊,F!+4UV`WO)b eK0ߌ1ŏ8P*|OFiQG 7A Cd93؝$S9S"cāƭ2 j=``Hm`QGbQkJ3^{Z joe.N 1iHv :E# wՖyvkz(h!TǒH#[R21~Le1T HRXABA2<т`\#/ BX^$U^-֊|ĸd]u5fmR2ͣZ2zpaDĺ!p@9N bl?f9sݶ( ]jid* 8.gdE'Ǿ8,Mg]ԃ]H,!f4"])Oƺs砢 BA=kWFP"q`ۅ45xe_ ɧbʣ7Q3+O_ Mg[ogC.8u';iocjqų8%©,ׄ 3I qZ:N@-̐H.Zp<'kֳôm tg8Ĥ^Rp2/r_R)=\jMZ덮\`ҒEwi-船 g lI=I,ѻmQI -%,64Wc؅QgRjD;g=o"MYniE6΁ rA5Ibv (^3f5p;X?~ՠ@lddFa- fx[v%ಓa28?o}TMkY?,>Fb^z[6 aT1(H5wq[ڠY`q04XTz'r\[7՝͗O[yKw4_6;Ƽ7oэ˧{v0:jL_"$㝴\w*пDri?X_o@M$t^AS9CJ1uaZ(It0R=⏮ N>`#bkMh͡ԭݚC#W*k`o)W.㉶ediFN^^C%^^>gc8GKb뿏g-;=rS}]z5[-^:9\}rREX܍Pt":5дmnC(j,gq* 3^\_.HZ?cB#Y OJS-$vki:xWiء[H|,ڊ$;ysSJ1[KDNwnG"Dnnj8 gj1,[+1ZWTHb*5yUF`%N h{fU-({>VNU#SZS*2i:D!+&-PK0W&nBIYjy0P.ԢZz*!'b4gZީNjyjQ  sHw^ ҫ~?rGj&>Ej ^5駙ǫi#)Qҧwۿ&5Rǫu}ܠ9BU1$k5J.%6chF1 щnQ1Zj=e79.qV yL1x%%ɻ\]C0?-rsC`ra`lRh2Wf3x7gIyݧ-H1/DZQ[FoF9 hN$%W&/s?zlLIG Ã|Yf&÷:N3`q}U ҹLwq* $j񴬒cy}^C7nY?j1@m}<J>hjҍ/(VbqI@ Muh*p7'Rn+%|5 3ԃ$b`Dd.,rvCMxL.@~?k -ODjnɨ;|ɚPrڍ}UGPR̨;EXQWԜIВ4{M | sLj-3.'#nie$К0s0§P"1M~8єq^ k8ʒߜIAM{u[D2IPbx(VPΙ4|V=Ĝ^H.=<:U0jٻ]'K=ʅN ngJ[^x:U[_ qMx4i|<PDUa*pr{bx8e K A |C㘔lE:-:R9Kuh˲#˟S}վKm>1qb;h.$!^T=uh%4wp-x&3pCjC KD+7f):e|-NW7EǑIN$6Da%JMl! G'zIK+0y 4TFt@FUF]*jJl3-$$TJ*~X$RJx{[ۓoB?|R, Slt•ֻ9s+ YY#Fk)ǞWCe5ː;2IϿs8 ):gg+@OMaڪͯ5`[r|~0w`_~2OIdh t?gdrO7[~a=2ŻN L*#0/LiqFPxǢZmU*b^WT-јs dnEW5H?޼vNuK"D`$f|eR͙8I?DnZA.4`",Z6}O>x%G?u138O.OhpJf]c09ڣHňΖe R&$2lY%'$Zg Dp5N&P'9Zx):_:,s*1UT3d0#S#eg AģH5X>"dhx 63\eIvc?VCXyE2:GˤGݭo^/o2\%Lk&H>^ͮF|On3,kN߯GckFk*F ;9ay R('4aA+YPb5sq-7 r}z!K:TOc:.y\-i%(axbsHTeԩ6{FnL!ilw`h+-GR${(KvIeUJ )V|<7\foJӫ8z+w~P"vFG&x ̖Ћ)еe0Qg2VڒaFv$] VǮeA;g s[nӽ(o-tV2R5ZH'\YBMv&mXm\}x84-#eN^5JuU[猞j-S3֘Wy8ڈ%8`umu3XlBN7Fyp$ƾ@$ֽl-n!HY%g11edN>5Wo)^]pTF)Ud@ 63F@̹Sg] ]{b֬ LF1"WVِ4Ҫ Y*ݒ^ϝXco4!GꞱBq%m{*\FWbu&gN",#x2dLv4uyvM܃IpW?B]c;< T qX~ 6B\ê ͢!'%c)UF˿\CCZvN gBb9z-uMwJh]޹.Sy#|:g)&U< MN}LOhyt( UScװ A'!J?ru! Xyp#u!,k }u!D" و&N) Ǻ;1ݒX2CvYVn./؍KڥG<^LoVbȍzPbmaJh%*{ bAڒׯLe.{w5*;0Ac/q)އtd@ԺYM6&dv1.n^@b6]1Q/Rq<ϙRɼbSlގzŨ5(/u B VEP&kpTPvB`r_fy#ʸ7@%afU-?V $JKAJ)FM!m cMCa'e5$!8v[ptqQbIiu翰 &wf薼Z?Z+q&T6w{7V`Bˊ"{{ { -Y =@N5ob N?<-olbFr#s:fF?J( -A fl~/#4D Ȃ>ZWqźÛI3|ݥ\מozrps"& 󯻠%L}j`I9uuUYN:DL+Es[μC@1" 2q_}eJ!τaon͎T!lwU (=ү·oZobSF Y43+ϼF-Yu,e~6A:JxY.HٞZˆyB94$K?\K$͇ldV֎^뒲QRkbAp4P5N\UȪ6%%Sj<}Qd¸]=w 6pI6+!ҐgDb>z/Lاh ?G2Robk}']ZJi~YEWy(2* ?/c6_ZctRȤ   d;$T" Բ3VʺxT`KLIS??v4543R֘%1&ŢOdqhXʰ\ADbXIs}+nsx6E@d_8K=2y 1A(} rEZ-3U]gsrr'V]#!kBW՗ttFhC2"{c1܇`6Ar^*'y}qi VfrD9džW_rig_c/+ބȅn˳r T +Ǭd ˨|\W}-MA[Ai˶pBqZZ'F>j-%<7xW+M@睶zYKd`*ݓ[H f +Q뎑2RIovS?o@"tYF`#!56,/mGcߓyǃrĘkeqB֎oeRD`f~fGK4sD]5{M`m.v>(i/ȒaKxq0rk62FvK / E Njqf'%<'p?#s+]\+!C]#Rtvzˠ<ƲvRM<"Ԋ6/we 5D!!+(kͬzcj3ΉI0.ef $N(1!ZmWJ |#xȭN*1C{MJp1n-݀N-2por4{I̬6Kg˟ZͫӷͼXF_ 1R!X}v1+77 Do+l,K y,3VZ@')ֳ|l Fۓ;5mEH}~ |8+Fpw˒( k}/Fe)7HM&vNX=Uf=7o?~?c8yQ2 灨9S@;?GbakEkm[f h2Ip )HW9=-<{ۄC5dz,Uy^ Wyֳ MFs,=E?M)\ǣԳζ9_^NZ^v TAJ,:rJQ~}@N3Gta@xmZOCPuƶ{}˞EE+i kOFÚu&9,J--}.li},QnQ޸2R5(k!$`R9:1X4۞p @ݡn+_65&ͼiOT_ѪYHrZ{# @α]?-w/6řng51)R9`DfNUŧV3iDpL$hin͢`mĔ$t_2V%s2g><=[sF.p4.@7Gzj,8?^ n6.fvcO}QP o#HlO^Ac ܒզd^ `FJ_Samz_ȁp|eXMg|Av|F'h6v~#Agw+˳}]? ~,پ+?f"5$EӜrړ& +&C1 $XZ(]cvZ 7E8|cn-\""BuʑrmtΆ\/qXuTA>v͐8穽Z$gUE(4UM. ";>@o;TۯKpEƎ߂ h룍QhIhDؿвQ?ޔH?OߞͨbC̪K[O;O>=?p?SD%8R^JW>sC \ƪF'!Je~X weQ8oX7\2~8 'E֒y(#j@=fXB?e?j֮ğ\r `T[;[`J_at90qirDxnC+b3|R?ѿfF$Aէ!vxc0a|3(_{:=Y :Sh8O$44AL(|YhH9%e>ڤGAX5cvr% kv~cve'!FѲYBJVT[ `45ڞ{9S~Α3_9Wي$[γc7kFxYN~ρqc{#ݟDq۝jOooTA7.jBapQ6HSA\@k+ ج4p+\Gs!J/pw>dfN#rKpеnΰ-Ͳ=Y/߫WYI|3jVe֖N[A2%B4ܧ:n(m43t8=)kN}mM 3;bfFaZaZ- ߟT_Eϭ go+Bo>i~8O}'tR aC&p!R@f0iB?݇σOTV$+-m(9>Yz/ֲW /-̸+GY&Ś<%cdeƣr Ft(|x2_}~pZ h\KH^iO˸b /~:%8Ǔ6g˺(O,3JOO~28H3:,5Ӝ}3?\eagh̓LqDZrVs֩:)3i3U޵$"yku`O 諥crHٲo5u&qn 5WU]U] Fre>ɋ>"uCW-{oCPsi,rpD ˃ Ak6{;?*|zvRF'cF㤱 J5kЉ;`W1%FOW՞$݅ 鮽T_rH՟=')Sp0G\xR*>3+fĆrc Y( r\%р8R0zUx%#&tь]z!Ӡ $mDi}DF16kY{LΦwҡtvs[\M}޶J %Ӏ0ٱ}/7W@&U"Jm0_geQ'*PR*c ᐲ1 QR Su\nK2i˕r,? Vf݉[IŠ+OM7ot phg. =L)maUƴbeY i0oc$Bh j^cAic-@rߟ@&lU!!ich8J1?c qH1!wܯ-~}JAMpCbJ;6H&p>ᵡ s8A턿6d+F $@iG)Ylj޶ _˴ycBLVk``|dto1YͅNk#u%) |;)Y/%2RD'-2]j#8#Wܘm~:2-If:H1LM~dӱfR>WV5pN 76je<[VxG-N:gG;='ObvP2ii-8>է_ĤLB#bVBB!0,Ip'^5@9c 2lH04m"РcԦYc"\,ul$W]%@m$*D`gw .Nd;*cB13\ěfrƈe xP =$7!kQ\aI50_F1+Zz?CVaCkRyX"1G)tqRG+$)!6dzӐE 2X=lIC.)A}OaE} vw]H(:&<ߕm\7{5 y8fτu< ,yզ}F,$`KB)})\)ok<]76Zڶ޶zWW{R $THi.sYJ7 i0az؈d#Tҝ:+epED VJAZi%Rh#l_ "q4NJ^jUCdT=8 DdQӿHyK_ekIp 譠 g}t /adrv`p47b4?nlcs9#ЌKY3/(Ֆh%ʤG(:s΁98$v- ($oKr[~i[yMRhUh8WjE1URl\ ,KrQi$,g.jLq.%[Gm˃}qZ4WKE@|:5LgQ|ڴܯz0=e@4:+-D˸un* ,oe^wV޾+v?( Ҏӻq_KP4e765r,†LE]M !E+0G۰g̬1!L9K/ROKu.jmC;0]XY1B>zͮ 5z_ ~@&T9l!<ݻktnf3ǹ}geZ&R9&WzT=wW_× ݹ#Y#ێkS]\]yDMmwh0 ; ;FjiO$jO3kMű?wM^E+Q=󅖺}HZ ‘-B{ AZ@Nqe j'eO#F,%Cӈ<:շ`Fd$icDցJќ8A!OS -vęƖ Y \=4Ц73-.(wHyPDok.(5Q+aDδapbA ӵn︀Aqm:JpB4dMCYP֝cuJZi<ָ)|Q͠'oYCӽʴ*΃˷Lxl2F՗ tˇ:ۑ]엳|spC,ҷϤS:q%مe-dz_狅Ϯfywwdt|+O*|"kL7?:0)U<uI*Ӑ> 5_ɦ1Y_2)*6/<\ݜ?_|r_nWw?&.鋫tcU?6/\.] p٤$td_꤁!L>)5r݆dM(j0$mTVeZSF+VMWvn=:ZӸUdj'd-@钥/l.f矌-`w{ CH46)K .@ bI|%1Ĝ])v_nKo*ˏ>qRP-ee3}//=͜,:{LbG>OFsVx"Uy|?4suY,@^7/]vOQjGJ-B%T/?5#Ȩ=j`P⒋N Y92A3i{nN^n5C̩ZjC11(pf%L >%.(iX:D`#etzB]Y8 }.a @I L}o_;RdKB mh2!1Z%y,!)Vzd#EN B$ UB.uRP8gD.Ɉhf~vXz85Rj+8}^CZh;cv;iDgsu`,b,]"jޝ^f!liȊ҇&߁iI_߶FJ^5)\F t`p` L ΐbhw5.zGšZ ;` G690PٗZ|(B{ܡ GJ0Hl)HUI'ZcƬwUޟŽvHh7n}2{o켏ܲ-EK3&֍nwO9Pп³(xciEј &QjDU(%[lRXO]ֲ\Z%Pf\uKFO[!<47rIU}a!Ipj]s,OW6[C N[D3yMRJJbÜ fJBX_Z{MC>V>/冘qxҺѴnZ(چ>cTqV5` 2`"2@tXJFC)=.i9Oh%ɜ[\qyEE8Ub#PgTdzgw/n)X6.vcÏl} /"(;$BbG;&x dY`dZ ɼY29;Xt=$~@vFNIh;Ǣ+`U٬ 资&6ʾ=^;4[Mq`xӗڶ\`E|IҍGQ̠;>qG/n1cD!@0L'+do݆-O閪a/mS6)& YU< lIÂ囵  /?Ԇ@M~潍u%p~U޵&B6Q6WJDL˓`QȢ|0٠Lgw!턋5VJJx7~%qmgy7:XML[_fEq@T~i &/oXaNֿ69}w4;' ^">.5iI&ZQ4*$UKHUӞ\ӄzQ/{L Xp9\ 4>9<y8bO:l)|! H,4PQ f ۇ67#׵9ְ+nA8SeA}M0MpfIPLOoPX]_i}]\_e~C7WzoEv0<; xA`)>YhZ= fq*ga_|RB-%Z8դpgJ <2st("ƪ1ĶǒKyڝب6SmG.31+PTL1 Q0pAwaKjF%D;JB9m=lͰ:ȫ'!!ծxݟQ3`u஑M8;LYރb{Fjؤatx[DTûjVgx7{(p Y! \pHG g JHuI nԈ,tDTLsV-Q8aY[aOџ]ןTMo[sF4 K)AN(ӂSen;tcz{S&T*`… Zl "8 Z6#ʸ _ӶtU5n}TȀ@r=.ʿ^peƓĠ%zbrt<2qz0 4'a+9)8]e}W?>;'g+_o~adlщVS5Tg T[_u ^aZ#0%}yԠȇx T_uvP>?}ӂ4b0 =" w6ἻK1?Z|:Z}ߴ >:)hv =kYfE:'RndH3qA بWC{5N^S`YXR`3`Y#*t÷~]DP" :i4c38gxs̀Q%fe;֊ط=}EXZqX\X$1KÔ5r9O.:ْURV6k CҴ1t _ Zi3^f,8 ~\KVZcbaQTrR(U"-Vp!_kZ4mÊo֬'ԛ< $AR1cY0 h^i}Ӫ qm^yJ#QhG;13*BQ]jhj 3".iqO Vx^v{>a RR9{)N(uaHwng$y=|?}:jRuz؛-eԯEqp!Iу^R6c/}^]$ <ܔ"1s~rkG:QPpX`0CnmȴpkـX mTSX]므 ):)U="E17k,8S1A 6"&VQle֛Er}aחRG5(qػ ;^crZ}j2)zXNR.%Fe#8$ U*o]#%+mGl@/RwtJҕ{.f9 ֶU)Y0 .RRdPE )T%lPF@j7C GA\o8mS# -#^YHHV#0LZ B*9,jʁʸC[޷(e|'hzwiM-[3$[Vh rD4LcrfShKӈqtAJ\Pܥ;II&fOG"f$$bZfxkO_}.Ϳ.}/G6QK2+8rto*^,k}8ˇB5^7^tRUL'?f?8I:c|o_=:m 0_aXX&4_ v ny4:U\j&CfZ'6h czbŔ@]RiB1MaQwٍ;cֽ= VVQeEȥR]ZjO-ɽwRu౹hʇGϰvrI+a3oKUf8c5[; CY*o(#:Um6xH[4zhvs4`Ne:3qyV؈Rsw*X,LVLh_+5}nІ("pXD`?(E$80ˆ`NX22Epzd1a"-tD0\Coͭ^r"=KDL[0vIY)Hg\:[^%ʾfcN <7Q&^=vQWɇ?+ewp;6sX& JwS 넄ϥOB'Dا.Չm$ҁocRwV#L6@ErD0}x7,}̃pFgؒj(S87Y| 1u>&C4' ӚIP~Φgj'w`0f~̮co1H[7z|L P|cqTl ?"J|~ 1k邜x\W)Ȁ296"T8q7ۇX;iPW =BhWH֞>fl?NFt_yE1,&KitwleESoQK74[.A&fMc$g)`v CLo>FPhTQz@8.h8\< Eg3LZ]ֻhndq3vuwo!ɰcaI0J zc{l,b6t^)C%ә<\̵53>I7kK`w"$r*9vux;;gO0ySX wy2#(J8>]&{YxM<2] ޸4&RNބѰpuqOsPMRc%Z[DB@&BV6ҬHX `t^`1#l;F*i6h` Oyg2c4QHv*L龢)qb1YCg*su:"?XeF ʜg .Lp+ |D ߧȏEWI1.f@yz߹sxzVEdʔ$ _E򻑀<[c,R lJ[ǩj+J }ϊx x/ׁ8y]feKcx*;ʃ(YY:zBh)<&cLyK xa:-e"u0ɡpy s*VeX0murF&/J.S#Uʇ{qldY6'Y$GB.^:\΀ͲA鴼F $b\$ϡD'IO")mp]<ɫPRQYw >#ovP>ɩlS+lOzMx*|y؏uI&Nz|݌|=31>tT׳35M35P4 OI yǕ`T ,*w @RRs ԍF(Hˆ'Ff"!:G3 vC+L%„o%U/|T@F,C֗>n4 JaQ8N wqH_eq1Y|}zn/-eO۱d)3#oC#yiMvXQĖ]/Y⯄S80)-"$Zy&XEbnO'܏UͭQf>Mh!ߙ^hyl{\ƅ}zdcERw/ϯy0S Ԓ$~`]#ewq# "M;3\Ne:4ߏi I6߾{g@TВ q}ug53 IhvǣJ(S(n,W%ebxRJx,[RA=4hAL.>/'^jb`^+UHJhcy rZJ8X2\ c)@JC)jjy], :YvT5,GvWQIBHIB?CInZ|FPƙ 0E6=B F'r$3FDg/rNXVaL­5KD'SQA VEj?w#㩡A[tĭda1B8rΦ50 =iezrWQBguh)D*@b븓d. p(zcr,7 M;lKf.-#b0 p{‚q< HCג\-v,yKh oE_Uyb՟"oZzXͫIݴw.-K('/b#gd({2]MA?bip5dTiM /l+I>F.6'z Hoìу.M *t%<T~Q ւMUZ؟jFI+ԉTch +zvKeU]2OgV?>uS|]s~f-[}ď,9|+Y,c^f ߱iMW1ϧӿTof '4X(rOz6!O% hMAp/u%nB ~ ޭT9S.|SBGn!2[ hM1z;=&[ rL]9[iǡ[LVB&ۦR\ԅ^7^ҡU;ev3Nn$YNb-DWlB=jt7]쐛ys Xœ!.z:{ʉwN;&* ($5 (8[sL 1t5xƱUꭋ3ul plf$ylHa$woh'DjG98U#䑴 24%kKE ;pV)PPID Ty (0 [>8bT4k `mb8|=^S'+59L8olNSsCP, 6~e;zwԺ$[SEk"'TaۊY8Vsg"a&Ղ[+V˺D)}RҪ(rnEAQe3LH"lp`@]x?U>m~6Š1!r-K |؃]?,lM*/kvi~,WG;-0xTl'vS9q%dv bwlxO?c| hr8 5QQSՉ7Gޱvj 8.*X6^uOGL*FM/$!׹QQEC~!X6T>ǫ>7'["IH.9ߠ/m\ş~l}ڛ?&Mϛd7m\`Xa&CV[`@*6ĕVVbT a!ZvCA+U,j/jUgCݩHU=t/߱ix`@a4bI*Jq;C3H~_"R䈌2";#CQz80Ӟ_MI`4.#EŹ=\%@( wxScu#); OP)*x $CLb`9y+UGG:J`XOASG"w̏gVr5OݑS"z탻nLd QL(H= UV=/r=-wqOnՍsqdTQ Q'Kʑi)9 %8NB$&G/ HԾ69}O͖d[tU]s]rϊDd0 R1!֫I$X-{$\@iC7bЄ!(`)>lİ P2Dü  #X6F,h-;"xi7jdp \*hԖ/$ -FEuq[ .P!M\Nf%SGvVLBi vψ71Sp “DSds{$Bշr]#,vN#hZqG{qGGYr-3߽yڀܵE4֥E)/ʓW\q 8X>ۅ~`k>V2wѪ5;/\Bn~;MRʎZl&Qgw1_t}P3/]ɨQ1FOs2E)l]]qãvW!uiBFaPTpY= .tqy+x[#WtEهfiȋduct>)͘,UO3Q 1j.[%D7Hq" ?grKZӈӌ]qHq7=u(T.O*4̪AEb0;:[w@I*AgF:/?CB'uyPU1mk2F%>5gL#cg0^VQ?]E!}ITӋⰑ`+\䞁WmؘvPih*ۼ<-6 ( ᯰfT΄ FI^c+LUH%CQZ2NPƿȽ.2qP]AVsĒԄN; Um`1a84c>vi$Y c{ [iRͷ\)Ku>dޖ(# 9X`Q \ބqR,QL*^͵A?[`{.|* UTleA!sB+pfyGvIMF1M Yc, >~P"arDQ~NxT#c)8R :jZ~|eiQ0v2SI6OἌ9.w} >%ŮaX_'@#1@Fpb'TLلIPH<8AB-砀J¼H Һ815p  0s`tydك#pYDL8 )UCfQW HRD-#rF*݁VNJ) ]Q&8@OAro<&Bmtxo<9!4E9N@+cYР#v>˴E&&h^1aC56 Px#Gn7"ӖD?G,G غ`gJ胬t9<qk 8jG(UqE(٪wPXN:cLc짟v\[ODA*_ ֈ$hVH$*e8."pt?q8vaCAlhcH6cW{v,|5kOg|{G֑}& 8+`+L7<:KꂴdoK@Ս՝",X;Xv-P4f-25\7ZS qRĥ)k߂x:p QsT4ܺ3Gm;x]:1 ?&s4x/\)vYsWUT8uy&TZWTB͢׭F%IH):ݟqsMC-uQ\R ڒ?2qhnҹڲoe$˾I}Ӷv@Q `!V CABJLI=H5"oT٭NK_T;ց3MOdPXZcd#'.R8aH+xe?ZaP\:ؑSG(D*vZ*;w!szXĿֿzI=ْLْ YS7= J uvt7EL e`PKo™ 9FNRZd29Z 0UGINFfm=5ڶ__YgRee/]ɨIPZ4$;"=p$4&|0apO0Rgp()I怈,| ũ]~y| y&PxdMH2/d mMVt&d8NRG侮TaM:\HqݛjI݊-Ef]_-|{s@y~_K*Ka$lY8Un ra0!$j4h0y|6g]sq`Rl޼TIv\ם`EX"2 9K0iaNx,p:Y*!x.Őp{ (2ڼdϤ; KQHt Y͕4Kq4Q`vԬDVSxyb= [R,ou[mR$:/'c#"/'ҝH''yx+5J&"Sh=Q{m%.sӴQ:\_/}&׻izmȒBq[Lr ۦH/,v\6-ƗqCڌSYB?^9SݑVA2aYČ;HO01Vw (;E>ON KiK;$W96A:[ NQjXdFeև_DvU{6*pm" ?ĥ1)L,FBE bEAqTu9?HYAkr0.@q  C^#7bKSFz{4^1-s?j{+vh6\83  8If3)!K*?H4f),XY`:sGI;opb8 vtqsc(z?4xaUgsըrQȁn3۰|pQN;~W@N&8=⾒ZUYo;ib-rE6lG,JFq "c\䆼U+^ӝJ'2Of-I%J$u(q,fŧ+{ږĔ3OvE^N{mԫ;~!9aci?u1a(>? Uj*r2.p9VF)*s08ze' *a tyZv;mq}?љN8-mT=/W}p܃qq=˫7?Mןkӏ|?̚{.]~ >/@lܑE~On޽Yv?wA;lfk7NxW5}`{P__g]ڝlbEM_q|=(c kk.Ly]YñTwy|ܯЕ&ܝC`O_7^oolqH?A3XQ*|ܼs*Zpj4Uo#4CAO#,\$ ?iN6/ߗP5,s܃)rb+~0]Gx^xpCN_x xottY֡T|:KR?_?eԞJ7ow c~OUR ̃ܨH dufT߯>Xg~PS!~z3~#2uZ+e9=SYN~}SxBlj9Ʋ DT l E.TJp>`Sc65yv'Osf"O< Eyu~mZ )4UͻY>{yύW̊QdV^06]J,hB+[ b}$8%3fnwKR2/;Ww PS}3؃"6- UJe8o}BNVzO(AHei*.8qcIp+C$VA v z<k#o G`Cp :z hс/HCd2$DҨ30Y$J%L6q(QDʥbE`(׮ˆGG,T8d1M lR@V9b"F)K ᱎU]KQ6C)*Ϝ8r}Gy$X"(!N)r(_">JCfލ;*W 릶[ YȣX)q*)B]]d2܄ a2R R6IFR |ވ~\3Wi`Au;`lΊO gW^e.'; ڲ>-y }Z8*_y܏x bi+3W3yj,_r&w`sd'_`ɽu|*y·l) ǔ5/Y<]h'rYѐZ>(F3nd`޸ksiZ$PӬJU3,wt]'FӰk3Xts᙭ªzg${&V3y6$5|$z$Dl;7g ݙ_N3oiMP2o_Jx_M#sKZ_iNUw`l$uGTN&SMSx]lQ7D{zm$<9QCʎBI0'q0!;S@1%x7G>L.k;PS0 ;(ovbtiזo.d7J!s>$[F$h#&# SHYdYLg*_-c]WmU띧6 B pJb,R!5_v!vZP,أ}ZI)AH)aRꡦ>RyLeb\e F oY4~Q::N^\F)$g:q+8iYFQ'I("Dȷ7`Yn0jWA^NwւZ?Zb#Vڋ&$+ ֝IE$IU "I&Ҡ#hJQ(*;d#7DK45#-^ (d,f0<`(K"t21W"P)%PeGwwz!MكEn ' -"ietjDƊ(RD,q2L"cLPdGKFT1@t\e,4WGI9a#FWP4oSy4ٻ6#W|[躲*Kz ό_D$8Ʊ} iHMݰ0уP_YGմ\Mbzs!/T=:KP0#21OYSCcSF#3șG8JE$+?e^EgE.zC{r"n6jqPh}pW$sqQdHXZ j#,I0$89_wLR*D{LNb zbo1#,&"@|qx6 hL䛪/y`kq $F;S]e6ϛ3v ȮƎ!2.LC4hJJU̲<R*e7@NDڬqc-s1\C'H,$-9C|=[FYHًs0()7%I!˔Y49M_`N;x_#ǹtSrx7.|b餳۟a)t9'%p#ʆHq}5smoS <dmQpj́LD0ȱZ6.IB|*< (wx5)NĨ-yb$FWX3v -Ht&]giJ1of b 98 /cީ"LR|lVBpwdy~UeM[X:E=w\tLL@NRR&GNȈPʨdsC_ F'tJt nf $8ǬiIFU* /uGBst)"˼ʷM#X K3ql- J衐Jlzx`ZrՔZ28)f8VuǂRy!sɑsh>/D-?wL~ys;v>QAɏ()V4ܟ6_=,oV&d<7sJVipg`#i˾hO%ޖd?vc9 Ȍ dMIYV'3?Աn@sŝPz:uh͢6J0,(J#ImRѶzuʨS*9NkdI(͠/Kokr; 6=' b2kN19Ċ*8rѶq(My-F %/lgC`\lKHn\RfFO*bwdSfjE* g6s&pʦ}GM q& "HToH23>X%}7I}B+e;riUu a,*ScμTPsv57@m;mP֏/s2Ŀӏ?\[AY#);{hҒm:@Ӥ CNk"U$B{(z""53ߒYc:4K! cD-2C]=gbrt|֨7.q^&6S۱r(@ e,()V(LJl}q6;^kza35[8V]R3m2Pe:4x -bOo=YsgNүQY< ݭM6:F8jw NRcO+|:I |2!KS4CIHyi˾شT@.d&s ̴sdTnph{NtS41'OpSĤY^uBW<Rg}/9E(:}XQ[{? WeM&S@[68Ýbo}ݷo'wyBu aMڵQ$y.pPAb!l=mid17hgwYڏynv8+"bYnG>hknbq dC+˻OZ`YEw֎;krrRNU:PX̨Y2\^6Mh7? ꋫVzrcn6b _۲R8b,hRf8cHx<r;@+sUwdk%U:o&[r]Mm8$[b= tA&퉞&i)*$% Lz\b֓jGAs,GRJ3Tʇ0nU{/輔Y84rMB7 ٬fd\_|ZRaظ/0PN=$A@ꥳaLdn56J0ihY-iy C bK|IcDuªV"@XQe) g/+XP Fz)u2gl(f i!qyb|)RU@#jhf ~'s_ԩ@r>ѹ4ULv(“6df!\kZ#ע9ޔAa=  gKbrvB,z 8ƪt!9=IJ( yi/mD'WtA>ǔTe3I *],Lz}26{}&/ 0n㹫eze˒xbZ YBB)ڸ%{ixz}~/;~#Vc.PpL;IJc3HIJ4U_IOn8\hMG 4ʈcG ^}pUOؒ7w2N!l-5.E\`=M!R{2iʼnhS|C:lYqU%ÚF0."fcۍe.Z ߦSǶbP#_ɮVωe_GBt`'CCɢ:<+dIc-k 0򨅴glL~32i`lChe%~9Yq ]aJ-ωYV- x>HIGlUw=c#TTt}#[7yN,7Kw;\a"2S_Iբ$CsT}))oh[֌`<[oֹ1\s ^YӔ"N:#=$} #n,>,ٴżEn Df1zD/'(gʶ?5xtzD~3b|U8{`T @xBkb%oN[Aݚ'c BuWüK&q'~ztw=i41zu8߽E#:~T?/kM z>ng~ e .G@.S1q^ sH?HG5RtYHHpFpcT:$mv1%TJB(1".tTQkzqxUcw_έ|퟾H۝z6eYKǾ^ ꐎ7?>v5װjۯlR֚6\CϋuyQ&fϙM%8v:pL{c^Čw?CQ'h$ZЃ$Iz9hP?}Teee[;-5ZsEӳn@mppg ɁC!Ni%9iغ*%n6i`=m-D"/zMuž -P׺0E3 y.nۖ$^pn@{A60I߇͊7e~jlK"ZؼWrz[^d/GhζzyuC޸<ƁkuԷFll,s>Wӌ[N'%&?llͨݑCD/r[a};LξV\?_>k?Jؼ#4 l~`wJGh'c~ƒ~r~"U~:dz^Sew48Kոc` ~>9?YCz\ϣwrNOQS7P"9_J]W,1T ?zG؃8 t}ܵI.^JNI(gLώWy~ '`2(GAնZL W&*6+^{yrOL(}:P^ZچSxyu"e)5[)_n*39ޥ-JjE@:[fCe@%㝷6ZG ޅ&*J\ED'|M !Ω7Aȉ$/aM4הnE :pɠG<7O'@  <+E֥!J})Jg- {O(e(6K+bĤX)J}˂T%B46txuFFov!Z[/Rl$s :0DSt:),#h# %ԩ : vE+ hudS1!FlzjzC : P..Mf[b+5B ޛ=<~w夋#v+HW߆J*D&R)iGpLGW͚v.]~]*%m$(svH:|WNfFx73XG4}v2ZƐGłplI!(&+k1q*Y3  #;$0t򒓟vz7h %>OOX=T nu1xSiw10#nZ|rt= lOJv~I0 "2 ayOG5llc,cqۑٙ4_DA 浬J4B4Ku|~S'R2qgW'f>$ƥC@0Lqӂ\ ycP`o{#|NȎP|K@1ְ/Wߠ ^T̄dgB2fr(dwނŴjB]79LK+$?gqu ʮpG=;8'X@SL|Hؑk9 x w0,rƤhBnFʝzLt>v'm&+]/*:RbXO+)6PQzpJ%2hLXb ~$R6.nzJir*`QA^X1RP(pxӾ' (Rψ~ؗˋ$E;n9F}(OPNJeL1NhVS<$i+k~k{f9mT}2fDd.$ֆ91 H bQ#%ؕ׉:M͗<_`}t>*^^WTC Q %- {7'qz +]~ Mw_z9Swm6ZRں LD\Qe$="x&Ɇ(Q&̨:DYw8<{+gi4=KMA/S^b\@ 5ֽ)|TPUm^{g"HǖNIcdD){ْZt@M8HoNW(ITzޜՅ%5jzТzPG 8׶7gz00:(͹%y-j Df24駥Xhl: O~p-v%eȌٟdo;#${0a -B!uscuy v5f o%ϲp_dLϧLgO֞67I?ܥ85Z,S@̶xͿ<2W[A |Mv6Ayj }-nQ2*wbկ+:1[` 2E'@YKwzӁniٴ#/e|\-]+eKzE`k(V!~ |<<;_"Org~# 9ﳇN&Fܾ{о@aCk!ɟJ^:-fu]y.ʭnEy FG]훣ZGxijs՛uqryeZ1_lg*/?gYh>[DrΛ?k+qڛ8nIlIpwn󱓻H`pv[PonKCz7?äqձB;ԨP%;C-EX^VAwӤWhcMa;C-Sת@ݔP%S~]bzG`xq<7=ɴ-:}:?RE9NOrjh8[j_Àxp{`wͻXO<[|D!ljؼk,El2t7o:4i8ߍ'_b_@Gy`# ^2{ h/aAX*OQ?uۿķuBB"Ly3Fy8%۾4h#:cYƼwiZZ.$.M܀ 9xtjMS۬vt*!%Z"/ e7Cyj!Ѽ˾zYl)r@ng}\vvBZ}7a]\\.D C2ÑbqnRV7^xݭH04 BtƸ~닫! [F!"1A ^0olJ Φo9-,Zn~m f1(MZ$,9cRP` G˹ kTEA!2%`+P jJ  ϫPKyF`K)爲ð 5CI;G2I1g |6R, z6k2"I kMQЕ2=c=4̏':Ay93@FNWN>ؾ+~dbb`bh.p4  r&6ku-kia-"o*{j-(5j ƧY"VfL!z- eu(&5-IL /HC O*[~·Zmj t8gz-~j"PB'fV GU2X"pbYOap:iGC^I)L2ȹc5)xrϭҖ- 5W"⫄hN߻,MX b& ]@16@-ÒI:ru}Q#mf&Хata-u“e`akR8bpeE4ѸфrB6bh-FRJB?ӵ+tÓɗ~Xwnlrʲ׺085I)VNh Rz&)yuR/v$r'J/w~Yofipr@hT;;(gSFODm Zq_+M~+К˹;jGyZ';|v$E lYSd#e|Fף)77dbI#{=٭/#4[3-C k9尜,?36\^ 1aϖ}P *\q!+l`2w,4\ H̳4Rc9bKp+>6ZXrҾΫZL (-YpA05@=v6OQʦQn}ɝ<ڍGyR;ov 8F8[!_و]O{ImWMjOˉ.29l4ApTs"s:"׬pgo;䝴TwZr~6 ٦.Fgz_1rS29àMEEv,9~C^˲윣s$KR#>r>(1bJ\'JkX3t̙$Vٻ'ѣF[BB}]Ekb@ICO`ENyDPHCB-h 9;Vc0/,FW`#mj襈@-kSROwl}@=9U3*#(.SbtB*iwr"J6 9jW8;Ǭ/#oD)\O5 >"|5(Ll9]{fĺ#b1G S$%]߯L%.%mҪVK,QX0l0<{zБSjSWRvkdu,BǮ,X/QRDZ+KؕcѮO Ѡ&" ]+KekyzȞYRƮXVh3Dz^kEqv/>z,Xx\\`?gH2#w >'&Uoe$!c.ޖ}L'l>àTGʷJ|:oe.ngo<^9zv~2VLMl.E`Kx`AN8uvD,(F5^LPU9hޫ*X-J;Q?4@w vR}-m'ȵ* :D:}85o{gnvq֝ldPDsdp59!CH]2de%Nן29wW0/wQcr1YDq1nVe8b:&UocQ&Txz`Q&y`Px%M܅f,$%alɂEpL^jyUiaV{.WN咽Dfg70JV]Kzo|s(^2w^~) @N5ly uk"X@hny٦O|i(6(v[n[^fb$LdMB I/~vvrǬP$dכE]Af36r-fkO3'-|-9! ]4e @u;8X@-p6#-NTY%]_CpxXk77vGqSq)](dq݇#^) ;fMtS8O@.TJtId2OD[aM"/ђ,ƓXj; zR Ѩ9߅t?71Y\I{ -PFu095ԘRp1  RXhՐV~ɨJhѥRź,U+Tt,E&Zkk̢ EG2gh-RԱGۉ@D( 2Il3KԪZB3?hd7N3-|%-/_[ /uǨCEjʎbnڿT]Oj~ͥjL ׽V\ZA3jˍHoxRuUgMSz[椒Zक|e(fr0ň)I|?}ȋTJxk(jِ?}m<\Oui8D9PBiX܈o4ЭQ~'򵩖m:ߡ"9MBۊ WT`/O! 5[J6gO^ޙiv*QQ CB;Ofzf)N"&H)ϮK n ChMc/uEpH괗btqOJ|/cok: Ne3^7r~uVk!RYHݲ*V~fUuVxgZ]z.+Fd29TyvACp`IQ*.mYc$#M;K4Lj+jA TD݆!D##ӖѸ Wo(Nc\`-H &{دtxx}{1XYGJv8vB 28u% w+=aQJBrt6~PZ&7:Tڀ % ߰ f'T!:0k5֙^녟*Dv$92h1DM,OE(8,C5ަrɎ=o|ȩ̂(Xua$>mX~svhًTj&SHAMU1XK5.;_) $֗PeRNVU^юɩ;M Qm@!$7dwF=69DZ؝4QǤ$юv= o[|q'uguKCAw?^stFitaz?ݰ *gmc88,f6{s{F#Z" R;ɉ׃;njr`iU,T+:6y'ݗ$i /r[tzۃZQ٪ynI٫ A緳zsuyr塦gQZqs?n1/'y5sr3K:]^d8ͷ]Xc( 租׳L_O~Tb,˕7@?2[.l~0-='2?V6͠,nr.^4(3/>vnpx0Kr?|p$S+Շ 1!^&O՘xIMo3kH540l-Hh>DhAF)k]uGze@kƚfwUNWwn9"ok dbmi|zeg#Ii-iPȲZ:m(`w&gbG8@F(~ oѪahU6;DE7[s;B6"zhLTbTzL͎yvI%¡z#ZHԞvvC=;jUR X" @W#.] 58p_ݒz5l2> XD ZkƒIZ+{9,1Y)Ef Z |Hj|zF6ؑtpPmGv"%xZU}-dd:$YgIT5II <Si+!D}5 LPVEZq?iPt]Gn"}+<:^b)hH˜WZ>LW?tu3tR`~G'?oz 2?+s/gs0EGKފovJaxR^.RLJrQPQ*o Q.6^)kmF/^(36A&`I/H$WlVeՒq̌6XUYȔ0 eVfh L*.xxθ"-bPkdYAtxA0 Uȸ ѲX2bgV ToL|Myqp.-JK2#u >,2aYPN yƉS IQ5Q}J55ya 7iZMN1%uͲ藫rC?FK[.\#oQՠ̘d:ecTq6`4/n<@,UĨE6cL~dҝʨC^ oM`8g)ۋ m ׶lS*PR^1WqYoz2ډ|?lZF`{n4q4Rmur{5UEP*σ=ԕ⛣ ~Si<c32k'Ct?fL}@My"Eϋ~GpQ(`ThmL󠼧)yB@@J=|OJa-D l0TK9/:q0ZA9_Ű*;|E?XB@)僉 ج+rBG X%%N+x%qʕ{kX{ T̴wp b,lE oܗX܆]u3l4zP\kUz窨9;wnNICTCc}_fgu'oOJ'ӹH h6N4(@.~{p/9D4MCNC;_yd<.3 kAN`'ќXqbƋ@ е$@ⲣdoKtMjȪ'qxe:k=-[ YILn|{P%2PB0 bՊђ0-={_R* 8+ )WZ:+ eCQ)XbCCV}nvZDvF@yE=s.`~b 5:G,]ʉBYMQ0Sd'(F;I69 P4&ctiZ՚JuIزymMX`$3]div%owmwM(rӣT"\ԏ|Ra6|~~{ifڳ3 Oȡgw1b[\ewgRf/y "l!xo/Oi(cR̎ۙ_a8D3e:hQ 8z{qR;4ֵjC[dPWW:xsҵm(g n>y;~ $nQK݌č7ǟUGgf%ncͷgD}'w nz{7}錹oY<`ѿI`d0/a$O :iZ~%ʊYȊ5OGY1!"+YDY!~EVYAYD+DNc;nψT^7 <x">%bw7QI,\U .vp9F]]Kɭ_e^Y&fTռ4$I߻vm,u ]@~Eol#d-B>h˸g'inj~Αsū;xgMejUviKq3;:j QCbΧU{1aN1Aq#F( X$"tN:+վ*Mɳ{{Sh*Bs[W3ԣ]ae sgn[4f Giv շ=\o߁j<6j:rgSR~ec[-=jű5%b z~iհbptVŌoiR0Vz3oUW'va9+ROy ecQv/G~R}1dz4eZ߆ZWE6s,qW{ܳ 5t5QNv!_vҩZauVhݺ Euu;\ejެ[ mhn%q[7CQny;Qv,ea[玠4+W2:er# l2k [#[hO5َj_(WeaAEJBi&֞jA-HqJsx!E\B5 !A`QP\5A`3$"SVDnL7CN5Pr%LTYP8ř.%0.0p%K(b.WF5\Kn"bGooĞ+m5v^:yC/Ea8 N3rݤ]?/ؤPkPv0(`>C2Iw7[ e!IB(Fs8q^R $dA!bSVv$$Sjdt bB |B6h"-8^qDq` ǾHRc~WvKsM 7ǂZUˁ}!)FWU&dHfvIzRnFP8S iTTw2J%Ae(t (?& 2 A<4k*@#2Ѯ6!ҍC*1cAFRy%oyBqjӈx{lB0sL|:4*nJUev.sygۧh_|29 јKtw>݉JQ7L j^Ro=(y7S|O LҌ:a\? ()nNDR^Do Jt*v2b^Ql((8  ;tMb{'x9Kɤ jzۻ}Xrec:HnF_~y%m M4ǦGqny7&[uSz\ĘN7RۄYaJѼ[~H6|&ۦк;?oJS!r&6PNA؁$&rTժڀcWs'y*R{T+7-CB5Рu\ŢAݥ) b ǹTo> h~Ǩ8*-pTD6T>p$sb1GKQvE=9j sQlpѨ)s/Z5*t;m @ B(USuɏQ]xLϓlr% `r1QX&N D=Jm~NHZW o5QA%;xh,"/jA&p+o\.tZU\f4ra M4˦9jۻ{Mrec:HnN}Fz!,76e$p~Ī󅟓qe߇uE8I >y,?ݜļ9 |>v0/#3(G:ų"筓CM.]z"F/ 1yzyvw:<;z8!cKx-J~I4bI_t*`^'!8th(8APRTN0(a>_4]:7l7_dOHٛO}/E]:ܐn|g~J9kO*;^.n)MUҐ>=,?07o(C[~̉ a}]\<'-vϿ)/|Qrps}ժ=yw+tt<>ռ" Dy 8!ʔU| a•BvyI9[@\u]@. B bnIBa c"2039*ei-M]Eq.S4.4u& B \z^C/4]qyF̀te6jk>/CS/_l%ݼMu~2JHSR@V@.LXYZ.mI* 3Iݛ LwqEO[?~LګMj Z3E%ϒF˸ <%Yۢ(mR '%n"? SOc!1OD7L<ѫ\׬ %|l CfFgb[-MXy.X+%Sޡru n\,tD*kIPɐ,Ehk ʋRL7U!l-l `e 62 ۞ eHL|]%DLZX26 GnA0'J3_>>"ND)]bk *jQN^J \.T4y~:bZ8nQ!4 :&l >||EGM\z8b+ uic,mU A*o HDYy[՞2 ,Wp 1ծ''Gi ށ|'=!P` ^KaT>җp%GCC%5ͨ]#ŎD=F"5b x p-C[7IHk]DhF"5DG&["F'Dzx^*f{HH3Zw٫n Uc밂ZC͎&^3^K b縀8=~/r) X_8׸,`z8!c$a(1XR XedYx%y ;G[U(`.v  ԌUH!?ԏB@ub;,`-;m(pJWbdmƼUEn4jфCz,(D<&Z^+ju%z/uEuXP2!ϒmPUjMMB!M*oR%.l&d#u2xyL@Gm=Бؽ%I8v/4\(;7&` M{L^bR WLOJ$ TI2 >vWITïn0,1K$UQ JȪ1W Kiβc> 05ꬣ^:oRuO7RksqdΈbbD&`)9Zq^Ɉ PMýq5(%d"BOvO j4E>> `gzЌp-a{H3i6ÌZ٫n*UƜ[܉H!‡cijKXs3j9. NPm`<,U!FcX8׸RE=jKQn:8jŒJb3ױ ~$ "K]sFW0bg@F3GRl$b\RAUHD։ uY?2* X* $Ӌ xA!sNQa2h,Δ+KJC)k4P]q(b(bAmPM#sUtuZ -I2V3dhhI.Z\G*:%o*zJSu&[EgǾVuq |5W=|0|n8 Uz1G?]nNrqles|o/C>Yoq.سdur=~Uck sӔhEj oDvN(mׂ3fLB?a#զ:Oa3gLό迶}T_mRHk JOJ!> ~jjFT@ jjݮ;[YyKvbgjjٗ2g_J ٗrg.|)GaڤVzV=&U 1eOKїwSvar/home/core/zuul-output/logs/kubelet.log0000644000000000000000005570033215157260441017707 0ustar rootrootMar 20 13:21:18 crc systemd[1]: Starting Kubernetes Kubelet... Mar 20 13:21:18 crc restorecon[4687]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:18 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:21:19 crc restorecon[4687]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 13:21:19 crc restorecon[4687]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 20 13:21:19 crc kubenswrapper[4973]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 13:21:19 crc kubenswrapper[4973]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 20 13:21:19 crc kubenswrapper[4973]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 13:21:19 crc kubenswrapper[4973]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 13:21:19 crc kubenswrapper[4973]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 13:21:19 crc kubenswrapper[4973]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.691464 4973 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.703772 4973 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704669 4973 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704697 4973 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704708 4973 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704716 4973 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704729 4973 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704740 4973 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704748 4973 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704756 4973 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704764 4973 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704772 4973 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704780 4973 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704788 4973 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704795 4973 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704804 4973 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704812 4973 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704820 4973 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704828 4973 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704836 4973 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704844 4973 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704851 4973 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704859 4973 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704868 4973 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704875 4973 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704883 4973 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704891 4973 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704899 4973 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704906 4973 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704914 4973 feature_gate.go:330] unrecognized feature gate: Example Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704923 4973 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704931 4973 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704939 4973 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704947 4973 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704954 4973 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704962 4973 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704969 4973 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704979 4973 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704987 4973 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.704994 4973 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705002 4973 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705009 4973 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705017 4973 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705024 4973 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705032 4973 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705039 4973 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705047 4973 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705055 4973 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705063 4973 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705070 4973 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705078 4973 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705086 4973 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705093 4973 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705101 4973 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705108 4973 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705116 4973 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705124 4973 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705134 4973 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705146 4973 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705154 4973 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705163 4973 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705171 4973 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705180 4973 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705188 4973 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705200 4973 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705209 4973 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705218 4973 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705226 4973 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705234 4973 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705244 4973 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705252 4973 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.705261 4973 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706231 4973 flags.go:64] FLAG: --address="0.0.0.0" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706259 4973 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706273 4973 flags.go:64] FLAG: --anonymous-auth="true" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706285 4973 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706296 4973 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706305 4973 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706317 4973 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706328 4973 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706365 4973 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706374 4973 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706384 4973 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706397 4973 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706407 4973 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706416 4973 flags.go:64] FLAG: --cgroup-root="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706424 4973 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706434 4973 flags.go:64] FLAG: --client-ca-file="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706443 4973 flags.go:64] FLAG: --cloud-config="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706452 4973 flags.go:64] FLAG: --cloud-provider="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706461 4973 flags.go:64] FLAG: --cluster-dns="[]" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706472 4973 flags.go:64] FLAG: --cluster-domain="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706481 4973 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706490 4973 flags.go:64] FLAG: --config-dir="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706498 4973 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706508 4973 flags.go:64] FLAG: --container-log-max-files="5" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706519 4973 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706528 4973 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706537 4973 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706546 4973 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706556 4973 flags.go:64] FLAG: --contention-profiling="false" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706565 4973 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706574 4973 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706583 4973 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706592 4973 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706603 4973 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706612 4973 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706621 4973 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706629 4973 flags.go:64] FLAG: --enable-load-reader="false" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706638 4973 flags.go:64] FLAG: --enable-server="true" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706647 4973 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706658 4973 flags.go:64] FLAG: --event-burst="100" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706667 4973 flags.go:64] FLAG: --event-qps="50" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706676 4973 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706684 4973 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706693 4973 flags.go:64] FLAG: --eviction-hard="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706704 4973 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706714 4973 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706723 4973 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706736 4973 flags.go:64] FLAG: --eviction-soft="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706744 4973 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706754 4973 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706763 4973 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706772 4973 flags.go:64] FLAG: --experimental-mounter-path="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706780 4973 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706789 4973 flags.go:64] FLAG: --fail-swap-on="true" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706798 4973 flags.go:64] FLAG: --feature-gates="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706820 4973 flags.go:64] FLAG: --file-check-frequency="20s" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706829 4973 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706838 4973 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706847 4973 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706857 4973 flags.go:64] FLAG: --healthz-port="10248" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706866 4973 flags.go:64] FLAG: --help="false" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706875 4973 flags.go:64] FLAG: --hostname-override="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706884 4973 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706893 4973 flags.go:64] FLAG: --http-check-frequency="20s" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706902 4973 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706911 4973 flags.go:64] FLAG: --image-credential-provider-config="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706920 4973 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706929 4973 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706937 4973 flags.go:64] FLAG: --image-service-endpoint="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706946 4973 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706955 4973 flags.go:64] FLAG: --kube-api-burst="100" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706964 4973 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706973 4973 flags.go:64] FLAG: --kube-api-qps="50" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706982 4973 flags.go:64] FLAG: --kube-reserved="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706991 4973 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.706999 4973 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707009 4973 flags.go:64] FLAG: --kubelet-cgroups="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707017 4973 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707026 4973 flags.go:64] FLAG: --lock-file="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707037 4973 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707047 4973 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707055 4973 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707068 4973 flags.go:64] FLAG: --log-json-split-stream="false" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707079 4973 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707087 4973 flags.go:64] FLAG: --log-text-split-stream="false" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707097 4973 flags.go:64] FLAG: --logging-format="text" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707105 4973 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707115 4973 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707124 4973 flags.go:64] FLAG: --manifest-url="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707133 4973 flags.go:64] FLAG: --manifest-url-header="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707144 4973 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707154 4973 flags.go:64] FLAG: --max-open-files="1000000" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707165 4973 flags.go:64] FLAG: --max-pods="110" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707174 4973 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707183 4973 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707192 4973 flags.go:64] FLAG: --memory-manager-policy="None" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707201 4973 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707210 4973 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707219 4973 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707228 4973 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707248 4973 flags.go:64] FLAG: --node-status-max-images="50" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707257 4973 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707267 4973 flags.go:64] FLAG: --oom-score-adj="-999" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707277 4973 flags.go:64] FLAG: --pod-cidr="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707286 4973 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707299 4973 flags.go:64] FLAG: --pod-manifest-path="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707308 4973 flags.go:64] FLAG: --pod-max-pids="-1" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707317 4973 flags.go:64] FLAG: --pods-per-core="0" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707326 4973 flags.go:64] FLAG: --port="10250" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707356 4973 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707366 4973 flags.go:64] FLAG: --provider-id="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707375 4973 flags.go:64] FLAG: --qos-reserved="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707384 4973 flags.go:64] FLAG: --read-only-port="10255" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707393 4973 flags.go:64] FLAG: --register-node="true" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707402 4973 flags.go:64] FLAG: --register-schedulable="true" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707410 4973 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707425 4973 flags.go:64] FLAG: --registry-burst="10" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707433 4973 flags.go:64] FLAG: --registry-qps="5" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707442 4973 flags.go:64] FLAG: --reserved-cpus="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707452 4973 flags.go:64] FLAG: --reserved-memory="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707463 4973 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707473 4973 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707482 4973 flags.go:64] FLAG: --rotate-certificates="false" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707490 4973 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707499 4973 flags.go:64] FLAG: --runonce="false" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707508 4973 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707518 4973 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707527 4973 flags.go:64] FLAG: --seccomp-default="false" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707536 4973 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707545 4973 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707554 4973 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707563 4973 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707572 4973 flags.go:64] FLAG: --storage-driver-password="root" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707581 4973 flags.go:64] FLAG: --storage-driver-secure="false" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707590 4973 flags.go:64] FLAG: --storage-driver-table="stats" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707599 4973 flags.go:64] FLAG: --storage-driver-user="root" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707607 4973 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707617 4973 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707626 4973 flags.go:64] FLAG: --system-cgroups="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707635 4973 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707649 4973 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707657 4973 flags.go:64] FLAG: --tls-cert-file="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707666 4973 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707678 4973 flags.go:64] FLAG: --tls-min-version="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707687 4973 flags.go:64] FLAG: --tls-private-key-file="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707696 4973 flags.go:64] FLAG: --topology-manager-policy="none" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707705 4973 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707714 4973 flags.go:64] FLAG: --topology-manager-scope="container" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707723 4973 flags.go:64] FLAG: --v="2" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707743 4973 flags.go:64] FLAG: --version="false" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707753 4973 flags.go:64] FLAG: --vmodule="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707764 4973 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.707773 4973 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708010 4973 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708021 4973 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708032 4973 feature_gate.go:330] unrecognized feature gate: Example Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708041 4973 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708049 4973 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708058 4973 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708066 4973 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708073 4973 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708081 4973 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708089 4973 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708097 4973 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708104 4973 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708112 4973 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708120 4973 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708128 4973 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708136 4973 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708143 4973 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708151 4973 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708159 4973 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708166 4973 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708175 4973 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708183 4973 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708192 4973 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708201 4973 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708209 4973 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708216 4973 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708224 4973 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708232 4973 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708239 4973 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708248 4973 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708256 4973 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708264 4973 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708272 4973 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708279 4973 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708287 4973 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708295 4973 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708303 4973 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708310 4973 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708319 4973 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708329 4973 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708362 4973 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708372 4973 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708380 4973 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708389 4973 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708398 4973 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708406 4973 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708414 4973 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708423 4973 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708431 4973 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708438 4973 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708449 4973 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708458 4973 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708467 4973 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708475 4973 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708484 4973 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708492 4973 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708499 4973 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708508 4973 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708515 4973 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708523 4973 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708530 4973 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708538 4973 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708546 4973 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708554 4973 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708564 4973 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708573 4973 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708583 4973 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708593 4973 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708603 4973 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708612 4973 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.708619 4973 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.708633 4973 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.721522 4973 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.722068 4973 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722211 4973 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722233 4973 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722242 4973 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722252 4973 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722262 4973 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722270 4973 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722281 4973 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722293 4973 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722303 4973 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722312 4973 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722322 4973 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722332 4973 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722396 4973 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722411 4973 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722421 4973 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722429 4973 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722438 4973 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722445 4973 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722453 4973 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722464 4973 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722474 4973 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722483 4973 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722491 4973 feature_gate.go:330] unrecognized feature gate: Example Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722499 4973 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722507 4973 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722515 4973 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722523 4973 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722530 4973 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722538 4973 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722545 4973 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722553 4973 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722562 4973 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722569 4973 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722577 4973 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722587 4973 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722596 4973 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722604 4973 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722612 4973 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722619 4973 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722629 4973 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722640 4973 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722649 4973 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722661 4973 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722680 4973 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722697 4973 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722709 4973 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722721 4973 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722730 4973 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722738 4973 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722746 4973 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722754 4973 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722762 4973 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722769 4973 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722777 4973 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722785 4973 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722793 4973 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722801 4973 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722809 4973 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722817 4973 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722824 4973 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722832 4973 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722840 4973 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722848 4973 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722855 4973 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722863 4973 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722874 4973 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722882 4973 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722890 4973 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722898 4973 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722905 4973 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.722918 4973 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.722932 4973 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723157 4973 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723169 4973 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723178 4973 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723186 4973 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723196 4973 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723204 4973 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723212 4973 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723220 4973 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723228 4973 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723236 4973 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723244 4973 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723252 4973 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723261 4973 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723270 4973 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723280 4973 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723290 4973 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723299 4973 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723310 4973 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723319 4973 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723328 4973 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723377 4973 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723397 4973 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723409 4973 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723420 4973 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723430 4973 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723442 4973 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723452 4973 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723461 4973 feature_gate.go:330] unrecognized feature gate: Example Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723471 4973 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723480 4973 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723490 4973 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723500 4973 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723509 4973 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723516 4973 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723527 4973 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723535 4973 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723543 4973 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723551 4973 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723558 4973 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723566 4973 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723574 4973 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723582 4973 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723591 4973 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723600 4973 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723611 4973 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723621 4973 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723631 4973 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723640 4973 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723650 4973 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.723659 4973 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.724595 4973 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.724619 4973 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.724636 4973 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.724649 4973 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.724660 4973 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.724669 4973 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.724678 4973 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.724687 4973 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.724695 4973 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.724715 4973 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.724724 4973 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.724732 4973 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.724740 4973 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.724748 4973 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.724759 4973 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.724769 4973 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.724777 4973 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.724786 4973 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.724795 4973 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.724803 4973 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.724823 4973 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.724836 4973 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.725178 4973 server.go:940] "Client rotation is on, will bootstrap in background" Mar 20 13:21:19 crc kubenswrapper[4973]: E0320 13:21:19.730366 4973 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.734804 4973 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.734938 4973 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.736925 4973 server.go:997] "Starting client certificate rotation" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.736972 4973 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.737180 4973 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.765041 4973 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 13:21:19 crc kubenswrapper[4973]: E0320 13:21:19.765576 4973 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.767841 4973 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.786464 4973 log.go:25] "Validated CRI v1 runtime API" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.829548 4973 log.go:25] "Validated CRI v1 image API" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.831780 4973 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.839489 4973 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-20-13-16-29-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.839542 4973 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.860726 4973 manager.go:217] Machine: {Timestamp:2026-03-20 13:21:19.85725174 +0000 UTC m=+0.600921554 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:bbb3e27f-a5bd-49db-8577-6a161b4912bb BootID:b12408c6-1427-4b07-880e-3523bdf11c11 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:19:ea:5e Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:19:ea:5e Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:a6:a8:08 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:be:4c:a6 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:15:ea:76 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:97:23:f8 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:e2:88:bb:66:35:7b Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:72:d1:a4:98:57:0e Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.861111 4973 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.861438 4973 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.861870 4973 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.862187 4973 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.862247 4973 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.862652 4973 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.862670 4973 container_manager_linux.go:303] "Creating device plugin manager" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.863233 4973 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.863285 4973 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.864741 4973 state_mem.go:36] "Initialized new in-memory state store" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.864886 4973 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.868542 4973 kubelet.go:418] "Attempting to sync node with API server" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.868588 4973 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.868699 4973 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.868722 4973 kubelet.go:324] "Adding apiserver pod source" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.868741 4973 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.874001 4973 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.874432 4973 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:21:19 crc kubenswrapper[4973]: E0320 13:21:19.874513 4973 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.875319 4973 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.875363 4973 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:21:19 crc kubenswrapper[4973]: E0320 13:21:19.875463 4973 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.878684 4973 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.880875 4973 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.880929 4973 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.880945 4973 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.880958 4973 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.880979 4973 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.880995 4973 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.881010 4973 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.881194 4973 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.881227 4973 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.881245 4973 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.881269 4973 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.881295 4973 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.883784 4973 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.884529 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.884716 4973 server.go:1280] "Started kubelet" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.884878 4973 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.884981 4973 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.885783 4973 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.889964 4973 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.890008 4973 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.890206 4973 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.890249 4973 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.890883 4973 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 20 13:21:19 crc systemd[1]: Started Kubernetes Kubelet. Mar 20 13:21:19 crc kubenswrapper[4973]: E0320 13:21:19.891668 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.891674 4973 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:21:19 crc kubenswrapper[4973]: E0320 13:21:19.891616 4973 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="200ms" Mar 20 13:21:19 crc kubenswrapper[4973]: E0320 13:21:19.891757 4973 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.895189 4973 factory.go:55] Registering systemd factory Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.895226 4973 factory.go:221] Registration of the systemd container factory successfully Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.895609 4973 factory.go:153] Registering CRI-O factory Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.895884 4973 factory.go:221] Registration of the crio container factory successfully Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.896200 4973 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.896256 4973 factory.go:103] Registering Raw factory Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.896287 4973 manager.go:1196] Started watching for new ooms in manager Mar 20 13:21:19 crc kubenswrapper[4973]: E0320 13:21:19.897218 4973 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.75:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e8f50ac7e2e59 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.884660313 +0000 UTC m=+0.628330097,LastTimestamp:2026-03-20 13:21:19.884660313 +0000 UTC m=+0.628330097,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.899381 4973 manager.go:319] Starting recovery of all containers Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.899784 4973 server.go:460] "Adding debug handlers to kubelet server" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.906760 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.907080 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.907177 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.907258 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.907369 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.907465 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.907544 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.907623 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.907702 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.907793 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.907880 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.907958 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.908032 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.908111 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.908198 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.908295 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.908457 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.908556 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.908634 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.908742 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.908832 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.908941 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.909044 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.909172 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.909289 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.909456 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.909573 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.909690 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.909796 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.909927 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.910047 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.910150 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.910246 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.910503 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.910632 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.910749 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.910862 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.910972 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.911077 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.911195 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.911314 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.911473 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.911588 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.911700 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.911813 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.911931 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.912055 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.912148 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.912232 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.912317 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.912428 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.912519 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.912604 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.912686 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.912762 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.912838 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.912913 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.913078 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.913162 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.913240 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.913317 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.913420 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.913510 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.913594 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.913671 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.913746 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.913823 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.913930 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.914016 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.914098 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.914182 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.914280 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.914429 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.914545 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.914634 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.914716 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.914792 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.914870 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.914953 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.915031 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.915113 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.915193 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.915276 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.915381 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.915465 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.915540 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.915615 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.915690 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.915779 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.915866 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.915957 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916041 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916194 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916246 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916265 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916280 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916294 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916308 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916320 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916330 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916370 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916386 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916400 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916413 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916435 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916450 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916465 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916477 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916490 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916502 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916515 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916527 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916538 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916548 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916559 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916569 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916579 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916589 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916600 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916611 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916620 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916630 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916641 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916654 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916667 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916710 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916727 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916741 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916754 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916766 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916777 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.916791 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918322 4973 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918371 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918389 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918406 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918422 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918436 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918449 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918463 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918478 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918493 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918509 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918521 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918534 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918544 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918553 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918587 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918597 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918606 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918616 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918625 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918634 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918644 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918653 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918663 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918673 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918682 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918854 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918875 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918903 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918918 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918932 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918953 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918966 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.918995 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919011 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919024 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919045 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919066 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919084 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919098 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919111 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919131 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919146 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919164 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919177 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919189 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919202 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919216 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919235 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919249 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919265 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919286 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919302 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919325 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919370 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919384 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919553 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919586 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919612 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919655 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919679 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919717 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919744 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919768 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919804 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919828 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919864 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919887 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919910 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919945 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.919972 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.920005 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.920027 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.920048 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.920083 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.920105 4973 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.920126 4973 reconstruct.go:97] "Volume reconstruction finished" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.920139 4973 reconciler.go:26] "Reconciler: start to sync state" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.924250 4973 manager.go:324] Recovery completed Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.938520 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.940510 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.940559 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.940576 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.941917 4973 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.941949 4973 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.941971 4973 state_mem.go:36] "Initialized new in-memory state store" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.947749 4973 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.949223 4973 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.949262 4973 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.949296 4973 kubelet.go:2335] "Starting kubelet main sync loop" Mar 20 13:21:19 crc kubenswrapper[4973]: E0320 13:21:19.949742 4973 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 13:21:19 crc kubenswrapper[4973]: W0320 13:21:19.951548 4973 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:21:19 crc kubenswrapper[4973]: E0320 13:21:19.951626 4973 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.959236 4973 policy_none.go:49] "None policy: Start" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.960202 4973 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 13:21:19 crc kubenswrapper[4973]: I0320 13:21:19.960239 4973 state_mem.go:35] "Initializing new in-memory state store" Mar 20 13:21:19 crc kubenswrapper[4973]: E0320 13:21:19.991932 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.015470 4973 manager.go:334] "Starting Device Plugin manager" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.015528 4973 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.015544 4973 server.go:79] "Starting device plugin registration server" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.016049 4973 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.016075 4973 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.016662 4973 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.016819 4973 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.016838 4973 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 13:21:20 crc kubenswrapper[4973]: E0320 13:21:20.025451 4973 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.050720 4973 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.050890 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.052035 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.052082 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.052092 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.052263 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.052620 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.052677 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.053112 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.053190 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.053211 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.053477 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.053495 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.053654 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.053706 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.053813 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.053899 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.054564 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.054590 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.054602 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.054669 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.054688 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.054698 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.054701 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.054833 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.054859 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.055475 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.055530 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.055545 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.056468 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.056550 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.056561 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.056692 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.056880 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.056929 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.057699 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.057717 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.057724 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.057856 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.057883 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.058188 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.058204 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.058211 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.058681 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.058698 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.058705 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:20 crc kubenswrapper[4973]: E0320 13:21:20.092411 4973 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="400ms" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.116418 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.119947 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.119996 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.120008 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.120037 4973 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:21:20 crc kubenswrapper[4973]: E0320 13:21:20.120589 4973 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.122114 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.122164 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.122187 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.122208 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.122243 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.122258 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.122276 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.122308 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.122330 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.122365 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.122381 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.122399 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.122430 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.122448 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.122469 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.223323 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.223406 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.223443 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.223477 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.223507 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.223531 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.223553 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.223574 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.223598 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.223619 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.223641 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.223644 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.223657 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.223723 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.223665 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.223767 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.223801 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.223761 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.223754 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.223810 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.223877 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.223777 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.223809 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.224008 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.223780 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.223807 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.223801 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.223972 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.224090 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.224141 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.321532 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.322921 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.322986 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.323003 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.323034 4973 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:21:20 crc kubenswrapper[4973]: E0320 13:21:20.323640 4973 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.378909 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.384623 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.404591 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.418837 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: W0320 13:21:20.425954 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-33eedacc12b7e238dd2c7c102eab59957fb1baf4f7f0c600736e3bd5f122198d WatchSource:0}: Error finding container 33eedacc12b7e238dd2c7c102eab59957fb1baf4f7f0c600736e3bd5f122198d: Status 404 returned error can't find the container with id 33eedacc12b7e238dd2c7c102eab59957fb1baf4f7f0c600736e3bd5f122198d Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.428774 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 13:21:20 crc kubenswrapper[4973]: W0320 13:21:20.429844 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-34ab0240095ca0e61d1dad3d9f8c514ea7e741e8cb71cf56f4339f3967ed74d3 WatchSource:0}: Error finding container 34ab0240095ca0e61d1dad3d9f8c514ea7e741e8cb71cf56f4339f3967ed74d3: Status 404 returned error can't find the container with id 34ab0240095ca0e61d1dad3d9f8c514ea7e741e8cb71cf56f4339f3967ed74d3 Mar 20 13:21:20 crc kubenswrapper[4973]: W0320 13:21:20.431988 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-e50098ff4a5f4bf6d3f717da8bfaf4d3a1de5e9e51562db4d60b0b0694aae1e9 WatchSource:0}: Error finding container e50098ff4a5f4bf6d3f717da8bfaf4d3a1de5e9e51562db4d60b0b0694aae1e9: Status 404 returned error can't find the container with id e50098ff4a5f4bf6d3f717da8bfaf4d3a1de5e9e51562db4d60b0b0694aae1e9 Mar 20 13:21:20 crc kubenswrapper[4973]: W0320 13:21:20.437886 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-befcb56ddd5d7c61e8ef34038117af787c64db1667d03f3fb8d59c69a80f5c4a WatchSource:0}: Error finding container befcb56ddd5d7c61e8ef34038117af787c64db1667d03f3fb8d59c69a80f5c4a: Status 404 returned error can't find the container with id befcb56ddd5d7c61e8ef34038117af787c64db1667d03f3fb8d59c69a80f5c4a Mar 20 13:21:20 crc kubenswrapper[4973]: W0320 13:21:20.450727 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-08f40b2a5dfb43d7f8601931683cad899e99e84642dbe4150ca33deb87364fa9 WatchSource:0}: Error finding container 08f40b2a5dfb43d7f8601931683cad899e99e84642dbe4150ca33deb87364fa9: Status 404 returned error can't find the container with id 08f40b2a5dfb43d7f8601931683cad899e99e84642dbe4150ca33deb87364fa9 Mar 20 13:21:20 crc kubenswrapper[4973]: E0320 13:21:20.494309 4973 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="800ms" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.724717 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.726423 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.726455 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.726464 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.726486 4973 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:21:20 crc kubenswrapper[4973]: E0320 13:21:20.726984 4973 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Mar 20 13:21:20 crc kubenswrapper[4973]: W0320 13:21:20.806046 4973 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:21:20 crc kubenswrapper[4973]: E0320 13:21:20.806154 4973 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.885440 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.952774 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"08f40b2a5dfb43d7f8601931683cad899e99e84642dbe4150ca33deb87364fa9"} Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.953789 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"befcb56ddd5d7c61e8ef34038117af787c64db1667d03f3fb8d59c69a80f5c4a"} Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.954713 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e50098ff4a5f4bf6d3f717da8bfaf4d3a1de5e9e51562db4d60b0b0694aae1e9"} Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.955434 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"34ab0240095ca0e61d1dad3d9f8c514ea7e741e8cb71cf56f4339f3967ed74d3"} Mar 20 13:21:20 crc kubenswrapper[4973]: I0320 13:21:20.956494 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"33eedacc12b7e238dd2c7c102eab59957fb1baf4f7f0c600736e3bd5f122198d"} Mar 20 13:21:20 crc kubenswrapper[4973]: W0320 13:21:20.989911 4973 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:21:20 crc kubenswrapper[4973]: E0320 13:21:20.990012 4973 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:21:21 crc kubenswrapper[4973]: W0320 13:21:21.217230 4973 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:21:21 crc kubenswrapper[4973]: E0320 13:21:21.217434 4973 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:21:21 crc kubenswrapper[4973]: E0320 13:21:21.295364 4973 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="1.6s" Mar 20 13:21:21 crc kubenswrapper[4973]: W0320 13:21:21.390798 4973 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:21:21 crc kubenswrapper[4973]: E0320 13:21:21.390909 4973 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.527300 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.528905 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.528942 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.528954 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.528976 4973 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:21:21 crc kubenswrapper[4973]: E0320 13:21:21.529431 4973 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.841588 4973 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:21:21 crc kubenswrapper[4973]: E0320 13:21:21.849051 4973 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.885664 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.960934 4973 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989" exitCode=0 Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.961032 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989"} Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.961092 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.962034 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.962067 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.962080 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.962500 4973 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9" exitCode=0 Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.962560 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9"} Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.962681 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.963602 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.963639 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.963651 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.963666 4973 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003" exitCode=0 Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.963710 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.963720 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003"} Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.964399 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.965204 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.965232 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.965242 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.965245 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.965260 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.965384 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.965688 4973 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="58875c22f02f77b2e02162cc91ff1ccf330bae82634752c372e67bdfb78e06ae" exitCode=0 Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.965734 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"58875c22f02f77b2e02162cc91ff1ccf330bae82634752c372e67bdfb78e06ae"} Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.965794 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.966528 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.966563 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.966574 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.968892 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"49fb5f33c67890448955d4851c677b66d07065ef5711938d3c96aae65082166d"} Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.968916 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"49856b1129ff41ad2951ef95db8bc9cc9faff376293559651f05b9905fb3c11c"} Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.968930 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0da83b4a45a3b65af0d17acc529fb5991668c1603f5f6b88b1476f579ad029e0"} Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.968941 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"787de60574f16b2026ae38a99d60b46b6329107ce49531d094fb400a8b010e67"} Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.968959 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.969971 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.970013 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:21 crc kubenswrapper[4973]: I0320 13:21:21.970025 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:22 crc kubenswrapper[4973]: E0320 13:21:22.415075 4973 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.75:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e8f50ac7e2e59 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.884660313 +0000 UTC m=+0.628330097,LastTimestamp:2026-03-20 13:21:19.884660313 +0000 UTC m=+0.628330097,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:22 crc kubenswrapper[4973]: I0320 13:21:22.886082 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:21:22 crc kubenswrapper[4973]: E0320 13:21:22.896448 4973 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="3.2s" Mar 20 13:21:22 crc kubenswrapper[4973]: I0320 13:21:22.972899 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b73942ac042b1d677693f81c2a365d5c6922c4843073c5754aee0c6007a87814"} Mar 20 13:21:22 crc kubenswrapper[4973]: I0320 13:21:22.972940 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0b6a3b9c865bf4d2dbabc06a94edba659a61fe90590cb2ced9f0284d6b204bd1"} Mar 20 13:21:22 crc kubenswrapper[4973]: I0320 13:21:22.972950 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2599b537b19ffc88d8144a6613b7812f3ab5e582d3ae0706db4919c06465c1e1"} Mar 20 13:21:22 crc kubenswrapper[4973]: I0320 13:21:22.973032 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:22 crc kubenswrapper[4973]: I0320 13:21:22.973717 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:22 crc kubenswrapper[4973]: I0320 13:21:22.973741 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:22 crc kubenswrapper[4973]: I0320 13:21:22.973749 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:22 crc kubenswrapper[4973]: I0320 13:21:22.978544 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a"} Mar 20 13:21:22 crc kubenswrapper[4973]: I0320 13:21:22.978574 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168"} Mar 20 13:21:22 crc kubenswrapper[4973]: I0320 13:21:22.978587 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b"} Mar 20 13:21:22 crc kubenswrapper[4973]: I0320 13:21:22.978611 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d"} Mar 20 13:21:22 crc kubenswrapper[4973]: I0320 13:21:22.979663 4973 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e" exitCode=0 Mar 20 13:21:22 crc kubenswrapper[4973]: I0320 13:21:22.979719 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e"} Mar 20 13:21:22 crc kubenswrapper[4973]: I0320 13:21:22.979814 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:22 crc kubenswrapper[4973]: I0320 13:21:22.981699 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:22 crc kubenswrapper[4973]: I0320 13:21:22.981721 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:22 crc kubenswrapper[4973]: I0320 13:21:22.981729 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:22 crc kubenswrapper[4973]: I0320 13:21:22.988128 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:22 crc kubenswrapper[4973]: I0320 13:21:22.988532 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:22 crc kubenswrapper[4973]: I0320 13:21:22.988565 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da"} Mar 20 13:21:22 crc kubenswrapper[4973]: I0320 13:21:22.989207 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:22 crc kubenswrapper[4973]: I0320 13:21:22.989230 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:22 crc kubenswrapper[4973]: I0320 13:21:22.989239 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:22 crc kubenswrapper[4973]: I0320 13:21:22.989637 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:22 crc kubenswrapper[4973]: I0320 13:21:22.989655 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:22 crc kubenswrapper[4973]: I0320 13:21:22.989663 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:23 crc kubenswrapper[4973]: I0320 13:21:23.130399 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:23 crc kubenswrapper[4973]: I0320 13:21:23.131662 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:23 crc kubenswrapper[4973]: I0320 13:21:23.131689 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:23 crc kubenswrapper[4973]: I0320 13:21:23.131698 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:23 crc kubenswrapper[4973]: I0320 13:21:23.131720 4973 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:21:23 crc kubenswrapper[4973]: E0320 13:21:23.132077 4973 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.75:6443: connect: connection refused" node="crc" Mar 20 13:21:23 crc kubenswrapper[4973]: W0320 13:21:23.143063 4973 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:21:23 crc kubenswrapper[4973]: E0320 13:21:23.143133 4973 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:21:23 crc kubenswrapper[4973]: W0320 13:21:23.364689 4973 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:21:23 crc kubenswrapper[4973]: E0320 13:21:23.364774 4973 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.75:6443: connect: connection refused" logger="UnhandledError" Mar 20 13:21:23 crc kubenswrapper[4973]: I0320 13:21:23.993896 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4857f3b032270079a55b8896c07f96b3a2381dc6d3af74c97bc29a68bc46c3e2"} Mar 20 13:21:23 crc kubenswrapper[4973]: I0320 13:21:23.994030 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:23 crc kubenswrapper[4973]: I0320 13:21:23.994821 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:23 crc kubenswrapper[4973]: I0320 13:21:23.994843 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:23 crc kubenswrapper[4973]: I0320 13:21:23.994853 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:23 crc kubenswrapper[4973]: I0320 13:21:23.996974 4973 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3" exitCode=0 Mar 20 13:21:23 crc kubenswrapper[4973]: I0320 13:21:23.996998 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3"} Mar 20 13:21:23 crc kubenswrapper[4973]: I0320 13:21:23.997067 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:23 crc kubenswrapper[4973]: I0320 13:21:23.997109 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:23 crc kubenswrapper[4973]: I0320 13:21:23.997149 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:23 crc kubenswrapper[4973]: I0320 13:21:23.997188 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:21:23 crc kubenswrapper[4973]: I0320 13:21:23.997701 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:23 crc kubenswrapper[4973]: I0320 13:21:23.997749 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:23 crc kubenswrapper[4973]: I0320 13:21:23.997766 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:23 crc kubenswrapper[4973]: I0320 13:21:23.998111 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:23 crc kubenswrapper[4973]: I0320 13:21:23.998141 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:23 crc kubenswrapper[4973]: I0320 13:21:23.998139 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:23 crc kubenswrapper[4973]: I0320 13:21:23.998153 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:23 crc kubenswrapper[4973]: I0320 13:21:23.998174 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:23 crc kubenswrapper[4973]: I0320 13:21:23.998188 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:25 crc kubenswrapper[4973]: I0320 13:21:25.003301 4973 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:21:25 crc kubenswrapper[4973]: I0320 13:21:25.003371 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:25 crc kubenswrapper[4973]: I0320 13:21:25.003730 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ed276b5c92b41fc0f7761442a51338cd71a913e99cc605598fc379cbc1451651"} Mar 20 13:21:25 crc kubenswrapper[4973]: I0320 13:21:25.003788 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"15cc75da411a896570582ba77693c099b590adc36f8a41d5f43e0b94c5d6e97d"} Mar 20 13:21:25 crc kubenswrapper[4973]: I0320 13:21:25.003802 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f147336d21c520b769cc2412032e87f34909104d9249047521ad2e72f489564b"} Mar 20 13:21:25 crc kubenswrapper[4973]: I0320 13:21:25.003814 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3dba5b3c826c4891871a963069a09e7b3dceac4295aa4173204d50da0a23b2e3"} Mar 20 13:21:25 crc kubenswrapper[4973]: I0320 13:21:25.003981 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:25 crc kubenswrapper[4973]: I0320 13:21:25.004097 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:25 crc kubenswrapper[4973]: I0320 13:21:25.004117 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:25 crc kubenswrapper[4973]: I0320 13:21:25.004127 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:25 crc kubenswrapper[4973]: I0320 13:21:25.004883 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:25 crc kubenswrapper[4973]: I0320 13:21:25.004909 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:25 crc kubenswrapper[4973]: I0320 13:21:25.004920 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:25 crc kubenswrapper[4973]: I0320 13:21:25.301948 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:25 crc kubenswrapper[4973]: I0320 13:21:25.831550 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:25 crc kubenswrapper[4973]: I0320 13:21:25.860053 4973 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:21:26 crc kubenswrapper[4973]: I0320 13:21:26.012732 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ab4c87775e72595ceaeaf9ec3c2f6136b579c1f0b2f7b1542525e14da6b424ce"} Mar 20 13:21:26 crc kubenswrapper[4973]: I0320 13:21:26.012800 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:26 crc kubenswrapper[4973]: I0320 13:21:26.012800 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:26 crc kubenswrapper[4973]: I0320 13:21:26.013727 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:26 crc kubenswrapper[4973]: I0320 13:21:26.013758 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:26 crc kubenswrapper[4973]: I0320 13:21:26.013767 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:26 crc kubenswrapper[4973]: I0320 13:21:26.013988 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:26 crc kubenswrapper[4973]: I0320 13:21:26.014022 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:26 crc kubenswrapper[4973]: I0320 13:21:26.014035 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:26 crc kubenswrapper[4973]: I0320 13:21:26.104995 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:26 crc kubenswrapper[4973]: I0320 13:21:26.332492 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:26 crc kubenswrapper[4973]: I0320 13:21:26.334136 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:26 crc kubenswrapper[4973]: I0320 13:21:26.334194 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:26 crc kubenswrapper[4973]: I0320 13:21:26.334216 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:26 crc kubenswrapper[4973]: I0320 13:21:26.334258 4973 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:21:26 crc kubenswrapper[4973]: I0320 13:21:26.929477 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:26 crc kubenswrapper[4973]: I0320 13:21:26.929691 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:26 crc kubenswrapper[4973]: I0320 13:21:26.930909 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:26 crc kubenswrapper[4973]: I0320 13:21:26.930940 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:26 crc kubenswrapper[4973]: I0320 13:21:26.930952 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:27 crc kubenswrapper[4973]: I0320 13:21:27.015164 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:27 crc kubenswrapper[4973]: I0320 13:21:27.015371 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:27 crc kubenswrapper[4973]: I0320 13:21:27.016250 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:27 crc kubenswrapper[4973]: I0320 13:21:27.016319 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:27 crc kubenswrapper[4973]: I0320 13:21:27.016370 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:27 crc kubenswrapper[4973]: I0320 13:21:27.016556 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:27 crc kubenswrapper[4973]: I0320 13:21:27.016608 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:27 crc kubenswrapper[4973]: I0320 13:21:27.016626 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:27 crc kubenswrapper[4973]: I0320 13:21:27.591549 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 20 13:21:27 crc kubenswrapper[4973]: I0320 13:21:27.600045 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:27 crc kubenswrapper[4973]: I0320 13:21:27.600478 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:27 crc kubenswrapper[4973]: I0320 13:21:27.602419 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:27 crc kubenswrapper[4973]: I0320 13:21:27.602511 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:27 crc kubenswrapper[4973]: I0320 13:21:27.602538 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:28 crc kubenswrapper[4973]: I0320 13:21:28.017911 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:28 crc kubenswrapper[4973]: I0320 13:21:28.018094 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:28 crc kubenswrapper[4973]: I0320 13:21:28.019122 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:28 crc kubenswrapper[4973]: I0320 13:21:28.019212 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:28 crc kubenswrapper[4973]: I0320 13:21:28.019271 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:28 crc kubenswrapper[4973]: I0320 13:21:28.019702 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:28 crc kubenswrapper[4973]: I0320 13:21:28.019778 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:28 crc kubenswrapper[4973]: I0320 13:21:28.019807 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:29 crc kubenswrapper[4973]: I0320 13:21:29.918646 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:29 crc kubenswrapper[4973]: I0320 13:21:29.918897 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:29 crc kubenswrapper[4973]: I0320 13:21:29.921647 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:29 crc kubenswrapper[4973]: I0320 13:21:29.921777 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:29 crc kubenswrapper[4973]: I0320 13:21:29.921799 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:29 crc kubenswrapper[4973]: I0320 13:21:29.928071 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:30 crc kubenswrapper[4973]: I0320 13:21:30.011130 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 20 13:21:30 crc kubenswrapper[4973]: I0320 13:21:30.011435 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:30 crc kubenswrapper[4973]: I0320 13:21:30.013172 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:30 crc kubenswrapper[4973]: I0320 13:21:30.013239 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:30 crc kubenswrapper[4973]: I0320 13:21:30.013251 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:30 crc kubenswrapper[4973]: I0320 13:21:30.018130 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:30 crc kubenswrapper[4973]: I0320 13:21:30.022861 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:30 crc kubenswrapper[4973]: I0320 13:21:30.025611 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:30 crc kubenswrapper[4973]: I0320 13:21:30.025640 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:30 crc kubenswrapper[4973]: I0320 13:21:30.025653 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:30 crc kubenswrapper[4973]: E0320 13:21:30.025683 4973 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:21:30 crc kubenswrapper[4973]: I0320 13:21:30.600580 4973 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:21:30 crc kubenswrapper[4973]: I0320 13:21:30.600742 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:21:31 crc kubenswrapper[4973]: I0320 13:21:31.025490 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:31 crc kubenswrapper[4973]: I0320 13:21:31.027153 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:31 crc kubenswrapper[4973]: I0320 13:21:31.027241 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:31 crc kubenswrapper[4973]: I0320 13:21:31.027263 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:31 crc kubenswrapper[4973]: I0320 13:21:31.032235 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:32 crc kubenswrapper[4973]: I0320 13:21:32.028884 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:32 crc kubenswrapper[4973]: I0320 13:21:32.029886 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:32 crc kubenswrapper[4973]: I0320 13:21:32.029931 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:32 crc kubenswrapper[4973]: I0320 13:21:32.029945 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:33 crc kubenswrapper[4973]: W0320 13:21:33.702773 4973 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 13:21:33 crc kubenswrapper[4973]: I0320 13:21:33.702857 4973 trace.go:236] Trace[287112533]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 13:21:23.701) (total time: 10001ms): Mar 20 13:21:33 crc kubenswrapper[4973]: Trace[287112533]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:21:33.702) Mar 20 13:21:33 crc kubenswrapper[4973]: Trace[287112533]: [10.001473864s] [10.001473864s] END Mar 20 13:21:33 crc kubenswrapper[4973]: E0320 13:21:33.702877 4973 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 13:21:33 crc kubenswrapper[4973]: I0320 13:21:33.887176 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 20 13:21:34 crc kubenswrapper[4973]: W0320 13:21:34.120649 4973 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 13:21:34 crc kubenswrapper[4973]: I0320 13:21:34.120750 4973 trace.go:236] Trace[1647331973]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 13:21:24.119) (total time: 10001ms): Mar 20 13:21:34 crc kubenswrapper[4973]: Trace[1647331973]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (13:21:34.120) Mar 20 13:21:34 crc kubenswrapper[4973]: Trace[1647331973]: [10.001230233s] [10.001230233s] END Mar 20 13:21:34 crc kubenswrapper[4973]: E0320 13:21:34.120780 4973 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 13:21:34 crc kubenswrapper[4973]: E0320 13:21:34.148235 4973 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:34Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 13:21:34 crc kubenswrapper[4973]: W0320 13:21:34.149401 4973 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:34Z is after 2026-02-23T05:33:13Z Mar 20 13:21:34 crc kubenswrapper[4973]: E0320 13:21:34.149467 4973 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:34Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:21:34 crc kubenswrapper[4973]: E0320 13:21:34.152365 4973 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:34Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:21:34 crc kubenswrapper[4973]: I0320 13:21:34.153394 4973 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 13:21:34 crc kubenswrapper[4973]: I0320 13:21:34.153460 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 13:21:34 crc kubenswrapper[4973]: W0320 13:21:34.154157 4973 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:34Z is after 2026-02-23T05:33:13Z Mar 20 13:21:34 crc kubenswrapper[4973]: E0320 13:21:34.154228 4973 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:34Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:21:34 crc kubenswrapper[4973]: I0320 13:21:34.158048 4973 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 13:21:34 crc kubenswrapper[4973]: I0320 13:21:34.158104 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 13:21:34 crc kubenswrapper[4973]: E0320 13:21:34.158838 4973 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:34Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 20 13:21:34 crc kubenswrapper[4973]: E0320 13:21:34.160645 4973 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:34Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e8f50ac7e2e59 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.884660313 +0000 UTC m=+0.628330097,LastTimestamp:2026-03-20 13:21:19.884660313 +0000 UTC m=+0.628330097,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:34 crc kubenswrapper[4973]: I0320 13:21:34.888139 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:34Z is after 2026-02-23T05:33:13Z Mar 20 13:21:35 crc kubenswrapper[4973]: I0320 13:21:35.042545 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 13:21:35 crc kubenswrapper[4973]: I0320 13:21:35.044172 4973 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4857f3b032270079a55b8896c07f96b3a2381dc6d3af74c97bc29a68bc46c3e2" exitCode=255 Mar 20 13:21:35 crc kubenswrapper[4973]: I0320 13:21:35.044211 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4857f3b032270079a55b8896c07f96b3a2381dc6d3af74c97bc29a68bc46c3e2"} Mar 20 13:21:35 crc kubenswrapper[4973]: I0320 13:21:35.044883 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:35 crc kubenswrapper[4973]: I0320 13:21:35.045865 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:35 crc kubenswrapper[4973]: I0320 13:21:35.045927 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:35 crc kubenswrapper[4973]: I0320 13:21:35.045949 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:35 crc kubenswrapper[4973]: I0320 13:21:35.046846 4973 scope.go:117] "RemoveContainer" containerID="4857f3b032270079a55b8896c07f96b3a2381dc6d3af74c97bc29a68bc46c3e2" Mar 20 13:21:35 crc kubenswrapper[4973]: I0320 13:21:35.839543 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:35 crc kubenswrapper[4973]: I0320 13:21:35.888431 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:35Z is after 2026-02-23T05:33:13Z Mar 20 13:21:36 crc kubenswrapper[4973]: I0320 13:21:36.051742 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 13:21:36 crc kubenswrapper[4973]: I0320 13:21:36.054277 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"94176631c69d8505fe02473ca8e06875ef2349c06a0bde5552c912558ade6cd0"} Mar 20 13:21:36 crc kubenswrapper[4973]: I0320 13:21:36.054499 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:36 crc kubenswrapper[4973]: I0320 13:21:36.055750 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:36 crc kubenswrapper[4973]: I0320 13:21:36.055779 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:36 crc kubenswrapper[4973]: I0320 13:21:36.055787 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:36 crc kubenswrapper[4973]: I0320 13:21:36.058454 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:36 crc kubenswrapper[4973]: I0320 13:21:36.888086 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:36Z is after 2026-02-23T05:33:13Z Mar 20 13:21:37 crc kubenswrapper[4973]: I0320 13:21:37.057638 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 13:21:37 crc kubenswrapper[4973]: I0320 13:21:37.058593 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 13:21:37 crc kubenswrapper[4973]: I0320 13:21:37.060188 4973 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="94176631c69d8505fe02473ca8e06875ef2349c06a0bde5552c912558ade6cd0" exitCode=255 Mar 20 13:21:37 crc kubenswrapper[4973]: I0320 13:21:37.060219 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"94176631c69d8505fe02473ca8e06875ef2349c06a0bde5552c912558ade6cd0"} Mar 20 13:21:37 crc kubenswrapper[4973]: I0320 13:21:37.060250 4973 scope.go:117] "RemoveContainer" containerID="4857f3b032270079a55b8896c07f96b3a2381dc6d3af74c97bc29a68bc46c3e2" Mar 20 13:21:37 crc kubenswrapper[4973]: I0320 13:21:37.060316 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:37 crc kubenswrapper[4973]: I0320 13:21:37.061045 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:37 crc kubenswrapper[4973]: I0320 13:21:37.061086 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:37 crc kubenswrapper[4973]: I0320 13:21:37.061098 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:37 crc kubenswrapper[4973]: I0320 13:21:37.062222 4973 scope.go:117] "RemoveContainer" containerID="94176631c69d8505fe02473ca8e06875ef2349c06a0bde5552c912558ade6cd0" Mar 20 13:21:37 crc kubenswrapper[4973]: E0320 13:21:37.063076 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:21:37 crc kubenswrapper[4973]: W0320 13:21:37.404490 4973 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:37Z is after 2026-02-23T05:33:13Z Mar 20 13:21:37 crc kubenswrapper[4973]: E0320 13:21:37.404960 4973 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:37Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:21:37 crc kubenswrapper[4973]: I0320 13:21:37.615925 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 20 13:21:37 crc kubenswrapper[4973]: I0320 13:21:37.616106 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:37 crc kubenswrapper[4973]: I0320 13:21:37.617105 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:37 crc kubenswrapper[4973]: I0320 13:21:37.617141 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:37 crc kubenswrapper[4973]: I0320 13:21:37.617153 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:37 crc kubenswrapper[4973]: I0320 13:21:37.628155 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 20 13:21:37 crc kubenswrapper[4973]: I0320 13:21:37.887681 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:37Z is after 2026-02-23T05:33:13Z Mar 20 13:21:38 crc kubenswrapper[4973]: I0320 13:21:38.063439 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 13:21:38 crc kubenswrapper[4973]: I0320 13:21:38.065398 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:38 crc kubenswrapper[4973]: I0320 13:21:38.065518 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:38 crc kubenswrapper[4973]: I0320 13:21:38.066201 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:38 crc kubenswrapper[4973]: I0320 13:21:38.066229 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:38 crc kubenswrapper[4973]: I0320 13:21:38.066254 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:38 crc kubenswrapper[4973]: I0320 13:21:38.066265 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:38 crc kubenswrapper[4973]: I0320 13:21:38.066234 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:38 crc kubenswrapper[4973]: I0320 13:21:38.066348 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:38 crc kubenswrapper[4973]: I0320 13:21:38.066994 4973 scope.go:117] "RemoveContainer" containerID="94176631c69d8505fe02473ca8e06875ef2349c06a0bde5552c912558ade6cd0" Mar 20 13:21:38 crc kubenswrapper[4973]: E0320 13:21:38.067163 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:21:38 crc kubenswrapper[4973]: I0320 13:21:38.887825 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:38Z is after 2026-02-23T05:33:13Z Mar 20 13:21:38 crc kubenswrapper[4973]: W0320 13:21:38.968368 4973 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:38Z is after 2026-02-23T05:33:13Z Mar 20 13:21:38 crc kubenswrapper[4973]: E0320 13:21:38.968450 4973 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:38Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:21:39 crc kubenswrapper[4973]: I0320 13:21:39.888368 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:39Z is after 2026-02-23T05:33:13Z Mar 20 13:21:40 crc kubenswrapper[4973]: E0320 13:21:40.025760 4973 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:21:40 crc kubenswrapper[4973]: I0320 13:21:40.038811 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:40 crc kubenswrapper[4973]: I0320 13:21:40.038956 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:40 crc kubenswrapper[4973]: I0320 13:21:40.039925 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:40 crc kubenswrapper[4973]: I0320 13:21:40.039954 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:40 crc kubenswrapper[4973]: I0320 13:21:40.039964 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:40 crc kubenswrapper[4973]: I0320 13:21:40.040503 4973 scope.go:117] "RemoveContainer" containerID="94176631c69d8505fe02473ca8e06875ef2349c06a0bde5552c912558ade6cd0" Mar 20 13:21:40 crc kubenswrapper[4973]: E0320 13:21:40.040695 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:21:40 crc kubenswrapper[4973]: I0320 13:21:40.548438 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:40 crc kubenswrapper[4973]: I0320 13:21:40.549756 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:40 crc kubenswrapper[4973]: I0320 13:21:40.549794 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:40 crc kubenswrapper[4973]: I0320 13:21:40.549804 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:40 crc kubenswrapper[4973]: I0320 13:21:40.549831 4973 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:21:40 crc kubenswrapper[4973]: E0320 13:21:40.553628 4973 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:40Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 13:21:40 crc kubenswrapper[4973]: E0320 13:21:40.562892 4973 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:40Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 13:21:40 crc kubenswrapper[4973]: I0320 13:21:40.600479 4973 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:21:40 crc kubenswrapper[4973]: I0320 13:21:40.600596 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:21:40 crc kubenswrapper[4973]: I0320 13:21:40.887974 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:40Z is after 2026-02-23T05:33:13Z Mar 20 13:21:41 crc kubenswrapper[4973]: I0320 13:21:41.887882 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:41Z is after 2026-02-23T05:33:13Z Mar 20 13:21:42 crc kubenswrapper[4973]: W0320 13:21:42.441182 4973 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:42Z is after 2026-02-23T05:33:13Z Mar 20 13:21:42 crc kubenswrapper[4973]: E0320 13:21:42.441261 4973 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:42Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:21:42 crc kubenswrapper[4973]: I0320 13:21:42.810619 4973 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:21:42 crc kubenswrapper[4973]: E0320 13:21:42.813169 4973 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:42Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:21:42 crc kubenswrapper[4973]: I0320 13:21:42.887642 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:42Z is after 2026-02-23T05:33:13Z Mar 20 13:21:43 crc kubenswrapper[4973]: I0320 13:21:43.890271 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:43Z is after 2026-02-23T05:33:13Z Mar 20 13:21:44 crc kubenswrapper[4973]: E0320 13:21:44.163975 4973 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:44Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e8f50ac7e2e59 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.884660313 +0000 UTC m=+0.628330097,LastTimestamp:2026-03-20 13:21:19.884660313 +0000 UTC m=+0.628330097,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:44 crc kubenswrapper[4973]: I0320 13:21:44.889757 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:44Z is after 2026-02-23T05:33:13Z Mar 20 13:21:45 crc kubenswrapper[4973]: I0320 13:21:45.302608 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:21:45 crc kubenswrapper[4973]: I0320 13:21:45.302944 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:45 crc kubenswrapper[4973]: I0320 13:21:45.304310 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:45 crc kubenswrapper[4973]: I0320 13:21:45.304659 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:45 crc kubenswrapper[4973]: I0320 13:21:45.304692 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:45 crc kubenswrapper[4973]: I0320 13:21:45.305852 4973 scope.go:117] "RemoveContainer" containerID="94176631c69d8505fe02473ca8e06875ef2349c06a0bde5552c912558ade6cd0" Mar 20 13:21:45 crc kubenswrapper[4973]: E0320 13:21:45.306163 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:21:45 crc kubenswrapper[4973]: W0320 13:21:45.785975 4973 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:45Z is after 2026-02-23T05:33:13Z Mar 20 13:21:45 crc kubenswrapper[4973]: E0320 13:21:45.786050 4973 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:45Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:21:45 crc kubenswrapper[4973]: I0320 13:21:45.887644 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:45Z is after 2026-02-23T05:33:13Z Mar 20 13:21:46 crc kubenswrapper[4973]: I0320 13:21:46.889299 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:46Z is after 2026-02-23T05:33:13Z Mar 20 13:21:47 crc kubenswrapper[4973]: I0320 13:21:47.554381 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:47 crc kubenswrapper[4973]: I0320 13:21:47.556385 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:47 crc kubenswrapper[4973]: I0320 13:21:47.556446 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:47 crc kubenswrapper[4973]: I0320 13:21:47.556468 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:47 crc kubenswrapper[4973]: I0320 13:21:47.556514 4973 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:21:47 crc kubenswrapper[4973]: E0320 13:21:47.561401 4973 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:47Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 13:21:47 crc kubenswrapper[4973]: E0320 13:21:47.568573 4973 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:47Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 13:21:47 crc kubenswrapper[4973]: W0320 13:21:47.682543 4973 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:47Z is after 2026-02-23T05:33:13Z Mar 20 13:21:47 crc kubenswrapper[4973]: E0320 13:21:47.682692 4973 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:47Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:21:47 crc kubenswrapper[4973]: I0320 13:21:47.892486 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:47Z is after 2026-02-23T05:33:13Z Mar 20 13:21:48 crc kubenswrapper[4973]: I0320 13:21:48.888130 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:48Z is after 2026-02-23T05:33:13Z Mar 20 13:21:49 crc kubenswrapper[4973]: W0320 13:21:49.078793 4973 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:49Z is after 2026-02-23T05:33:13Z Mar 20 13:21:49 crc kubenswrapper[4973]: E0320 13:21:49.078894 4973 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:49Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 13:21:49 crc kubenswrapper[4973]: I0320 13:21:49.890189 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:49Z is after 2026-02-23T05:33:13Z Mar 20 13:21:50 crc kubenswrapper[4973]: E0320 13:21:50.025958 4973 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:21:50 crc kubenswrapper[4973]: I0320 13:21:50.600860 4973 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:21:50 crc kubenswrapper[4973]: I0320 13:21:50.600975 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:21:50 crc kubenswrapper[4973]: I0320 13:21:50.601067 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:50 crc kubenswrapper[4973]: I0320 13:21:50.601333 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:50 crc kubenswrapper[4973]: I0320 13:21:50.603093 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:50 crc kubenswrapper[4973]: I0320 13:21:50.603145 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:50 crc kubenswrapper[4973]: I0320 13:21:50.603159 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:50 crc kubenswrapper[4973]: I0320 13:21:50.603789 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"0da83b4a45a3b65af0d17acc529fb5991668c1603f5f6b88b1476f579ad029e0"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 13:21:50 crc kubenswrapper[4973]: I0320 13:21:50.603979 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://0da83b4a45a3b65af0d17acc529fb5991668c1603f5f6b88b1476f579ad029e0" gracePeriod=30 Mar 20 13:21:50 crc kubenswrapper[4973]: I0320 13:21:50.887863 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:50Z is after 2026-02-23T05:33:13Z Mar 20 13:21:51 crc kubenswrapper[4973]: I0320 13:21:51.104920 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 13:21:51 crc kubenswrapper[4973]: I0320 13:21:51.105523 4973 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="0da83b4a45a3b65af0d17acc529fb5991668c1603f5f6b88b1476f579ad029e0" exitCode=255 Mar 20 13:21:51 crc kubenswrapper[4973]: I0320 13:21:51.105583 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"0da83b4a45a3b65af0d17acc529fb5991668c1603f5f6b88b1476f579ad029e0"} Mar 20 13:21:51 crc kubenswrapper[4973]: I0320 13:21:51.105623 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c0e67ab19704e074f39eba08b8d61e69ddca313f486f07b8f8aa307aa35f3931"} Mar 20 13:21:51 crc kubenswrapper[4973]: I0320 13:21:51.105739 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:51 crc kubenswrapper[4973]: I0320 13:21:51.106664 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:51 crc kubenswrapper[4973]: I0320 13:21:51.106702 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:51 crc kubenswrapper[4973]: I0320 13:21:51.106713 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:51 crc kubenswrapper[4973]: I0320 13:21:51.891218 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:51Z is after 2026-02-23T05:33:13Z Mar 20 13:21:52 crc kubenswrapper[4973]: I0320 13:21:52.107982 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:52 crc kubenswrapper[4973]: I0320 13:21:52.108853 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:52 crc kubenswrapper[4973]: I0320 13:21:52.109005 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:52 crc kubenswrapper[4973]: I0320 13:21:52.109037 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:52 crc kubenswrapper[4973]: I0320 13:21:52.889231 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:21:53 crc kubenswrapper[4973]: I0320 13:21:53.888924 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.171538 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f50ac7e2e59 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.884660313 +0000 UTC m=+0.628330097,LastTimestamp:2026-03-20 13:21:19.884660313 +0000 UTC m=+0.628330097,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.175502 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f50afd2d9c4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.940540868 +0000 UTC m=+0.684210622,LastTimestamp:2026-03-20 13:21:19.940540868 +0000 UTC m=+0.684210622,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.181602 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f50afd34986 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.940569478 +0000 UTC m=+0.684239242,LastTimestamp:2026-03-20 13:21:19.940569478 +0000 UTC m=+0.684239242,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.188513 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f50afd382cb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.940584139 +0000 UTC m=+0.684253903,LastTimestamp:2026-03-20 13:21:19.940584139 +0000 UTC m=+0.684253903,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.194802 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f50b4847ccf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:20.019291343 +0000 UTC m=+0.762961097,LastTimestamp:2026-03-20 13:21:20.019291343 +0000 UTC m=+0.762961097,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.202421 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f50afd2d9c4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f50afd2d9c4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.940540868 +0000 UTC m=+0.684210622,LastTimestamp:2026-03-20 13:21:20.052053318 +0000 UTC m=+0.795723052,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.208580 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f50afd34986\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f50afd34986 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.940569478 +0000 UTC m=+0.684239242,LastTimestamp:2026-03-20 13:21:20.052088709 +0000 UTC m=+0.795758453,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.214985 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f50afd382cb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f50afd382cb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.940584139 +0000 UTC m=+0.684253903,LastTimestamp:2026-03-20 13:21:20.052096839 +0000 UTC m=+0.795766583,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.221671 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f50afd2d9c4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f50afd2d9c4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.940540868 +0000 UTC m=+0.684210622,LastTimestamp:2026-03-20 13:21:20.053174928 +0000 UTC m=+0.796844682,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.225611 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f50afd34986\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f50afd34986 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.940569478 +0000 UTC m=+0.684239242,LastTimestamp:2026-03-20 13:21:20.053205619 +0000 UTC m=+0.796875373,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.231127 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f50afd382cb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f50afd382cb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.940584139 +0000 UTC m=+0.684253903,LastTimestamp:2026-03-20 13:21:20.05322015 +0000 UTC m=+0.796889904,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.236136 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f50afd2d9c4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f50afd2d9c4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.940540868 +0000 UTC m=+0.684210622,LastTimestamp:2026-03-20 13:21:20.053792766 +0000 UTC m=+0.797462540,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.242987 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f50afd34986\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f50afd34986 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.940569478 +0000 UTC m=+0.684239242,LastTimestamp:2026-03-20 13:21:20.053887429 +0000 UTC m=+0.797557193,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.249761 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f50afd382cb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f50afd382cb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.940584139 +0000 UTC m=+0.684253903,LastTimestamp:2026-03-20 13:21:20.053977751 +0000 UTC m=+0.797647505,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.256724 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f50afd2d9c4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f50afd2d9c4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.940540868 +0000 UTC m=+0.684210622,LastTimestamp:2026-03-20 13:21:20.054581607 +0000 UTC m=+0.798251351,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.263408 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f50afd34986\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f50afd34986 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.940569478 +0000 UTC m=+0.684239242,LastTimestamp:2026-03-20 13:21:20.054597398 +0000 UTC m=+0.798267142,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.269886 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f50afd382cb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f50afd382cb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.940584139 +0000 UTC m=+0.684253903,LastTimestamp:2026-03-20 13:21:20.054608198 +0000 UTC m=+0.798277942,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.276242 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f50afd2d9c4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f50afd2d9c4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.940540868 +0000 UTC m=+0.684210622,LastTimestamp:2026-03-20 13:21:20.0546818 +0000 UTC m=+0.798351544,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.282935 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f50afd34986\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f50afd34986 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.940569478 +0000 UTC m=+0.684239242,LastTimestamp:2026-03-20 13:21:20.05469467 +0000 UTC m=+0.798364414,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.292631 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f50afd382cb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f50afd382cb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.940584139 +0000 UTC m=+0.684253903,LastTimestamp:2026-03-20 13:21:20.054705451 +0000 UTC m=+0.798375195,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.300188 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f50afd2d9c4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f50afd2d9c4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.940540868 +0000 UTC m=+0.684210622,LastTimestamp:2026-03-20 13:21:20.055493593 +0000 UTC m=+0.799163337,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.305919 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f50afd34986\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f50afd34986 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.940569478 +0000 UTC m=+0.684239242,LastTimestamp:2026-03-20 13:21:20.055540214 +0000 UTC m=+0.799209958,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.311926 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f50afd382cb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f50afd382cb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.940584139 +0000 UTC m=+0.684253903,LastTimestamp:2026-03-20 13:21:20.055551624 +0000 UTC m=+0.799221368,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.315684 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f50afd2d9c4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f50afd2d9c4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.940540868 +0000 UTC m=+0.684210622,LastTimestamp:2026-03-20 13:21:20.056538752 +0000 UTC m=+0.800208486,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.322303 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8f50afd34986\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8f50afd34986 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:19.940569478 +0000 UTC m=+0.684239242,LastTimestamp:2026-03-20 13:21:20.056558022 +0000 UTC m=+0.800227766,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.326283 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f50cd2a0ef9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:20.432795385 +0000 UTC m=+1.176465129,LastTimestamp:2026-03-20 13:21:20.432795385 +0000 UTC m=+1.176465129,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.328777 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f50cd338f44 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:20.433418052 +0000 UTC m=+1.177087796,LastTimestamp:2026-03-20 13:21:20.433418052 +0000 UTC m=+1.177087796,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.330963 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f50cd374400 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:20.433660928 +0000 UTC m=+1.177330672,LastTimestamp:2026-03-20 13:21:20.433660928 +0000 UTC m=+1.177330672,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.334698 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f50cda13452 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:20.44060373 +0000 UTC m=+1.184273474,LastTimestamp:2026-03-20 13:21:20.44060373 +0000 UTC m=+1.184273474,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.341090 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8f50ce86a7f4 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:20.455641076 +0000 UTC m=+1.199310820,LastTimestamp:2026-03-20 13:21:20.455641076 +0000 UTC m=+1.199310820,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.347873 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f50ef3c44dd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:21.004414173 +0000 UTC m=+1.748083917,LastTimestamp:2026-03-20 13:21:21.004414173 +0000 UTC m=+1.748083917,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.353141 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8f50ef6ac3ed openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:21.007461357 +0000 UTC m=+1.751131101,LastTimestamp:2026-03-20 13:21:21.007461357 +0000 UTC m=+1.751131101,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.357438 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f50ef6be801 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:21.007536129 +0000 UTC m=+1.751205873,LastTimestamp:2026-03-20 13:21:21.007536129 +0000 UTC m=+1.751205873,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.362750 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f50ef748593 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:21.008100755 +0000 UTC m=+1.751770499,LastTimestamp:2026-03-20 13:21:21.008100755 +0000 UTC m=+1.751770499,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.367954 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f50efb3415a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:21.012212058 +0000 UTC m=+1.755881802,LastTimestamp:2026-03-20 13:21:21.012212058 +0000 UTC m=+1.755881802,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.372387 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f50f0147265 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:21.018581605 +0000 UTC m=+1.762251349,LastTimestamp:2026-03-20 13:21:21.018581605 +0000 UTC m=+1.762251349,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.376735 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f50f026a15e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:21.019773278 +0000 UTC m=+1.763443022,LastTimestamp:2026-03-20 13:21:21.019773278 +0000 UTC m=+1.763443022,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.383992 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f50f050247f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:21.022493823 +0000 UTC m=+1.766163567,LastTimestamp:2026-03-20 13:21:21.022493823 +0000 UTC m=+1.766163567,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.389461 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f50f056d8f3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:21.022933235 +0000 UTC m=+1.766602979,LastTimestamp:2026-03-20 13:21:21.022933235 +0000 UTC m=+1.766602979,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.395102 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f50f08ee939 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:21.026607417 +0000 UTC m=+1.770277161,LastTimestamp:2026-03-20 13:21:21.026607417 +0000 UTC m=+1.770277161,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.399881 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8f50f128f99b openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:21.036704155 +0000 UTC m=+1.780373899,LastTimestamp:2026-03-20 13:21:21.036704155 +0000 UTC m=+1.780373899,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.404449 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f5101bdd9bf openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:21.314896319 +0000 UTC m=+2.058566083,LastTimestamp:2026-03-20 13:21:21.314896319 +0000 UTC m=+2.058566083,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.412518 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f5102a144b2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:21.32980037 +0000 UTC m=+2.073470114,LastTimestamp:2026-03-20 13:21:21.32980037 +0000 UTC m=+2.073470114,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.420441 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f5102b47a5f openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:21.331059295 +0000 UTC m=+2.074729079,LastTimestamp:2026-03-20 13:21:21.331059295 +0000 UTC m=+2.074729079,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.427392 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f510f6e87d9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:21.544579033 +0000 UTC m=+2.288248777,LastTimestamp:2026-03-20 13:21:21.544579033 +0000 UTC m=+2.288248777,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.431725 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f51104f0251 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:21.559290449 +0000 UTC m=+2.302960193,LastTimestamp:2026-03-20 13:21:21.559290449 +0000 UTC m=+2.302960193,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.438073 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f511070d2bb openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:21.561506491 +0000 UTC m=+2.305176245,LastTimestamp:2026-03-20 13:21:21.561506491 +0000 UTC m=+2.305176245,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.443190 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f5119edfc6b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:21.720704107 +0000 UTC m=+2.464373851,LastTimestamp:2026-03-20 13:21:21.720704107 +0000 UTC m=+2.464373851,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.447842 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f511bb982fc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:21.75081958 +0000 UTC m=+2.494489324,LastTimestamp:2026-03-20 13:21:21.75081958 +0000 UTC m=+2.494489324,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.453454 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f51286d87ba openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:21.963943866 +0000 UTC m=+2.707613610,LastTimestamp:2026-03-20 13:21:21.963943866 +0000 UTC m=+2.707613610,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.460546 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f512881fb75 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:21.965284213 +0000 UTC m=+2.708953957,LastTimestamp:2026-03-20 13:21:21.965284213 +0000 UTC m=+2.708953957,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.465885 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8f5128931e57 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:21.966407255 +0000 UTC m=+2.710076999,LastTimestamp:2026-03-20 13:21:21.966407255 +0000 UTC m=+2.710076999,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.473068 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f5128a9c720 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:21.967892256 +0000 UTC m=+2.711562000,LastTimestamp:2026-03-20 13:21:21.967892256 +0000 UTC m=+2.711562000,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.478599 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f5136b4f641 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:22.203506241 +0000 UTC m=+2.947175985,LastTimestamp:2026-03-20 13:21:22.203506241 +0000 UTC m=+2.947175985,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.484074 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8f5136bfa4f8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:22.204206328 +0000 UTC m=+2.947876072,LastTimestamp:2026-03-20 13:21:22.204206328 +0000 UTC m=+2.947876072,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.489016 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f5136c57b48 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:22.204588872 +0000 UTC m=+2.948258616,LastTimestamp:2026-03-20 13:21:22.204588872 +0000 UTC m=+2.948258616,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.494196 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f5136c64a70 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:22.204641904 +0000 UTC m=+2.948311648,LastTimestamp:2026-03-20 13:21:22.204641904 +0000 UTC m=+2.948311648,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.500654 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f5137d6cd54 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:22.222501204 +0000 UTC m=+2.966170948,LastTimestamp:2026-03-20 13:21:22.222501204 +0000 UTC m=+2.966170948,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.507371 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f5137e603ce openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:22.22349819 +0000 UTC m=+2.967167934,LastTimestamp:2026-03-20 13:21:22.22349819 +0000 UTC m=+2.967167934,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.513113 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f5137eae1fd openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:22.223817213 +0000 UTC m=+2.967486957,LastTimestamp:2026-03-20 13:21:22.223817213 +0000 UTC m=+2.967486957,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.518318 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f5137f79d2c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:22.224651564 +0000 UTC m=+2.968321308,LastTimestamp:2026-03-20 13:21:22.224651564 +0000 UTC m=+2.968321308,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.523313 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8f5138026e4d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:22.225360461 +0000 UTC m=+2.969030225,LastTimestamp:2026-03-20 13:21:22.225360461 +0000 UTC m=+2.969030225,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.528225 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f5142b75c7c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:22.404990076 +0000 UTC m=+3.148659820,LastTimestamp:2026-03-20 13:21:22.404990076 +0000 UTC m=+3.148659820,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.534815 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f5142bf5115 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:22.405511445 +0000 UTC m=+3.149181189,LastTimestamp:2026-03-20 13:21:22.405511445 +0000 UTC m=+3.149181189,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.541675 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f5143c190ab openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:22.422436011 +0000 UTC m=+3.166105755,LastTimestamp:2026-03-20 13:21:22.422436011 +0000 UTC m=+3.166105755,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.546722 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f5143c25b0e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:22.422487822 +0000 UTC m=+3.166157586,LastTimestamp:2026-03-20 13:21:22.422487822 +0000 UTC m=+3.166157586,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.551773 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f5143ce57a5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:22.423273381 +0000 UTC m=+3.166943125,LastTimestamp:2026-03-20 13:21:22.423273381 +0000 UTC m=+3.166943125,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.556084 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f5143dbf107 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:22.424164615 +0000 UTC m=+3.167834369,LastTimestamp:2026-03-20 13:21:22.424164615 +0000 UTC m=+3.167834369,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.560120 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f514f339e2e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:22.61445995 +0000 UTC m=+3.358129694,LastTimestamp:2026-03-20 13:21:22.61445995 +0000 UTC m=+3.358129694,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: I0320 13:21:54.561514 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:54 crc kubenswrapper[4973]: I0320 13:21:54.562834 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:54 crc kubenswrapper[4973]: I0320 13:21:54.562863 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:54 crc kubenswrapper[4973]: I0320 13:21:54.562873 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:54 crc kubenswrapper[4973]: I0320 13:21:54.562897 4973 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.565002 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f514f431906 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:22.615474438 +0000 UTC m=+3.359144192,LastTimestamp:2026-03-20 13:21:22.615474438 +0000 UTC m=+3.359144192,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.565270 4973 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.568152 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f51501abbcc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:22.629606348 +0000 UTC m=+3.373276092,LastTimestamp:2026-03-20 13:21:22.629606348 +0000 UTC m=+3.373276092,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.572664 4973 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.572745 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8f51502003e1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:22.629952481 +0000 UTC m=+3.373622225,LastTimestamp:2026-03-20 13:21:22.629952481 +0000 UTC m=+3.373622225,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.576871 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f51502fe986 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:22.63099431 +0000 UTC m=+3.374664054,LastTimestamp:2026-03-20 13:21:22.63099431 +0000 UTC m=+3.374664054,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.580536 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f5155fb3677 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:22.728203895 +0000 UTC m=+3.471873639,LastTimestamp:2026-03-20 13:21:22.728203895 +0000 UTC m=+3.471873639,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.585307 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f515b96cb84 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:22.822286212 +0000 UTC m=+3.565955956,LastTimestamp:2026-03-20 13:21:22.822286212 +0000 UTC m=+3.565955956,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.590362 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f515c774e23 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:22.836999715 +0000 UTC m=+3.580669459,LastTimestamp:2026-03-20 13:21:22.836999715 +0000 UTC m=+3.580669459,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.594865 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f515c919c4b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:22.838723659 +0000 UTC m=+3.582393403,LastTimestamp:2026-03-20 13:21:22.838723659 +0000 UTC m=+3.582393403,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.599927 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f5165680edb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:22.986995419 +0000 UTC m=+3.730665163,LastTimestamp:2026-03-20 13:21:22.986995419 +0000 UTC m=+3.730665163,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.605429 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f51667c1103 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:23.005083907 +0000 UTC m=+3.748753651,LastTimestamp:2026-03-20 13:21:23.005083907 +0000 UTC m=+3.748753651,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.609754 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f5167180ebc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:23.01530694 +0000 UTC m=+3.758976684,LastTimestamp:2026-03-20 13:21:23.01530694 +0000 UTC m=+3.758976684,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.613347 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f516e82e2cc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:23.139748556 +0000 UTC m=+3.883418310,LastTimestamp:2026-03-20 13:21:23.139748556 +0000 UTC m=+3.883418310,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.618082 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f516f66908a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:23.154669706 +0000 UTC m=+3.898339450,LastTimestamp:2026-03-20 13:21:23.154669706 +0000 UTC m=+3.898339450,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.623249 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f51a1c64b68 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:23.999804264 +0000 UTC m=+4.743474008,LastTimestamp:2026-03-20 13:21:23.999804264 +0000 UTC m=+4.743474008,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.627475 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f51ac3f0e48 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:24.175490632 +0000 UTC m=+4.919160376,LastTimestamp:2026-03-20 13:21:24.175490632 +0000 UTC m=+4.919160376,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.630699 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f51acf15ce0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:24.18717616 +0000 UTC m=+4.930845904,LastTimestamp:2026-03-20 13:21:24.18717616 +0000 UTC m=+4.930845904,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.634499 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f51ad00717d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:24.188164477 +0000 UTC m=+4.931834221,LastTimestamp:2026-03-20 13:21:24.188164477 +0000 UTC m=+4.931834221,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.637918 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f51ba0ecb6a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:24.40720881 +0000 UTC m=+5.150878574,LastTimestamp:2026-03-20 13:21:24.40720881 +0000 UTC m=+5.150878574,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.641211 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f51babb3be1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:24.418509793 +0000 UTC m=+5.162179537,LastTimestamp:2026-03-20 13:21:24.418509793 +0000 UTC m=+5.162179537,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.645112 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f51bacb834e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:24.419576654 +0000 UTC m=+5.163246418,LastTimestamp:2026-03-20 13:21:24.419576654 +0000 UTC m=+5.163246418,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.649295 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f51c76dcb1d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:24.631538461 +0000 UTC m=+5.375208205,LastTimestamp:2026-03-20 13:21:24.631538461 +0000 UTC m=+5.375208205,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.653723 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f51ca2669e3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:24.677192163 +0000 UTC m=+5.420861897,LastTimestamp:2026-03-20 13:21:24.677192163 +0000 UTC m=+5.420861897,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.657745 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f51ca3c439d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:24.678624157 +0000 UTC m=+5.422293901,LastTimestamp:2026-03-20 13:21:24.678624157 +0000 UTC m=+5.422293901,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.662205 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f51d74d342e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:24.897838126 +0000 UTC m=+5.641507870,LastTimestamp:2026-03-20 13:21:24.897838126 +0000 UTC m=+5.641507870,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.666765 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f51d7f0c7ee openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:24.908558318 +0000 UTC m=+5.652228092,LastTimestamp:2026-03-20 13:21:24.908558318 +0000 UTC m=+5.652228092,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.670772 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f51d8060ffc openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:24.90995302 +0000 UTC m=+5.653622764,LastTimestamp:2026-03-20 13:21:24.90995302 +0000 UTC m=+5.653622764,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.674942 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f51e250aca0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:25.082614944 +0000 UTC m=+5.826284688,LastTimestamp:2026-03-20 13:21:25.082614944 +0000 UTC m=+5.826284688,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.680348 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8f51e32155c9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:25.096289737 +0000 UTC m=+5.839959481,LastTimestamp:2026-03-20 13:21:25.096289737 +0000 UTC m=+5.839959481,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.686491 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 13:21:54 crc kubenswrapper[4973]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8f532b37d6ad openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 20 13:21:54 crc kubenswrapper[4973]: body: Mar 20 13:21:54 crc kubenswrapper[4973]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:30.600691373 +0000 UTC m=+11.344361157,LastTimestamp:2026-03-20 13:21:30.600691373 +0000 UTC m=+11.344361157,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:21:54 crc kubenswrapper[4973]: > Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.690557 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f532b397fba openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:30.600800186 +0000 UTC m=+11.344469970,LastTimestamp:2026-03-20 13:21:30.600800186 +0000 UTC m=+11.344469970,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.695168 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 13:21:54 crc kubenswrapper[4973]: &Event{ObjectMeta:{kube-apiserver-crc.189e8f53fefa8287 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 13:21:54 crc kubenswrapper[4973]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 13:21:54 crc kubenswrapper[4973]: Mar 20 13:21:54 crc kubenswrapper[4973]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:34.153441927 +0000 UTC m=+14.897111671,LastTimestamp:2026-03-20 13:21:34.153441927 +0000 UTC m=+14.897111671,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:21:54 crc kubenswrapper[4973]: > Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.699190 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f53fefb22f6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:34.153482998 +0000 UTC m=+14.897152742,LastTimestamp:2026-03-20 13:21:34.153482998 +0000 UTC m=+14.897152742,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.703520 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8f53fefa8287\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 13:21:54 crc kubenswrapper[4973]: &Event{ObjectMeta:{kube-apiserver-crc.189e8f53fefa8287 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 13:21:54 crc kubenswrapper[4973]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 13:21:54 crc kubenswrapper[4973]: Mar 20 13:21:54 crc kubenswrapper[4973]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:34.153441927 +0000 UTC m=+14.897111671,LastTimestamp:2026-03-20 13:21:34.158088555 +0000 UTC m=+14.901758299,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:21:54 crc kubenswrapper[4973]: > Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.707673 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8f53fefb22f6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f53fefb22f6 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:34.153482998 +0000 UTC m=+14.897152742,LastTimestamp:2026-03-20 13:21:34.158127846 +0000 UTC m=+14.901797590,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.711870 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8f515c919c4b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f515c919c4b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:22.838723659 +0000 UTC m=+3.582393403,LastTimestamp:2026-03-20 13:21:35.04797598 +0000 UTC m=+15.791645734,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.715437 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8f51667c1103\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f51667c1103 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:23.005083907 +0000 UTC m=+3.748753651,LastTimestamp:2026-03-20 13:21:35.223575293 +0000 UTC m=+15.967245037,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.718944 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e8f5167180ebc\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8f5167180ebc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:23.01530694 +0000 UTC m=+3.758976684,LastTimestamp:2026-03-20 13:21:35.231737718 +0000 UTC m=+15.975407462,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.723192 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 13:21:54 crc kubenswrapper[4973]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8f557f41c41a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 13:21:54 crc kubenswrapper[4973]: body: Mar 20 13:21:54 crc kubenswrapper[4973]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:40.600562714 +0000 UTC m=+21.344232498,LastTimestamp:2026-03-20 13:21:40.600562714 +0000 UTC m=+21.344232498,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:21:54 crc kubenswrapper[4973]: > Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.726606 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f557f42f44a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:40.600640586 +0000 UTC m=+21.344310350,LastTimestamp:2026-03-20 13:21:40.600640586 +0000 UTC m=+21.344310350,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.731792 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8f557f41c41a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 13:21:54 crc kubenswrapper[4973]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8f557f41c41a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 13:21:54 crc kubenswrapper[4973]: body: Mar 20 13:21:54 crc kubenswrapper[4973]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:40.600562714 +0000 UTC m=+21.344232498,LastTimestamp:2026-03-20 13:21:50.600947557 +0000 UTC m=+31.344617341,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:21:54 crc kubenswrapper[4973]: > Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.736384 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8f557f42f44a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f557f42f44a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:40.600640586 +0000 UTC m=+21.344310350,LastTimestamp:2026-03-20 13:21:50.601025139 +0000 UTC m=+31.344694923,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.742115 4973 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f57d381669c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:50.60395382 +0000 UTC m=+31.347623574,LastTimestamp:2026-03-20 13:21:50.60395382 +0000 UTC m=+31.347623574,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.745852 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8f50f026a15e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f50f026a15e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:21.019773278 +0000 UTC m=+1.763443022,LastTimestamp:2026-03-20 13:21:50.7264545 +0000 UTC m=+31.470124254,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.749649 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8f5101bdd9bf\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f5101bdd9bf openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:21.314896319 +0000 UTC m=+2.058566083,LastTimestamp:2026-03-20 13:21:50.94041685 +0000 UTC m=+31.684086604,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: E0320 13:21:54.753684 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8f5102a144b2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f5102a144b2 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:21.32980037 +0000 UTC m=+2.073470114,LastTimestamp:2026-03-20 13:21:50.949470021 +0000 UTC m=+31.693139765,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:21:54 crc kubenswrapper[4973]: I0320 13:21:54.889821 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:21:55 crc kubenswrapper[4973]: I0320 13:21:55.890067 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:21:56 crc kubenswrapper[4973]: I0320 13:21:56.892914 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:21:57 crc kubenswrapper[4973]: I0320 13:21:57.600263 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:21:57 crc kubenswrapper[4973]: I0320 13:21:57.600436 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:57 crc kubenswrapper[4973]: I0320 13:21:57.601671 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:57 crc kubenswrapper[4973]: I0320 13:21:57.601753 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:57 crc kubenswrapper[4973]: I0320 13:21:57.601774 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:57 crc kubenswrapper[4973]: I0320 13:21:57.891829 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:21:57 crc kubenswrapper[4973]: I0320 13:21:57.949731 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:57 crc kubenswrapper[4973]: I0320 13:21:57.951157 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:57 crc kubenswrapper[4973]: I0320 13:21:57.951205 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:57 crc kubenswrapper[4973]: I0320 13:21:57.951223 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:57 crc kubenswrapper[4973]: I0320 13:21:57.951939 4973 scope.go:117] "RemoveContainer" containerID="94176631c69d8505fe02473ca8e06875ef2349c06a0bde5552c912558ade6cd0" Mar 20 13:21:58 crc kubenswrapper[4973]: I0320 13:21:58.890065 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:21:59 crc kubenswrapper[4973]: I0320 13:21:59.124846 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 13:21:59 crc kubenswrapper[4973]: I0320 13:21:59.125302 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 13:21:59 crc kubenswrapper[4973]: I0320 13:21:59.126795 4973 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a2f552b111c3da99bfadfc49209ff0066f1d72de80fc8203eb1b9cb96434caa7" exitCode=255 Mar 20 13:21:59 crc kubenswrapper[4973]: I0320 13:21:59.126836 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a2f552b111c3da99bfadfc49209ff0066f1d72de80fc8203eb1b9cb96434caa7"} Mar 20 13:21:59 crc kubenswrapper[4973]: I0320 13:21:59.126874 4973 scope.go:117] "RemoveContainer" containerID="94176631c69d8505fe02473ca8e06875ef2349c06a0bde5552c912558ade6cd0" Mar 20 13:21:59 crc kubenswrapper[4973]: I0320 13:21:59.127254 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:21:59 crc kubenswrapper[4973]: I0320 13:21:59.131371 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:21:59 crc kubenswrapper[4973]: I0320 13:21:59.131415 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:21:59 crc kubenswrapper[4973]: I0320 13:21:59.131430 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:21:59 crc kubenswrapper[4973]: I0320 13:21:59.132091 4973 scope.go:117] "RemoveContainer" containerID="a2f552b111c3da99bfadfc49209ff0066f1d72de80fc8203eb1b9cb96434caa7" Mar 20 13:21:59 crc kubenswrapper[4973]: E0320 13:21:59.132311 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:21:59 crc kubenswrapper[4973]: I0320 13:21:59.835713 4973 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 13:21:59 crc kubenswrapper[4973]: I0320 13:21:59.849429 4973 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 13:21:59 crc kubenswrapper[4973]: I0320 13:21:59.889540 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:00 crc kubenswrapper[4973]: I0320 13:22:00.018283 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:22:00 crc kubenswrapper[4973]: I0320 13:22:00.018642 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:00 crc kubenswrapper[4973]: I0320 13:22:00.020730 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:00 crc kubenswrapper[4973]: I0320 13:22:00.020770 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:00 crc kubenswrapper[4973]: I0320 13:22:00.020781 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:00 crc kubenswrapper[4973]: E0320 13:22:00.026348 4973 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:22:00 crc kubenswrapper[4973]: I0320 13:22:00.039183 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:22:00 crc kubenswrapper[4973]: I0320 13:22:00.131119 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 13:22:00 crc kubenswrapper[4973]: I0320 13:22:00.132921 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:00 crc kubenswrapper[4973]: I0320 13:22:00.133769 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:00 crc kubenswrapper[4973]: I0320 13:22:00.133804 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:00 crc kubenswrapper[4973]: I0320 13:22:00.133816 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:00 crc kubenswrapper[4973]: I0320 13:22:00.134433 4973 scope.go:117] "RemoveContainer" containerID="a2f552b111c3da99bfadfc49209ff0066f1d72de80fc8203eb1b9cb96434caa7" Mar 20 13:22:00 crc kubenswrapper[4973]: E0320 13:22:00.134616 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:22:00 crc kubenswrapper[4973]: I0320 13:22:00.600909 4973 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:22:00 crc kubenswrapper[4973]: I0320 13:22:00.600992 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:22:00 crc kubenswrapper[4973]: E0320 13:22:00.605926 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8f557f41c41a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 13:22:00 crc kubenswrapper[4973]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8f557f41c41a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 13:22:00 crc kubenswrapper[4973]: body: Mar 20 13:22:00 crc kubenswrapper[4973]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:40.600562714 +0000 UTC m=+21.344232498,LastTimestamp:2026-03-20 13:22:00.600967772 +0000 UTC m=+41.344637556,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 13:22:00 crc kubenswrapper[4973]: > Mar 20 13:22:00 crc kubenswrapper[4973]: E0320 13:22:00.609948 4973 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8f557f42f44a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8f557f42f44a openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:21:40.600640586 +0000 UTC m=+21.344310350,LastTimestamp:2026-03-20 13:22:00.601028433 +0000 UTC m=+41.344698217,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:22:00 crc kubenswrapper[4973]: I0320 13:22:00.889049 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:01 crc kubenswrapper[4973]: I0320 13:22:01.565890 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:01 crc kubenswrapper[4973]: I0320 13:22:01.567397 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:01 crc kubenswrapper[4973]: I0320 13:22:01.567455 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:01 crc kubenswrapper[4973]: I0320 13:22:01.567472 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:01 crc kubenswrapper[4973]: I0320 13:22:01.567504 4973 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:22:01 crc kubenswrapper[4973]: E0320 13:22:01.572922 4973 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 13:22:01 crc kubenswrapper[4973]: E0320 13:22:01.578774 4973 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 13:22:01 crc kubenswrapper[4973]: I0320 13:22:01.889792 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:02 crc kubenswrapper[4973]: W0320 13:22:02.410223 4973 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:02 crc kubenswrapper[4973]: E0320 13:22:02.410290 4973 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 13:22:02 crc kubenswrapper[4973]: I0320 13:22:02.891848 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:03 crc kubenswrapper[4973]: W0320 13:22:03.720459 4973 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 20 13:22:03 crc kubenswrapper[4973]: E0320 13:22:03.720522 4973 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 13:22:03 crc kubenswrapper[4973]: W0320 13:22:03.816767 4973 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 13:22:03 crc kubenswrapper[4973]: E0320 13:22:03.816875 4973 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 13:22:03 crc kubenswrapper[4973]: I0320 13:22:03.892594 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:04 crc kubenswrapper[4973]: I0320 13:22:04.889978 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:04 crc kubenswrapper[4973]: W0320 13:22:04.911381 4973 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 13:22:04 crc kubenswrapper[4973]: E0320 13:22:04.911450 4973 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 13:22:05 crc kubenswrapper[4973]: I0320 13:22:05.302691 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:22:05 crc kubenswrapper[4973]: I0320 13:22:05.302845 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:05 crc kubenswrapper[4973]: I0320 13:22:05.303845 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:05 crc kubenswrapper[4973]: I0320 13:22:05.303900 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:05 crc kubenswrapper[4973]: I0320 13:22:05.303913 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:05 crc kubenswrapper[4973]: I0320 13:22:05.304453 4973 scope.go:117] "RemoveContainer" containerID="a2f552b111c3da99bfadfc49209ff0066f1d72de80fc8203eb1b9cb96434caa7" Mar 20 13:22:05 crc kubenswrapper[4973]: E0320 13:22:05.304614 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:22:05 crc kubenswrapper[4973]: I0320 13:22:05.895569 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:06 crc kubenswrapper[4973]: I0320 13:22:06.890960 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:07 crc kubenswrapper[4973]: I0320 13:22:07.605205 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:22:07 crc kubenswrapper[4973]: I0320 13:22:07.605527 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:07 crc kubenswrapper[4973]: I0320 13:22:07.607058 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:07 crc kubenswrapper[4973]: I0320 13:22:07.607132 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:07 crc kubenswrapper[4973]: I0320 13:22:07.607153 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:07 crc kubenswrapper[4973]: I0320 13:22:07.609584 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:22:07 crc kubenswrapper[4973]: I0320 13:22:07.889368 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:08 crc kubenswrapper[4973]: I0320 13:22:08.153407 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:08 crc kubenswrapper[4973]: I0320 13:22:08.154217 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:08 crc kubenswrapper[4973]: I0320 13:22:08.154239 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:08 crc kubenswrapper[4973]: I0320 13:22:08.154249 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:08 crc kubenswrapper[4973]: I0320 13:22:08.573176 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:08 crc kubenswrapper[4973]: I0320 13:22:08.574583 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:08 crc kubenswrapper[4973]: I0320 13:22:08.574663 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:08 crc kubenswrapper[4973]: I0320 13:22:08.574680 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:08 crc kubenswrapper[4973]: I0320 13:22:08.574746 4973 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:22:08 crc kubenswrapper[4973]: E0320 13:22:08.580244 4973 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 13:22:08 crc kubenswrapper[4973]: E0320 13:22:08.580513 4973 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 13:22:08 crc kubenswrapper[4973]: I0320 13:22:08.888542 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:09 crc kubenswrapper[4973]: I0320 13:22:09.892852 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:10 crc kubenswrapper[4973]: E0320 13:22:10.026445 4973 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:22:10 crc kubenswrapper[4973]: I0320 13:22:10.891029 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:11 crc kubenswrapper[4973]: I0320 13:22:11.202689 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 13:22:11 crc kubenswrapper[4973]: I0320 13:22:11.202804 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:11 crc kubenswrapper[4973]: I0320 13:22:11.203700 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:11 crc kubenswrapper[4973]: I0320 13:22:11.203733 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:11 crc kubenswrapper[4973]: I0320 13:22:11.203747 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:11 crc kubenswrapper[4973]: I0320 13:22:11.889625 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:12 crc kubenswrapper[4973]: I0320 13:22:12.891997 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:13 crc kubenswrapper[4973]: I0320 13:22:13.887632 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:14 crc kubenswrapper[4973]: I0320 13:22:14.889505 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:15 crc kubenswrapper[4973]: I0320 13:22:15.580331 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:15 crc kubenswrapper[4973]: I0320 13:22:15.581475 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:15 crc kubenswrapper[4973]: I0320 13:22:15.581513 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:15 crc kubenswrapper[4973]: I0320 13:22:15.581523 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:15 crc kubenswrapper[4973]: I0320 13:22:15.581549 4973 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:22:15 crc kubenswrapper[4973]: E0320 13:22:15.586693 4973 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 13:22:15 crc kubenswrapper[4973]: E0320 13:22:15.586951 4973 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 13:22:15 crc kubenswrapper[4973]: I0320 13:22:15.889888 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:16 crc kubenswrapper[4973]: I0320 13:22:16.892878 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:16 crc kubenswrapper[4973]: I0320 13:22:16.949991 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:16 crc kubenswrapper[4973]: I0320 13:22:16.951469 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:16 crc kubenswrapper[4973]: I0320 13:22:16.951543 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:16 crc kubenswrapper[4973]: I0320 13:22:16.951562 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:16 crc kubenswrapper[4973]: I0320 13:22:16.952451 4973 scope.go:117] "RemoveContainer" containerID="a2f552b111c3da99bfadfc49209ff0066f1d72de80fc8203eb1b9cb96434caa7" Mar 20 13:22:16 crc kubenswrapper[4973]: E0320 13:22:16.952736 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:22:17 crc kubenswrapper[4973]: I0320 13:22:17.892243 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:18 crc kubenswrapper[4973]: I0320 13:22:18.893064 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:19 crc kubenswrapper[4973]: I0320 13:22:19.895110 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:20 crc kubenswrapper[4973]: E0320 13:22:20.026635 4973 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:22:20 crc kubenswrapper[4973]: I0320 13:22:20.894436 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:21 crc kubenswrapper[4973]: I0320 13:22:21.898280 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:22 crc kubenswrapper[4973]: I0320 13:22:22.587221 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:22 crc kubenswrapper[4973]: I0320 13:22:22.588536 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:22 crc kubenswrapper[4973]: I0320 13:22:22.588571 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:22 crc kubenswrapper[4973]: I0320 13:22:22.588579 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:22 crc kubenswrapper[4973]: I0320 13:22:22.588600 4973 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:22:22 crc kubenswrapper[4973]: E0320 13:22:22.591508 4973 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 13:22:22 crc kubenswrapper[4973]: E0320 13:22:22.591845 4973 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 13:22:22 crc kubenswrapper[4973]: I0320 13:22:22.889906 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:23 crc kubenswrapper[4973]: I0320 13:22:23.889135 4973 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 13:22:24 crc kubenswrapper[4973]: I0320 13:22:24.370757 4973 csr.go:261] certificate signing request csr-bjswr is approved, waiting to be issued Mar 20 13:22:24 crc kubenswrapper[4973]: I0320 13:22:24.379722 4973 csr.go:257] certificate signing request csr-bjswr is issued Mar 20 13:22:24 crc kubenswrapper[4973]: I0320 13:22:24.434359 4973 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 20 13:22:24 crc kubenswrapper[4973]: I0320 13:22:24.736774 4973 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 20 13:22:25 crc kubenswrapper[4973]: I0320 13:22:25.381696 4973 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-15 15:44:05.12680154 +0000 UTC Mar 20 13:22:25 crc kubenswrapper[4973]: I0320 13:22:25.381756 4973 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7226h21m39.745048299s for next certificate rotation Mar 20 13:22:27 crc kubenswrapper[4973]: I0320 13:22:27.949609 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:27 crc kubenswrapper[4973]: I0320 13:22:27.950873 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:27 crc kubenswrapper[4973]: I0320 13:22:27.950898 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:27 crc kubenswrapper[4973]: I0320 13:22:27.950907 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:27 crc kubenswrapper[4973]: I0320 13:22:27.951555 4973 scope.go:117] "RemoveContainer" containerID="a2f552b111c3da99bfadfc49209ff0066f1d72de80fc8203eb1b9cb96434caa7" Mar 20 13:22:28 crc kubenswrapper[4973]: I0320 13:22:28.244235 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 13:22:28 crc kubenswrapper[4973]: I0320 13:22:28.246058 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365"} Mar 20 13:22:28 crc kubenswrapper[4973]: I0320 13:22:28.246194 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:28 crc kubenswrapper[4973]: I0320 13:22:28.247188 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:28 crc kubenswrapper[4973]: I0320 13:22:28.247225 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:28 crc kubenswrapper[4973]: I0320 13:22:28.247235 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.250133 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.250660 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.252626 4973 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365" exitCode=255 Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.252699 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365"} Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.252765 4973 scope.go:117] "RemoveContainer" containerID="a2f552b111c3da99bfadfc49209ff0066f1d72de80fc8203eb1b9cb96434caa7" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.253022 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.255818 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.255844 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.255852 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.256409 4973 scope.go:117] "RemoveContainer" containerID="923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365" Mar 20 13:22:29 crc kubenswrapper[4973]: E0320 13:22:29.256559 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.591795 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.593191 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.593255 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.593275 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.593472 4973 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.605285 4973 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.605807 4973 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 20 13:22:29 crc kubenswrapper[4973]: E0320 13:22:29.606412 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.610676 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.610739 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.610767 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.610799 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.610824 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:29Z","lastTransitionTime":"2026-03-20T13:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:29 crc kubenswrapper[4973]: E0320 13:22:29.634879 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.643236 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.643276 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.643286 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.643302 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.643312 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:29Z","lastTransitionTime":"2026-03-20T13:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:29 crc kubenswrapper[4973]: E0320 13:22:29.655475 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.663623 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.663667 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.663679 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.663694 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.663704 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:29Z","lastTransitionTime":"2026-03-20T13:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:29 crc kubenswrapper[4973]: E0320 13:22:29.673846 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.683730 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.683783 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.683800 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.683826 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:29 crc kubenswrapper[4973]: I0320 13:22:29.683842 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:29Z","lastTransitionTime":"2026-03-20T13:22:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:29 crc kubenswrapper[4973]: E0320 13:22:29.700168 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:29 crc kubenswrapper[4973]: E0320 13:22:29.700422 4973 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:22:29 crc kubenswrapper[4973]: E0320 13:22:29.700462 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:29 crc kubenswrapper[4973]: E0320 13:22:29.800884 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:29 crc kubenswrapper[4973]: E0320 13:22:29.901591 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:30 crc kubenswrapper[4973]: E0320 13:22:30.002296 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:30 crc kubenswrapper[4973]: E0320 13:22:30.027381 4973 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:22:30 crc kubenswrapper[4973]: I0320 13:22:30.039534 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:22:30 crc kubenswrapper[4973]: E0320 13:22:30.103093 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:30 crc kubenswrapper[4973]: E0320 13:22:30.204139 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:30 crc kubenswrapper[4973]: I0320 13:22:30.256330 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 13:22:30 crc kubenswrapper[4973]: I0320 13:22:30.257945 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:30 crc kubenswrapper[4973]: I0320 13:22:30.258631 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:30 crc kubenswrapper[4973]: I0320 13:22:30.258671 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:30 crc kubenswrapper[4973]: I0320 13:22:30.258680 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:30 crc kubenswrapper[4973]: I0320 13:22:30.259268 4973 scope.go:117] "RemoveContainer" containerID="923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365" Mar 20 13:22:30 crc kubenswrapper[4973]: E0320 13:22:30.259463 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:22:30 crc kubenswrapper[4973]: E0320 13:22:30.304537 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:30 crc kubenswrapper[4973]: E0320 13:22:30.405452 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:30 crc kubenswrapper[4973]: E0320 13:22:30.506365 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:30 crc kubenswrapper[4973]: E0320 13:22:30.607481 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:30 crc kubenswrapper[4973]: E0320 13:22:30.708491 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:30 crc kubenswrapper[4973]: E0320 13:22:30.809401 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:30 crc kubenswrapper[4973]: E0320 13:22:30.909872 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:31 crc kubenswrapper[4973]: E0320 13:22:31.010443 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:31 crc kubenswrapper[4973]: E0320 13:22:31.110977 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:31 crc kubenswrapper[4973]: E0320 13:22:31.211983 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:31 crc kubenswrapper[4973]: E0320 13:22:31.312401 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:31 crc kubenswrapper[4973]: E0320 13:22:31.413352 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:31 crc kubenswrapper[4973]: E0320 13:22:31.513454 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:31 crc kubenswrapper[4973]: E0320 13:22:31.614048 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:31 crc kubenswrapper[4973]: E0320 13:22:31.715081 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:31 crc kubenswrapper[4973]: E0320 13:22:31.816025 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:31 crc kubenswrapper[4973]: E0320 13:22:31.916308 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:32 crc kubenswrapper[4973]: E0320 13:22:32.016683 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:32 crc kubenswrapper[4973]: E0320 13:22:32.117679 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:32 crc kubenswrapper[4973]: E0320 13:22:32.218848 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:32 crc kubenswrapper[4973]: E0320 13:22:32.319576 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:32 crc kubenswrapper[4973]: E0320 13:22:32.420177 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:32 crc kubenswrapper[4973]: E0320 13:22:32.521008 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:32 crc kubenswrapper[4973]: E0320 13:22:32.621588 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:32 crc kubenswrapper[4973]: E0320 13:22:32.722362 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:32 crc kubenswrapper[4973]: E0320 13:22:32.822892 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:32 crc kubenswrapper[4973]: E0320 13:22:32.923197 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:33 crc kubenswrapper[4973]: E0320 13:22:33.024176 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:33 crc kubenswrapper[4973]: E0320 13:22:33.124318 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:33 crc kubenswrapper[4973]: E0320 13:22:33.225290 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:33 crc kubenswrapper[4973]: E0320 13:22:33.326428 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:33 crc kubenswrapper[4973]: E0320 13:22:33.427468 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:33 crc kubenswrapper[4973]: E0320 13:22:33.528280 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:33 crc kubenswrapper[4973]: E0320 13:22:33.629163 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:33 crc kubenswrapper[4973]: E0320 13:22:33.730255 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:33 crc kubenswrapper[4973]: E0320 13:22:33.831462 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:33 crc kubenswrapper[4973]: E0320 13:22:33.932098 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:34 crc kubenswrapper[4973]: E0320 13:22:34.032837 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:34 crc kubenswrapper[4973]: E0320 13:22:34.133802 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:34 crc kubenswrapper[4973]: E0320 13:22:34.234892 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:34 crc kubenswrapper[4973]: E0320 13:22:34.335446 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:34 crc kubenswrapper[4973]: E0320 13:22:34.436512 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:34 crc kubenswrapper[4973]: E0320 13:22:34.537378 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:34 crc kubenswrapper[4973]: E0320 13:22:34.638564 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:34 crc kubenswrapper[4973]: E0320 13:22:34.739194 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:34 crc kubenswrapper[4973]: E0320 13:22:34.839290 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:34 crc kubenswrapper[4973]: E0320 13:22:34.940107 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:35 crc kubenswrapper[4973]: E0320 13:22:35.040472 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:35 crc kubenswrapper[4973]: E0320 13:22:35.140727 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:35 crc kubenswrapper[4973]: E0320 13:22:35.241883 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:35 crc kubenswrapper[4973]: I0320 13:22:35.302947 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:22:35 crc kubenswrapper[4973]: I0320 13:22:35.303159 4973 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 13:22:35 crc kubenswrapper[4973]: I0320 13:22:35.304473 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:35 crc kubenswrapper[4973]: I0320 13:22:35.304562 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:35 crc kubenswrapper[4973]: I0320 13:22:35.304588 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:35 crc kubenswrapper[4973]: I0320 13:22:35.305639 4973 scope.go:117] "RemoveContainer" containerID="923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365" Mar 20 13:22:35 crc kubenswrapper[4973]: E0320 13:22:35.305919 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:22:35 crc kubenswrapper[4973]: E0320 13:22:35.342553 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:35 crc kubenswrapper[4973]: E0320 13:22:35.443536 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:35 crc kubenswrapper[4973]: E0320 13:22:35.544313 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:35 crc kubenswrapper[4973]: E0320 13:22:35.645103 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:35 crc kubenswrapper[4973]: E0320 13:22:35.745648 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:35 crc kubenswrapper[4973]: E0320 13:22:35.845757 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:35 crc kubenswrapper[4973]: E0320 13:22:35.946586 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:36 crc kubenswrapper[4973]: E0320 13:22:36.046718 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:36 crc kubenswrapper[4973]: E0320 13:22:36.147674 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:36 crc kubenswrapper[4973]: E0320 13:22:36.248966 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:36 crc kubenswrapper[4973]: E0320 13:22:36.350378 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:36 crc kubenswrapper[4973]: E0320 13:22:36.451798 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:36 crc kubenswrapper[4973]: E0320 13:22:36.552226 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:36 crc kubenswrapper[4973]: E0320 13:22:36.652936 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:36 crc kubenswrapper[4973]: E0320 13:22:36.753822 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:36 crc kubenswrapper[4973]: E0320 13:22:36.854118 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:36 crc kubenswrapper[4973]: E0320 13:22:36.954660 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:37 crc kubenswrapper[4973]: E0320 13:22:37.055503 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:37 crc kubenswrapper[4973]: E0320 13:22:37.156571 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:37 crc kubenswrapper[4973]: E0320 13:22:37.257459 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:37 crc kubenswrapper[4973]: E0320 13:22:37.357784 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:37 crc kubenswrapper[4973]: E0320 13:22:37.458201 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:37 crc kubenswrapper[4973]: E0320 13:22:37.558767 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:37 crc kubenswrapper[4973]: E0320 13:22:37.659954 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:37 crc kubenswrapper[4973]: E0320 13:22:37.760128 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:37 crc kubenswrapper[4973]: E0320 13:22:37.860487 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:37 crc kubenswrapper[4973]: E0320 13:22:37.960799 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:38 crc kubenswrapper[4973]: E0320 13:22:38.061871 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:38 crc kubenswrapper[4973]: E0320 13:22:38.163100 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:38 crc kubenswrapper[4973]: E0320 13:22:38.264395 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:38 crc kubenswrapper[4973]: E0320 13:22:38.366072 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:38 crc kubenswrapper[4973]: E0320 13:22:38.466747 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:38 crc kubenswrapper[4973]: E0320 13:22:38.567566 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:38 crc kubenswrapper[4973]: E0320 13:22:38.667977 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:38 crc kubenswrapper[4973]: E0320 13:22:38.768551 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:38 crc kubenswrapper[4973]: E0320 13:22:38.869622 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:38 crc kubenswrapper[4973]: E0320 13:22:38.970679 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:39 crc kubenswrapper[4973]: E0320 13:22:39.071774 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:39 crc kubenswrapper[4973]: E0320 13:22:39.172497 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:39 crc kubenswrapper[4973]: E0320 13:22:39.272597 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:39 crc kubenswrapper[4973]: E0320 13:22:39.373043 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:39 crc kubenswrapper[4973]: E0320 13:22:39.474026 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:39 crc kubenswrapper[4973]: E0320 13:22:39.575107 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:39 crc kubenswrapper[4973]: E0320 13:22:39.675521 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:39 crc kubenswrapper[4973]: E0320 13:22:39.775963 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:39 crc kubenswrapper[4973]: E0320 13:22:39.876734 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:39 crc kubenswrapper[4973]: E0320 13:22:39.977150 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:39 crc kubenswrapper[4973]: E0320 13:22:39.994673 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 13:22:39 crc kubenswrapper[4973]: I0320 13:22:39.999833 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:40 crc kubenswrapper[4973]: I0320 13:22:39.999886 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:40 crc kubenswrapper[4973]: I0320 13:22:39.999898 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:40 crc kubenswrapper[4973]: I0320 13:22:39.999916 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:40 crc kubenswrapper[4973]: I0320 13:22:39.999928 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:39Z","lastTransitionTime":"2026-03-20T13:22:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:40 crc kubenswrapper[4973]: E0320 13:22:40.008737 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:40 crc kubenswrapper[4973]: I0320 13:22:40.015439 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:40 crc kubenswrapper[4973]: I0320 13:22:40.015472 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:40 crc kubenswrapper[4973]: I0320 13:22:40.015482 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:40 crc kubenswrapper[4973]: I0320 13:22:40.015499 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:40 crc kubenswrapper[4973]: I0320 13:22:40.015511 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:40Z","lastTransitionTime":"2026-03-20T13:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:40 crc kubenswrapper[4973]: E0320 13:22:40.024747 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:40 crc kubenswrapper[4973]: E0320 13:22:40.027630 4973 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 13:22:40 crc kubenswrapper[4973]: I0320 13:22:40.030557 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:40 crc kubenswrapper[4973]: I0320 13:22:40.030662 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:40 crc kubenswrapper[4973]: I0320 13:22:40.030721 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:40 crc kubenswrapper[4973]: I0320 13:22:40.030781 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:40 crc kubenswrapper[4973]: I0320 13:22:40.030838 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:40Z","lastTransitionTime":"2026-03-20T13:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:40 crc kubenswrapper[4973]: E0320 13:22:40.043749 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:40 crc kubenswrapper[4973]: I0320 13:22:40.050986 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:40 crc kubenswrapper[4973]: I0320 13:22:40.051024 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:40 crc kubenswrapper[4973]: I0320 13:22:40.051035 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:40 crc kubenswrapper[4973]: I0320 13:22:40.051050 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:40 crc kubenswrapper[4973]: I0320 13:22:40.051090 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:40Z","lastTransitionTime":"2026-03-20T13:22:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:40 crc kubenswrapper[4973]: E0320 13:22:40.060308 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:40 crc kubenswrapper[4973]: E0320 13:22:40.060427 4973 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:22:40 crc kubenswrapper[4973]: E0320 13:22:40.077416 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:40 crc kubenswrapper[4973]: E0320 13:22:40.178199 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:40 crc kubenswrapper[4973]: E0320 13:22:40.278456 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:40 crc kubenswrapper[4973]: E0320 13:22:40.379078 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:40 crc kubenswrapper[4973]: E0320 13:22:40.479424 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:40 crc kubenswrapper[4973]: E0320 13:22:40.580568 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:40 crc kubenswrapper[4973]: E0320 13:22:40.681593 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:40 crc kubenswrapper[4973]: E0320 13:22:40.782635 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:40 crc kubenswrapper[4973]: E0320 13:22:40.883177 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:40 crc kubenswrapper[4973]: E0320 13:22:40.983929 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:41 crc kubenswrapper[4973]: E0320 13:22:41.084222 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:41 crc kubenswrapper[4973]: E0320 13:22:41.185147 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:41 crc kubenswrapper[4973]: E0320 13:22:41.285551 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:41 crc kubenswrapper[4973]: E0320 13:22:41.386422 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:41 crc kubenswrapper[4973]: E0320 13:22:41.487228 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:41 crc kubenswrapper[4973]: E0320 13:22:41.588100 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:41 crc kubenswrapper[4973]: E0320 13:22:41.689296 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:41 crc kubenswrapper[4973]: E0320 13:22:41.789996 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:41 crc kubenswrapper[4973]: E0320 13:22:41.891380 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:41 crc kubenswrapper[4973]: E0320 13:22:41.991695 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:42 crc kubenswrapper[4973]: E0320 13:22:42.091757 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:42 crc kubenswrapper[4973]: E0320 13:22:42.192527 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:42 crc kubenswrapper[4973]: E0320 13:22:42.293830 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:42 crc kubenswrapper[4973]: E0320 13:22:42.394967 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:42 crc kubenswrapper[4973]: E0320 13:22:42.496797 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:42 crc kubenswrapper[4973]: E0320 13:22:42.597482 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:42 crc kubenswrapper[4973]: E0320 13:22:42.698326 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:42 crc kubenswrapper[4973]: E0320 13:22:42.799581 4973 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.821982 4973 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.903589 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.904168 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.904463 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.904724 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.904946 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:42Z","lastTransitionTime":"2026-03-20T13:22:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.920049 4973 apiserver.go:52] "Watching apiserver" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.928706 4973 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.930915 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb","openshift-ovn-kubernetes/ovnkube-node-jllfx","openshift-dns/node-resolver-7qcsb","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-multus/network-metrics-daemon-7kszd","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-multus/multus-57hnn","openshift-image-registry/node-ca-qqncz","openshift-machine-config-operator/machine-config-daemon-qlztx","openshift-network-node-identity/network-node-identity-vrzqb","openshift-multus/multus-additional-cni-plugins-tmj8d","openshift-network-diagnostics/network-check-target-xd92c"] Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.932812 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.933157 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.933258 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qqncz" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.933306 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.933320 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.933404 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.933408 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.933527 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.933619 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7qcsb" Mar 20 13:22:42 crc kubenswrapper[4973]: E0320 13:22:42.933828 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:22:42 crc kubenswrapper[4973]: E0320 13:22:42.934686 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:22:42 crc kubenswrapper[4973]: E0320 13:22:42.935210 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:22:42 crc kubenswrapper[4973]: E0320 13:22:42.935307 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.935318 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-57hnn" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.935217 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.935434 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.935938 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.938504 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.944871 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.944870 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.945190 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.945396 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.945968 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.946461 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.946840 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.947285 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.947538 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.947870 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.948070 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.948322 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.948660 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.948824 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.948940 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.949282 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.949627 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.949786 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.948877 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.950282 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.950081 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.950467 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.950601 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.950618 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.950677 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.951057 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.951150 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.951230 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.951238 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.951269 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.951303 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.951370 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.951379 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.951585 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.951813 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.951857 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.952022 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.973430 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.983395 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.992448 4973 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 13:22:42 crc kubenswrapper[4973]: I0320 13:22:42.995069 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.003326 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.007928 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.008199 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.008393 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.008547 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.009177 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:43Z","lastTransitionTime":"2026-03-20T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.011918 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.020781 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.030423 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.042361 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052035 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052103 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052140 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052174 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052207 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052237 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052266 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052297 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052321 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052372 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052400 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052429 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052458 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052486 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052516 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052548 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052578 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052611 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052636 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052649 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052665 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052712 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052732 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052742 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052805 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052842 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052877 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052905 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.052994 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.053012 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.053030 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.053051 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.053059 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.053112 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.053146 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.053175 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.053204 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.053305 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.053359 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.053436 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.053468 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.053496 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.053580 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.053615 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.053882 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.053918 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.053951 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.053985 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054017 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054050 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054078 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054110 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054142 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054172 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054201 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054233 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054263 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054298 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054364 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054397 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054424 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054459 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054492 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054519 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054544 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054571 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054610 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054639 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054665 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054783 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054819 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054853 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054883 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054912 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054939 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054988 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055023 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055050 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055078 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055103 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055131 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055166 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055197 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055230 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055256 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055285 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055314 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055370 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055403 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055434 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055464 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055492 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055520 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055547 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055574 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055602 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055633 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055666 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055698 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055730 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.062703 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.053068 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.063762 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.053289 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.053378 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054106 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054264 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054746 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.054968 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055098 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055107 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055128 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055129 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055555 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055555 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055588 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055734 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.063883 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.055758 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:22:43.555731355 +0000 UTC m=+84.299401109 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055846 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.064002 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.064084 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.064120 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.064147 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.064171 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.064196 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.064218 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.064189 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.064247 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.064271 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.064295 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.064358 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.064382 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.064403 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.064469 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.064495 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.064768 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.064845 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.064926 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.065234 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.065192 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.065287 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.065308 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.065328 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.065374 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.065392 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.065965 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.065417 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.066380 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.066434 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.066540 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.066653 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.066795 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.066831 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.055890 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.056031 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.056391 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.056706 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.057044 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.057052 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.057123 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.057171 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.057179 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.057560 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.057911 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.058122 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.058493 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.058543 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.058992 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.059014 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.059080 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.059500 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.059593 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.059799 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.059796 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.059666 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.060258 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.060287 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.060655 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.060699 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.060849 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.060900 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.061315 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.061835 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.061986 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.062064 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.062305 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.062587 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.062622 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.062945 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.062951 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.062983 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.063138 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.063609 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.063637 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.063678 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.067519 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.067793 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.068117 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.068168 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.068771 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.069321 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.069561 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.069615 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.069649 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.069674 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.069696 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.069712 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.069735 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.069778 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.069800 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.069821 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.069843 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.069864 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.069881 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.069900 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.069923 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.069944 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.069961 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.069982 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070001 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.069978 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070023 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070054 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070076 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070092 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070111 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070137 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070157 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070174 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070195 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070214 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070260 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070281 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070300 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070317 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070348 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070367 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070388 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070406 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070425 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070444 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070465 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070486 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070506 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070524 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070554 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070573 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070596 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070615 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070829 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070886 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070906 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070929 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070952 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070971 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.070990 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.071009 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.071044 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.071074 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.071103 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.071135 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.071160 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.071184 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.071205 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.071227 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.071244 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.071264 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.071285 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.071303 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.071323 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.071412 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.071600 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.071623 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.071646 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.071667 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.071689 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.071742 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.071766 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.071856 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.071876 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.071897 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.071918 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.072043 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b85f66bb-77a7-4c4c-8d36-6a94a52c90dd-cnibin\") pod \"multus-additional-cni-plugins-tmj8d\" (UID: \"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\") " pod="openshift-multus/multus-additional-cni-plugins-tmj8d" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.072093 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.072135 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.072160 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-run-systemd\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.072178 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-multus-socket-dir-parent\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.072164 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.072203 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2747a19a-a33a-458e-bc5d-bda5c13a2bf1-hosts-file\") pod \"node-resolver-7qcsb\" (UID: \"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\") " pod="openshift-dns/node-resolver-7qcsb" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.072233 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48dqg\" (UniqueName: \"kubernetes.io/projected/b85f66bb-77a7-4c4c-8d36-6a94a52c90dd-kube-api-access-48dqg\") pod \"multus-additional-cni-plugins-tmj8d\" (UID: \"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\") " pod="openshift-multus/multus-additional-cni-plugins-tmj8d" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.072254 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-etc-openvswitch\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.072271 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-host-run-multus-certs\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.072293 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wxk4\" (UniqueName: \"kubernetes.io/projected/70745a45-4eff-4e56-b9ab-efa4a7c83306-kube-api-access-9wxk4\") pod \"machine-config-daemon-qlztx\" (UID: \"70745a45-4eff-4e56-b9ab-efa4a7c83306\") " pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.072314 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.072333 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-var-lib-openvswitch\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073176 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-cni-netd\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073195 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-hostroot\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073215 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/35802646-2926-42b8-913a-986001818f97-multus-daemon-config\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073239 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b85f66bb-77a7-4c4c-8d36-6a94a52c90dd-os-release\") pod \"multus-additional-cni-plugins-tmj8d\" (UID: \"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\") " pod="openshift-multus/multus-additional-cni-plugins-tmj8d" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073259 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-slash\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073276 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-run-openvswitch\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073332 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-system-cni-dir\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073469 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-cnibin\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073489 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/35802646-2926-42b8-913a-986001818f97-cni-binary-copy\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073513 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-multus-conf-dir\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073538 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073561 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-cni-bin\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073586 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073607 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x94sv\" (UniqueName: \"kubernetes.io/projected/93c5ad90-87bf-4668-9d87-34e676b15783-kube-api-access-x94sv\") pod \"network-metrics-daemon-7kszd\" (UID: \"93c5ad90-87bf-4668-9d87-34e676b15783\") " pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073625 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-etc-kubernetes\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073659 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9hbz\" (UniqueName: \"kubernetes.io/projected/35802646-2926-42b8-913a-986001818f97-kube-api-access-h9hbz\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073682 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073701 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073725 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6e2d3006-c203-45e9-875b-8b8210a85409-serviceca\") pod \"node-ca-qqncz\" (UID: \"6e2d3006-c203-45e9-875b-8b8210a85409\") " pod="openshift-image-registry/node-ca-qqncz" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073783 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-run-ovn-kubernetes\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073802 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-multus-cni-dir\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073822 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b606d037-c146-4c4b-985a-8cea73f83da5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-sp2rb\" (UID: \"b606d037-c146-4c4b-985a-8cea73f83da5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073843 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-host-var-lib-cni-multus\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073863 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b85f66bb-77a7-4c4c-8d36-6a94a52c90dd-cni-binary-copy\") pod \"multus-additional-cni-plugins-tmj8d\" (UID: \"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\") " pod="openshift-multus/multus-additional-cni-plugins-tmj8d" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073883 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b85f66bb-77a7-4c4c-8d36-6a94a52c90dd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tmj8d\" (UID: \"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\") " pod="openshift-multus/multus-additional-cni-plugins-tmj8d" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073904 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-log-socket\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073923 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073944 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073964 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-run-ovn\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073984 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074052 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074078 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b606d037-c146-4c4b-985a-8cea73f83da5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-sp2rb\" (UID: \"b606d037-c146-4c4b-985a-8cea73f83da5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074102 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074143 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e2d3006-c203-45e9-875b-8b8210a85409-host\") pod \"node-ca-qqncz\" (UID: \"6e2d3006-c203-45e9-875b-8b8210a85409\") " pod="openshift-image-registry/node-ca-qqncz" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074173 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-kubelet\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074191 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-run-netns\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074215 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnk94\" (UniqueName: \"kubernetes.io/projected/6e2d3006-c203-45e9-875b-8b8210a85409-kube-api-access-gnk94\") pod \"node-ca-qqncz\" (UID: \"6e2d3006-c203-45e9-875b-8b8210a85409\") " pod="openshift-image-registry/node-ca-qqncz" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074234 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sgs6\" (UniqueName: \"kubernetes.io/projected/2747a19a-a33a-458e-bc5d-bda5c13a2bf1-kube-api-access-7sgs6\") pod \"node-resolver-7qcsb\" (UID: \"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\") " pod="openshift-dns/node-resolver-7qcsb" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074255 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/774edfed-7d45-4b69-b9d7-a3a914cbca04-ovn-node-metrics-cert\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074274 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93c5ad90-87bf-4668-9d87-34e676b15783-metrics-certs\") pod \"network-metrics-daemon-7kszd\" (UID: \"93c5ad90-87bf-4668-9d87-34e676b15783\") " pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074293 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-host-run-k8s-cni-cncf-io\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074313 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b85f66bb-77a7-4c4c-8d36-6a94a52c90dd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tmj8d\" (UID: \"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\") " pod="openshift-multus/multus-additional-cni-plugins-tmj8d" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074332 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws7x5\" (UniqueName: \"kubernetes.io/projected/b606d037-c146-4c4b-985a-8cea73f83da5-kube-api-access-ws7x5\") pod \"ovnkube-control-plane-749d76644c-sp2rb\" (UID: \"b606d037-c146-4c4b-985a-8cea73f83da5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074367 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-os-release\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074382 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/774edfed-7d45-4b69-b9d7-a3a914cbca04-env-overrides\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074400 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/774edfed-7d45-4b69-b9d7-a3a914cbca04-ovnkube-script-lib\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074425 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074444 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-host-run-netns\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074463 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/70745a45-4eff-4e56-b9ab-efa4a7c83306-mcd-auth-proxy-config\") pod \"machine-config-daemon-qlztx\" (UID: \"70745a45-4eff-4e56-b9ab-efa4a7c83306\") " pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074482 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-host-var-lib-kubelet\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074500 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/70745a45-4eff-4e56-b9ab-efa4a7c83306-rootfs\") pod \"machine-config-daemon-qlztx\" (UID: \"70745a45-4eff-4e56-b9ab-efa4a7c83306\") " pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074518 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-node-log\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074540 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/774edfed-7d45-4b69-b9d7-a3a914cbca04-ovnkube-config\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074556 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lx76c\" (UniqueName: \"kubernetes.io/projected/774edfed-7d45-4b69-b9d7-a3a914cbca04-kube-api-access-lx76c\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074576 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b606d037-c146-4c4b-985a-8cea73f83da5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-sp2rb\" (UID: \"b606d037-c146-4c4b-985a-8cea73f83da5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074595 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-host-var-lib-cni-bin\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074615 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074632 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b85f66bb-77a7-4c4c-8d36-6a94a52c90dd-system-cni-dir\") pod \"multus-additional-cni-plugins-tmj8d\" (UID: \"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\") " pod="openshift-multus/multus-additional-cni-plugins-tmj8d" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074656 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.076312 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70745a45-4eff-4e56-b9ab-efa4a7c83306-proxy-tls\") pod \"machine-config-daemon-qlztx\" (UID: \"70745a45-4eff-4e56-b9ab-efa4a7c83306\") " pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.076408 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-systemd-units\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.079402 4973 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.081459 4973 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.081493 4973 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.081516 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.081529 4973 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.081541 4973 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.081553 4973 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.081570 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.081590 4973 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086632 4973 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086657 4973 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086680 4973 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086693 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086705 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086715 4973 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086730 4973 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086741 4973 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086751 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086764 4973 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086774 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086783 4973 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086793 4973 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086806 4973 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086815 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086825 4973 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086835 4973 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086848 4973 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086858 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086869 4973 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086879 4973 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086892 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086903 4973 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086922 4973 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086937 4973 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086947 4973 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086957 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086971 4973 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086983 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086995 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087004 4973 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087014 4973 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087026 4973 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087035 4973 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087045 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087013 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087057 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087264 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087292 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087314 4973 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087401 4973 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087423 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087441 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087459 4973 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087484 4973 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087501 4973 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087521 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087541 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087565 4973 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087586 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087605 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087623 4973 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087648 4973 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087667 4973 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087683 4973 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087707 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087727 4973 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087746 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.085088 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087768 4973 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087793 4973 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.072181 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.072553 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.072600 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.072784 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.073316 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074085 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074151 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074596 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.074865 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.075588 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.075822 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.075886 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.076017 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.076058 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.076176 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.083018 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087813 4973 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.088038 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.088053 4973 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.076814 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.088039 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.088070 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.088106 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.082511 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.088135 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.088152 4973 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.088169 4973 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.077163 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.077190 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.078132 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.078359 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.078684 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.078700 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.078706 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.078757 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.088251 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.088305 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.079178 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.088376 4973 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.088414 4973 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.088458 4973 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.079193 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.078955 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.088491 4973 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.088529 4973 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.088568 4973 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.088615 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.088649 4973 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.088680 4973 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.088718 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.079366 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.079600 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.079644 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.079734 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.079794 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.079961 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.078935 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.080100 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.080242 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.080301 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.080801 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.080988 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.080990 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.080696 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.081022 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.081147 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.081290 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.081322 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.081225 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.081489 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.081495 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.081601 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.080027 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.081672 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.081873 4973 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.081182 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.089310 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:22:43.589273571 +0000 UTC m=+84.332943375 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.082118 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.083432 4973 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.089401 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:22:43.589387244 +0000 UTC m=+84.333056998 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086399 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.086688 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087246 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087390 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087536 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087705 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087773 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087810 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.087988 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.090788 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.092162 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.094331 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.094873 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.097983 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.098284 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.102184 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.102238 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.102266 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.102748 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.102777 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.103033 4973 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.103133 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:22:43.603111285 +0000 UTC m=+84.346781109 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.103217 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.103602 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.105465 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.109493 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.109530 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.109546 4973 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.109610 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:22:43.60959136 +0000 UTC m=+84.353261114 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.109812 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.110170 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.111730 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.112080 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.112114 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.112127 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.112143 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.112155 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:43Z","lastTransitionTime":"2026-03-20T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.113429 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.116435 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.116768 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.116875 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.117110 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.117260 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.117420 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.117554 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.117693 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.117857 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.117946 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.118384 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.118929 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.119008 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.119541 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.120695 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.124525 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.125533 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.125698 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.125743 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.125811 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.125827 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.125880 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.125986 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.126133 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.126288 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.126870 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.126912 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.126978 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.127129 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.127492 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.127711 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.127985 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.135532 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.136191 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.144794 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.144862 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.149070 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.154820 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189050 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/774edfed-7d45-4b69-b9d7-a3a914cbca04-ovnkube-script-lib\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189112 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-host-run-netns\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189140 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/70745a45-4eff-4e56-b9ab-efa4a7c83306-mcd-auth-proxy-config\") pod \"machine-config-daemon-qlztx\" (UID: \"70745a45-4eff-4e56-b9ab-efa4a7c83306\") " pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189161 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/774edfed-7d45-4b69-b9d7-a3a914cbca04-env-overrides\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189181 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/774edfed-7d45-4b69-b9d7-a3a914cbca04-ovnkube-config\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189199 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lx76c\" (UniqueName: \"kubernetes.io/projected/774edfed-7d45-4b69-b9d7-a3a914cbca04-kube-api-access-lx76c\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189218 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b606d037-c146-4c4b-985a-8cea73f83da5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-sp2rb\" (UID: \"b606d037-c146-4c4b-985a-8cea73f83da5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189237 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-host-var-lib-cni-bin\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189257 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-host-var-lib-kubelet\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189279 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/70745a45-4eff-4e56-b9ab-efa4a7c83306-rootfs\") pod \"machine-config-daemon-qlztx\" (UID: \"70745a45-4eff-4e56-b9ab-efa4a7c83306\") " pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189273 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-host-run-netns\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189371 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-node-log\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189312 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-node-log\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189444 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-host-var-lib-cni-bin\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189499 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-host-var-lib-kubelet\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189540 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189571 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b85f66bb-77a7-4c4c-8d36-6a94a52c90dd-system-cni-dir\") pod \"multus-additional-cni-plugins-tmj8d\" (UID: \"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\") " pod="openshift-multus/multus-additional-cni-plugins-tmj8d" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189590 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70745a45-4eff-4e56-b9ab-efa4a7c83306-proxy-tls\") pod \"machine-config-daemon-qlztx\" (UID: \"70745a45-4eff-4e56-b9ab-efa4a7c83306\") " pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189613 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-systemd-units\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189642 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-run-systemd\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189657 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-multus-socket-dir-parent\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189672 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b85f66bb-77a7-4c4c-8d36-6a94a52c90dd-cnibin\") pod \"multus-additional-cni-plugins-tmj8d\" (UID: \"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\") " pod="openshift-multus/multus-additional-cni-plugins-tmj8d" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189689 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48dqg\" (UniqueName: \"kubernetes.io/projected/b85f66bb-77a7-4c4c-8d36-6a94a52c90dd-kube-api-access-48dqg\") pod \"multus-additional-cni-plugins-tmj8d\" (UID: \"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\") " pod="openshift-multus/multus-additional-cni-plugins-tmj8d" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189705 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-etc-openvswitch\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189720 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2747a19a-a33a-458e-bc5d-bda5c13a2bf1-hosts-file\") pod \"node-resolver-7qcsb\" (UID: \"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\") " pod="openshift-dns/node-resolver-7qcsb" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189735 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189750 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-var-lib-openvswitch\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189766 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-cni-netd\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189777 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/70745a45-4eff-4e56-b9ab-efa4a7c83306-rootfs\") pod \"machine-config-daemon-qlztx\" (UID: \"70745a45-4eff-4e56-b9ab-efa4a7c83306\") " pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189781 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-hostroot\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189799 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-hostroot\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189815 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189821 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-host-run-multus-certs\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189847 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wxk4\" (UniqueName: \"kubernetes.io/projected/70745a45-4eff-4e56-b9ab-efa4a7c83306-kube-api-access-9wxk4\") pod \"machine-config-daemon-qlztx\" (UID: \"70745a45-4eff-4e56-b9ab-efa4a7c83306\") " pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189892 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2747a19a-a33a-458e-bc5d-bda5c13a2bf1-hosts-file\") pod \"node-resolver-7qcsb\" (UID: \"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\") " pod="openshift-dns/node-resolver-7qcsb" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189902 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b85f66bb-77a7-4c4c-8d36-6a94a52c90dd-system-cni-dir\") pod \"multus-additional-cni-plugins-tmj8d\" (UID: \"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\") " pod="openshift-multus/multus-additional-cni-plugins-tmj8d" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189935 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-host-run-multus-certs\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.189996 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.190017 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-var-lib-openvswitch\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.190015 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b85f66bb-77a7-4c4c-8d36-6a94a52c90dd-os-release\") pod \"multus-additional-cni-plugins-tmj8d\" (UID: \"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\") " pod="openshift-multus/multus-additional-cni-plugins-tmj8d" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.190043 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-slash\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.190058 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-run-openvswitch\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.190074 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-system-cni-dir\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.190072 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b85f66bb-77a7-4c4c-8d36-6a94a52c90dd-os-release\") pod \"multus-additional-cni-plugins-tmj8d\" (UID: \"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\") " pod="openshift-multus/multus-additional-cni-plugins-tmj8d" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.190103 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-run-openvswitch\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.190112 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-system-cni-dir\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.190130 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-systemd-units\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.190134 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-cni-netd\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.190143 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-run-systemd\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.190167 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-slash\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.190175 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-multus-socket-dir-parent\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.190196 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/35802646-2926-42b8-913a-986001818f97-multus-daemon-config\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.190249 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b606d037-c146-4c4b-985a-8cea73f83da5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-sp2rb\" (UID: \"b606d037-c146-4c4b-985a-8cea73f83da5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.190297 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b85f66bb-77a7-4c4c-8d36-6a94a52c90dd-cnibin\") pod \"multus-additional-cni-plugins-tmj8d\" (UID: \"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\") " pod="openshift-multus/multus-additional-cni-plugins-tmj8d" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.190613 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-etc-openvswitch\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.190686 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/70745a45-4eff-4e56-b9ab-efa4a7c83306-mcd-auth-proxy-config\") pod \"machine-config-daemon-qlztx\" (UID: \"70745a45-4eff-4e56-b9ab-efa4a7c83306\") " pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.190735 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-cni-bin\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.190773 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x94sv\" (UniqueName: \"kubernetes.io/projected/93c5ad90-87bf-4668-9d87-34e676b15783-kube-api-access-x94sv\") pod \"network-metrics-daemon-7kszd\" (UID: \"93c5ad90-87bf-4668-9d87-34e676b15783\") " pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.190803 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-cni-bin\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.190996 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-cnibin\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.191167 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/35802646-2926-42b8-913a-986001818f97-cni-binary-copy\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.191194 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-multus-conf-dir\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.191220 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9hbz\" (UniqueName: \"kubernetes.io/projected/35802646-2926-42b8-913a-986001818f97-kube-api-access-h9hbz\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.191314 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-etc-kubernetes\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.191424 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6e2d3006-c203-45e9-875b-8b8210a85409-serviceca\") pod \"node-ca-qqncz\" (UID: \"6e2d3006-c203-45e9-875b-8b8210a85409\") " pod="openshift-image-registry/node-ca-qqncz" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.191493 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-multus-conf-dir\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.191532 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-run-ovn-kubernetes\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.191080 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-cnibin\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.191829 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-etc-kubernetes\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.191939 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/774edfed-7d45-4b69-b9d7-a3a914cbca04-ovnkube-config\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.191283 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/35802646-2926-42b8-913a-986001818f97-multus-daemon-config\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.191495 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-run-ovn-kubernetes\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.192024 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-multus-cni-dir\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.192046 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b85f66bb-77a7-4c4c-8d36-6a94a52c90dd-cni-binary-copy\") pod \"multus-additional-cni-plugins-tmj8d\" (UID: \"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\") " pod="openshift-multus/multus-additional-cni-plugins-tmj8d" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.192083 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b85f66bb-77a7-4c4c-8d36-6a94a52c90dd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tmj8d\" (UID: \"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\") " pod="openshift-multus/multus-additional-cni-plugins-tmj8d" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.192098 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-log-socket\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.192125 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b606d037-c146-4c4b-985a-8cea73f83da5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-sp2rb\" (UID: \"b606d037-c146-4c4b-985a-8cea73f83da5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.192168 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-host-var-lib-cni-multus\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.192190 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-run-ovn\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.192236 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.192248 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/35802646-2926-42b8-913a-986001818f97-cni-binary-copy\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.192276 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-kubelet\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.192296 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-run-netns\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.192497 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/774edfed-7d45-4b69-b9d7-a3a914cbca04-env-overrides\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.192557 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-multus-cni-dir\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.192384 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b606d037-c146-4c4b-985a-8cea73f83da5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-sp2rb\" (UID: \"b606d037-c146-4c4b-985a-8cea73f83da5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.192981 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e2d3006-c203-45e9-875b-8b8210a85409-host\") pod \"node-ca-qqncz\" (UID: \"6e2d3006-c203-45e9-875b-8b8210a85409\") " pod="openshift-image-registry/node-ca-qqncz" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.193036 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7sgs6\" (UniqueName: \"kubernetes.io/projected/2747a19a-a33a-458e-bc5d-bda5c13a2bf1-kube-api-access-7sgs6\") pod \"node-resolver-7qcsb\" (UID: \"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\") " pod="openshift-dns/node-resolver-7qcsb" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.193146 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnk94\" (UniqueName: \"kubernetes.io/projected/6e2d3006-c203-45e9-875b-8b8210a85409-kube-api-access-gnk94\") pod \"node-ca-qqncz\" (UID: \"6e2d3006-c203-45e9-875b-8b8210a85409\") " pod="openshift-image-registry/node-ca-qqncz" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.193159 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b85f66bb-77a7-4c4c-8d36-6a94a52c90dd-cni-binary-copy\") pod \"multus-additional-cni-plugins-tmj8d\" (UID: \"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\") " pod="openshift-multus/multus-additional-cni-plugins-tmj8d" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.193236 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/774edfed-7d45-4b69-b9d7-a3a914cbca04-ovn-node-metrics-cert\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.193262 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93c5ad90-87bf-4668-9d87-34e676b15783-metrics-certs\") pod \"network-metrics-daemon-7kszd\" (UID: \"93c5ad90-87bf-4668-9d87-34e676b15783\") " pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.193304 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-host-run-k8s-cni-cncf-io\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.193367 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws7x5\" (UniqueName: \"kubernetes.io/projected/b606d037-c146-4c4b-985a-8cea73f83da5-kube-api-access-ws7x5\") pod \"ovnkube-control-plane-749d76644c-sp2rb\" (UID: \"b606d037-c146-4c4b-985a-8cea73f83da5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.193417 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-os-release\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.193464 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b85f66bb-77a7-4c4c-8d36-6a94a52c90dd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tmj8d\" (UID: \"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\") " pod="openshift-multus/multus-additional-cni-plugins-tmj8d" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.193605 4973 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.193621 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.193647 4973 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.193659 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.193727 4973 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.193738 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.193748 4973 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.193756 4973 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.193765 4973 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.193809 4973 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.193818 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.193923 4973 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.193937 4973 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.193948 4973 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.193956 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.193965 4973 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.193975 4973 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194007 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194017 4973 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194026 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194035 4973 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194044 4973 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194053 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194082 4973 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194091 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194100 4973 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194110 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194119 4973 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194128 4973 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194157 4973 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194166 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194175 4973 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194183 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194192 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194187 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/774edfed-7d45-4b69-b9d7-a3a914cbca04-ovnkube-script-lib\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194202 4973 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194251 4973 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194265 4973 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194277 4973 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194290 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194302 4973 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194317 4973 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194330 4973 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194361 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194360 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/6e2d3006-c203-45e9-875b-8b8210a85409-serviceca\") pod \"node-ca-qqncz\" (UID: \"6e2d3006-c203-45e9-875b-8b8210a85409\") " pod="openshift-image-registry/node-ca-qqncz" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194415 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-log-socket\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194420 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/6e2d3006-c203-45e9-875b-8b8210a85409-host\") pod \"node-ca-qqncz\" (UID: \"6e2d3006-c203-45e9-875b-8b8210a85409\") " pod="openshift-image-registry/node-ca-qqncz" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194375 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194459 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194473 4973 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194487 4973 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.194494 4973 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.194544 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93c5ad90-87bf-4668-9d87-34e676b15783-metrics-certs podName:93c5ad90-87bf-4668-9d87-34e676b15783 nodeName:}" failed. No retries permitted until 2026-03-20 13:22:43.694525604 +0000 UTC m=+84.438195358 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93c5ad90-87bf-4668-9d87-34e676b15783-metrics-certs") pod "network-metrics-daemon-7kszd" (UID: "93c5ad90-87bf-4668-9d87-34e676b15783") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194702 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/b85f66bb-77a7-4c4c-8d36-6a94a52c90dd-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tmj8d\" (UID: \"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\") " pod="openshift-multus/multus-additional-cni-plugins-tmj8d" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194742 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-host-run-k8s-cni-cncf-io\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194884 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b606d037-c146-4c4b-985a-8cea73f83da5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-sp2rb\" (UID: \"b606d037-c146-4c4b-985a-8cea73f83da5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194922 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-os-release\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194928 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/35802646-2926-42b8-913a-986001818f97-host-var-lib-cni-multus\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194952 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-kubelet\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194963 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-run-netns\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194977 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-run-ovn\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.194499 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.195551 4973 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.195566 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.195577 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.195639 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.195654 4973 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.195665 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.195675 4973 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.196196 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198006 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b606d037-c146-4c4b-985a-8cea73f83da5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-sp2rb\" (UID: \"b606d037-c146-4c4b-985a-8cea73f83da5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.197272 4973 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198088 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198104 4973 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198118 4973 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198131 4973 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198144 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198157 4973 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198168 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198180 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198192 4973 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198203 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198215 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198228 4973 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198239 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198251 4973 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198262 4973 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198274 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198286 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198298 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198311 4973 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198323 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198353 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198366 4973 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198378 4973 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198398 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198410 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198422 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198433 4973 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198447 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198459 4973 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198470 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198483 4973 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198495 4973 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198508 4973 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198519 4973 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198531 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198543 4973 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198556 4973 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198569 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198581 4973 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198592 4973 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198603 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198616 4973 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198628 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198638 4973 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198650 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198663 4973 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198674 4973 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198685 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198697 4973 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198710 4973 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.198721 4973 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.199575 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70745a45-4eff-4e56-b9ab-efa4a7c83306-proxy-tls\") pod \"machine-config-daemon-qlztx\" (UID: \"70745a45-4eff-4e56-b9ab-efa4a7c83306\") " pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.206223 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b85f66bb-77a7-4c4c-8d36-6a94a52c90dd-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tmj8d\" (UID: \"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\") " pod="openshift-multus/multus-additional-cni-plugins-tmj8d" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.208198 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnk94\" (UniqueName: \"kubernetes.io/projected/6e2d3006-c203-45e9-875b-8b8210a85409-kube-api-access-gnk94\") pod \"node-ca-qqncz\" (UID: \"6e2d3006-c203-45e9-875b-8b8210a85409\") " pod="openshift-image-registry/node-ca-qqncz" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.208609 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/774edfed-7d45-4b69-b9d7-a3a914cbca04-ovn-node-metrics-cert\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.209260 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x94sv\" (UniqueName: \"kubernetes.io/projected/93c5ad90-87bf-4668-9d87-34e676b15783-kube-api-access-x94sv\") pod \"network-metrics-daemon-7kszd\" (UID: \"93c5ad90-87bf-4668-9d87-34e676b15783\") " pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.210849 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9hbz\" (UniqueName: \"kubernetes.io/projected/35802646-2926-42b8-913a-986001818f97-kube-api-access-h9hbz\") pod \"multus-57hnn\" (UID: \"35802646-2926-42b8-913a-986001818f97\") " pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.210995 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lx76c\" (UniqueName: \"kubernetes.io/projected/774edfed-7d45-4b69-b9d7-a3a914cbca04-kube-api-access-lx76c\") pod \"ovnkube-node-jllfx\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.213615 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wxk4\" (UniqueName: \"kubernetes.io/projected/70745a45-4eff-4e56-b9ab-efa4a7c83306-kube-api-access-9wxk4\") pod \"machine-config-daemon-qlztx\" (UID: \"70745a45-4eff-4e56-b9ab-efa4a7c83306\") " pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.214293 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.214331 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.214363 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.214385 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.214400 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:43Z","lastTransitionTime":"2026-03-20T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.214415 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sgs6\" (UniqueName: \"kubernetes.io/projected/2747a19a-a33a-458e-bc5d-bda5c13a2bf1-kube-api-access-7sgs6\") pod \"node-resolver-7qcsb\" (UID: \"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\") " pod="openshift-dns/node-resolver-7qcsb" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.214922 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws7x5\" (UniqueName: \"kubernetes.io/projected/b606d037-c146-4c4b-985a-8cea73f83da5-kube-api-access-ws7x5\") pod \"ovnkube-control-plane-749d76644c-sp2rb\" (UID: \"b606d037-c146-4c4b-985a-8cea73f83da5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.215354 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48dqg\" (UniqueName: \"kubernetes.io/projected/b85f66bb-77a7-4c4c-8d36-6a94a52c90dd-kube-api-access-48dqg\") pod \"multus-additional-cni-plugins-tmj8d\" (UID: \"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\") " pod="openshift-multus/multus-additional-cni-plugins-tmj8d" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.267609 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.281173 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-qqncz" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.284292 4973 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:22:43 crc kubenswrapper[4973]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 20 13:22:43 crc kubenswrapper[4973]: set -o allexport Mar 20 13:22:43 crc kubenswrapper[4973]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 20 13:22:43 crc kubenswrapper[4973]: source /etc/kubernetes/apiserver-url.env Mar 20 13:22:43 crc kubenswrapper[4973]: else Mar 20 13:22:43 crc kubenswrapper[4973]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 20 13:22:43 crc kubenswrapper[4973]: exit 1 Mar 20 13:22:43 crc kubenswrapper[4973]: fi Mar 20 13:22:43 crc kubenswrapper[4973]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 20 13:22:43 crc kubenswrapper[4973]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:22:43 crc kubenswrapper[4973]: > logger="UnhandledError" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.285756 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.286973 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e12cf27a3b6dc28c4df85050b8f255e2f2c4d9a795806d78ecb34ae3d165c375"} Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.294171 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 13:22:43 crc kubenswrapper[4973]: W0320 13:22:43.297316 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e2d3006_c203_45e9_875b_8b8210a85409.slice/crio-65fa6859e2548fa0ac7c5fbce2d5d2db7a972a8b265306e2ed0bc76d75abc035 WatchSource:0}: Error finding container 65fa6859e2548fa0ac7c5fbce2d5d2db7a972a8b265306e2ed0bc76d75abc035: Status 404 returned error can't find the container with id 65fa6859e2548fa0ac7c5fbce2d5d2db7a972a8b265306e2ed0bc76d75abc035 Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.302534 4973 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:22:43 crc kubenswrapper[4973]: container &Container{Name:node-ca,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f,Command:[/bin/sh -c trap 'jobs -p | xargs -r kill; echo shutting down node-ca; exit 0' TERM Mar 20 13:22:43 crc kubenswrapper[4973]: while [ true ]; Mar 20 13:22:43 crc kubenswrapper[4973]: do Mar 20 13:22:43 crc kubenswrapper[4973]: for f in $(ls /tmp/serviceca); do Mar 20 13:22:43 crc kubenswrapper[4973]: echo $f Mar 20 13:22:43 crc kubenswrapper[4973]: ca_file_path="/tmp/serviceca/${f}" Mar 20 13:22:43 crc kubenswrapper[4973]: f=$(echo $f | sed -r 's/(.*)\.\./\1:/') Mar 20 13:22:43 crc kubenswrapper[4973]: reg_dir_path="/etc/docker/certs.d/${f}" Mar 20 13:22:43 crc kubenswrapper[4973]: if [ -e "${reg_dir_path}" ]; then Mar 20 13:22:43 crc kubenswrapper[4973]: cp -u $ca_file_path $reg_dir_path/ca.crt Mar 20 13:22:43 crc kubenswrapper[4973]: else Mar 20 13:22:43 crc kubenswrapper[4973]: mkdir $reg_dir_path Mar 20 13:22:43 crc kubenswrapper[4973]: cp $ca_file_path $reg_dir_path/ca.crt Mar 20 13:22:43 crc kubenswrapper[4973]: fi Mar 20 13:22:43 crc kubenswrapper[4973]: done Mar 20 13:22:43 crc kubenswrapper[4973]: for d in $(ls /etc/docker/certs.d); do Mar 20 13:22:43 crc kubenswrapper[4973]: echo $d Mar 20 13:22:43 crc kubenswrapper[4973]: dp=$(echo $d | sed -r 's/(.*):/\1\.\./') Mar 20 13:22:43 crc kubenswrapper[4973]: reg_conf_path="/tmp/serviceca/${dp}" Mar 20 13:22:43 crc kubenswrapper[4973]: if [ ! -e "${reg_conf_path}" ]; then Mar 20 13:22:43 crc kubenswrapper[4973]: rm -rf /etc/docker/certs.d/$d Mar 20 13:22:43 crc kubenswrapper[4973]: fi Mar 20 13:22:43 crc kubenswrapper[4973]: done Mar 20 13:22:43 crc kubenswrapper[4973]: sleep 60 & wait ${!} Mar 20 13:22:43 crc kubenswrapper[4973]: done Mar 20 13:22:43 crc kubenswrapper[4973]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{10485760 0} {} 10Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:serviceca,ReadOnly:false,MountPath:/tmp/serviceca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host,ReadOnly:false,MountPath:/etc/docker/certs.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gnk94,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-ca-qqncz_openshift-image-registry(6e2d3006-c203-45e9-875b-8b8210a85409): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:22:43 crc kubenswrapper[4973]: > logger="UnhandledError" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.303788 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"node-ca\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-image-registry/node-ca-qqncz" podUID="6e2d3006-c203-45e9-875b-8b8210a85409" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.310156 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-7qcsb" Mar 20 13:22:43 crc kubenswrapper[4973]: W0320 13:22:43.310894 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-d2504ad06047ea5836edc87390ebc22a2719780a79530b2240bf2b16a59f1882 WatchSource:0}: Error finding container d2504ad06047ea5836edc87390ebc22a2719780a79530b2240bf2b16a59f1882: Status 404 returned error can't find the container with id d2504ad06047ea5836edc87390ebc22a2719780a79530b2240bf2b16a59f1882 Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.314053 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.315669 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.317468 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.317543 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.317585 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.317608 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.317623 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:43Z","lastTransitionTime":"2026-03-20T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.319802 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.328108 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 13:22:43 crc kubenswrapper[4973]: W0320 13:22:43.328926 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2747a19a_a33a_458e_bc5d_bda5c13a2bf1.slice/crio-b3b1ebeff176166b224edc9c621865fc27302cc12b8f768b9aa7a2104d53ea81 WatchSource:0}: Error finding container b3b1ebeff176166b224edc9c621865fc27302cc12b8f768b9aa7a2104d53ea81: Status 404 returned error can't find the container with id b3b1ebeff176166b224edc9c621865fc27302cc12b8f768b9aa7a2104d53ea81 Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.335365 4973 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:22:43 crc kubenswrapper[4973]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 20 13:22:43 crc kubenswrapper[4973]: set -uo pipefail Mar 20 13:22:43 crc kubenswrapper[4973]: Mar 20 13:22:43 crc kubenswrapper[4973]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 20 13:22:43 crc kubenswrapper[4973]: Mar 20 13:22:43 crc kubenswrapper[4973]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 20 13:22:43 crc kubenswrapper[4973]: HOSTS_FILE="/etc/hosts" Mar 20 13:22:43 crc kubenswrapper[4973]: TEMP_FILE="/etc/hosts.tmp" Mar 20 13:22:43 crc kubenswrapper[4973]: Mar 20 13:22:43 crc kubenswrapper[4973]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 20 13:22:43 crc kubenswrapper[4973]: Mar 20 13:22:43 crc kubenswrapper[4973]: # Make a temporary file with the old hosts file's attributes. Mar 20 13:22:43 crc kubenswrapper[4973]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 20 13:22:43 crc kubenswrapper[4973]: echo "Failed to preserve hosts file. Exiting." Mar 20 13:22:43 crc kubenswrapper[4973]: exit 1 Mar 20 13:22:43 crc kubenswrapper[4973]: fi Mar 20 13:22:43 crc kubenswrapper[4973]: Mar 20 13:22:43 crc kubenswrapper[4973]: while true; do Mar 20 13:22:43 crc kubenswrapper[4973]: declare -A svc_ips Mar 20 13:22:43 crc kubenswrapper[4973]: for svc in "${services[@]}"; do Mar 20 13:22:43 crc kubenswrapper[4973]: # Fetch service IP from cluster dns if present. We make several tries Mar 20 13:22:43 crc kubenswrapper[4973]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 20 13:22:43 crc kubenswrapper[4973]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 20 13:22:43 crc kubenswrapper[4973]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 20 13:22:43 crc kubenswrapper[4973]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 13:22:43 crc kubenswrapper[4973]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 13:22:43 crc kubenswrapper[4973]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 20 13:22:43 crc kubenswrapper[4973]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 20 13:22:43 crc kubenswrapper[4973]: for i in ${!cmds[*]} Mar 20 13:22:43 crc kubenswrapper[4973]: do Mar 20 13:22:43 crc kubenswrapper[4973]: ips=($(eval "${cmds[i]}")) Mar 20 13:22:43 crc kubenswrapper[4973]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 20 13:22:43 crc kubenswrapper[4973]: svc_ips["${svc}"]="${ips[@]}" Mar 20 13:22:43 crc kubenswrapper[4973]: break Mar 20 13:22:43 crc kubenswrapper[4973]: fi Mar 20 13:22:43 crc kubenswrapper[4973]: done Mar 20 13:22:43 crc kubenswrapper[4973]: done Mar 20 13:22:43 crc kubenswrapper[4973]: Mar 20 13:22:43 crc kubenswrapper[4973]: # Update /etc/hosts only if we get valid service IPs Mar 20 13:22:43 crc kubenswrapper[4973]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 20 13:22:43 crc kubenswrapper[4973]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 20 13:22:43 crc kubenswrapper[4973]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 20 13:22:43 crc kubenswrapper[4973]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 20 13:22:43 crc kubenswrapper[4973]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 20 13:22:43 crc kubenswrapper[4973]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 20 13:22:43 crc kubenswrapper[4973]: sleep 60 & wait Mar 20 13:22:43 crc kubenswrapper[4973]: continue Mar 20 13:22:43 crc kubenswrapper[4973]: fi Mar 20 13:22:43 crc kubenswrapper[4973]: Mar 20 13:22:43 crc kubenswrapper[4973]: # Append resolver entries for services Mar 20 13:22:43 crc kubenswrapper[4973]: rc=0 Mar 20 13:22:43 crc kubenswrapper[4973]: for svc in "${!svc_ips[@]}"; do Mar 20 13:22:43 crc kubenswrapper[4973]: for ip in ${svc_ips[${svc}]}; do Mar 20 13:22:43 crc kubenswrapper[4973]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 20 13:22:43 crc kubenswrapper[4973]: done Mar 20 13:22:43 crc kubenswrapper[4973]: done Mar 20 13:22:43 crc kubenswrapper[4973]: if [[ $rc -ne 0 ]]; then Mar 20 13:22:43 crc kubenswrapper[4973]: sleep 60 & wait Mar 20 13:22:43 crc kubenswrapper[4973]: continue Mar 20 13:22:43 crc kubenswrapper[4973]: fi Mar 20 13:22:43 crc kubenswrapper[4973]: Mar 20 13:22:43 crc kubenswrapper[4973]: Mar 20 13:22:43 crc kubenswrapper[4973]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 20 13:22:43 crc kubenswrapper[4973]: # Replace /etc/hosts with our modified version if needed Mar 20 13:22:43 crc kubenswrapper[4973]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 20 13:22:43 crc kubenswrapper[4973]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 20 13:22:43 crc kubenswrapper[4973]: fi Mar 20 13:22:43 crc kubenswrapper[4973]: sleep 60 & wait Mar 20 13:22:43 crc kubenswrapper[4973]: unset svc_ips Mar 20 13:22:43 crc kubenswrapper[4973]: done Mar 20 13:22:43 crc kubenswrapper[4973]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7sgs6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-7qcsb_openshift-dns(2747a19a-a33a-458e-bc5d-bda5c13a2bf1): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:22:43 crc kubenswrapper[4973]: > logger="UnhandledError" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.336525 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-7qcsb" podUID="2747a19a-a33a-458e-bc5d-bda5c13a2bf1" Mar 20 13:22:43 crc kubenswrapper[4973]: W0320 13:22:43.337264 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70745a45_4eff_4e56_b9ab_efa4a7c83306.slice/crio-1da2348a30abea03395d3d1abb18d63b3eed67f2f1b496c1ac39a89f96982991 WatchSource:0}: Error finding container 1da2348a30abea03395d3d1abb18d63b3eed67f2f1b496c1ac39a89f96982991: Status 404 returned error can't find the container with id 1da2348a30abea03395d3d1abb18d63b3eed67f2f1b496c1ac39a89f96982991 Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.339263 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.339800 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9wxk4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.344864 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9wxk4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 13:22:43 crc kubenswrapper[4973]: W0320 13:22:43.345932 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-6fbef1291afc738b3f06b8bebf6e22225ee7616a58907ab48132e10e46b0c9c9 WatchSource:0}: Error finding container 6fbef1291afc738b3f06b8bebf6e22225ee7616a58907ab48132e10e46b0c9c9: Status 404 returned error can't find the container with id 6fbef1291afc738b3f06b8bebf6e22225ee7616a58907ab48132e10e46b0c9c9 Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.346062 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.347644 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-57hnn" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.349532 4973 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:22:43 crc kubenswrapper[4973]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 13:22:43 crc kubenswrapper[4973]: if [[ -f "/env/_master" ]]; then Mar 20 13:22:43 crc kubenswrapper[4973]: set -o allexport Mar 20 13:22:43 crc kubenswrapper[4973]: source "/env/_master" Mar 20 13:22:43 crc kubenswrapper[4973]: set +o allexport Mar 20 13:22:43 crc kubenswrapper[4973]: fi Mar 20 13:22:43 crc kubenswrapper[4973]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 20 13:22:43 crc kubenswrapper[4973]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 20 13:22:43 crc kubenswrapper[4973]: ho_enable="--enable-hybrid-overlay" Mar 20 13:22:43 crc kubenswrapper[4973]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 20 13:22:43 crc kubenswrapper[4973]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 20 13:22:43 crc kubenswrapper[4973]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 20 13:22:43 crc kubenswrapper[4973]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 13:22:43 crc kubenswrapper[4973]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 20 13:22:43 crc kubenswrapper[4973]: --webhook-host=127.0.0.1 \ Mar 20 13:22:43 crc kubenswrapper[4973]: --webhook-port=9743 \ Mar 20 13:22:43 crc kubenswrapper[4973]: ${ho_enable} \ Mar 20 13:22:43 crc kubenswrapper[4973]: --enable-interconnect \ Mar 20 13:22:43 crc kubenswrapper[4973]: --disable-approver \ Mar 20 13:22:43 crc kubenswrapper[4973]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 20 13:22:43 crc kubenswrapper[4973]: --wait-for-kubernetes-api=200s \ Mar 20 13:22:43 crc kubenswrapper[4973]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 20 13:22:43 crc kubenswrapper[4973]: --loglevel="${LOGLEVEL}" Mar 20 13:22:43 crc kubenswrapper[4973]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:22:43 crc kubenswrapper[4973]: > logger="UnhandledError" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.353616 4973 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:22:43 crc kubenswrapper[4973]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 13:22:43 crc kubenswrapper[4973]: if [[ -f "/env/_master" ]]; then Mar 20 13:22:43 crc kubenswrapper[4973]: set -o allexport Mar 20 13:22:43 crc kubenswrapper[4973]: source "/env/_master" Mar 20 13:22:43 crc kubenswrapper[4973]: set +o allexport Mar 20 13:22:43 crc kubenswrapper[4973]: fi Mar 20 13:22:43 crc kubenswrapper[4973]: Mar 20 13:22:43 crc kubenswrapper[4973]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 20 13:22:43 crc kubenswrapper[4973]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 20 13:22:43 crc kubenswrapper[4973]: --disable-webhook \ Mar 20 13:22:43 crc kubenswrapper[4973]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 20 13:22:43 crc kubenswrapper[4973]: --loglevel="${LOGLEVEL}" Mar 20 13:22:43 crc kubenswrapper[4973]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:22:43 crc kubenswrapper[4973]: > logger="UnhandledError" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.354803 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 20 13:22:43 crc kubenswrapper[4973]: W0320 13:22:43.359563 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb606d037_c146_4c4b_985a_8cea73f83da5.slice/crio-cdae54dc574d022c6a997749d6d34a506aa6d3097790ca718d20b3bd8fa93b8b WatchSource:0}: Error finding container cdae54dc574d022c6a997749d6d34a506aa6d3097790ca718d20b3bd8fa93b8b: Status 404 returned error can't find the container with id cdae54dc574d022c6a997749d6d34a506aa6d3097790ca718d20b3bd8fa93b8b Mar 20 13:22:43 crc kubenswrapper[4973]: W0320 13:22:43.361073 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35802646_2926_42b8_913a_986001818f97.slice/crio-15f363cf2d907125f878529f3fb122d68d23e9e4d72980b8ee0fefeeaccce9b1 WatchSource:0}: Error finding container 15f363cf2d907125f878529f3fb122d68d23e9e4d72980b8ee0fefeeaccce9b1: Status 404 returned error can't find the container with id 15f363cf2d907125f878529f3fb122d68d23e9e4d72980b8ee0fefeeaccce9b1 Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.362199 4973 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:22:43 crc kubenswrapper[4973]: container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[/bin/bash -c #!/bin/bash Mar 20 13:22:43 crc kubenswrapper[4973]: set -euo pipefail Mar 20 13:22:43 crc kubenswrapper[4973]: TLS_PK=/etc/pki/tls/metrics-cert/tls.key Mar 20 13:22:43 crc kubenswrapper[4973]: TLS_CERT=/etc/pki/tls/metrics-cert/tls.crt Mar 20 13:22:43 crc kubenswrapper[4973]: # As the secret mount is optional we must wait for the files to be present. Mar 20 13:22:43 crc kubenswrapper[4973]: # The service is created in monitor.yaml and this is created in sdn.yaml. Mar 20 13:22:43 crc kubenswrapper[4973]: TS=$(date +%s) Mar 20 13:22:43 crc kubenswrapper[4973]: WARN_TS=$(( ${TS} + $(( 20 * 60)) )) Mar 20 13:22:43 crc kubenswrapper[4973]: HAS_LOGGED_INFO=0 Mar 20 13:22:43 crc kubenswrapper[4973]: Mar 20 13:22:43 crc kubenswrapper[4973]: log_missing_certs(){ Mar 20 13:22:43 crc kubenswrapper[4973]: CUR_TS=$(date +%s) Mar 20 13:22:43 crc kubenswrapper[4973]: if [[ "${CUR_TS}" -gt "WARN_TS" ]]; then Mar 20 13:22:43 crc kubenswrapper[4973]: echo $(date -Iseconds) WARN: ovn-control-plane-metrics-cert not mounted after 20 minutes. Mar 20 13:22:43 crc kubenswrapper[4973]: elif [[ "${HAS_LOGGED_INFO}" -eq 0 ]] ; then Mar 20 13:22:43 crc kubenswrapper[4973]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-cert not mounted. Waiting 20 minutes. Mar 20 13:22:43 crc kubenswrapper[4973]: HAS_LOGGED_INFO=1 Mar 20 13:22:43 crc kubenswrapper[4973]: fi Mar 20 13:22:43 crc kubenswrapper[4973]: } Mar 20 13:22:43 crc kubenswrapper[4973]: while [[ ! -f "${TLS_PK}" || ! -f "${TLS_CERT}" ]] ; do Mar 20 13:22:43 crc kubenswrapper[4973]: log_missing_certs Mar 20 13:22:43 crc kubenswrapper[4973]: sleep 5 Mar 20 13:22:43 crc kubenswrapper[4973]: done Mar 20 13:22:43 crc kubenswrapper[4973]: Mar 20 13:22:43 crc kubenswrapper[4973]: echo $(date -Iseconds) INFO: ovn-control-plane-metrics-certs mounted, starting kube-rbac-proxy Mar 20 13:22:43 crc kubenswrapper[4973]: exec /usr/bin/kube-rbac-proxy \ Mar 20 13:22:43 crc kubenswrapper[4973]: --logtostderr \ Mar 20 13:22:43 crc kubenswrapper[4973]: --secure-listen-address=:9108 \ Mar 20 13:22:43 crc kubenswrapper[4973]: --tls-cipher-suites=TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 \ Mar 20 13:22:43 crc kubenswrapper[4973]: --upstream=http://127.0.0.1:29108/ \ Mar 20 13:22:43 crc kubenswrapper[4973]: --tls-private-key-file=${TLS_PK} \ Mar 20 13:22:43 crc kubenswrapper[4973]: --tls-cert-file=${TLS_CERT} Mar 20 13:22:43 crc kubenswrapper[4973]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:9108,ContainerPort:9108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{20971520 0} {} 20Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovn-control-plane-metrics-cert,ReadOnly:true,MountPath:/etc/pki/tls/metrics-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ws7x5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-sp2rb_openshift-ovn-kubernetes(b606d037-c146-4c4b-985a-8cea73f83da5): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:22:43 crc kubenswrapper[4973]: > logger="UnhandledError" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.364068 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.368147 4973 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:22:43 crc kubenswrapper[4973]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 20 13:22:43 crc kubenswrapper[4973]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 20 13:22:43 crc kubenswrapper[4973]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h9hbz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-57hnn_openshift-multus(35802646-2926-42b8-913a-986001818f97): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:22:43 crc kubenswrapper[4973]: > logger="UnhandledError" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.368960 4973 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:22:43 crc kubenswrapper[4973]: container &Container{Name:ovnkube-cluster-manager,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 20 13:22:43 crc kubenswrapper[4973]: if [[ -f "/env/_master" ]]; then Mar 20 13:22:43 crc kubenswrapper[4973]: set -o allexport Mar 20 13:22:43 crc kubenswrapper[4973]: source "/env/_master" Mar 20 13:22:43 crc kubenswrapper[4973]: set +o allexport Mar 20 13:22:43 crc kubenswrapper[4973]: fi Mar 20 13:22:43 crc kubenswrapper[4973]: Mar 20 13:22:43 crc kubenswrapper[4973]: ovn_v4_join_subnet_opt= Mar 20 13:22:43 crc kubenswrapper[4973]: if [[ "" != "" ]]; then Mar 20 13:22:43 crc kubenswrapper[4973]: ovn_v4_join_subnet_opt="--gateway-v4-join-subnet " Mar 20 13:22:43 crc kubenswrapper[4973]: fi Mar 20 13:22:43 crc kubenswrapper[4973]: ovn_v6_join_subnet_opt= Mar 20 13:22:43 crc kubenswrapper[4973]: if [[ "" != "" ]]; then Mar 20 13:22:43 crc kubenswrapper[4973]: ovn_v6_join_subnet_opt="--gateway-v6-join-subnet " Mar 20 13:22:43 crc kubenswrapper[4973]: fi Mar 20 13:22:43 crc kubenswrapper[4973]: Mar 20 13:22:43 crc kubenswrapper[4973]: ovn_v4_transit_switch_subnet_opt= Mar 20 13:22:43 crc kubenswrapper[4973]: if [[ "" != "" ]]; then Mar 20 13:22:43 crc kubenswrapper[4973]: ovn_v4_transit_switch_subnet_opt="--cluster-manager-v4-transit-switch-subnet " Mar 20 13:22:43 crc kubenswrapper[4973]: fi Mar 20 13:22:43 crc kubenswrapper[4973]: ovn_v6_transit_switch_subnet_opt= Mar 20 13:22:43 crc kubenswrapper[4973]: if [[ "" != "" ]]; then Mar 20 13:22:43 crc kubenswrapper[4973]: ovn_v6_transit_switch_subnet_opt="--cluster-manager-v6-transit-switch-subnet " Mar 20 13:22:43 crc kubenswrapper[4973]: fi Mar 20 13:22:43 crc kubenswrapper[4973]: Mar 20 13:22:43 crc kubenswrapper[4973]: dns_name_resolver_enabled_flag= Mar 20 13:22:43 crc kubenswrapper[4973]: if [[ "false" == "true" ]]; then Mar 20 13:22:43 crc kubenswrapper[4973]: dns_name_resolver_enabled_flag="--enable-dns-name-resolver" Mar 20 13:22:43 crc kubenswrapper[4973]: fi Mar 20 13:22:43 crc kubenswrapper[4973]: Mar 20 13:22:43 crc kubenswrapper[4973]: persistent_ips_enabled_flag= Mar 20 13:22:43 crc kubenswrapper[4973]: if [[ "true" == "true" ]]; then Mar 20 13:22:43 crc kubenswrapper[4973]: persistent_ips_enabled_flag="--enable-persistent-ips" Mar 20 13:22:43 crc kubenswrapper[4973]: fi Mar 20 13:22:43 crc kubenswrapper[4973]: Mar 20 13:22:43 crc kubenswrapper[4973]: # This is needed so that converting clusters from GA to TP Mar 20 13:22:43 crc kubenswrapper[4973]: # will rollout control plane pods as well Mar 20 13:22:43 crc kubenswrapper[4973]: network_segmentation_enabled_flag= Mar 20 13:22:43 crc kubenswrapper[4973]: multi_network_enabled_flag= Mar 20 13:22:43 crc kubenswrapper[4973]: if [[ "true" == "true" ]]; then Mar 20 13:22:43 crc kubenswrapper[4973]: multi_network_enabled_flag="--enable-multi-network" Mar 20 13:22:43 crc kubenswrapper[4973]: network_segmentation_enabled_flag="--enable-network-segmentation" Mar 20 13:22:43 crc kubenswrapper[4973]: fi Mar 20 13:22:43 crc kubenswrapper[4973]: Mar 20 13:22:43 crc kubenswrapper[4973]: echo "I$(date "+%m%d %H:%M:%S.%N") - ovnkube-control-plane - start ovnkube --init-cluster-manager ${K8S_NODE}" Mar 20 13:22:43 crc kubenswrapper[4973]: exec /usr/bin/ovnkube \ Mar 20 13:22:43 crc kubenswrapper[4973]: --enable-interconnect \ Mar 20 13:22:43 crc kubenswrapper[4973]: --init-cluster-manager "${K8S_NODE}" \ Mar 20 13:22:43 crc kubenswrapper[4973]: --config-file=/run/ovnkube-config/ovnkube.conf \ Mar 20 13:22:43 crc kubenswrapper[4973]: --loglevel "${OVN_KUBE_LOG_LEVEL}" \ Mar 20 13:22:43 crc kubenswrapper[4973]: --metrics-bind-address "127.0.0.1:29108" \ Mar 20 13:22:43 crc kubenswrapper[4973]: --metrics-enable-pprof \ Mar 20 13:22:43 crc kubenswrapper[4973]: --metrics-enable-config-duration \ Mar 20 13:22:43 crc kubenswrapper[4973]: ${ovn_v4_join_subnet_opt} \ Mar 20 13:22:43 crc kubenswrapper[4973]: ${ovn_v6_join_subnet_opt} \ Mar 20 13:22:43 crc kubenswrapper[4973]: ${ovn_v4_transit_switch_subnet_opt} \ Mar 20 13:22:43 crc kubenswrapper[4973]: ${ovn_v6_transit_switch_subnet_opt} \ Mar 20 13:22:43 crc kubenswrapper[4973]: ${dns_name_resolver_enabled_flag} \ Mar 20 13:22:43 crc kubenswrapper[4973]: ${persistent_ips_enabled_flag} \ Mar 20 13:22:43 crc kubenswrapper[4973]: ${multi_network_enabled_flag} \ Mar 20 13:22:43 crc kubenswrapper[4973]: ${network_segmentation_enabled_flag} Mar 20 13:22:43 crc kubenswrapper[4973]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics-port,HostPort:29108,ContainerPort:29108,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OVN_KUBE_LOG_LEVEL,Value:4,ValueFrom:nil,},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{314572800 0} {} 300Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:ovnkube-config,ReadOnly:false,MountPath:/run/ovnkube-config/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ws7x5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-control-plane-749d76644c-sp2rb_openshift-ovn-kubernetes(b606d037-c146-4c4b-985a-8cea73f83da5): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:22:43 crc kubenswrapper[4973]: > logger="UnhandledError" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.369763 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-57hnn" podUID="35802646-2926-42b8-913a-986001818f97" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.370489 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"ovnkube-cluster-manager\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" podUID="b606d037-c146-4c4b-985a-8cea73f83da5" Mar 20 13:22:43 crc kubenswrapper[4973]: W0320 13:22:43.375885 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb85f66bb_77a7_4c4c_8d36_6a94a52c90dd.slice/crio-c48bd1e6dc6c70952c55abbcc3243ed8fbe15264d2ba280e50c5d3ea898c0ea8 WatchSource:0}: Error finding container c48bd1e6dc6c70952c55abbcc3243ed8fbe15264d2ba280e50c5d3ea898c0ea8: Status 404 returned error can't find the container with id c48bd1e6dc6c70952c55abbcc3243ed8fbe15264d2ba280e50c5d3ea898c0ea8 Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.377689 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.378398 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-48dqg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-tmj8d_openshift-multus(b85f66bb-77a7-4c4c-8d36-6a94a52c90dd): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.379549 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" podUID="b85f66bb-77a7-4c4c-8d36-6a94a52c90dd" Mar 20 13:22:43 crc kubenswrapper[4973]: W0320 13:22:43.396197 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod774edfed_7d45_4b69_b9d7_a3a914cbca04.slice/crio-0aa02c1453484ce87bcebc6e8ee798cfdbd3c90b867f6022264249094dc0a316 WatchSource:0}: Error finding container 0aa02c1453484ce87bcebc6e8ee798cfdbd3c90b867f6022264249094dc0a316: Status 404 returned error can't find the container with id 0aa02c1453484ce87bcebc6e8ee798cfdbd3c90b867f6022264249094dc0a316 Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.399281 4973 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:22:43 crc kubenswrapper[4973]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 20 13:22:43 crc kubenswrapper[4973]: apiVersion: v1 Mar 20 13:22:43 crc kubenswrapper[4973]: clusters: Mar 20 13:22:43 crc kubenswrapper[4973]: - cluster: Mar 20 13:22:43 crc kubenswrapper[4973]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 20 13:22:43 crc kubenswrapper[4973]: server: https://api-int.crc.testing:6443 Mar 20 13:22:43 crc kubenswrapper[4973]: name: default-cluster Mar 20 13:22:43 crc kubenswrapper[4973]: contexts: Mar 20 13:22:43 crc kubenswrapper[4973]: - context: Mar 20 13:22:43 crc kubenswrapper[4973]: cluster: default-cluster Mar 20 13:22:43 crc kubenswrapper[4973]: namespace: default Mar 20 13:22:43 crc kubenswrapper[4973]: user: default-auth Mar 20 13:22:43 crc kubenswrapper[4973]: name: default-context Mar 20 13:22:43 crc kubenswrapper[4973]: current-context: default-context Mar 20 13:22:43 crc kubenswrapper[4973]: kind: Config Mar 20 13:22:43 crc kubenswrapper[4973]: preferences: {} Mar 20 13:22:43 crc kubenswrapper[4973]: users: Mar 20 13:22:43 crc kubenswrapper[4973]: - name: default-auth Mar 20 13:22:43 crc kubenswrapper[4973]: user: Mar 20 13:22:43 crc kubenswrapper[4973]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 13:22:43 crc kubenswrapper[4973]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 20 13:22:43 crc kubenswrapper[4973]: EOF Mar 20 13:22:43 crc kubenswrapper[4973]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lx76c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-jllfx_openshift-ovn-kubernetes(774edfed-7d45-4b69-b9d7-a3a914cbca04): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 20 13:22:43 crc kubenswrapper[4973]: > logger="UnhandledError" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.400392 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.420440 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.420470 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.420484 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.420498 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.420508 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:43Z","lastTransitionTime":"2026-03-20T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.524023 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.524061 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.524070 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.524086 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.524096 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:43Z","lastTransitionTime":"2026-03-20T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.604679 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.604847 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.604929 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:22:44.604886747 +0000 UTC m=+85.348556531 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.605009 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.605070 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.605156 4973 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.605165 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.605218 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:22:44.605203305 +0000 UTC m=+85.348873089 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.605075 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.605233 4973 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.605327 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:22:44.605297488 +0000 UTC m=+85.348967272 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.605763 4973 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.605889 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:22:44.605863154 +0000 UTC m=+85.349532928 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.629690 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.629762 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.629782 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.629809 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.629828 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:43Z","lastTransitionTime":"2026-03-20T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.706962 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93c5ad90-87bf-4668-9d87-34e676b15783-metrics-certs\") pod \"network-metrics-daemon-7kszd\" (UID: \"93c5ad90-87bf-4668-9d87-34e676b15783\") " pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.707028 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.707168 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.707194 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.707209 4973 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.707271 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:22:44.707255012 +0000 UTC m=+85.450924756 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.707256 4973 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:22:43 crc kubenswrapper[4973]: E0320 13:22:43.707681 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93c5ad90-87bf-4668-9d87-34e676b15783-metrics-certs podName:93c5ad90-87bf-4668-9d87-34e676b15783 nodeName:}" failed. No retries permitted until 2026-03-20 13:22:44.707539609 +0000 UTC m=+85.451209393 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93c5ad90-87bf-4668-9d87-34e676b15783-metrics-certs") pod "network-metrics-daemon-7kszd" (UID: "93c5ad90-87bf-4668-9d87-34e676b15783") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.734328 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.734513 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.734545 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.734741 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.734832 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:43Z","lastTransitionTime":"2026-03-20T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.838813 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.838893 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.838918 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.838949 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.838974 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:43Z","lastTransitionTime":"2026-03-20T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.941821 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.941875 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.941886 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.941903 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.941926 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:43Z","lastTransitionTime":"2026-03-20T13:22:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.954109 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.955276 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.958060 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.959643 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.961708 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.962690 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.964069 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.965390 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.966098 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.967249 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.967893 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.969449 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.970286 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.970893 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.971981 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.972631 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.973767 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.974240 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.974987 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.976191 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.976769 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.977905 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.978691 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.980134 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.980907 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.981882 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.983417 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.984082 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.985119 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.985708 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.986651 4973 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.986842 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.988524 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.989460 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.989934 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.991867 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.992750 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.993823 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.994659 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.995823 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.996408 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.997445 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.998207 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.999332 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 20 13:22:43 crc kubenswrapper[4973]: I0320 13:22:43.999913 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.000987 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.001721 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.004169 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.005571 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.007448 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.008690 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.010732 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.011590 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.012211 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.044351 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.044382 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.044391 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.044406 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.044418 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:44Z","lastTransitionTime":"2026-03-20T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.147918 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.148190 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.148199 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.148213 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.148224 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:44Z","lastTransitionTime":"2026-03-20T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.179440 4973 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.251393 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.251480 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.251491 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.251506 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.251517 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:44Z","lastTransitionTime":"2026-03-20T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.292942 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" event={"ID":"b606d037-c146-4c4b-985a-8cea73f83da5","Type":"ContainerStarted","Data":"cdae54dc574d022c6a997749d6d34a506aa6d3097790ca718d20b3bd8fa93b8b"} Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.293918 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-57hnn" event={"ID":"35802646-2926-42b8-913a-986001818f97","Type":"ContainerStarted","Data":"15f363cf2d907125f878529f3fb122d68d23e9e4d72980b8ee0fefeeaccce9b1"} Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.294996 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qqncz" event={"ID":"6e2d3006-c203-45e9-875b-8b8210a85409","Type":"ContainerStarted","Data":"65fa6859e2548fa0ac7c5fbce2d5d2db7a972a8b265306e2ed0bc76d75abc035"} Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.298702 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerStarted","Data":"1da2348a30abea03395d3d1abb18d63b3eed67f2f1b496c1ac39a89f96982991"} Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.300198 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7qcsb" event={"ID":"2747a19a-a33a-458e-bc5d-bda5c13a2bf1","Type":"ContainerStarted","Data":"b3b1ebeff176166b224edc9c621865fc27302cc12b8f768b9aa7a2104d53ea81"} Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.301317 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" event={"ID":"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd","Type":"ContainerStarted","Data":"c48bd1e6dc6c70952c55abbcc3243ed8fbe15264d2ba280e50c5d3ea898c0ea8"} Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.302498 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d2504ad06047ea5836edc87390ebc22a2719780a79530b2240bf2b16a59f1882"} Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.303969 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" event={"ID":"774edfed-7d45-4b69-b9d7-a3a914cbca04","Type":"ContainerStarted","Data":"0aa02c1453484ce87bcebc6e8ee798cfdbd3c90b867f6022264249094dc0a316"} Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.305315 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"6fbef1291afc738b3f06b8bebf6e22225ee7616a58907ab48132e10e46b0c9c9"} Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.307255 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.327811 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.348890 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.355252 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.355393 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.355418 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.355477 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.355500 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:44Z","lastTransitionTime":"2026-03-20T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.371301 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.391987 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.408480 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.425106 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.448078 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.460590 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.460639 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.460653 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.460670 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.460682 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:44Z","lastTransitionTime":"2026-03-20T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.461225 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.472173 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.484697 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.502704 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.510623 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.519241 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.531886 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.541418 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.551406 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.560363 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.563989 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.564028 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.564037 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.564052 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.564061 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:44Z","lastTransitionTime":"2026-03-20T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.570540 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.578541 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.595775 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.607362 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.617271 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.617787 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.617888 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.617935 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.617962 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:22:44 crc kubenswrapper[4973]: E0320 13:22:44.618061 4973 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:22:44 crc kubenswrapper[4973]: E0320 13:22:44.618163 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:22:44 crc kubenswrapper[4973]: E0320 13:22:44.618177 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:22:44 crc kubenswrapper[4973]: E0320 13:22:44.618188 4973 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:22:44 crc kubenswrapper[4973]: E0320 13:22:44.618275 4973 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:22:44 crc kubenswrapper[4973]: E0320 13:22:44.618483 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:22:46.618094013 +0000 UTC m=+87.361763767 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:22:44 crc kubenswrapper[4973]: E0320 13:22:44.618506 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:22:46.618497403 +0000 UTC m=+87.362167147 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:22:44 crc kubenswrapper[4973]: E0320 13:22:44.618520 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:22:46.618513984 +0000 UTC m=+87.362183728 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:22:44 crc kubenswrapper[4973]: E0320 13:22:44.618533 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:22:46.618527314 +0000 UTC m=+87.362197058 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.626406 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.635901 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.645112 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.657972 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.666716 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.666750 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.666761 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.666778 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.666790 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:44Z","lastTransitionTime":"2026-03-20T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.667810 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.719175 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93c5ad90-87bf-4668-9d87-34e676b15783-metrics-certs\") pod \"network-metrics-daemon-7kszd\" (UID: \"93c5ad90-87bf-4668-9d87-34e676b15783\") " pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.719721 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:22:44 crc kubenswrapper[4973]: E0320 13:22:44.719350 4973 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:22:44 crc kubenswrapper[4973]: E0320 13:22:44.719850 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:22:44 crc kubenswrapper[4973]: E0320 13:22:44.719867 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:22:44 crc kubenswrapper[4973]: E0320 13:22:44.719898 4973 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:22:44 crc kubenswrapper[4973]: E0320 13:22:44.719944 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93c5ad90-87bf-4668-9d87-34e676b15783-metrics-certs podName:93c5ad90-87bf-4668-9d87-34e676b15783 nodeName:}" failed. No retries permitted until 2026-03-20 13:22:46.719914442 +0000 UTC m=+87.463584186 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93c5ad90-87bf-4668-9d87-34e676b15783-metrics-certs") pod "network-metrics-daemon-7kszd" (UID: "93c5ad90-87bf-4668-9d87-34e676b15783") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:22:44 crc kubenswrapper[4973]: E0320 13:22:44.719987 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:22:46.719977844 +0000 UTC m=+87.463647588 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.769078 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.769141 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.769154 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.769175 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.769191 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:44Z","lastTransitionTime":"2026-03-20T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.871896 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.871944 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.871955 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.871970 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.871983 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:44Z","lastTransitionTime":"2026-03-20T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.950169 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.950232 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.950238 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:22:44 crc kubenswrapper[4973]: E0320 13:22:44.950297 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:22:44 crc kubenswrapper[4973]: E0320 13:22:44.950624 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.950689 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:22:44 crc kubenswrapper[4973]: E0320 13:22:44.950751 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:22:44 crc kubenswrapper[4973]: E0320 13:22:44.950826 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.974037 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.974100 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.974113 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.974131 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:44 crc kubenswrapper[4973]: I0320 13:22:44.974162 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:44Z","lastTransitionTime":"2026-03-20T13:22:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.076871 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.076905 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.076912 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.076926 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.076936 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:45Z","lastTransitionTime":"2026-03-20T13:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.179012 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.179062 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.179076 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.179095 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.179109 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:45Z","lastTransitionTime":"2026-03-20T13:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.281524 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.281563 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.281573 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.281591 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.281603 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:45Z","lastTransitionTime":"2026-03-20T13:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.311484 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401"} Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.311546 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8"} Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.312966 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-7qcsb" event={"ID":"2747a19a-a33a-458e-bc5d-bda5c13a2bf1","Type":"ContainerStarted","Data":"f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325"} Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.314876 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374"} Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.317409 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" event={"ID":"b606d037-c146-4c4b-985a-8cea73f83da5","Type":"ContainerStarted","Data":"aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a"} Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.317450 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" event={"ID":"b606d037-c146-4c4b-985a-8cea73f83da5","Type":"ContainerStarted","Data":"60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25"} Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.320797 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-57hnn" event={"ID":"35802646-2926-42b8-913a-986001818f97","Type":"ContainerStarted","Data":"410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225"} Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.322823 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerStarted","Data":"ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560"} Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.322874 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerStarted","Data":"abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc"} Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.328984 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-qqncz" event={"ID":"6e2d3006-c203-45e9-875b-8b8210a85409","Type":"ContainerStarted","Data":"0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7"} Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.329664 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.333147 4973 generic.go:334] "Generic (PLEG): container finished" podID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerID="e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13" exitCode=0 Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.333208 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" event={"ID":"774edfed-7d45-4b69-b9d7-a3a914cbca04","Type":"ContainerDied","Data":"e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13"} Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.337296 4973 generic.go:334] "Generic (PLEG): container finished" podID="b85f66bb-77a7-4c4c-8d36-6a94a52c90dd" containerID="fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455" exitCode=0 Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.337386 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" event={"ID":"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd","Type":"ContainerDied","Data":"fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455"} Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.343779 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.366800 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.384296 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.384377 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.384390 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.384411 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.384423 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:45Z","lastTransitionTime":"2026-03-20T13:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.384980 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.403762 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.420865 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.436321 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.453567 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.475000 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.497621 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.498856 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.498881 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.498890 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.498903 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.498911 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:45Z","lastTransitionTime":"2026-03-20T13:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.511246 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.529064 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.539699 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.551101 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.560232 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.570516 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.583978 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.599775 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.601449 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.601475 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.601485 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.601499 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.601508 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:45Z","lastTransitionTime":"2026-03-20T13:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.612862 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.626274 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.642167 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.656235 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.667947 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.680676 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.692599 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.703554 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.703586 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.703598 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.703614 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.703627 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:45Z","lastTransitionTime":"2026-03-20T13:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.710402 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.719428 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.729365 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:45Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.806112 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.806153 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.806162 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.806176 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.806187 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:45Z","lastTransitionTime":"2026-03-20T13:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.908937 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.909367 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.909378 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.909396 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:45 crc kubenswrapper[4973]: I0320 13:22:45.909406 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:45Z","lastTransitionTime":"2026-03-20T13:22:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.014014 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.023096 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.023145 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.023155 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.023180 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.023203 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:46Z","lastTransitionTime":"2026-03-20T13:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.125823 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.125865 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.125898 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.125917 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.125928 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:46Z","lastTransitionTime":"2026-03-20T13:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.231665 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.232010 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.232019 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.232034 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.232044 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:46Z","lastTransitionTime":"2026-03-20T13:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.336211 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.336257 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.336269 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.336288 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.336301 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:46Z","lastTransitionTime":"2026-03-20T13:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.351121 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" event={"ID":"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd","Type":"ContainerStarted","Data":"2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4"} Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.357171 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" event={"ID":"774edfed-7d45-4b69-b9d7-a3a914cbca04","Type":"ContainerStarted","Data":"871fb9fe2847621c60969a007f73da50cc62ca926e0251ee5c0fb27aefbda812"} Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.357458 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" event={"ID":"774edfed-7d45-4b69-b9d7-a3a914cbca04","Type":"ContainerStarted","Data":"07ee7da60e3690096c10650c3b6c42fec3b15bd2067e4ce5013b7b450626f4cf"} Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.357540 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" event={"ID":"774edfed-7d45-4b69-b9d7-a3a914cbca04","Type":"ContainerStarted","Data":"769012078c45497a3a9634abb008e5887154b0b6af53007e47fa95b7f8093e71"} Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.357609 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" event={"ID":"774edfed-7d45-4b69-b9d7-a3a914cbca04","Type":"ContainerStarted","Data":"84a85b4f11b49688632f7d86a55bf387f3469cf372fffceeb507ca48fb35d031"} Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.364298 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.393877 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.412185 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.427308 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.440088 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.440114 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.440124 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.440138 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.440150 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:46Z","lastTransitionTime":"2026-03-20T13:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.446408 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.462577 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.481580 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.494315 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.510705 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.523441 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.535995 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.547177 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.547714 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.547728 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.547749 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.547762 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:46Z","lastTransitionTime":"2026-03-20T13:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.554142 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.566828 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.578769 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.592355 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:46Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.640284 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.640418 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:22:46 crc kubenswrapper[4973]: E0320 13:22:46.640483 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:22:50.640457234 +0000 UTC m=+91.384126978 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:22:46 crc kubenswrapper[4973]: E0320 13:22:46.640518 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:22:46 crc kubenswrapper[4973]: E0320 13:22:46.640535 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:22:46 crc kubenswrapper[4973]: E0320 13:22:46.640545 4973 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.640585 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:22:46 crc kubenswrapper[4973]: E0320 13:22:46.640593 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:22:50.640579217 +0000 UTC m=+91.384248951 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.640625 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:22:46 crc kubenswrapper[4973]: E0320 13:22:46.640699 4973 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:22:46 crc kubenswrapper[4973]: E0320 13:22:46.640742 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:22:50.640735801 +0000 UTC m=+91.384405545 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:22:46 crc kubenswrapper[4973]: E0320 13:22:46.640808 4973 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:22:46 crc kubenswrapper[4973]: E0320 13:22:46.640900 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:22:50.640877925 +0000 UTC m=+91.384547689 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.650225 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.650288 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.650304 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.650356 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.650381 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:46Z","lastTransitionTime":"2026-03-20T13:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.741499 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93c5ad90-87bf-4668-9d87-34e676b15783-metrics-certs\") pod \"network-metrics-daemon-7kszd\" (UID: \"93c5ad90-87bf-4668-9d87-34e676b15783\") " pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.741750 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:22:46 crc kubenswrapper[4973]: E0320 13:22:46.741664 4973 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:22:46 crc kubenswrapper[4973]: E0320 13:22:46.741966 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93c5ad90-87bf-4668-9d87-34e676b15783-metrics-certs podName:93c5ad90-87bf-4668-9d87-34e676b15783 nodeName:}" failed. No retries permitted until 2026-03-20 13:22:50.741951504 +0000 UTC m=+91.485621248 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93c5ad90-87bf-4668-9d87-34e676b15783-metrics-certs") pod "network-metrics-daemon-7kszd" (UID: "93c5ad90-87bf-4668-9d87-34e676b15783") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:22:46 crc kubenswrapper[4973]: E0320 13:22:46.741835 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:22:46 crc kubenswrapper[4973]: E0320 13:22:46.742109 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:22:46 crc kubenswrapper[4973]: E0320 13:22:46.742171 4973 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:22:46 crc kubenswrapper[4973]: E0320 13:22:46.742241 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:22:50.742233663 +0000 UTC m=+91.485903407 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.752411 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.752666 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.752729 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.752819 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.752896 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:46Z","lastTransitionTime":"2026-03-20T13:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.856099 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.856177 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.856202 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.856233 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.856259 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:46Z","lastTransitionTime":"2026-03-20T13:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.949835 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.949845 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.949901 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.949901 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:22:46 crc kubenswrapper[4973]: E0320 13:22:46.950688 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:22:46 crc kubenswrapper[4973]: E0320 13:22:46.950773 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:22:46 crc kubenswrapper[4973]: E0320 13:22:46.950881 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:22:46 crc kubenswrapper[4973]: E0320 13:22:46.950979 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.958985 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.959046 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.959068 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.959102 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.959126 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:46Z","lastTransitionTime":"2026-03-20T13:22:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.963750 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:22:46 crc kubenswrapper[4973]: I0320 13:22:46.963895 4973 scope.go:117] "RemoveContainer" containerID="923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365" Mar 20 13:22:46 crc kubenswrapper[4973]: E0320 13:22:46.964189 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.061680 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.061723 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.061734 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.061750 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.061764 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:47Z","lastTransitionTime":"2026-03-20T13:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.164079 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.164117 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.164126 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.164142 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.164152 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:47Z","lastTransitionTime":"2026-03-20T13:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.265993 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.266032 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.266043 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.266058 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.266069 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:47Z","lastTransitionTime":"2026-03-20T13:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.362786 4973 generic.go:334] "Generic (PLEG): container finished" podID="b85f66bb-77a7-4c4c-8d36-6a94a52c90dd" containerID="2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4" exitCode=0 Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.362869 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" event={"ID":"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd","Type":"ContainerDied","Data":"2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4"} Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.366239 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7"} Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.367606 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.367652 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.367661 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.367673 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.367682 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:47Z","lastTransitionTime":"2026-03-20T13:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.376020 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" event={"ID":"774edfed-7d45-4b69-b9d7-a3a914cbca04","Type":"ContainerStarted","Data":"5f66b6b2fb9b2f8d32d126fc2a6ce3795a88f698d6da5fe5e2dd8f67633eaed1"} Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.376102 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" event={"ID":"774edfed-7d45-4b69-b9d7-a3a914cbca04","Type":"ContainerStarted","Data":"b0ecf535291dac1ed0e5ea70a4bf7b4dd5d8db1a7d39b63ace83f7b7376a1a79"} Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.376660 4973 scope.go:117] "RemoveContainer" containerID="923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365" Mar 20 13:22:47 crc kubenswrapper[4973]: E0320 13:22:47.376843 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.384788 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.411756 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.430162 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.445995 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.457814 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.469421 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.471457 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.471494 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.471503 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.471518 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.471529 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:47Z","lastTransitionTime":"2026-03-20T13:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.487982 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.509785 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.520739 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.535290 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.546810 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.560527 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.574883 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.576193 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.576215 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.576228 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.576244 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.576254 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:47Z","lastTransitionTime":"2026-03-20T13:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.597052 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.615475 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.629599 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.642537 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.655852 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.669775 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.682279 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.682356 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.682371 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.682396 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.682410 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:47Z","lastTransitionTime":"2026-03-20T13:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.684503 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.697661 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.708522 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.712738 4973 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.724560 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.733989 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.743540 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.757437 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.771375 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.783361 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.785220 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.785264 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.785299 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.785320 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.785333 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:47Z","lastTransitionTime":"2026-03-20T13:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.798076 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.811940 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.825493 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.837633 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:47Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.887799 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.887840 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.887854 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.887872 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.887883 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:47Z","lastTransitionTime":"2026-03-20T13:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.989512 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.989569 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.989585 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.989611 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:47 crc kubenswrapper[4973]: I0320 13:22:47.989626 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:47Z","lastTransitionTime":"2026-03-20T13:22:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.092419 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.092458 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.092470 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.092488 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.092500 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:48Z","lastTransitionTime":"2026-03-20T13:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.194510 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.194560 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.194571 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.194589 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.194601 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:48Z","lastTransitionTime":"2026-03-20T13:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.296285 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.296324 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.296356 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.296378 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.296387 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:48Z","lastTransitionTime":"2026-03-20T13:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.380382 4973 generic.go:334] "Generic (PLEG): container finished" podID="b85f66bb-77a7-4c4c-8d36-6a94a52c90dd" containerID="0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6" exitCode=0 Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.380423 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" event={"ID":"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd","Type":"ContainerDied","Data":"0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6"} Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.398236 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.398601 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.398625 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.398633 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.398646 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.398655 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:48Z","lastTransitionTime":"2026-03-20T13:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.413382 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.430273 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.450893 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.468314 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.480020 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.496788 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.500870 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.500904 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.500912 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.500928 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.500937 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:48Z","lastTransitionTime":"2026-03-20T13:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.515443 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.530211 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.553581 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.565228 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.580565 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.592813 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.603892 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.607969 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.607995 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.608003 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.608016 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.608025 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:48Z","lastTransitionTime":"2026-03-20T13:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.611731 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.623575 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:48Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.710818 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.711124 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.711132 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.711146 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.711155 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:48Z","lastTransitionTime":"2026-03-20T13:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.813567 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.813604 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.813612 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.813629 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.813640 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:48Z","lastTransitionTime":"2026-03-20T13:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.915813 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.915857 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.915866 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.915882 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.915892 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:48Z","lastTransitionTime":"2026-03-20T13:22:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.950255 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.950326 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.950389 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:22:48 crc kubenswrapper[4973]: I0320 13:22:48.950259 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:22:48 crc kubenswrapper[4973]: E0320 13:22:48.950429 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:22:48 crc kubenswrapper[4973]: E0320 13:22:48.950469 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:22:48 crc kubenswrapper[4973]: E0320 13:22:48.950541 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:22:48 crc kubenswrapper[4973]: E0320 13:22:48.950614 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.018674 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.018715 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.018732 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.018751 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.018760 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:49Z","lastTransitionTime":"2026-03-20T13:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.120834 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.120881 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.120893 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.120909 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.120922 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:49Z","lastTransitionTime":"2026-03-20T13:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.223386 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.223422 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.223430 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.223447 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.223456 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:49Z","lastTransitionTime":"2026-03-20T13:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.326433 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.326482 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.326494 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.326513 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.326526 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:49Z","lastTransitionTime":"2026-03-20T13:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.388220 4973 generic.go:334] "Generic (PLEG): container finished" podID="b85f66bb-77a7-4c4c-8d36-6a94a52c90dd" containerID="5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5" exitCode=0 Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.388307 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" event={"ID":"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd","Type":"ContainerDied","Data":"5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5"} Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.394475 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" event={"ID":"774edfed-7d45-4b69-b9d7-a3a914cbca04","Type":"ContainerStarted","Data":"049540dfa35ad15c3611412fbe908d24faf4f90a09da41b7a16ce2c5b4bd5062"} Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.406961 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.422486 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.429867 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.429921 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.429939 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.429963 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.429981 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:49Z","lastTransitionTime":"2026-03-20T13:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.436668 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.454223 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.492615 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.508556 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.517992 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.532804 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.532900 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.532918 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.532940 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.532954 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:49Z","lastTransitionTime":"2026-03-20T13:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.533407 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.548571 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.565933 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.586489 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.603949 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.615398 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.626609 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.635435 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.635509 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.635518 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.635535 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.635547 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:49Z","lastTransitionTime":"2026-03-20T13:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.644027 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.667199 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.738023 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.738088 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.738100 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.738119 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.738132 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:49Z","lastTransitionTime":"2026-03-20T13:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.841260 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.841313 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.841323 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.841357 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.841372 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:49Z","lastTransitionTime":"2026-03-20T13:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.945140 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.945200 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.945213 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.945234 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.945246 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:49Z","lastTransitionTime":"2026-03-20T13:22:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.965650 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.979682 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:49 crc kubenswrapper[4973]: I0320 13:22:49.995639 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.015946 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.030546 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.042588 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.046911 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.046941 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.046951 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.046965 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.046974 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:50Z","lastTransitionTime":"2026-03-20T13:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.058687 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.077640 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.089975 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.102397 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.115715 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.127742 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.139321 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.150096 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.150460 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.150483 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.150492 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.150506 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.150516 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:50Z","lastTransitionTime":"2026-03-20T13:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.160287 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.176253 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.252829 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.252875 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.252886 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.252902 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.252914 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:50Z","lastTransitionTime":"2026-03-20T13:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.305034 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.305075 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.305085 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.305101 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.305112 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:50Z","lastTransitionTime":"2026-03-20T13:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:50 crc kubenswrapper[4973]: E0320 13:22:50.318924 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.322709 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.322753 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.322767 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.322783 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.322794 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:50Z","lastTransitionTime":"2026-03-20T13:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:50 crc kubenswrapper[4973]: E0320 13:22:50.334581 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.338076 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.338116 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.338126 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.338144 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.338155 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:50Z","lastTransitionTime":"2026-03-20T13:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:50 crc kubenswrapper[4973]: E0320 13:22:50.349382 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.352436 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.352462 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.352473 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.352490 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.352501 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:50Z","lastTransitionTime":"2026-03-20T13:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:50 crc kubenswrapper[4973]: E0320 13:22:50.363589 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.366233 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.366272 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.366280 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.366293 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.366304 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:50Z","lastTransitionTime":"2026-03-20T13:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:50 crc kubenswrapper[4973]: E0320 13:22:50.376160 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: E0320 13:22:50.376261 4973 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.377278 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.377303 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.377311 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.377324 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.377332 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:50Z","lastTransitionTime":"2026-03-20T13:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.401263 4973 generic.go:334] "Generic (PLEG): container finished" podID="b85f66bb-77a7-4c4c-8d36-6a94a52c90dd" containerID="4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17" exitCode=0 Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.401312 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" event={"ID":"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd","Type":"ContainerDied","Data":"4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17"} Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.414889 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.427924 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.438639 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.449292 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.461633 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.472987 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.479059 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.479087 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.479097 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.479112 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.479120 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:50Z","lastTransitionTime":"2026-03-20T13:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.484828 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.500624 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.509717 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.518365 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.529538 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.544554 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.554110 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.564746 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.578280 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.580830 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.580871 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.580886 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.580904 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.580915 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:50Z","lastTransitionTime":"2026-03-20T13:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.590375 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:50Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.627263 4973 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.683512 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.683547 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.683558 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.683574 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.683584 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:50Z","lastTransitionTime":"2026-03-20T13:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.685881 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.686020 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:22:50 crc kubenswrapper[4973]: E0320 13:22:50.686054 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:22:58.686036209 +0000 UTC m=+99.429705953 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.686084 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.686119 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:22:50 crc kubenswrapper[4973]: E0320 13:22:50.686151 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:22:50 crc kubenswrapper[4973]: E0320 13:22:50.686170 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:22:50 crc kubenswrapper[4973]: E0320 13:22:50.686184 4973 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:22:50 crc kubenswrapper[4973]: E0320 13:22:50.686210 4973 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:22:50 crc kubenswrapper[4973]: E0320 13:22:50.686227 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:22:58.686211145 +0000 UTC m=+99.429880889 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:22:50 crc kubenswrapper[4973]: E0320 13:22:50.686246 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:22:58.686238176 +0000 UTC m=+99.429908030 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:22:50 crc kubenswrapper[4973]: E0320 13:22:50.686280 4973 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:22:50 crc kubenswrapper[4973]: E0320 13:22:50.686301 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:22:58.686294707 +0000 UTC m=+99.429964451 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.786287 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.786528 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.786541 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.786558 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.786570 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:50Z","lastTransitionTime":"2026-03-20T13:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.786581 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93c5ad90-87bf-4668-9d87-34e676b15783-metrics-certs\") pod \"network-metrics-daemon-7kszd\" (UID: \"93c5ad90-87bf-4668-9d87-34e676b15783\") " pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.786619 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:22:50 crc kubenswrapper[4973]: E0320 13:22:50.786736 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:22:50 crc kubenswrapper[4973]: E0320 13:22:50.786755 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:22:50 crc kubenswrapper[4973]: E0320 13:22:50.786760 4973 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:22:50 crc kubenswrapper[4973]: E0320 13:22:50.786769 4973 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:22:50 crc kubenswrapper[4973]: E0320 13:22:50.786831 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93c5ad90-87bf-4668-9d87-34e676b15783-metrics-certs podName:93c5ad90-87bf-4668-9d87-34e676b15783 nodeName:}" failed. No retries permitted until 2026-03-20 13:22:58.786811161 +0000 UTC m=+99.530480975 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93c5ad90-87bf-4668-9d87-34e676b15783-metrics-certs") pod "network-metrics-daemon-7kszd" (UID: "93c5ad90-87bf-4668-9d87-34e676b15783") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:22:50 crc kubenswrapper[4973]: E0320 13:22:50.786857 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:22:58.786846832 +0000 UTC m=+99.530516696 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.888366 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.888403 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.888414 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.888432 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.888442 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:50Z","lastTransitionTime":"2026-03-20T13:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.949579 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.949670 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:22:50 crc kubenswrapper[4973]: E0320 13:22:50.949739 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.949768 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.949761 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:22:50 crc kubenswrapper[4973]: E0320 13:22:50.949862 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:22:50 crc kubenswrapper[4973]: E0320 13:22:50.950103 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:22:50 crc kubenswrapper[4973]: E0320 13:22:50.950183 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.991640 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.991690 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.991702 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.991721 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:50 crc kubenswrapper[4973]: I0320 13:22:50.991731 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:50Z","lastTransitionTime":"2026-03-20T13:22:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.093861 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.093900 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.093910 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.093926 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.093935 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:51Z","lastTransitionTime":"2026-03-20T13:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.196710 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.196741 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.196750 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.196767 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.196775 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:51Z","lastTransitionTime":"2026-03-20T13:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.299207 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.299247 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.299256 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.299272 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.299284 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:51Z","lastTransitionTime":"2026-03-20T13:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.402170 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.402294 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.402325 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.402408 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.402439 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:51Z","lastTransitionTime":"2026-03-20T13:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.406965 4973 generic.go:334] "Generic (PLEG): container finished" podID="b85f66bb-77a7-4c4c-8d36-6a94a52c90dd" containerID="945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242" exitCode=0 Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.407027 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" event={"ID":"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd","Type":"ContainerDied","Data":"945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242"} Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.412063 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" event={"ID":"774edfed-7d45-4b69-b9d7-a3a914cbca04","Type":"ContainerStarted","Data":"7b2bf1073dbe1bcafe13e2888ed61c9031e1e22e86fa74d690411041b2918b0f"} Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.412758 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.412824 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.412850 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.421644 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.435020 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.445159 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.451641 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.452430 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.462586 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.473559 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.485678 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.502639 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.512697 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.512727 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.512735 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.512749 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.512759 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:51Z","lastTransitionTime":"2026-03-20T13:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.515518 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.526319 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.539125 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.549893 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.562890 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.577416 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.586995 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.599164 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.610792 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.614725 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.614786 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.614799 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.614819 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.614833 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:51Z","lastTransitionTime":"2026-03-20T13:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.621486 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.633158 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.643171 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.652046 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.661079 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.670806 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.679362 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.696070 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07ee7da60e3690096c10650c3b6c42fec3b15bd2067e4ce5013b7b450626f4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://871fb9fe2847621c60969a007f73da50cc62ca926e0251ee5c0fb27aefbda812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66b6b2fb9b2f8d32d126fc2a6ce3795a88f698d6da5fe5e2dd8f67633eaed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ecf535291dac1ed0e5ea70a4bf7b4dd5d8db1a7d39b63ace83f7b7376a1a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://769012078c45497a3a9634abb008e5887154b0b6af53007e47fa95b7f8093e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a85b4f11b49688632f7d86a55bf387f3469cf372fffceeb507ca48fb35d031\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b2bf1073dbe1bcafe13e2888ed61c9031e1e22e86fa74d690411041b2918b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://049540dfa35ad15c3611412fbe908d24faf4f90a09da41b7a16ce2c5b4bd5062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.704810 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.714155 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.716752 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.716793 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.716806 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.716825 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.716837 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:51Z","lastTransitionTime":"2026-03-20T13:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.722258 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.731360 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.741185 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.752193 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.762841 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.771872 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:51Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.818912 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.818956 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.818967 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.818985 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.818996 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:51Z","lastTransitionTime":"2026-03-20T13:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.921582 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.921615 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.921624 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.921638 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:51 crc kubenswrapper[4973]: I0320 13:22:51.921647 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:51Z","lastTransitionTime":"2026-03-20T13:22:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.023645 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.023675 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.023683 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.023695 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.023703 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:52Z","lastTransitionTime":"2026-03-20T13:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.125736 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.125779 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.125792 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.125809 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.125820 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:52Z","lastTransitionTime":"2026-03-20T13:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.228741 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.228794 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.228809 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.228829 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.228839 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:52Z","lastTransitionTime":"2026-03-20T13:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.331030 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.331063 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.331071 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.331085 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.331094 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:52Z","lastTransitionTime":"2026-03-20T13:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.419886 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" event={"ID":"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd","Type":"ContainerStarted","Data":"a72006c12565eceb6bcd910f7a84aefb14a881325ea4617bed4ccffdd31e356c"} Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.431547 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.432846 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.432874 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.432885 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.432903 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.432915 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:52Z","lastTransitionTime":"2026-03-20T13:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.445966 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.457564 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.473848 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.485506 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.500048 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.511585 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.532701 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07ee7da60e3690096c10650c3b6c42fec3b15bd2067e4ce5013b7b450626f4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://871fb9fe2847621c60969a007f73da50cc62ca926e0251ee5c0fb27aefbda812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66b6b2fb9b2f8d32d126fc2a6ce3795a88f698d6da5fe5e2dd8f67633eaed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ecf535291dac1ed0e5ea70a4bf7b4dd5d8db1a7d39b63ace83f7b7376a1a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://769012078c45497a3a9634abb008e5887154b0b6af53007e47fa95b7f8093e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a85b4f11b49688632f7d86a55bf387f3469cf372fffceeb507ca48fb35d031\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b2bf1073dbe1bcafe13e2888ed61c9031e1e22e86fa74d690411041b2918b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://049540dfa35ad15c3611412fbe908d24faf4f90a09da41b7a16ce2c5b4bd5062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.537723 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.537778 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.537789 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.537807 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.537819 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:52Z","lastTransitionTime":"2026-03-20T13:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.545303 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.556098 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.579565 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.596056 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.604804 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.620039 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.631968 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.641987 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.642020 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.642028 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.642042 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.642051 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:52Z","lastTransitionTime":"2026-03-20T13:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.645641 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72006c12565eceb6bcd910f7a84aefb14a881325ea4617bed4ccffdd31e356c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:52Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.744570 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.744605 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.744615 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.744631 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.744643 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:52Z","lastTransitionTime":"2026-03-20T13:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.846726 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.846757 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.846767 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.846782 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.846792 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:52Z","lastTransitionTime":"2026-03-20T13:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.949128 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.949162 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.949171 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.949187 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.949199 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:52Z","lastTransitionTime":"2026-03-20T13:22:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.949801 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.949810 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.949867 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:22:52 crc kubenswrapper[4973]: I0320 13:22:52.949887 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:22:52 crc kubenswrapper[4973]: E0320 13:22:52.949990 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:22:52 crc kubenswrapper[4973]: E0320 13:22:52.950040 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:22:52 crc kubenswrapper[4973]: E0320 13:22:52.950142 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:22:52 crc kubenswrapper[4973]: E0320 13:22:52.950177 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.050830 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.050882 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.050891 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.050913 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.050926 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:53Z","lastTransitionTime":"2026-03-20T13:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.153976 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.154020 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.154029 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.154046 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.154085 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:53Z","lastTransitionTime":"2026-03-20T13:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.256782 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.256830 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.256842 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.256863 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.256879 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:53Z","lastTransitionTime":"2026-03-20T13:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.358573 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.358653 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.358669 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.358687 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.358699 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:53Z","lastTransitionTime":"2026-03-20T13:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.460426 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.460724 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.460734 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.460748 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.460757 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:53Z","lastTransitionTime":"2026-03-20T13:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.562269 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.562319 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.562372 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.562389 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.562401 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:53Z","lastTransitionTime":"2026-03-20T13:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.664283 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.664317 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.664326 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.664359 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.664368 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:53Z","lastTransitionTime":"2026-03-20T13:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.767067 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.767099 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.767108 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.767122 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.767131 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:53Z","lastTransitionTime":"2026-03-20T13:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.869825 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.869865 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.869876 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.869893 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.869905 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:53Z","lastTransitionTime":"2026-03-20T13:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.972322 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.972367 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.972378 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.972391 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:53 crc kubenswrapper[4973]: I0320 13:22:53.972400 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:53Z","lastTransitionTime":"2026-03-20T13:22:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.074543 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.074581 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.074592 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.074608 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.074620 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:54Z","lastTransitionTime":"2026-03-20T13:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.176848 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.177092 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.177200 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.177277 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.177361 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:54Z","lastTransitionTime":"2026-03-20T13:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.280375 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.280437 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.280460 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.280489 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.280516 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:54Z","lastTransitionTime":"2026-03-20T13:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.382289 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.382326 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.382359 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.382383 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.382397 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:54Z","lastTransitionTime":"2026-03-20T13:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.427946 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jllfx_774edfed-7d45-4b69-b9d7-a3a914cbca04/ovnkube-controller/0.log" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.430864 4973 generic.go:334] "Generic (PLEG): container finished" podID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerID="7b2bf1073dbe1bcafe13e2888ed61c9031e1e22e86fa74d690411041b2918b0f" exitCode=1 Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.430895 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" event={"ID":"774edfed-7d45-4b69-b9d7-a3a914cbca04","Type":"ContainerDied","Data":"7b2bf1073dbe1bcafe13e2888ed61c9031e1e22e86fa74d690411041b2918b0f"} Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.432011 4973 scope.go:117] "RemoveContainer" containerID="7b2bf1073dbe1bcafe13e2888ed61c9031e1e22e86fa74d690411041b2918b0f" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.447290 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.464769 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.479539 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.484323 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.484371 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.484383 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.484400 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.484411 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:54Z","lastTransitionTime":"2026-03-20T13:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.493636 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.506382 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.520242 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.534970 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.552364 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07ee7da60e3690096c10650c3b6c42fec3b15bd2067e4ce5013b7b450626f4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://871fb9fe2847621c60969a007f73da50cc62ca926e0251ee5c0fb27aefbda812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66b6b2fb9b2f8d32d126fc2a6ce3795a88f698d6da5fe5e2dd8f67633eaed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ecf535291dac1ed0e5ea70a4bf7b4dd5d8db1a7d39b63ace83f7b7376a1a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://769012078c45497a3a9634abb008e5887154b0b6af53007e47fa95b7f8093e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a85b4f11b49688632f7d86a55bf387f3469cf372fffceeb507ca48fb35d031\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7b2bf1073dbe1bcafe13e2888ed61c9031e1e22e86fa74d690411041b2918b0f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b2bf1073dbe1bcafe13e2888ed61c9031e1e22e86fa74d690411041b2918b0f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:22:53Z\\\",\\\"message\\\":\\\"flector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:22:53.574330 6804 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:22:53.574472 6804 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:22:53.574582 6804 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:22:53.574814 6804 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:22:53.574916 6804 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:22:53.574999 6804 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 13:22:53.575104 6804 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:22:53.575592 6804 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://049540dfa35ad15c3611412fbe908d24faf4f90a09da41b7a16ce2c5b4bd5062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.562433 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.573189 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.587111 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.587213 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.587245 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.587260 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.587278 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.587289 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:54Z","lastTransitionTime":"2026-03-20T13:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.599739 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.615141 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.629179 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.641119 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.655733 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72006c12565eceb6bcd910f7a84aefb14a881325ea4617bed4ccffdd31e356c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:54Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.689704 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.689743 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.689751 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.689766 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.689776 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:54Z","lastTransitionTime":"2026-03-20T13:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.792459 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.792523 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.792537 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.792560 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.792577 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:54Z","lastTransitionTime":"2026-03-20T13:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.895189 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.895443 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.895462 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.895532 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.895551 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:54Z","lastTransitionTime":"2026-03-20T13:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.949935 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.950017 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:22:54 crc kubenswrapper[4973]: E0320 13:22:54.950055 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:22:54 crc kubenswrapper[4973]: E0320 13:22:54.950158 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.950230 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:22:54 crc kubenswrapper[4973]: E0320 13:22:54.950295 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.950362 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:22:54 crc kubenswrapper[4973]: E0320 13:22:54.950407 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.998105 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.998145 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.998155 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.998171 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:54 crc kubenswrapper[4973]: I0320 13:22:54.998183 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:54Z","lastTransitionTime":"2026-03-20T13:22:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.099879 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.099916 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.099925 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.099939 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.099947 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:55Z","lastTransitionTime":"2026-03-20T13:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.202184 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.202222 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.202234 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.202247 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.202255 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:55Z","lastTransitionTime":"2026-03-20T13:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.303960 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.304001 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.304009 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.304024 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.304033 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:55Z","lastTransitionTime":"2026-03-20T13:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.405696 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.405736 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.405744 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.405759 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.405768 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:55Z","lastTransitionTime":"2026-03-20T13:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.435378 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jllfx_774edfed-7d45-4b69-b9d7-a3a914cbca04/ovnkube-controller/0.log" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.437923 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" event={"ID":"774edfed-7d45-4b69-b9d7-a3a914cbca04","Type":"ContainerStarted","Data":"2f7151ffba559fac0693eaecf8a2fee71159b68227809d4af62b8486ee7af3a6"} Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.438352 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.452076 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.463648 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.475054 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.486048 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.496877 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72006c12565eceb6bcd910f7a84aefb14a881325ea4617bed4ccffdd31e356c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.506205 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.507616 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.507643 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.507651 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.507664 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.507673 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:55Z","lastTransitionTime":"2026-03-20T13:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.514092 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.522723 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.533278 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.545134 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.556684 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.567349 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.583447 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07ee7da60e3690096c10650c3b6c42fec3b15bd2067e4ce5013b7b450626f4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://871fb9fe2847621c60969a007f73da50cc62ca926e0251ee5c0fb27aefbda812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66b6b2fb9b2f8d32d126fc2a6ce3795a88f698d6da5fe5e2dd8f67633eaed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ecf535291dac1ed0e5ea70a4bf7b4dd5d8db1a7d39b63ace83f7b7376a1a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://769012078c45497a3a9634abb008e5887154b0b6af53007e47fa95b7f8093e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a85b4f11b49688632f7d86a55bf387f3469cf372fffceeb507ca48fb35d031\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f7151ffba559fac0693eaecf8a2fee71159b68227809d4af62b8486ee7af3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b2bf1073dbe1bcafe13e2888ed61c9031e1e22e86fa74d690411041b2918b0f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:22:53Z\\\",\\\"message\\\":\\\"flector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:22:53.574330 6804 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:22:53.574472 6804 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:22:53.574582 6804 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:22:53.574814 6804 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:22:53.574916 6804 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:22:53.574999 6804 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 13:22:53.575104 6804 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:22:53.575592 6804 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://049540dfa35ad15c3611412fbe908d24faf4f90a09da41b7a16ce2c5b4bd5062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.595974 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.605736 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.610736 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.610779 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.610789 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.610805 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.610816 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:55Z","lastTransitionTime":"2026-03-20T13:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.616674 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:55Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.713136 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.713174 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.713184 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.713199 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.713208 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:55Z","lastTransitionTime":"2026-03-20T13:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.815433 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.815483 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.815502 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.815525 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.815542 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:55Z","lastTransitionTime":"2026-03-20T13:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.917457 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.917523 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.917534 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.917551 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:55 crc kubenswrapper[4973]: I0320 13:22:55.917560 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:55Z","lastTransitionTime":"2026-03-20T13:22:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.019775 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.019809 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.019835 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.019850 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.019862 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:56Z","lastTransitionTime":"2026-03-20T13:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.122854 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.122896 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.122906 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.122924 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.122945 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:56Z","lastTransitionTime":"2026-03-20T13:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.226295 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.226348 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.226398 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.226417 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.226429 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:56Z","lastTransitionTime":"2026-03-20T13:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.328450 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.328517 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.328528 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.328544 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.328553 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:56Z","lastTransitionTime":"2026-03-20T13:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.430885 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.430946 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.430963 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.430987 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.431004 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:56Z","lastTransitionTime":"2026-03-20T13:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.441449 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jllfx_774edfed-7d45-4b69-b9d7-a3a914cbca04/ovnkube-controller/1.log" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.442445 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jllfx_774edfed-7d45-4b69-b9d7-a3a914cbca04/ovnkube-controller/0.log" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.444794 4973 generic.go:334] "Generic (PLEG): container finished" podID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerID="2f7151ffba559fac0693eaecf8a2fee71159b68227809d4af62b8486ee7af3a6" exitCode=1 Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.444878 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" event={"ID":"774edfed-7d45-4b69-b9d7-a3a914cbca04","Type":"ContainerDied","Data":"2f7151ffba559fac0693eaecf8a2fee71159b68227809d4af62b8486ee7af3a6"} Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.444952 4973 scope.go:117] "RemoveContainer" containerID="7b2bf1073dbe1bcafe13e2888ed61c9031e1e22e86fa74d690411041b2918b0f" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.445721 4973 scope.go:117] "RemoveContainer" containerID="2f7151ffba559fac0693eaecf8a2fee71159b68227809d4af62b8486ee7af3a6" Mar 20 13:22:56 crc kubenswrapper[4973]: E0320 13:22:56.446009 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jllfx_openshift-ovn-kubernetes(774edfed-7d45-4b69-b9d7-a3a914cbca04)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.466561 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72006c12565eceb6bcd910f7a84aefb14a881325ea4617bed4ccffdd31e356c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.488937 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.500803 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.512736 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.531240 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.533111 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.533206 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.533225 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.533247 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.533264 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:56Z","lastTransitionTime":"2026-03-20T13:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.545152 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.559644 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.570316 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.581933 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.596035 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.612981 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.630018 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.635733 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.635772 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.635781 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.635797 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.635807 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:56Z","lastTransitionTime":"2026-03-20T13:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.654111 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07ee7da60e3690096c10650c3b6c42fec3b15bd2067e4ce5013b7b450626f4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://871fb9fe2847621c60969a007f73da50cc62ca926e0251ee5c0fb27aefbda812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66b6b2fb9b2f8d32d126fc2a6ce3795a88f698d6da5fe5e2dd8f67633eaed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ecf535291dac1ed0e5ea70a4bf7b4dd5d8db1a7d39b63ace83f7b7376a1a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://769012078c45497a3a9634abb008e5887154b0b6af53007e47fa95b7f8093e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a85b4f11b49688632f7d86a55bf387f3469cf372fffceeb507ca48fb35d031\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f7151ffba559fac0693eaecf8a2fee71159b68227809d4af62b8486ee7af3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7b2bf1073dbe1bcafe13e2888ed61c9031e1e22e86fa74d690411041b2918b0f\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:22:53Z\\\",\\\"message\\\":\\\"flector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:22:53.574330 6804 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:22:53.574472 6804 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:22:53.574582 6804 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:22:53.574814 6804 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:22:53.574916 6804 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 13:22:53.574999 6804 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 13:22:53.575104 6804 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 13:22:53.575592 6804 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f7151ffba559fac0693eaecf8a2fee71159b68227809d4af62b8486ee7af3a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:22:55Z\\\",\\\"message\\\":\\\"ntroller.go:444] Built service openshift-console-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0320 13:22:55.343731 6944 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0320 13:22:55.343735 6944 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0320 13:22:55.343739 6944 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0320 13:22:55.343731 6944 services_controller.go:445] Built service openshift-console-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 13:22:55.343754 6944 services_controller.go:451] Built service openshift-console-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.88\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:54Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://049540dfa35ad15c3611412fbe908d24faf4f90a09da41b7a16ce2c5b4bd5062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.670946 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.686851 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.716026 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:56Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.738043 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.738093 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.738102 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.738117 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.738126 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:56Z","lastTransitionTime":"2026-03-20T13:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.840247 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.840281 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.840289 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.840303 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.840311 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:56Z","lastTransitionTime":"2026-03-20T13:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.942411 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.942442 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.942449 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.942462 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.942470 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:56Z","lastTransitionTime":"2026-03-20T13:22:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.949887 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.949923 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:22:56 crc kubenswrapper[4973]: E0320 13:22:56.949992 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.950020 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:22:56 crc kubenswrapper[4973]: E0320 13:22:56.950117 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:22:56 crc kubenswrapper[4973]: I0320 13:22:56.950132 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:22:56 crc kubenswrapper[4973]: E0320 13:22:56.950406 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:22:56 crc kubenswrapper[4973]: E0320 13:22:56.950496 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.046540 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.046605 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.046616 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.046635 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.046652 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:57Z","lastTransitionTime":"2026-03-20T13:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.149077 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.149110 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.149121 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.149134 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.149143 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:57Z","lastTransitionTime":"2026-03-20T13:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.251401 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.251467 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.251477 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.251491 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.251501 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:57Z","lastTransitionTime":"2026-03-20T13:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.353961 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.354021 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.354033 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.354057 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.354069 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:57Z","lastTransitionTime":"2026-03-20T13:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.450117 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jllfx_774edfed-7d45-4b69-b9d7-a3a914cbca04/ovnkube-controller/1.log" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.455250 4973 scope.go:117] "RemoveContainer" containerID="2f7151ffba559fac0693eaecf8a2fee71159b68227809d4af62b8486ee7af3a6" Mar 20 13:22:57 crc kubenswrapper[4973]: E0320 13:22:57.455554 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jllfx_openshift-ovn-kubernetes(774edfed-7d45-4b69-b9d7-a3a914cbca04)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.455888 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.455936 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.455958 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.455981 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.455998 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:57Z","lastTransitionTime":"2026-03-20T13:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.469975 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:57Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.485150 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:57Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.498069 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:57Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.511386 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:57Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.523161 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:57Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.533331 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:57Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.546875 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:57Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.557865 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.557905 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.557917 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.557933 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.557944 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:57Z","lastTransitionTime":"2026-03-20T13:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.567701 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07ee7da60e3690096c10650c3b6c42fec3b15bd2067e4ce5013b7b450626f4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://871fb9fe2847621c60969a007f73da50cc62ca926e0251ee5c0fb27aefbda812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66b6b2fb9b2f8d32d126fc2a6ce3795a88f698d6da5fe5e2dd8f67633eaed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ecf535291dac1ed0e5ea70a4bf7b4dd5d8db1a7d39b63ace83f7b7376a1a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://769012078c45497a3a9634abb008e5887154b0b6af53007e47fa95b7f8093e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a85b4f11b49688632f7d86a55bf387f3469cf372fffceeb507ca48fb35d031\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f7151ffba559fac0693eaecf8a2fee71159b68227809d4af62b8486ee7af3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f7151ffba559fac0693eaecf8a2fee71159b68227809d4af62b8486ee7af3a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:22:55Z\\\",\\\"message\\\":\\\"ntroller.go:444] Built service openshift-console-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0320 13:22:55.343731 6944 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0320 13:22:55.343735 6944 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0320 13:22:55.343739 6944 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0320 13:22:55.343731 6944 services_controller.go:445] Built service openshift-console-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 13:22:55.343754 6944 services_controller.go:451] Built service openshift-console-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.88\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jllfx_openshift-ovn-kubernetes(774edfed-7d45-4b69-b9d7-a3a914cbca04)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://049540dfa35ad15c3611412fbe908d24faf4f90a09da41b7a16ce2c5b4bd5062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:57Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.581527 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:57Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.591856 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:57Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.604051 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:57Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.615454 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:57Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.627953 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:57Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.639616 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:57Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.657619 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72006c12565eceb6bcd910f7a84aefb14a881325ea4617bed4ccffdd31e356c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:57Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.659829 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.659866 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.659877 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.659898 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.659909 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:57Z","lastTransitionTime":"2026-03-20T13:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.672714 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:57Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.767185 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.767487 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.767550 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.767628 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.767686 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:57Z","lastTransitionTime":"2026-03-20T13:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.869520 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.869737 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.869801 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.869888 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.869994 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:57Z","lastTransitionTime":"2026-03-20T13:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.972567 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.972607 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.972621 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.972636 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:57 crc kubenswrapper[4973]: I0320 13:22:57.972649 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:57Z","lastTransitionTime":"2026-03-20T13:22:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.075124 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.075439 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.075510 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.075580 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.075637 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:58Z","lastTransitionTime":"2026-03-20T13:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.178041 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.178114 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.178137 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.178167 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.178188 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:58Z","lastTransitionTime":"2026-03-20T13:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.280776 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.280809 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.280818 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.280831 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.280840 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:58Z","lastTransitionTime":"2026-03-20T13:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.383380 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.383563 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.383573 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.383586 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.383596 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:58Z","lastTransitionTime":"2026-03-20T13:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.486047 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.486310 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.486418 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.486506 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.486660 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:58Z","lastTransitionTime":"2026-03-20T13:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.593796 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.593837 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.593847 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.593862 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.593873 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:58Z","lastTransitionTime":"2026-03-20T13:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.697277 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.697320 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.697330 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.697368 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.697381 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:58Z","lastTransitionTime":"2026-03-20T13:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.767376 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:22:58 crc kubenswrapper[4973]: E0320 13:22:58.767461 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:14.767443449 +0000 UTC m=+115.511113193 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.767503 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.767539 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.767558 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:22:58 crc kubenswrapper[4973]: E0320 13:22:58.767631 4973 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:22:58 crc kubenswrapper[4973]: E0320 13:22:58.767671 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:14.767661014 +0000 UTC m=+115.511330758 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:22:58 crc kubenswrapper[4973]: E0320 13:22:58.767696 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:22:58 crc kubenswrapper[4973]: E0320 13:22:58.767715 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:22:58 crc kubenswrapper[4973]: E0320 13:22:58.767725 4973 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:22:58 crc kubenswrapper[4973]: E0320 13:22:58.767759 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:14.767750937 +0000 UTC m=+115.511420671 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:22:58 crc kubenswrapper[4973]: E0320 13:22:58.767777 4973 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:22:58 crc kubenswrapper[4973]: E0320 13:22:58.767840 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:14.767828 +0000 UTC m=+115.511497744 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.798927 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.798999 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.799016 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.799038 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.799055 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:58Z","lastTransitionTime":"2026-03-20T13:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.868834 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93c5ad90-87bf-4668-9d87-34e676b15783-metrics-certs\") pod \"network-metrics-daemon-7kszd\" (UID: \"93c5ad90-87bf-4668-9d87-34e676b15783\") " pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.868883 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:22:58 crc kubenswrapper[4973]: E0320 13:22:58.868997 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:22:58 crc kubenswrapper[4973]: E0320 13:22:58.869013 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:22:58 crc kubenswrapper[4973]: E0320 13:22:58.869023 4973 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:22:58 crc kubenswrapper[4973]: E0320 13:22:58.869067 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:14.869055484 +0000 UTC m=+115.612725218 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:22:58 crc kubenswrapper[4973]: E0320 13:22:58.870439 4973 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:22:58 crc kubenswrapper[4973]: E0320 13:22:58.870489 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93c5ad90-87bf-4668-9d87-34e676b15783-metrics-certs podName:93c5ad90-87bf-4668-9d87-34e676b15783 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:14.870477452 +0000 UTC m=+115.614147196 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93c5ad90-87bf-4668-9d87-34e676b15783-metrics-certs") pod "network-metrics-daemon-7kszd" (UID: "93c5ad90-87bf-4668-9d87-34e676b15783") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.900522 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.900560 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.900570 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.900584 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.900592 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:58Z","lastTransitionTime":"2026-03-20T13:22:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.949779 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.949779 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:22:58 crc kubenswrapper[4973]: E0320 13:22:58.949894 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.949797 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:22:58 crc kubenswrapper[4973]: I0320 13:22:58.949787 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:22:58 crc kubenswrapper[4973]: E0320 13:22:58.949962 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:22:58 crc kubenswrapper[4973]: E0320 13:22:58.950010 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:22:58 crc kubenswrapper[4973]: E0320 13:22:58.950078 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.002259 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.002294 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.002302 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.002314 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.002322 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:59Z","lastTransitionTime":"2026-03-20T13:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.104375 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.104414 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.104423 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.104439 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.104457 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:59Z","lastTransitionTime":"2026-03-20T13:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.206115 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.206155 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.206169 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.206184 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.206193 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:59Z","lastTransitionTime":"2026-03-20T13:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.308592 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.308664 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.308686 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.308715 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.308737 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:59Z","lastTransitionTime":"2026-03-20T13:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.412459 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.412537 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.412562 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.412593 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.412619 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:59Z","lastTransitionTime":"2026-03-20T13:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.516666 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.517261 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.517299 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.517323 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.517337 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:59Z","lastTransitionTime":"2026-03-20T13:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.619569 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.619613 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.619623 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.619638 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.619648 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:59Z","lastTransitionTime":"2026-03-20T13:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.721845 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.721896 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.721919 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.721941 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.721957 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:59Z","lastTransitionTime":"2026-03-20T13:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.823776 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.823823 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.823839 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.823868 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.823882 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:59Z","lastTransitionTime":"2026-03-20T13:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.925919 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.925972 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.925985 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.926004 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.926018 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:22:59Z","lastTransitionTime":"2026-03-20T13:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.964948 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:59Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.977801 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:59Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.987494 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:59Z is after 2025-08-24T17:21:41Z" Mar 20 13:22:59 crc kubenswrapper[4973]: I0320 13:22:59.997934 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:22:59Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.012134 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:00Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.027982 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.028028 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.028042 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.028062 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.028082 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:00Z","lastTransitionTime":"2026-03-20T13:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.029545 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72006c12565eceb6bcd910f7a84aefb14a881325ea4617bed4ccffdd31e356c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:00Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.049743 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:00Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.059259 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:00Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.069725 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:00Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.081157 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:00Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.095156 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:00Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.108377 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:00Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.119712 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:00Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.128539 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:00Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.130054 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.130163 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.130243 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.130400 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.130503 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:00Z","lastTransitionTime":"2026-03-20T13:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.140222 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:00Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.154961 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07ee7da60e3690096c10650c3b6c42fec3b15bd2067e4ce5013b7b450626f4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://871fb9fe2847621c60969a007f73da50cc62ca926e0251ee5c0fb27aefbda812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66b6b2fb9b2f8d32d126fc2a6ce3795a88f698d6da5fe5e2dd8f67633eaed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ecf535291dac1ed0e5ea70a4bf7b4dd5d8db1a7d39b63ace83f7b7376a1a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://769012078c45497a3a9634abb008e5887154b0b6af53007e47fa95b7f8093e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a85b4f11b49688632f7d86a55bf387f3469cf372fffceeb507ca48fb35d031\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f7151ffba559fac0693eaecf8a2fee71159b68227809d4af62b8486ee7af3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f7151ffba559fac0693eaecf8a2fee71159b68227809d4af62b8486ee7af3a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:22:55Z\\\",\\\"message\\\":\\\"ntroller.go:444] Built service openshift-console-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0320 13:22:55.343731 6944 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0320 13:22:55.343735 6944 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0320 13:22:55.343739 6944 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0320 13:22:55.343731 6944 services_controller.go:445] Built service openshift-console-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 13:22:55.343754 6944 services_controller.go:451] Built service openshift-console-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.88\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jllfx_openshift-ovn-kubernetes(774edfed-7d45-4b69-b9d7-a3a914cbca04)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://049540dfa35ad15c3611412fbe908d24faf4f90a09da41b7a16ce2c5b4bd5062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:00Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.232881 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.232929 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.232940 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.232957 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.232971 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:00Z","lastTransitionTime":"2026-03-20T13:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.335592 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.335836 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.335909 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.335988 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.336051 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:00Z","lastTransitionTime":"2026-03-20T13:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.438910 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.438942 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.438950 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.438963 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.438972 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:00Z","lastTransitionTime":"2026-03-20T13:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.496908 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.496978 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.497000 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.497022 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.497041 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:00Z","lastTransitionTime":"2026-03-20T13:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:00 crc kubenswrapper[4973]: E0320 13:23:00.511576 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:00Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.515324 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.515384 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.515402 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.515422 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.515436 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:00Z","lastTransitionTime":"2026-03-20T13:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:00 crc kubenswrapper[4973]: E0320 13:23:00.529163 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:00Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.533818 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.533871 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.533887 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.533910 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.533927 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:00Z","lastTransitionTime":"2026-03-20T13:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:00 crc kubenswrapper[4973]: E0320 13:23:00.550517 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:00Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.554633 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.554687 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.554700 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.554719 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.554731 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:00Z","lastTransitionTime":"2026-03-20T13:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:00 crc kubenswrapper[4973]: E0320 13:23:00.573628 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:00Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.577928 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.577978 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.577990 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.578010 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.578024 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:00Z","lastTransitionTime":"2026-03-20T13:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:00 crc kubenswrapper[4973]: E0320 13:23:00.591145 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:00Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:00 crc kubenswrapper[4973]: E0320 13:23:00.591271 4973 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.592923 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.592980 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.592996 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.593018 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.593039 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:00Z","lastTransitionTime":"2026-03-20T13:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.695613 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.695671 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.695689 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.695715 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.695732 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:00Z","lastTransitionTime":"2026-03-20T13:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.797932 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.798176 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.798185 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.798199 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.798208 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:00Z","lastTransitionTime":"2026-03-20T13:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.900298 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.900331 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.900355 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.900368 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.900379 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:00Z","lastTransitionTime":"2026-03-20T13:23:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.949665 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.949716 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.949753 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:00 crc kubenswrapper[4973]: I0320 13:23:00.949779 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:00 crc kubenswrapper[4973]: E0320 13:23:00.949789 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:00 crc kubenswrapper[4973]: E0320 13:23:00.949873 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:00 crc kubenswrapper[4973]: E0320 13:23:00.949938 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:00 crc kubenswrapper[4973]: E0320 13:23:00.950022 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.002576 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.002612 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.002621 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.002633 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.002641 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:01Z","lastTransitionTime":"2026-03-20T13:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.105275 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.105318 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.105329 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.105365 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.105376 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:01Z","lastTransitionTime":"2026-03-20T13:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.207377 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.207452 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.207476 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.207579 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.207610 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:01Z","lastTransitionTime":"2026-03-20T13:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.310379 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.310444 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.310469 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.310498 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.310521 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:01Z","lastTransitionTime":"2026-03-20T13:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.412898 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.412959 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.412968 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.412984 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.412994 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:01Z","lastTransitionTime":"2026-03-20T13:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.515723 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.515761 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.515771 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.515788 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.515799 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:01Z","lastTransitionTime":"2026-03-20T13:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.618283 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.618327 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.618353 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.618371 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.618382 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:01Z","lastTransitionTime":"2026-03-20T13:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.720395 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.720435 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.720445 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.720460 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.720472 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:01Z","lastTransitionTime":"2026-03-20T13:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.823230 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.823539 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.823628 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.823713 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.823804 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:01Z","lastTransitionTime":"2026-03-20T13:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.925935 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.925982 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.925993 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.926010 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.926022 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:01Z","lastTransitionTime":"2026-03-20T13:23:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:01 crc kubenswrapper[4973]: I0320 13:23:01.951296 4973 scope.go:117] "RemoveContainer" containerID="923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365" Mar 20 13:23:01 crc kubenswrapper[4973]: E0320 13:23:01.951515 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.027834 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.027883 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.027895 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.027914 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.027926 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:02Z","lastTransitionTime":"2026-03-20T13:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.129698 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.129731 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.129740 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.129755 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.129763 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:02Z","lastTransitionTime":"2026-03-20T13:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.231974 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.232013 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.232026 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.232043 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.232055 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:02Z","lastTransitionTime":"2026-03-20T13:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.334384 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.334611 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.334749 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.334849 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.334944 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:02Z","lastTransitionTime":"2026-03-20T13:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.437489 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.437819 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.437892 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.438003 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.438105 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:02Z","lastTransitionTime":"2026-03-20T13:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.540109 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.540412 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.540565 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.540679 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.540755 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:02Z","lastTransitionTime":"2026-03-20T13:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.643267 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.643315 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.643331 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.643375 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.643389 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:02Z","lastTransitionTime":"2026-03-20T13:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.745223 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.745832 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.745906 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.745967 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.746028 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:02Z","lastTransitionTime":"2026-03-20T13:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.848468 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.848509 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.848518 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.848531 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.848540 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:02Z","lastTransitionTime":"2026-03-20T13:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.949594 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.949695 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.950101 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:02 crc kubenswrapper[4973]: E0320 13:23:02.950170 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:02 crc kubenswrapper[4973]: E0320 13:23:02.950222 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.950367 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:02 crc kubenswrapper[4973]: E0320 13:23:02.950442 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:02 crc kubenswrapper[4973]: E0320 13:23:02.950616 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.952151 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.952206 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.952236 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.952261 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:02 crc kubenswrapper[4973]: I0320 13:23:02.952278 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:02Z","lastTransitionTime":"2026-03-20T13:23:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.054536 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.054575 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.054588 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.054604 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.054616 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:03Z","lastTransitionTime":"2026-03-20T13:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.156695 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.156734 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.156742 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.156758 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.156768 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:03Z","lastTransitionTime":"2026-03-20T13:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.260890 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.261394 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.261608 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.261811 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.261993 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:03Z","lastTransitionTime":"2026-03-20T13:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.364502 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.364805 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.364925 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.365054 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.365172 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:03Z","lastTransitionTime":"2026-03-20T13:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.467540 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.467838 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.468030 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.468122 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.468203 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:03Z","lastTransitionTime":"2026-03-20T13:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.570734 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.570770 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.570780 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.570798 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.570810 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:03Z","lastTransitionTime":"2026-03-20T13:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.673384 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.673667 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.673765 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.673868 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.673958 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:03Z","lastTransitionTime":"2026-03-20T13:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.775869 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.775897 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.775905 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.775920 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.775929 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:03Z","lastTransitionTime":"2026-03-20T13:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.877891 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.877926 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.877938 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.877953 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.877966 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:03Z","lastTransitionTime":"2026-03-20T13:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.964640 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.979770 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.979855 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.979880 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.979910 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:03 crc kubenswrapper[4973]: I0320 13:23:03.979933 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:03Z","lastTransitionTime":"2026-03-20T13:23:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.082957 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.083004 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.083018 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.083038 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.083056 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:04Z","lastTransitionTime":"2026-03-20T13:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.185713 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.185756 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.185768 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.185783 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.185793 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:04Z","lastTransitionTime":"2026-03-20T13:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.288592 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.288621 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.288630 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.288643 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.288653 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:04Z","lastTransitionTime":"2026-03-20T13:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.391293 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.391326 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.391347 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.391361 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.391373 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:04Z","lastTransitionTime":"2026-03-20T13:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.494054 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.494090 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.494098 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.494115 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.494125 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:04Z","lastTransitionTime":"2026-03-20T13:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.596153 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.596234 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.596248 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.596266 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.596277 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:04Z","lastTransitionTime":"2026-03-20T13:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.699048 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.699100 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.699116 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.699137 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.699151 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:04Z","lastTransitionTime":"2026-03-20T13:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.800665 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.800711 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.800730 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.800752 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.800769 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:04Z","lastTransitionTime":"2026-03-20T13:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.902774 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.902818 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.902830 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.902850 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.902863 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:04Z","lastTransitionTime":"2026-03-20T13:23:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.950390 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.950500 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.950418 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:04 crc kubenswrapper[4973]: E0320 13:23:04.950616 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:04 crc kubenswrapper[4973]: E0320 13:23:04.950773 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:04 crc kubenswrapper[4973]: E0320 13:23:04.950880 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:04 crc kubenswrapper[4973]: I0320 13:23:04.951131 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:04 crc kubenswrapper[4973]: E0320 13:23:04.951424 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.005381 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.005449 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.005471 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.005501 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.005522 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:05Z","lastTransitionTime":"2026-03-20T13:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.107632 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.107670 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.107678 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.107693 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.107703 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:05Z","lastTransitionTime":"2026-03-20T13:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.211820 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.211870 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.211884 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.211904 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.211922 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:05Z","lastTransitionTime":"2026-03-20T13:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.314497 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.314750 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.314818 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.314902 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.314991 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:05Z","lastTransitionTime":"2026-03-20T13:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.417007 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.417236 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.417384 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.417454 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.417520 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:05Z","lastTransitionTime":"2026-03-20T13:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.519774 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.519833 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.519850 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.519891 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.519905 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:05Z","lastTransitionTime":"2026-03-20T13:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.622196 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.622233 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.622243 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.622257 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.622265 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:05Z","lastTransitionTime":"2026-03-20T13:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.724568 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.724588 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.724596 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.724606 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.724614 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:05Z","lastTransitionTime":"2026-03-20T13:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.826994 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.827038 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.827049 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.827063 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.827073 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:05Z","lastTransitionTime":"2026-03-20T13:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.928999 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.929237 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.929357 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.929442 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:05 crc kubenswrapper[4973]: I0320 13:23:05.929538 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:05Z","lastTransitionTime":"2026-03-20T13:23:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.031664 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.031694 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.031705 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.031720 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.031732 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:06Z","lastTransitionTime":"2026-03-20T13:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.133637 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.133679 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.133691 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.133706 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.133716 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:06Z","lastTransitionTime":"2026-03-20T13:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.235979 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.236016 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.236025 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.236040 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.236050 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:06Z","lastTransitionTime":"2026-03-20T13:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.338266 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.338308 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.338319 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.338351 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.338361 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:06Z","lastTransitionTime":"2026-03-20T13:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.441295 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.441357 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.441366 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.441381 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.441390 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:06Z","lastTransitionTime":"2026-03-20T13:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.543580 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.543647 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.543665 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.543689 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.543706 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:06Z","lastTransitionTime":"2026-03-20T13:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.646388 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.646432 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.646443 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.646458 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.646467 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:06Z","lastTransitionTime":"2026-03-20T13:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.749129 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.749164 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.749172 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.749185 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.749196 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:06Z","lastTransitionTime":"2026-03-20T13:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.851678 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.851775 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.851791 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.851816 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.852224 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:06Z","lastTransitionTime":"2026-03-20T13:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.949837 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.949913 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.949914 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.949851 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:06 crc kubenswrapper[4973]: E0320 13:23:06.950042 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:06 crc kubenswrapper[4973]: E0320 13:23:06.950107 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:06 crc kubenswrapper[4973]: E0320 13:23:06.950268 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:06 crc kubenswrapper[4973]: E0320 13:23:06.950415 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.955053 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.955096 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.955110 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.955130 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:06 crc kubenswrapper[4973]: I0320 13:23:06.955143 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:06Z","lastTransitionTime":"2026-03-20T13:23:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.058010 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.058052 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.058064 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.058080 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.058094 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:07Z","lastTransitionTime":"2026-03-20T13:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.160756 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.160799 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.160811 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.160830 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.160845 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:07Z","lastTransitionTime":"2026-03-20T13:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.264160 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.264217 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.264229 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.264248 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.264260 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:07Z","lastTransitionTime":"2026-03-20T13:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.366856 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.366919 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.366931 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.366953 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.366966 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:07Z","lastTransitionTime":"2026-03-20T13:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.469531 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.469596 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.469612 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.469630 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.469642 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:07Z","lastTransitionTime":"2026-03-20T13:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.572467 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.572538 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.572552 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.572585 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.572600 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:07Z","lastTransitionTime":"2026-03-20T13:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.675024 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.675077 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.675101 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.675129 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.675184 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:07Z","lastTransitionTime":"2026-03-20T13:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.778197 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.778246 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.778258 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.778275 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.778287 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:07Z","lastTransitionTime":"2026-03-20T13:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.881659 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.881714 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.881723 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.881739 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.881749 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:07Z","lastTransitionTime":"2026-03-20T13:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.984623 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.984693 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.984717 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.984762 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:07 crc kubenswrapper[4973]: I0320 13:23:07.984791 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:07Z","lastTransitionTime":"2026-03-20T13:23:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.087685 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.087721 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.087731 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.087744 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.087751 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:08Z","lastTransitionTime":"2026-03-20T13:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.190169 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.190219 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.190230 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.190245 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.190254 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:08Z","lastTransitionTime":"2026-03-20T13:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.292625 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.292700 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.292724 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.292743 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.292755 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:08Z","lastTransitionTime":"2026-03-20T13:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.396015 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.396062 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.396073 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.396094 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.396107 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:08Z","lastTransitionTime":"2026-03-20T13:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.498403 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.498447 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.498457 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.498470 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.498480 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:08Z","lastTransitionTime":"2026-03-20T13:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.599866 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.599899 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.599907 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.599919 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.599929 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:08Z","lastTransitionTime":"2026-03-20T13:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.703116 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.703173 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.703189 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.703213 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.703229 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:08Z","lastTransitionTime":"2026-03-20T13:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.805887 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.805923 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.805931 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.805969 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.805980 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:08Z","lastTransitionTime":"2026-03-20T13:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.909008 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.909077 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.909090 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.909110 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.909122 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:08Z","lastTransitionTime":"2026-03-20T13:23:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.950531 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.950582 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.950559 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:08 crc kubenswrapper[4973]: I0320 13:23:08.950534 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:08 crc kubenswrapper[4973]: E0320 13:23:08.950696 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:08 crc kubenswrapper[4973]: E0320 13:23:08.950773 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:08 crc kubenswrapper[4973]: E0320 13:23:08.950847 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:08 crc kubenswrapper[4973]: E0320 13:23:08.950932 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.012465 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.012647 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.012661 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.012679 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.012690 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:09Z","lastTransitionTime":"2026-03-20T13:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.115766 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.115822 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.115839 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.115862 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.115879 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:09Z","lastTransitionTime":"2026-03-20T13:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.217840 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.217883 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.217897 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.217911 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.217923 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:09Z","lastTransitionTime":"2026-03-20T13:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.319825 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.320106 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.320229 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.320580 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.320881 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:09Z","lastTransitionTime":"2026-03-20T13:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.423516 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.423546 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.423555 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.423568 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.423576 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:09Z","lastTransitionTime":"2026-03-20T13:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.525817 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.526187 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.526324 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.526669 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.526890 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:09Z","lastTransitionTime":"2026-03-20T13:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.630470 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.630527 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.630544 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.630568 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.630586 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:09Z","lastTransitionTime":"2026-03-20T13:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.733548 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.733617 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.733642 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.733675 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.733696 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:09Z","lastTransitionTime":"2026-03-20T13:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.837254 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.837307 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.837323 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.837380 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.837395 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:09Z","lastTransitionTime":"2026-03-20T13:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.939732 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.939793 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.939810 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.939835 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.939852 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:09Z","lastTransitionTime":"2026-03-20T13:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.963033 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.975642 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:09 crc kubenswrapper[4973]: I0320 13:23:09.989266 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:09Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.006270 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.017611 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.037742 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.049211 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.049304 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.049414 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.049846 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.050014 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:10Z","lastTransitionTime":"2026-03-20T13:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.057760 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.073420 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72006c12565eceb6bcd910f7a84aefb14a881325ea4617bed4ccffdd31e356c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.087149 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.106364 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ad1dd43-c2a7-4e78-9205-34f9a5f38dc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f147336d21c520b769cc2412032e87f34909104d9249047521ad2e72f489564b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15cc75da411a896570582ba77693c099b590adc36f8a41d5f43e0b94c5d6e97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed276b5c92b41fc0f7761442a51338cd71a913e99cc605598fc379cbc1451651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4c87775e72595ceaeaf9ec3c2f6136b579c1f0b2f7b1542525e14da6b424ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dba5b3c826c4891871a963069a09e7b3dceac4295aa4173204d50da0a23b2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.120925 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.133705 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.149250 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.153266 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.153298 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.153308 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.153323 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.153361 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:10Z","lastTransitionTime":"2026-03-20T13:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.162739 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.177571 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.191382 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.213603 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07ee7da60e3690096c10650c3b6c42fec3b15bd2067e4ce5013b7b450626f4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://871fb9fe2847621c60969a007f73da50cc62ca926e0251ee5c0fb27aefbda812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66b6b2fb9b2f8d32d126fc2a6ce3795a88f698d6da5fe5e2dd8f67633eaed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ecf535291dac1ed0e5ea70a4bf7b4dd5d8db1a7d39b63ace83f7b7376a1a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://769012078c45497a3a9634abb008e5887154b0b6af53007e47fa95b7f8093e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a85b4f11b49688632f7d86a55bf387f3469cf372fffceeb507ca48fb35d031\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2f7151ffba559fac0693eaecf8a2fee71159b68227809d4af62b8486ee7af3a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f7151ffba559fac0693eaecf8a2fee71159b68227809d4af62b8486ee7af3a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:22:55Z\\\",\\\"message\\\":\\\"ntroller.go:444] Built service openshift-console-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0320 13:22:55.343731 6944 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0320 13:22:55.343735 6944 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0320 13:22:55.343739 6944 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0320 13:22:55.343731 6944 services_controller.go:445] Built service openshift-console-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 13:22:55.343754 6944 services_controller.go:451] Built service openshift-console-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.88\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-jllfx_openshift-ovn-kubernetes(774edfed-7d45-4b69-b9d7-a3a914cbca04)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://049540dfa35ad15c3611412fbe908d24faf4f90a09da41b7a16ce2c5b4bd5062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.255397 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.255425 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.255433 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.255446 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.255455 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:10Z","lastTransitionTime":"2026-03-20T13:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.357985 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.358034 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.358051 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.358066 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.358076 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:10Z","lastTransitionTime":"2026-03-20T13:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.460610 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.460666 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.460684 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.460706 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.460722 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:10Z","lastTransitionTime":"2026-03-20T13:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.563265 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.563313 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.563325 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.563359 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.563373 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:10Z","lastTransitionTime":"2026-03-20T13:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.665466 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.665532 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.665548 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.665565 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.665577 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:10Z","lastTransitionTime":"2026-03-20T13:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.768153 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.768203 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.768220 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.768240 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.768254 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:10Z","lastTransitionTime":"2026-03-20T13:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.870809 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.870902 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.870920 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.870942 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.870959 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:10Z","lastTransitionTime":"2026-03-20T13:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.922890 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.922934 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.922945 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.922961 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.922973 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:10Z","lastTransitionTime":"2026-03-20T13:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:10 crc kubenswrapper[4973]: E0320 13:23:10.943773 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.948044 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.948083 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.948095 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.948113 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.948128 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:10Z","lastTransitionTime":"2026-03-20T13:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.949934 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.949956 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:10 crc kubenswrapper[4973]: E0320 13:23:10.950043 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.950053 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.950087 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:10 crc kubenswrapper[4973]: E0320 13:23:10.950207 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:10 crc kubenswrapper[4973]: E0320 13:23:10.950387 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:10 crc kubenswrapper[4973]: E0320 13:23:10.950516 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:10 crc kubenswrapper[4973]: E0320 13:23:10.963072 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.966436 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.966475 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.966485 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.966503 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.966516 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:10Z","lastTransitionTime":"2026-03-20T13:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:10 crc kubenswrapper[4973]: E0320 13:23:10.976894 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.979936 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.979964 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.979974 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.979994 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.980007 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:10Z","lastTransitionTime":"2026-03-20T13:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:10 crc kubenswrapper[4973]: E0320 13:23:10.991902 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:10Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.996798 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.996867 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.996879 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.996894 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:10 crc kubenswrapper[4973]: I0320 13:23:10.996904 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:10Z","lastTransitionTime":"2026-03-20T13:23:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:11 crc kubenswrapper[4973]: E0320 13:23:11.012909 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:11Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:11 crc kubenswrapper[4973]: E0320 13:23:11.013022 4973 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.015177 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.015218 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.015229 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.015244 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.015253 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:11Z","lastTransitionTime":"2026-03-20T13:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.117264 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.117415 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.117493 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.117586 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.118051 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:11Z","lastTransitionTime":"2026-03-20T13:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.219951 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.220023 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.220036 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.220054 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.220088 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:11Z","lastTransitionTime":"2026-03-20T13:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.322992 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.323030 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.323043 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.323060 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.323072 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:11Z","lastTransitionTime":"2026-03-20T13:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.425817 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.425859 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.425870 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.425886 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.425898 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:11Z","lastTransitionTime":"2026-03-20T13:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.528584 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.528658 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.528680 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.528704 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.528721 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:11Z","lastTransitionTime":"2026-03-20T13:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.631783 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.631911 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.631930 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.631953 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.631970 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:11Z","lastTransitionTime":"2026-03-20T13:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.734737 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.734795 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.734815 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.734839 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.734858 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:11Z","lastTransitionTime":"2026-03-20T13:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.837193 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.837240 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.837252 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.837272 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.837287 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:11Z","lastTransitionTime":"2026-03-20T13:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.939698 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.939760 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.939778 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.939803 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.939823 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:11Z","lastTransitionTime":"2026-03-20T13:23:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:11 crc kubenswrapper[4973]: I0320 13:23:11.951261 4973 scope.go:117] "RemoveContainer" containerID="2f7151ffba559fac0693eaecf8a2fee71159b68227809d4af62b8486ee7af3a6" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.042286 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.042799 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.042821 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.042845 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.042862 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:12Z","lastTransitionTime":"2026-03-20T13:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.145641 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.145690 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.145705 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.145726 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.145740 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:12Z","lastTransitionTime":"2026-03-20T13:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.248840 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.248929 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.248949 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.248976 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.248993 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:12Z","lastTransitionTime":"2026-03-20T13:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.351596 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.351633 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.351642 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.351657 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.351668 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:12Z","lastTransitionTime":"2026-03-20T13:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.454411 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.454444 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.454455 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.454469 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.454478 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:12Z","lastTransitionTime":"2026-03-20T13:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.503100 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jllfx_774edfed-7d45-4b69-b9d7-a3a914cbca04/ovnkube-controller/1.log" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.505090 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" event={"ID":"774edfed-7d45-4b69-b9d7-a3a914cbca04","Type":"ContainerStarted","Data":"523ff44d0d4e2a76f995db507faacddad44fc13c73598200d0aa665c37f4cb35"} Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.505579 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.518206 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:12Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.527707 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:12Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.545609 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:12Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.562083 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.562169 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.562191 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.562221 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.562246 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:12Z","lastTransitionTime":"2026-03-20T13:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.574087 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72006c12565eceb6bcd910f7a84aefb14a881325ea4617bed4ccffdd31e356c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:12Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.586716 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:12Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.601815 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:12Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.612688 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:12Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.627498 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:12Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.639100 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:12Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.653700 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:12Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.665001 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.665051 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.665067 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.665089 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.665104 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:12Z","lastTransitionTime":"2026-03-20T13:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.667685 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:12Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.680947 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:12Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.690948 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:12Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.707840 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ad1dd43-c2a7-4e78-9205-34f9a5f38dc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f147336d21c520b769cc2412032e87f34909104d9249047521ad2e72f489564b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15cc75da411a896570582ba77693c099b590adc36f8a41d5f43e0b94c5d6e97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed276b5c92b41fc0f7761442a51338cd71a913e99cc605598fc379cbc1451651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4c87775e72595ceaeaf9ec3c2f6136b579c1f0b2f7b1542525e14da6b424ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dba5b3c826c4891871a963069a09e7b3dceac4295aa4173204d50da0a23b2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:12Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.720999 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:12Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.731799 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:12Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.748795 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07ee7da60e3690096c10650c3b6c42fec3b15bd2067e4ce5013b7b450626f4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://871fb9fe2847621c60969a007f73da50cc62ca926e0251ee5c0fb27aefbda812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66b6b2fb9b2f8d32d126fc2a6ce3795a88f698d6da5fe5e2dd8f67633eaed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ecf535291dac1ed0e5ea70a4bf7b4dd5d8db1a7d39b63ace83f7b7376a1a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://769012078c45497a3a9634abb008e5887154b0b6af53007e47fa95b7f8093e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a85b4f11b49688632f7d86a55bf387f3469cf372fffceeb507ca48fb35d031\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://523ff44d0d4e2a76f995db507faacddad44fc13c73598200d0aa665c37f4cb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f7151ffba559fac0693eaecf8a2fee71159b68227809d4af62b8486ee7af3a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:22:55Z\\\",\\\"message\\\":\\\"ntroller.go:444] Built service openshift-console-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0320 13:22:55.343731 6944 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0320 13:22:55.343735 6944 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0320 13:22:55.343739 6944 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0320 13:22:55.343731 6944 services_controller.go:445] Built service openshift-console-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 13:22:55.343754 6944 services_controller.go:451] Built service openshift-console-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.88\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://049540dfa35ad15c3611412fbe908d24faf4f90a09da41b7a16ce2c5b4bd5062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:12Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.767825 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.767863 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.767872 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.767886 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.767896 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:12Z","lastTransitionTime":"2026-03-20T13:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.869646 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.869677 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.869687 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.869700 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.869709 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:12Z","lastTransitionTime":"2026-03-20T13:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.950562 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.950576 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.950735 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.950818 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:12 crc kubenswrapper[4973]: E0320 13:23:12.950953 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:12 crc kubenswrapper[4973]: E0320 13:23:12.951052 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.951130 4973 scope.go:117] "RemoveContainer" containerID="923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365" Mar 20 13:23:12 crc kubenswrapper[4973]: E0320 13:23:12.951144 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:12 crc kubenswrapper[4973]: E0320 13:23:12.951203 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.977229 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.977270 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.977281 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.977298 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:12 crc kubenswrapper[4973]: I0320 13:23:12.977310 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:12Z","lastTransitionTime":"2026-03-20T13:23:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.080722 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.080763 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.080773 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.080787 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.080798 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:13Z","lastTransitionTime":"2026-03-20T13:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.183016 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.183059 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.183070 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.183090 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.183103 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:13Z","lastTransitionTime":"2026-03-20T13:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.286836 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.286882 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.286890 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.286904 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.286914 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:13Z","lastTransitionTime":"2026-03-20T13:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.388993 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.389029 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.389037 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.389051 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.389059 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:13Z","lastTransitionTime":"2026-03-20T13:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.491248 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.491293 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.491301 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.491316 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.491325 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:13Z","lastTransitionTime":"2026-03-20T13:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.512466 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.514415 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ae0f13493cd2874f6e19f0ced0646d2902d6cc172ca6b1560a28949984bebc70"} Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.514662 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.517098 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jllfx_774edfed-7d45-4b69-b9d7-a3a914cbca04/ovnkube-controller/2.log" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.517815 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jllfx_774edfed-7d45-4b69-b9d7-a3a914cbca04/ovnkube-controller/1.log" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.522298 4973 generic.go:334] "Generic (PLEG): container finished" podID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerID="523ff44d0d4e2a76f995db507faacddad44fc13c73598200d0aa665c37f4cb35" exitCode=1 Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.522360 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" event={"ID":"774edfed-7d45-4b69-b9d7-a3a914cbca04","Type":"ContainerDied","Data":"523ff44d0d4e2a76f995db507faacddad44fc13c73598200d0aa665c37f4cb35"} Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.522398 4973 scope.go:117] "RemoveContainer" containerID="2f7151ffba559fac0693eaecf8a2fee71159b68227809d4af62b8486ee7af3a6" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.523057 4973 scope.go:117] "RemoveContainer" containerID="523ff44d0d4e2a76f995db507faacddad44fc13c73598200d0aa665c37f4cb35" Mar 20 13:23:13 crc kubenswrapper[4973]: E0320 13:23:13.523219 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jllfx_openshift-ovn-kubernetes(774edfed-7d45-4b69-b9d7-a3a914cbca04)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.527284 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.539732 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.549013 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.561155 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.574501 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72006c12565eceb6bcd910f7a84aefb14a881325ea4617bed4ccffdd31e356c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.585645 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0f13493cd2874f6e19f0ced0646d2902d6cc172ca6b1560a28949984bebc70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.593747 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.593772 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.593780 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.593793 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.593802 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:13Z","lastTransitionTime":"2026-03-20T13:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.596393 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.605679 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.618725 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.630807 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.642885 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.652863 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.671268 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.684495 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.696515 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.696538 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.696547 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.696577 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.696587 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:13Z","lastTransitionTime":"2026-03-20T13:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.706965 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ad1dd43-c2a7-4e78-9205-34f9a5f38dc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f147336d21c520b769cc2412032e87f34909104d9249047521ad2e72f489564b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15cc75da411a896570582ba77693c099b590adc36f8a41d5f43e0b94c5d6e97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed276b5c92b41fc0f7761442a51338cd71a913e99cc605598fc379cbc1451651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4c87775e72595ceaeaf9ec3c2f6136b579c1f0b2f7b1542525e14da6b424ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dba5b3c826c4891871a963069a09e7b3dceac4295aa4173204d50da0a23b2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.721427 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.737701 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07ee7da60e3690096c10650c3b6c42fec3b15bd2067e4ce5013b7b450626f4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://871fb9fe2847621c60969a007f73da50cc62ca926e0251ee5c0fb27aefbda812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66b6b2fb9b2f8d32d126fc2a6ce3795a88f698d6da5fe5e2dd8f67633eaed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ecf535291dac1ed0e5ea70a4bf7b4dd5d8db1a7d39b63ace83f7b7376a1a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://769012078c45497a3a9634abb008e5887154b0b6af53007e47fa95b7f8093e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a85b4f11b49688632f7d86a55bf387f3469cf372fffceeb507ca48fb35d031\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://523ff44d0d4e2a76f995db507faacddad44fc13c73598200d0aa665c37f4cb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f7151ffba559fac0693eaecf8a2fee71159b68227809d4af62b8486ee7af3a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:22:55Z\\\",\\\"message\\\":\\\"ntroller.go:444] Built service openshift-console-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0320 13:22:55.343731 6944 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0320 13:22:55.343735 6944 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0320 13:22:55.343739 6944 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0320 13:22:55.343731 6944 services_controller.go:445] Built service openshift-console-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 13:22:55.343754 6944 services_controller.go:451] Built service openshift-console-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.88\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://049540dfa35ad15c3611412fbe908d24faf4f90a09da41b7a16ce2c5b4bd5062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.747452 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.756386 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.768260 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.781678 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0f13493cd2874f6e19f0ced0646d2902d6cc172ca6b1560a28949984bebc70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.790970 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.803880 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.804198 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.804245 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.804305 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.804323 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.804352 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:13Z","lastTransitionTime":"2026-03-20T13:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.815436 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.826668 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72006c12565eceb6bcd910f7a84aefb14a881325ea4617bed4ccffdd31e356c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.836244 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.845620 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.855185 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.872537 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ad1dd43-c2a7-4e78-9205-34f9a5f38dc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f147336d21c520b769cc2412032e87f34909104d9249047521ad2e72f489564b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15cc75da411a896570582ba77693c099b590adc36f8a41d5f43e0b94c5d6e97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed276b5c92b41fc0f7761442a51338cd71a913e99cc605598fc379cbc1451651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4c87775e72595ceaeaf9ec3c2f6136b579c1f0b2f7b1542525e14da6b424ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dba5b3c826c4891871a963069a09e7b3dceac4295aa4173204d50da0a23b2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.888120 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.901012 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.907484 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.907523 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.907534 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.907550 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.907562 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:13Z","lastTransitionTime":"2026-03-20T13:23:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.912146 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.922233 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:13 crc kubenswrapper[4973]: I0320 13:23:13.943090 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07ee7da60e3690096c10650c3b6c42fec3b15bd2067e4ce5013b7b450626f4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://871fb9fe2847621c60969a007f73da50cc62ca926e0251ee5c0fb27aefbda812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66b6b2fb9b2f8d32d126fc2a6ce3795a88f698d6da5fe5e2dd8f67633eaed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ecf535291dac1ed0e5ea70a4bf7b4dd5d8db1a7d39b63ace83f7b7376a1a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://769012078c45497a3a9634abb008e5887154b0b6af53007e47fa95b7f8093e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a85b4f11b49688632f7d86a55bf387f3469cf372fffceeb507ca48fb35d031\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://523ff44d0d4e2a76f995db507faacddad44fc13c73598200d0aa665c37f4cb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f7151ffba559fac0693eaecf8a2fee71159b68227809d4af62b8486ee7af3a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:22:55Z\\\",\\\"message\\\":\\\"ntroller.go:444] Built service openshift-console-operator/metrics LB per-node configs for network=default: []services.lbConfig(nil)\\\\nI0320 13:22:55.343731 6944 ovn.go:134] Ensuring zone local for Pod openshift-kube-apiserver/kube-apiserver-crc in node crc\\\\nI0320 13:22:55.343735 6944 obj_retry.go:386] Retry successful for *v1.Pod openshift-kube-apiserver/kube-apiserver-crc after 0 failed attempt(s)\\\\nI0320 13:22:55.343739 6944 default_network_controller.go:776] Recording success event on pod openshift-kube-apiserver/kube-apiserver-crc\\\\nI0320 13:22:55.343731 6944 services_controller.go:445] Built service openshift-console-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nI0320 13:22:55.343754 6944 services_controller.go:451] Built service openshift-console-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-console-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-console-operator/metrics\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.4.88\\\\\\\", Port:443, Template:(*services.Template)(nil)}, Targets:[]services\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:54Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://523ff44d0d4e2a76f995db507faacddad44fc13c73598200d0aa665c37f4cb35\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:23:12Z\\\",\\\"message\\\":\\\"g{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-marketplace\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.140\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0320 13:23:12.813392 7144 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:12Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:23:12.813373 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://049540dfa35ad15c3611412fbe908d24faf4f90a09da41b7a16ce2c5b4bd5062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:13Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.009657 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.009703 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.009713 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.009730 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.009741 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:14Z","lastTransitionTime":"2026-03-20T13:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.111546 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.111575 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.111583 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.111595 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.111605 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:14Z","lastTransitionTime":"2026-03-20T13:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.213168 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.213193 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.213203 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.213215 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.213223 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:14Z","lastTransitionTime":"2026-03-20T13:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.314756 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.314791 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.314802 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.314817 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.314825 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:14Z","lastTransitionTime":"2026-03-20T13:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.417746 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.417793 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.417806 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.417824 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.417834 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:14Z","lastTransitionTime":"2026-03-20T13:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.519691 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.519720 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.519727 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.519740 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.519748 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:14Z","lastTransitionTime":"2026-03-20T13:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.527610 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jllfx_774edfed-7d45-4b69-b9d7-a3a914cbca04/ovnkube-controller/2.log" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.534155 4973 scope.go:117] "RemoveContainer" containerID="523ff44d0d4e2a76f995db507faacddad44fc13c73598200d0aa665c37f4cb35" Mar 20 13:23:14 crc kubenswrapper[4973]: E0320 13:23:14.534546 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jllfx_openshift-ovn-kubernetes(774edfed-7d45-4b69-b9d7-a3a914cbca04)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.563407 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.580851 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.604309 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ad1dd43-c2a7-4e78-9205-34f9a5f38dc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f147336d21c520b769cc2412032e87f34909104d9249047521ad2e72f489564b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15cc75da411a896570582ba77693c099b590adc36f8a41d5f43e0b94c5d6e97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed276b5c92b41fc0f7761442a51338cd71a913e99cc605598fc379cbc1451651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4c87775e72595ceaeaf9ec3c2f6136b579c1f0b2f7b1542525e14da6b424ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dba5b3c826c4891871a963069a09e7b3dceac4295aa4173204d50da0a23b2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.619626 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.623109 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.623181 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.623193 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.623210 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.623222 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:14Z","lastTransitionTime":"2026-03-20T13:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.633381 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.645645 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.657800 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.676834 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07ee7da60e3690096c10650c3b6c42fec3b15bd2067e4ce5013b7b450626f4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://871fb9fe2847621c60969a007f73da50cc62ca926e0251ee5c0fb27aefbda812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66b6b2fb9b2f8d32d126fc2a6ce3795a88f698d6da5fe5e2dd8f67633eaed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ecf535291dac1ed0e5ea70a4bf7b4dd5d8db1a7d39b63ace83f7b7376a1a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://769012078c45497a3a9634abb008e5887154b0b6af53007e47fa95b7f8093e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a85b4f11b49688632f7d86a55bf387f3469cf372fffceeb507ca48fb35d031\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://523ff44d0d4e2a76f995db507faacddad44fc13c73598200d0aa665c37f4cb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://523ff44d0d4e2a76f995db507faacddad44fc13c73598200d0aa665c37f4cb35\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:23:12Z\\\",\\\"message\\\":\\\"g{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-marketplace\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.140\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0320 13:23:12.813392 7144 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:12Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:23:12.813373 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jllfx_openshift-ovn-kubernetes(774edfed-7d45-4b69-b9d7-a3a914cbca04)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://049540dfa35ad15c3611412fbe908d24faf4f90a09da41b7a16ce2c5b4bd5062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.686511 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.697438 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.709264 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.723168 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0f13493cd2874f6e19f0ced0646d2902d6cc172ca6b1560a28949984bebc70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.725235 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.725293 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.725311 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.725336 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.725379 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:14Z","lastTransitionTime":"2026-03-20T13:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.736706 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.749632 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.766177 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.772516 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:14 crc kubenswrapper[4973]: E0320 13:23:14.772593 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:23:46.772573657 +0000 UTC m=+147.516243401 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.772902 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:14 crc kubenswrapper[4973]: E0320 13:23:14.773007 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:23:14 crc kubenswrapper[4973]: E0320 13:23:14.773022 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:23:14 crc kubenswrapper[4973]: E0320 13:23:14.773034 4973 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.773093 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.773140 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:14 crc kubenswrapper[4973]: E0320 13:23:14.773161 4973 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:23:14 crc kubenswrapper[4973]: E0320 13:23:14.773192 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:46.773142173 +0000 UTC m=+147.516811917 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:14 crc kubenswrapper[4973]: E0320 13:23:14.773231 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:46.773223315 +0000 UTC m=+147.516893059 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:23:14 crc kubenswrapper[4973]: E0320 13:23:14.773252 4973 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:23:14 crc kubenswrapper[4973]: E0320 13:23:14.773370 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:46.773313538 +0000 UTC m=+147.516983322 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.787212 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72006c12565eceb6bcd910f7a84aefb14a881325ea4617bed4ccffdd31e356c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.803361 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:14Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.828214 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.828249 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.828257 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.828274 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.828283 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:14Z","lastTransitionTime":"2026-03-20T13:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.873469 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93c5ad90-87bf-4668-9d87-34e676b15783-metrics-certs\") pod \"network-metrics-daemon-7kszd\" (UID: \"93c5ad90-87bf-4668-9d87-34e676b15783\") " pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.873517 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:14 crc kubenswrapper[4973]: E0320 13:23:14.873629 4973 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:23:14 crc kubenswrapper[4973]: E0320 13:23:14.873691 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93c5ad90-87bf-4668-9d87-34e676b15783-metrics-certs podName:93c5ad90-87bf-4668-9d87-34e676b15783 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:46.873672798 +0000 UTC m=+147.617342542 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93c5ad90-87bf-4668-9d87-34e676b15783-metrics-certs") pod "network-metrics-daemon-7kszd" (UID: "93c5ad90-87bf-4668-9d87-34e676b15783") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:23:14 crc kubenswrapper[4973]: E0320 13:23:14.873637 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:23:14 crc kubenswrapper[4973]: E0320 13:23:14.873747 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:23:14 crc kubenswrapper[4973]: E0320 13:23:14.873761 4973 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:14 crc kubenswrapper[4973]: E0320 13:23:14.873820 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:23:46.873809252 +0000 UTC m=+147.617478996 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.930966 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.931003 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.931013 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.931027 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.931073 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:14Z","lastTransitionTime":"2026-03-20T13:23:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.950551 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.950551 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:14 crc kubenswrapper[4973]: E0320 13:23:14.950663 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:14 crc kubenswrapper[4973]: E0320 13:23:14.950723 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.950559 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:14 crc kubenswrapper[4973]: E0320 13:23:14.950794 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:14 crc kubenswrapper[4973]: I0320 13:23:14.951391 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:14 crc kubenswrapper[4973]: E0320 13:23:14.951996 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.033485 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.033525 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.033536 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.033548 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.033557 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:15Z","lastTransitionTime":"2026-03-20T13:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.136065 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.136102 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.136110 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.136125 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.136134 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:15Z","lastTransitionTime":"2026-03-20T13:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.237861 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.237917 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.237937 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.237960 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.237977 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:15Z","lastTransitionTime":"2026-03-20T13:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.340618 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.340671 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.340683 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.340706 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.340732 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:15Z","lastTransitionTime":"2026-03-20T13:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.442997 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.443029 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.443040 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.443054 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.443063 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:15Z","lastTransitionTime":"2026-03-20T13:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.545077 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.545118 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.545129 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.545145 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.545156 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:15Z","lastTransitionTime":"2026-03-20T13:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.649650 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.649716 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.649734 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.649759 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.649776 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:15Z","lastTransitionTime":"2026-03-20T13:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.752325 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.752378 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.752388 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.752402 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.752413 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:15Z","lastTransitionTime":"2026-03-20T13:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.855159 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.855246 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.855270 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.855300 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.855321 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:15Z","lastTransitionTime":"2026-03-20T13:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.958335 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.958397 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.958406 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.958421 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:15 crc kubenswrapper[4973]: I0320 13:23:15.958432 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:15Z","lastTransitionTime":"2026-03-20T13:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.061574 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.062403 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.062554 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.062699 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.062846 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:16Z","lastTransitionTime":"2026-03-20T13:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.164934 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.165236 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.165315 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.165433 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.165509 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:16Z","lastTransitionTime":"2026-03-20T13:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.268017 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.268061 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.268071 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.268085 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.268093 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:16Z","lastTransitionTime":"2026-03-20T13:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.370520 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.370578 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.370595 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.370620 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.370637 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:16Z","lastTransitionTime":"2026-03-20T13:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.473267 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.473688 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.473862 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.474013 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.474175 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:16Z","lastTransitionTime":"2026-03-20T13:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.577278 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.577363 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.577402 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.577455 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.577473 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:16Z","lastTransitionTime":"2026-03-20T13:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.680175 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.680209 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.680219 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.680236 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.680245 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:16Z","lastTransitionTime":"2026-03-20T13:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.782584 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.782636 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.782653 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.782680 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.782696 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:16Z","lastTransitionTime":"2026-03-20T13:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.885710 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.885778 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.885804 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.885834 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.885850 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:16Z","lastTransitionTime":"2026-03-20T13:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.950568 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.950619 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.950657 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.950624 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:16 crc kubenswrapper[4973]: E0320 13:23:16.950741 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:16 crc kubenswrapper[4973]: E0320 13:23:16.950844 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:16 crc kubenswrapper[4973]: E0320 13:23:16.950935 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:16 crc kubenswrapper[4973]: E0320 13:23:16.951050 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.988915 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.988979 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.989003 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.989033 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:16 crc kubenswrapper[4973]: I0320 13:23:16.989054 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:16Z","lastTransitionTime":"2026-03-20T13:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.091889 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.091920 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.091932 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.091955 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.091971 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:17Z","lastTransitionTime":"2026-03-20T13:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.194406 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.194446 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.194455 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.194468 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.194478 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:17Z","lastTransitionTime":"2026-03-20T13:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.296694 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.296739 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.296748 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.296766 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.296777 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:17Z","lastTransitionTime":"2026-03-20T13:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.398838 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.398880 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.398888 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.398906 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.398915 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:17Z","lastTransitionTime":"2026-03-20T13:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.501284 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.501322 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.501331 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.501356 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.501365 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:17Z","lastTransitionTime":"2026-03-20T13:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.603613 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.603673 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.603689 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.603714 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.603731 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:17Z","lastTransitionTime":"2026-03-20T13:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.705666 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.705699 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.705709 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.705722 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.705730 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:17Z","lastTransitionTime":"2026-03-20T13:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.808056 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.808366 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.808432 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.808505 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.808564 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:17Z","lastTransitionTime":"2026-03-20T13:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.911522 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.911864 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.912036 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.912165 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:17 crc kubenswrapper[4973]: I0320 13:23:17.912318 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:17Z","lastTransitionTime":"2026-03-20T13:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.014918 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.015180 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.015274 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.015368 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.015430 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:18Z","lastTransitionTime":"2026-03-20T13:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.118409 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.118462 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.118479 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.118500 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.118517 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:18Z","lastTransitionTime":"2026-03-20T13:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.221657 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.221706 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.221722 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.221748 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.221764 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:18Z","lastTransitionTime":"2026-03-20T13:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.324412 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.324491 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.324504 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.324523 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.324559 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:18Z","lastTransitionTime":"2026-03-20T13:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.427731 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.427819 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.427836 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.427859 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.427902 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:18Z","lastTransitionTime":"2026-03-20T13:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.531295 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.531329 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.531358 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.531372 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.531382 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:18Z","lastTransitionTime":"2026-03-20T13:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.633245 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.633295 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.633304 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.633317 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.633326 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:18Z","lastTransitionTime":"2026-03-20T13:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.736488 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.736535 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.736544 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.736558 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.736567 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:18Z","lastTransitionTime":"2026-03-20T13:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.838891 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.838939 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.838948 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.838965 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.838976 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:18Z","lastTransitionTime":"2026-03-20T13:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.941953 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.942002 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.942017 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.942036 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.942049 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:18Z","lastTransitionTime":"2026-03-20T13:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.950211 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.950218 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:18 crc kubenswrapper[4973]: E0320 13:23:18.950352 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.950232 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:18 crc kubenswrapper[4973]: I0320 13:23:18.950222 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:18 crc kubenswrapper[4973]: E0320 13:23:18.950487 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:18 crc kubenswrapper[4973]: E0320 13:23:18.950648 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:18 crc kubenswrapper[4973]: E0320 13:23:18.950705 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.044097 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.044154 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.044171 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.044197 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.044213 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:19Z","lastTransitionTime":"2026-03-20T13:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.147396 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.147468 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.147491 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.147522 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.147544 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:19Z","lastTransitionTime":"2026-03-20T13:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.250314 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.250412 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.250429 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.250452 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.250469 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:19Z","lastTransitionTime":"2026-03-20T13:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.354473 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.354525 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.354534 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.354552 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.354564 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:19Z","lastTransitionTime":"2026-03-20T13:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.456689 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.456770 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.456797 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.456830 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.456853 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:19Z","lastTransitionTime":"2026-03-20T13:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.559661 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.559711 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.559727 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.559748 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.559765 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:19Z","lastTransitionTime":"2026-03-20T13:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.662545 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.662619 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.662644 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.662672 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.662692 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:19Z","lastTransitionTime":"2026-03-20T13:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.765109 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.765171 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.765188 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.765215 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.765231 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:19Z","lastTransitionTime":"2026-03-20T13:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.868164 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.868234 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.868259 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.868292 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.868314 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:19Z","lastTransitionTime":"2026-03-20T13:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:19 crc kubenswrapper[4973]: E0320 13:23:19.968949 4973 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.972612 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:19Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:19 crc kubenswrapper[4973]: I0320 13:23:19.993187 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:19Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:20 crc kubenswrapper[4973]: I0320 13:23:20.012388 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:20Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:20 crc kubenswrapper[4973]: I0320 13:23:20.033411 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:20Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:20 crc kubenswrapper[4973]: E0320 13:23:20.044776 4973 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:23:20 crc kubenswrapper[4973]: I0320 13:23:20.047911 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:20Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:20 crc kubenswrapper[4973]: I0320 13:23:20.062053 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:20Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:20 crc kubenswrapper[4973]: I0320 13:23:20.074962 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:20Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:20 crc kubenswrapper[4973]: I0320 13:23:20.105076 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ad1dd43-c2a7-4e78-9205-34f9a5f38dc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f147336d21c520b769cc2412032e87f34909104d9249047521ad2e72f489564b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15cc75da411a896570582ba77693c099b590adc36f8a41d5f43e0b94c5d6e97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed276b5c92b41fc0f7761442a51338cd71a913e99cc605598fc379cbc1451651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4c87775e72595ceaeaf9ec3c2f6136b579c1f0b2f7b1542525e14da6b424ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dba5b3c826c4891871a963069a09e7b3dceac4295aa4173204d50da0a23b2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:20Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:20 crc kubenswrapper[4973]: I0320 13:23:20.123634 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07ee7da60e3690096c10650c3b6c42fec3b15bd2067e4ce5013b7b450626f4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://871fb9fe2847621c60969a007f73da50cc62ca926e0251ee5c0fb27aefbda812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66b6b2fb9b2f8d32d126fc2a6ce3795a88f698d6da5fe5e2dd8f67633eaed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ecf535291dac1ed0e5ea70a4bf7b4dd5d8db1a7d39b63ace83f7b7376a1a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://769012078c45497a3a9634abb008e5887154b0b6af53007e47fa95b7f8093e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a85b4f11b49688632f7d86a55bf387f3469cf372fffceeb507ca48fb35d031\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://523ff44d0d4e2a76f995db507faacddad44fc13c73598200d0aa665c37f4cb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://523ff44d0d4e2a76f995db507faacddad44fc13c73598200d0aa665c37f4cb35\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:23:12Z\\\",\\\"message\\\":\\\"g{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-marketplace\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.140\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0320 13:23:12.813392 7144 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:12Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:23:12.813373 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jllfx_openshift-ovn-kubernetes(774edfed-7d45-4b69-b9d7-a3a914cbca04)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://049540dfa35ad15c3611412fbe908d24faf4f90a09da41b7a16ce2c5b4bd5062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:20Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:20 crc kubenswrapper[4973]: I0320 13:23:20.134804 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:20Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:20 crc kubenswrapper[4973]: I0320 13:23:20.147388 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:20Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:20 crc kubenswrapper[4973]: I0320 13:23:20.158091 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:20Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:20 crc kubenswrapper[4973]: I0320 13:23:20.170787 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:20Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:20 crc kubenswrapper[4973]: I0320 13:23:20.187536 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:20Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:20 crc kubenswrapper[4973]: I0320 13:23:20.202452 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72006c12565eceb6bcd910f7a84aefb14a881325ea4617bed4ccffdd31e356c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:20Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:20 crc kubenswrapper[4973]: I0320 13:23:20.216042 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0f13493cd2874f6e19f0ced0646d2902d6cc172ca6b1560a28949984bebc70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:20Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:20 crc kubenswrapper[4973]: I0320 13:23:20.227506 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:20Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:20 crc kubenswrapper[4973]: I0320 13:23:20.949970 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:20 crc kubenswrapper[4973]: I0320 13:23:20.950103 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:20 crc kubenswrapper[4973]: I0320 13:23:20.950132 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:20 crc kubenswrapper[4973]: I0320 13:23:20.950197 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:20 crc kubenswrapper[4973]: E0320 13:23:20.951085 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:20 crc kubenswrapper[4973]: E0320 13:23:20.951229 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:20 crc kubenswrapper[4973]: E0320 13:23:20.951360 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:20 crc kubenswrapper[4973]: E0320 13:23:20.951675 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:21 crc kubenswrapper[4973]: I0320 13:23:21.166291 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:21 crc kubenswrapper[4973]: I0320 13:23:21.166329 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:21 crc kubenswrapper[4973]: I0320 13:23:21.166363 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:21 crc kubenswrapper[4973]: I0320 13:23:21.166377 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:21 crc kubenswrapper[4973]: I0320 13:23:21.166386 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:21Z","lastTransitionTime":"2026-03-20T13:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:21 crc kubenswrapper[4973]: E0320 13:23:21.184257 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:21 crc kubenswrapper[4973]: I0320 13:23:21.189154 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:21 crc kubenswrapper[4973]: I0320 13:23:21.189181 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:21 crc kubenswrapper[4973]: I0320 13:23:21.189190 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:21 crc kubenswrapper[4973]: I0320 13:23:21.189204 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:21 crc kubenswrapper[4973]: I0320 13:23:21.189214 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:21Z","lastTransitionTime":"2026-03-20T13:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:21 crc kubenswrapper[4973]: E0320 13:23:21.204497 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:21 crc kubenswrapper[4973]: I0320 13:23:21.209408 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:21 crc kubenswrapper[4973]: I0320 13:23:21.209449 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:21 crc kubenswrapper[4973]: I0320 13:23:21.209472 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:21 crc kubenswrapper[4973]: I0320 13:23:21.209491 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:21 crc kubenswrapper[4973]: I0320 13:23:21.209505 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:21Z","lastTransitionTime":"2026-03-20T13:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:21 crc kubenswrapper[4973]: E0320 13:23:21.227793 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:21 crc kubenswrapper[4973]: I0320 13:23:21.232100 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:21 crc kubenswrapper[4973]: I0320 13:23:21.232155 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:21 crc kubenswrapper[4973]: I0320 13:23:21.232168 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:21 crc kubenswrapper[4973]: I0320 13:23:21.232185 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:21 crc kubenswrapper[4973]: I0320 13:23:21.232196 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:21Z","lastTransitionTime":"2026-03-20T13:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:21 crc kubenswrapper[4973]: E0320 13:23:21.249583 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:21 crc kubenswrapper[4973]: I0320 13:23:21.253629 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:21 crc kubenswrapper[4973]: I0320 13:23:21.253680 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:21 crc kubenswrapper[4973]: I0320 13:23:21.253692 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:21 crc kubenswrapper[4973]: I0320 13:23:21.253710 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:21 crc kubenswrapper[4973]: I0320 13:23:21.253722 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:21Z","lastTransitionTime":"2026-03-20T13:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:21 crc kubenswrapper[4973]: E0320 13:23:21.269082 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:21Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:21 crc kubenswrapper[4973]: E0320 13:23:21.269192 4973 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:23:22 crc kubenswrapper[4973]: I0320 13:23:22.950572 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:22 crc kubenswrapper[4973]: I0320 13:23:22.950617 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:22 crc kubenswrapper[4973]: I0320 13:23:22.950659 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:22 crc kubenswrapper[4973]: I0320 13:23:22.950625 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:22 crc kubenswrapper[4973]: E0320 13:23:22.950782 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:22 crc kubenswrapper[4973]: E0320 13:23:22.950910 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:22 crc kubenswrapper[4973]: E0320 13:23:22.951069 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:22 crc kubenswrapper[4973]: E0320 13:23:22.951225 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:24 crc kubenswrapper[4973]: I0320 13:23:24.949938 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:24 crc kubenswrapper[4973]: I0320 13:23:24.949986 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:24 crc kubenswrapper[4973]: I0320 13:23:24.949916 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:24 crc kubenswrapper[4973]: I0320 13:23:24.949943 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:24 crc kubenswrapper[4973]: E0320 13:23:24.950114 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:24 crc kubenswrapper[4973]: E0320 13:23:24.950247 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:24 crc kubenswrapper[4973]: E0320 13:23:24.950512 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:24 crc kubenswrapper[4973]: E0320 13:23:24.950606 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:25 crc kubenswrapper[4973]: E0320 13:23:25.045829 4973 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:23:25 crc kubenswrapper[4973]: I0320 13:23:25.305783 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:23:25 crc kubenswrapper[4973]: I0320 13:23:25.321993 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:25 crc kubenswrapper[4973]: I0320 13:23:25.335547 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:25 crc kubenswrapper[4973]: I0320 13:23:25.344704 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:25 crc kubenswrapper[4973]: I0320 13:23:25.364048 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:25 crc kubenswrapper[4973]: I0320 13:23:25.384801 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72006c12565eceb6bcd910f7a84aefb14a881325ea4617bed4ccffdd31e356c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:25 crc kubenswrapper[4973]: I0320 13:23:25.406026 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0f13493cd2874f6e19f0ced0646d2902d6cc172ca6b1560a28949984bebc70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:25 crc kubenswrapper[4973]: I0320 13:23:25.417204 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:25 crc kubenswrapper[4973]: I0320 13:23:25.429766 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:25 crc kubenswrapper[4973]: I0320 13:23:25.445806 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:25 crc kubenswrapper[4973]: I0320 13:23:25.479839 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:25 crc kubenswrapper[4973]: I0320 13:23:25.506636 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:25 crc kubenswrapper[4973]: I0320 13:23:25.524410 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:25 crc kubenswrapper[4973]: I0320 13:23:25.535950 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:25 crc kubenswrapper[4973]: I0320 13:23:25.546322 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:25 crc kubenswrapper[4973]: I0320 13:23:25.567183 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ad1dd43-c2a7-4e78-9205-34f9a5f38dc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f147336d21c520b769cc2412032e87f34909104d9249047521ad2e72f489564b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15cc75da411a896570582ba77693c099b590adc36f8a41d5f43e0b94c5d6e97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed276b5c92b41fc0f7761442a51338cd71a913e99cc605598fc379cbc1451651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4c87775e72595ceaeaf9ec3c2f6136b579c1f0b2f7b1542525e14da6b424ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dba5b3c826c4891871a963069a09e7b3dceac4295aa4173204d50da0a23b2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:25 crc kubenswrapper[4973]: I0320 13:23:25.579711 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:25 crc kubenswrapper[4973]: I0320 13:23:25.599873 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07ee7da60e3690096c10650c3b6c42fec3b15bd2067e4ce5013b7b450626f4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://871fb9fe2847621c60969a007f73da50cc62ca926e0251ee5c0fb27aefbda812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66b6b2fb9b2f8d32d126fc2a6ce3795a88f698d6da5fe5e2dd8f67633eaed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ecf535291dac1ed0e5ea70a4bf7b4dd5d8db1a7d39b63ace83f7b7376a1a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://769012078c45497a3a9634abb008e5887154b0b6af53007e47fa95b7f8093e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a85b4f11b49688632f7d86a55bf387f3469cf372fffceeb507ca48fb35d031\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://523ff44d0d4e2a76f995db507faacddad44fc13c73598200d0aa665c37f4cb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://523ff44d0d4e2a76f995db507faacddad44fc13c73598200d0aa665c37f4cb35\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:23:12Z\\\",\\\"message\\\":\\\"g{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-marketplace\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.140\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0320 13:23:12.813392 7144 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:12Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:23:12.813373 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jllfx_openshift-ovn-kubernetes(774edfed-7d45-4b69-b9d7-a3a914cbca04)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://049540dfa35ad15c3611412fbe908d24faf4f90a09da41b7a16ce2c5b4bd5062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:25Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:25 crc kubenswrapper[4973]: I0320 13:23:25.951175 4973 scope.go:117] "RemoveContainer" containerID="523ff44d0d4e2a76f995db507faacddad44fc13c73598200d0aa665c37f4cb35" Mar 20 13:23:25 crc kubenswrapper[4973]: E0320 13:23:25.951361 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jllfx_openshift-ovn-kubernetes(774edfed-7d45-4b69-b9d7-a3a914cbca04)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" Mar 20 13:23:26 crc kubenswrapper[4973]: I0320 13:23:26.950562 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:26 crc kubenswrapper[4973]: I0320 13:23:26.950635 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:26 crc kubenswrapper[4973]: I0320 13:23:26.950578 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:26 crc kubenswrapper[4973]: I0320 13:23:26.950578 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:26 crc kubenswrapper[4973]: E0320 13:23:26.950751 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:26 crc kubenswrapper[4973]: E0320 13:23:26.950881 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:26 crc kubenswrapper[4973]: E0320 13:23:26.951012 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:26 crc kubenswrapper[4973]: E0320 13:23:26.951083 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:28 crc kubenswrapper[4973]: I0320 13:23:28.950018 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:28 crc kubenswrapper[4973]: I0320 13:23:28.950068 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:28 crc kubenswrapper[4973]: I0320 13:23:28.950124 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:28 crc kubenswrapper[4973]: E0320 13:23:28.950161 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:28 crc kubenswrapper[4973]: I0320 13:23:28.950018 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:28 crc kubenswrapper[4973]: E0320 13:23:28.950276 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:28 crc kubenswrapper[4973]: E0320 13:23:28.950377 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:28 crc kubenswrapper[4973]: E0320 13:23:28.950444 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:29 crc kubenswrapper[4973]: I0320 13:23:29.966549 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0f13493cd2874f6e19f0ced0646d2902d6cc172ca6b1560a28949984bebc70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:29 crc kubenswrapper[4973]: I0320 13:23:29.977786 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:29 crc kubenswrapper[4973]: I0320 13:23:29.991497 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:29Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:30 crc kubenswrapper[4973]: I0320 13:23:30.004455 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:30 crc kubenswrapper[4973]: I0320 13:23:30.019978 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72006c12565eceb6bcd910f7a84aefb14a881325ea4617bed4ccffdd31e356c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:30 crc kubenswrapper[4973]: I0320 13:23:30.039150 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:30 crc kubenswrapper[4973]: E0320 13:23:30.046636 4973 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:23:30 crc kubenswrapper[4973]: I0320 13:23:30.054429 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:30 crc kubenswrapper[4973]: I0320 13:23:30.066623 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:30 crc kubenswrapper[4973]: I0320 13:23:30.079818 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:30 crc kubenswrapper[4973]: I0320 13:23:30.103083 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ad1dd43-c2a7-4e78-9205-34f9a5f38dc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f147336d21c520b769cc2412032e87f34909104d9249047521ad2e72f489564b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15cc75da411a896570582ba77693c099b590adc36f8a41d5f43e0b94c5d6e97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed276b5c92b41fc0f7761442a51338cd71a913e99cc605598fc379cbc1451651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4c87775e72595ceaeaf9ec3c2f6136b579c1f0b2f7b1542525e14da6b424ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dba5b3c826c4891871a963069a09e7b3dceac4295aa4173204d50da0a23b2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:30 crc kubenswrapper[4973]: I0320 13:23:30.116468 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:30 crc kubenswrapper[4973]: I0320 13:23:30.129998 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:30 crc kubenswrapper[4973]: I0320 13:23:30.140935 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:30 crc kubenswrapper[4973]: I0320 13:23:30.163391 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07ee7da60e3690096c10650c3b6c42fec3b15bd2067e4ce5013b7b450626f4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://871fb9fe2847621c60969a007f73da50cc62ca926e0251ee5c0fb27aefbda812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66b6b2fb9b2f8d32d126fc2a6ce3795a88f698d6da5fe5e2dd8f67633eaed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ecf535291dac1ed0e5ea70a4bf7b4dd5d8db1a7d39b63ace83f7b7376a1a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://769012078c45497a3a9634abb008e5887154b0b6af53007e47fa95b7f8093e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a85b4f11b49688632f7d86a55bf387f3469cf372fffceeb507ca48fb35d031\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://523ff44d0d4e2a76f995db507faacddad44fc13c73598200d0aa665c37f4cb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://523ff44d0d4e2a76f995db507faacddad44fc13c73598200d0aa665c37f4cb35\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:23:12Z\\\",\\\"message\\\":\\\"g{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-marketplace\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.140\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0320 13:23:12.813392 7144 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:12Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:23:12.813373 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jllfx_openshift-ovn-kubernetes(774edfed-7d45-4b69-b9d7-a3a914cbca04)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://049540dfa35ad15c3611412fbe908d24faf4f90a09da41b7a16ce2c5b4bd5062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:30 crc kubenswrapper[4973]: I0320 13:23:30.173790 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:30 crc kubenswrapper[4973]: I0320 13:23:30.183964 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:30 crc kubenswrapper[4973]: I0320 13:23:30.196607 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:30Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:30 crc kubenswrapper[4973]: I0320 13:23:30.950429 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:30 crc kubenswrapper[4973]: I0320 13:23:30.950485 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:30 crc kubenswrapper[4973]: E0320 13:23:30.950580 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:30 crc kubenswrapper[4973]: I0320 13:23:30.950441 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:30 crc kubenswrapper[4973]: I0320 13:23:30.950624 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:30 crc kubenswrapper[4973]: E0320 13:23:30.950705 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:30 crc kubenswrapper[4973]: E0320 13:23:30.950797 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:30 crc kubenswrapper[4973]: E0320 13:23:30.950856 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.424982 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.425463 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.425482 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.425503 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.425518 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:31Z","lastTransitionTime":"2026-03-20T13:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:31 crc kubenswrapper[4973]: E0320 13:23:31.442420 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.449182 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.449229 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.449244 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.449269 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.449284 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:31Z","lastTransitionTime":"2026-03-20T13:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:31 crc kubenswrapper[4973]: E0320 13:23:31.469702 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.474530 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.474619 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.474631 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.474649 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.474689 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:31Z","lastTransitionTime":"2026-03-20T13:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:31 crc kubenswrapper[4973]: E0320 13:23:31.494099 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.498795 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.498854 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.498873 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.498899 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.498916 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:31Z","lastTransitionTime":"2026-03-20T13:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:31 crc kubenswrapper[4973]: E0320 13:23:31.518917 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.523778 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.523899 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.523913 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.523981 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.523994 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:31Z","lastTransitionTime":"2026-03-20T13:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:31 crc kubenswrapper[4973]: E0320 13:23:31.546538 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:31 crc kubenswrapper[4973]: E0320 13:23:31.546869 4973 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.589140 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-57hnn_35802646-2926-42b8-913a-986001818f97/kube-multus/0.log" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.589223 4973 generic.go:334] "Generic (PLEG): container finished" podID="35802646-2926-42b8-913a-986001818f97" containerID="410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225" exitCode=1 Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.589276 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-57hnn" event={"ID":"35802646-2926-42b8-913a-986001818f97","Type":"ContainerDied","Data":"410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225"} Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.589932 4973 scope.go:117] "RemoveContainer" containerID="410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.609430 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.634726 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ad1dd43-c2a7-4e78-9205-34f9a5f38dc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f147336d21c520b769cc2412032e87f34909104d9249047521ad2e72f489564b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15cc75da411a896570582ba77693c099b590adc36f8a41d5f43e0b94c5d6e97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed276b5c92b41fc0f7761442a51338cd71a913e99cc605598fc379cbc1451651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4c87775e72595ceaeaf9ec3c2f6136b579c1f0b2f7b1542525e14da6b424ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dba5b3c826c4891871a963069a09e7b3dceac4295aa4173204d50da0a23b2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.656241 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.675526 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.697839 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.713632 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.729509 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.743162 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.761915 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07ee7da60e3690096c10650c3b6c42fec3b15bd2067e4ce5013b7b450626f4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://871fb9fe2847621c60969a007f73da50cc62ca926e0251ee5c0fb27aefbda812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66b6b2fb9b2f8d32d126fc2a6ce3795a88f698d6da5fe5e2dd8f67633eaed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ecf535291dac1ed0e5ea70a4bf7b4dd5d8db1a7d39b63ace83f7b7376a1a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://769012078c45497a3a9634abb008e5887154b0b6af53007e47fa95b7f8093e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a85b4f11b49688632f7d86a55bf387f3469cf372fffceeb507ca48fb35d031\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://523ff44d0d4e2a76f995db507faacddad44fc13c73598200d0aa665c37f4cb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://523ff44d0d4e2a76f995db507faacddad44fc13c73598200d0aa665c37f4cb35\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:23:12Z\\\",\\\"message\\\":\\\"g{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-marketplace\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.140\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0320 13:23:12.813392 7144 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:12Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:23:12.813373 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jllfx_openshift-ovn-kubernetes(774edfed-7d45-4b69-b9d7-a3a914cbca04)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://049540dfa35ad15c3611412fbe908d24faf4f90a09da41b7a16ce2c5b4bd5062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.773525 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.782556 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.794989 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.807960 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0f13493cd2874f6e19f0ced0646d2902d6cc172ca6b1560a28949984bebc70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.818892 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.830578 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.845026 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"2026-03-20T13:22:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a426a4cc-b0a1-4217-8aaf-9af3a933f09f\\\\n2026-03-20T13:22:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a426a4cc-b0a1-4217-8aaf-9af3a933f09f to /host/opt/cni/bin/\\\\n2026-03-20T13:22:46Z [verbose] multus-daemon started\\\\n2026-03-20T13:22:46Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:23:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:31 crc kubenswrapper[4973]: I0320 13:23:31.858497 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72006c12565eceb6bcd910f7a84aefb14a881325ea4617bed4ccffdd31e356c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:31Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:32 crc kubenswrapper[4973]: I0320 13:23:32.595586 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-57hnn_35802646-2926-42b8-913a-986001818f97/kube-multus/0.log" Mar 20 13:23:32 crc kubenswrapper[4973]: I0320 13:23:32.595678 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-57hnn" event={"ID":"35802646-2926-42b8-913a-986001818f97","Type":"ContainerStarted","Data":"47617dca1598ac1606a2c6a9c1e00cd3d4acb516976b6b2d685ae48fa382baf1"} Mar 20 13:23:32 crc kubenswrapper[4973]: I0320 13:23:32.618563 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0f13493cd2874f6e19f0ced0646d2902d6cc172ca6b1560a28949984bebc70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:32 crc kubenswrapper[4973]: I0320 13:23:32.662284 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:32 crc kubenswrapper[4973]: I0320 13:23:32.684427 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:32 crc kubenswrapper[4973]: I0320 13:23:32.703461 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47617dca1598ac1606a2c6a9c1e00cd3d4acb516976b6b2d685ae48fa382baf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"2026-03-20T13:22:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a426a4cc-b0a1-4217-8aaf-9af3a933f09f\\\\n2026-03-20T13:22:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a426a4cc-b0a1-4217-8aaf-9af3a933f09f to /host/opt/cni/bin/\\\\n2026-03-20T13:22:46Z [verbose] multus-daemon started\\\\n2026-03-20T13:22:46Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:23:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:32 crc kubenswrapper[4973]: I0320 13:23:32.732413 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72006c12565eceb6bcd910f7a84aefb14a881325ea4617bed4ccffdd31e356c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:32 crc kubenswrapper[4973]: I0320 13:23:32.751555 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:32 crc kubenswrapper[4973]: I0320 13:23:32.769651 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:32 crc kubenswrapper[4973]: I0320 13:23:32.786597 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:32 crc kubenswrapper[4973]: I0320 13:23:32.814791 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ad1dd43-c2a7-4e78-9205-34f9a5f38dc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f147336d21c520b769cc2412032e87f34909104d9249047521ad2e72f489564b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15cc75da411a896570582ba77693c099b590adc36f8a41d5f43e0b94c5d6e97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed276b5c92b41fc0f7761442a51338cd71a913e99cc605598fc379cbc1451651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4c87775e72595ceaeaf9ec3c2f6136b579c1f0b2f7b1542525e14da6b424ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dba5b3c826c4891871a963069a09e7b3dceac4295aa4173204d50da0a23b2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:32 crc kubenswrapper[4973]: I0320 13:23:32.835025 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:32 crc kubenswrapper[4973]: I0320 13:23:32.854480 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:32 crc kubenswrapper[4973]: I0320 13:23:32.872558 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:32 crc kubenswrapper[4973]: I0320 13:23:32.887935 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:32 crc kubenswrapper[4973]: I0320 13:23:32.918167 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07ee7da60e3690096c10650c3b6c42fec3b15bd2067e4ce5013b7b450626f4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://871fb9fe2847621c60969a007f73da50cc62ca926e0251ee5c0fb27aefbda812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66b6b2fb9b2f8d32d126fc2a6ce3795a88f698d6da5fe5e2dd8f67633eaed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ecf535291dac1ed0e5ea70a4bf7b4dd5d8db1a7d39b63ace83f7b7376a1a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://769012078c45497a3a9634abb008e5887154b0b6af53007e47fa95b7f8093e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a85b4f11b49688632f7d86a55bf387f3469cf372fffceeb507ca48fb35d031\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://523ff44d0d4e2a76f995db507faacddad44fc13c73598200d0aa665c37f4cb35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://523ff44d0d4e2a76f995db507faacddad44fc13c73598200d0aa665c37f4cb35\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:23:12Z\\\",\\\"message\\\":\\\"g{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-marketplace\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.140\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0320 13:23:12.813392 7144 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:12Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:23:12.813373 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-jllfx_openshift-ovn-kubernetes(774edfed-7d45-4b69-b9d7-a3a914cbca04)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://049540dfa35ad15c3611412fbe908d24faf4f90a09da41b7a16ce2c5b4bd5062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:32 crc kubenswrapper[4973]: I0320 13:23:32.930893 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:32 crc kubenswrapper[4973]: I0320 13:23:32.947253 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:32 crc kubenswrapper[4973]: I0320 13:23:32.950330 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:32 crc kubenswrapper[4973]: I0320 13:23:32.950416 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:32 crc kubenswrapper[4973]: I0320 13:23:32.950378 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:32 crc kubenswrapper[4973]: I0320 13:23:32.950522 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:32 crc kubenswrapper[4973]: E0320 13:23:32.950702 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:32 crc kubenswrapper[4973]: E0320 13:23:32.951068 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:32 crc kubenswrapper[4973]: E0320 13:23:32.951247 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:32 crc kubenswrapper[4973]: E0320 13:23:32.951368 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:32 crc kubenswrapper[4973]: I0320 13:23:32.963681 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:32Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:34 crc kubenswrapper[4973]: I0320 13:23:34.950006 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:34 crc kubenswrapper[4973]: I0320 13:23:34.950106 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:34 crc kubenswrapper[4973]: I0320 13:23:34.950025 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:34 crc kubenswrapper[4973]: E0320 13:23:34.950128 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:34 crc kubenswrapper[4973]: I0320 13:23:34.950006 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:34 crc kubenswrapper[4973]: E0320 13:23:34.950379 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:34 crc kubenswrapper[4973]: E0320 13:23:34.950468 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:34 crc kubenswrapper[4973]: E0320 13:23:34.950575 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:34 crc kubenswrapper[4973]: I0320 13:23:34.961029 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 20 13:23:35 crc kubenswrapper[4973]: E0320 13:23:35.047507 4973 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:23:36 crc kubenswrapper[4973]: I0320 13:23:36.950498 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:36 crc kubenswrapper[4973]: E0320 13:23:36.951556 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:36 crc kubenswrapper[4973]: I0320 13:23:36.950593 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:36 crc kubenswrapper[4973]: E0320 13:23:36.951691 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:36 crc kubenswrapper[4973]: I0320 13:23:36.951145 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:36 crc kubenswrapper[4973]: E0320 13:23:36.951794 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:36 crc kubenswrapper[4973]: I0320 13:23:36.950555 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:36 crc kubenswrapper[4973]: E0320 13:23:36.952593 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:36 crc kubenswrapper[4973]: I0320 13:23:36.961646 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 13:23:38 crc kubenswrapper[4973]: I0320 13:23:38.949912 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:38 crc kubenswrapper[4973]: I0320 13:23:38.949933 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:38 crc kubenswrapper[4973]: I0320 13:23:38.949965 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:38 crc kubenswrapper[4973]: I0320 13:23:38.950384 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:38 crc kubenswrapper[4973]: E0320 13:23:38.950326 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:38 crc kubenswrapper[4973]: E0320 13:23:38.950576 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:38 crc kubenswrapper[4973]: E0320 13:23:38.950601 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:38 crc kubenswrapper[4973]: E0320 13:23:38.950733 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:38 crc kubenswrapper[4973]: I0320 13:23:38.951708 4973 scope.go:117] "RemoveContainer" containerID="523ff44d0d4e2a76f995db507faacddad44fc13c73598200d0aa665c37f4cb35" Mar 20 13:23:39 crc kubenswrapper[4973]: I0320 13:23:39.621489 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jllfx_774edfed-7d45-4b69-b9d7-a3a914cbca04/ovnkube-controller/2.log" Mar 20 13:23:39 crc kubenswrapper[4973]: I0320 13:23:39.623725 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" event={"ID":"774edfed-7d45-4b69-b9d7-a3a914cbca04","Type":"ContainerStarted","Data":"39e07ba1d363ef9be20296cc96d9e4c8166d43a08ded9d617e7f41dab15dbab3"} Mar 20 13:23:39 crc kubenswrapper[4973]: I0320 13:23:39.624134 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:23:39 crc kubenswrapper[4973]: I0320 13:23:39.638265 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72006c12565eceb6bcd910f7a84aefb14a881325ea4617bed4ccffdd31e356c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:39 crc kubenswrapper[4973]: I0320 13:23:39.650226 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0f13493cd2874f6e19f0ced0646d2902d6cc172ca6b1560a28949984bebc70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:39 crc kubenswrapper[4973]: I0320 13:23:39.659818 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:39 crc kubenswrapper[4973]: I0320 13:23:39.669295 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:39 crc kubenswrapper[4973]: I0320 13:23:39.680212 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47617dca1598ac1606a2c6a9c1e00cd3d4acb516976b6b2d685ae48fa382baf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"2026-03-20T13:22:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a426a4cc-b0a1-4217-8aaf-9af3a933f09f\\\\n2026-03-20T13:22:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a426a4cc-b0a1-4217-8aaf-9af3a933f09f to /host/opt/cni/bin/\\\\n2026-03-20T13:22:46Z [verbose] multus-daemon started\\\\n2026-03-20T13:22:46Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:23:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:39 crc kubenswrapper[4973]: I0320 13:23:39.690707 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:39 crc kubenswrapper[4973]: I0320 13:23:39.700872 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:39 crc kubenswrapper[4973]: I0320 13:23:39.711934 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:39 crc kubenswrapper[4973]: I0320 13:23:39.721573 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:39 crc kubenswrapper[4973]: I0320 13:23:39.731052 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:39 crc kubenswrapper[4973]: I0320 13:23:39.746817 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ad1dd43-c2a7-4e78-9205-34f9a5f38dc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f147336d21c520b769cc2412032e87f34909104d9249047521ad2e72f489564b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15cc75da411a896570582ba77693c099b590adc36f8a41d5f43e0b94c5d6e97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed276b5c92b41fc0f7761442a51338cd71a913e99cc605598fc379cbc1451651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4c87775e72595ceaeaf9ec3c2f6136b579c1f0b2f7b1542525e14da6b424ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dba5b3c826c4891871a963069a09e7b3dceac4295aa4173204d50da0a23b2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:39 crc kubenswrapper[4973]: I0320 13:23:39.759512 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdb39ce9-8aab-40c4-83ca-67d6aeaa3261\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e67ab19704e074f39eba08b8d61e69ddca313f486f07b8f8aa307aa35f3931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0da83b4a45a3b65af0d17acc529fb5991668c1603f5f6b88b1476f579ad029e0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:21:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:21:22.128927 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:21:22.134975 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:21:22.169407 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:21:22.174062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:21:50.608438 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:21:50.608529 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:50Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787de60574f16b2026ae38a99d60b46b6329107ce49531d094fb400a8b010e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49856b1129ff41ad2951ef95db8bc9cc9faff376293559651f05b9905fb3c11c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49fb5f33c67890448955d4851c677b66d07065ef5711938d3c96aae65082166d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:39 crc kubenswrapper[4973]: I0320 13:23:39.773147 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:39 crc kubenswrapper[4973]: I0320 13:23:39.786090 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:39 crc kubenswrapper[4973]: I0320 13:23:39.804254 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07ee7da60e3690096c10650c3b6c42fec3b15bd2067e4ce5013b7b450626f4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://871fb9fe2847621c60969a007f73da50cc62ca926e0251ee5c0fb27aefbda812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66b6b2fb9b2f8d32d126fc2a6ce3795a88f698d6da5fe5e2dd8f67633eaed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ecf535291dac1ed0e5ea70a4bf7b4dd5d8db1a7d39b63ace83f7b7376a1a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://769012078c45497a3a9634abb008e5887154b0b6af53007e47fa95b7f8093e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a85b4f11b49688632f7d86a55bf387f3469cf372fffceeb507ca48fb35d031\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e07ba1d363ef9be20296cc96d9e4c8166d43a08ded9d617e7f41dab15dbab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://523ff44d0d4e2a76f995db507faacddad44fc13c73598200d0aa665c37f4cb35\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:23:12Z\\\",\\\"message\\\":\\\"g{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-marketplace\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.140\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0320 13:23:12.813392 7144 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:12Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:23:12.813373 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://049540dfa35ad15c3611412fbe908d24faf4f90a09da41b7a16ce2c5b4bd5062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:39 crc kubenswrapper[4973]: I0320 13:23:39.822433 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d64e13c0-e87e-4e1e-b47d-c01a84e81e97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2599b537b19ffc88d8144a6613b7812f3ab5e582d3ae0706db4919c06465c1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6a3b9c865bf4d2dbabc06a94edba659a61fe90590cb2ced9f0284d6b204bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73942ac042b1d677693f81c2a365d5c6922c4843073c5754aee0c6007a87814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58875c22f02f77b2e02162cc91ff1ccf330bae82634752c372e67bdfb78e06ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58875c22f02f77b2e02162cc91ff1ccf330bae82634752c372e67bdfb78e06ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:39 crc kubenswrapper[4973]: I0320 13:23:39.832421 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:39 crc kubenswrapper[4973]: I0320 13:23:39.842400 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:39 crc kubenswrapper[4973]: I0320 13:23:39.855513 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:39 crc kubenswrapper[4973]: I0320 13:23:39.967526 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d64e13c0-e87e-4e1e-b47d-c01a84e81e97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2599b537b19ffc88d8144a6613b7812f3ab5e582d3ae0706db4919c06465c1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6a3b9c865bf4d2dbabc06a94edba659a61fe90590cb2ced9f0284d6b204bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73942ac042b1d677693f81c2a365d5c6922c4843073c5754aee0c6007a87814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58875c22f02f77b2e02162cc91ff1ccf330bae82634752c372e67bdfb78e06ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58875c22f02f77b2e02162cc91ff1ccf330bae82634752c372e67bdfb78e06ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:39 crc kubenswrapper[4973]: I0320 13:23:39.976854 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:39 crc kubenswrapper[4973]: I0320 13:23:39.986442 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:39 crc kubenswrapper[4973]: I0320 13:23:39.996848 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:39Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.009766 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0f13493cd2874f6e19f0ced0646d2902d6cc172ca6b1560a28949984bebc70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.020756 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.030848 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.041740 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47617dca1598ac1606a2c6a9c1e00cd3d4acb516976b6b2d685ae48fa382baf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"2026-03-20T13:22:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a426a4cc-b0a1-4217-8aaf-9af3a933f09f\\\\n2026-03-20T13:22:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a426a4cc-b0a1-4217-8aaf-9af3a933f09f to /host/opt/cni/bin/\\\\n2026-03-20T13:22:46Z [verbose] multus-daemon started\\\\n2026-03-20T13:22:46Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:23:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: E0320 13:23:40.048057 4973 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.064701 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72006c12565eceb6bcd910f7a84aefb14a881325ea4617bed4ccffdd31e356c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.075312 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.089439 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.113273 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.148490 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ad1dd43-c2a7-4e78-9205-34f9a5f38dc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f147336d21c520b769cc2412032e87f34909104d9249047521ad2e72f489564b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15cc75da411a896570582ba77693c099b590adc36f8a41d5f43e0b94c5d6e97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed276b5c92b41fc0f7761442a51338cd71a913e99cc605598fc379cbc1451651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4c87775e72595ceaeaf9ec3c2f6136b579c1f0b2f7b1542525e14da6b424ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dba5b3c826c4891871a963069a09e7b3dceac4295aa4173204d50da0a23b2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.162595 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdb39ce9-8aab-40c4-83ca-67d6aeaa3261\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e67ab19704e074f39eba08b8d61e69ddca313f486f07b8f8aa307aa35f3931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0da83b4a45a3b65af0d17acc529fb5991668c1603f5f6b88b1476f579ad029e0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:21:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:21:22.128927 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:21:22.134975 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:21:22.169407 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:21:22.174062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:21:50.608438 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:21:50.608529 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:50Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787de60574f16b2026ae38a99d60b46b6329107ce49531d094fb400a8b010e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49856b1129ff41ad2951ef95db8bc9cc9faff376293559651f05b9905fb3c11c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49fb5f33c67890448955d4851c677b66d07065ef5711938d3c96aae65082166d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.177245 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.190222 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.204264 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.216477 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.234159 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07ee7da60e3690096c10650c3b6c42fec3b15bd2067e4ce5013b7b450626f4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://871fb9fe2847621c60969a007f73da50cc62ca926e0251ee5c0fb27aefbda812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66b6b2fb9b2f8d32d126fc2a6ce3795a88f698d6da5fe5e2dd8f67633eaed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ecf535291dac1ed0e5ea70a4bf7b4dd5d8db1a7d39b63ace83f7b7376a1a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://769012078c45497a3a9634abb008e5887154b0b6af53007e47fa95b7f8093e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a85b4f11b49688632f7d86a55bf387f3469cf372fffceeb507ca48fb35d031\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e07ba1d363ef9be20296cc96d9e4c8166d43a08ded9d617e7f41dab15dbab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://523ff44d0d4e2a76f995db507faacddad44fc13c73598200d0aa665c37f4cb35\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:23:12Z\\\",\\\"message\\\":\\\"g{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-marketplace\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.140\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0320 13:23:12.813392 7144 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:12Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:23:12.813373 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://049540dfa35ad15c3611412fbe908d24faf4f90a09da41b7a16ce2c5b4bd5062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.630464 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jllfx_774edfed-7d45-4b69-b9d7-a3a914cbca04/ovnkube-controller/3.log" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.631119 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jllfx_774edfed-7d45-4b69-b9d7-a3a914cbca04/ovnkube-controller/2.log" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.634275 4973 generic.go:334] "Generic (PLEG): container finished" podID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerID="39e07ba1d363ef9be20296cc96d9e4c8166d43a08ded9d617e7f41dab15dbab3" exitCode=1 Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.634329 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" event={"ID":"774edfed-7d45-4b69-b9d7-a3a914cbca04","Type":"ContainerDied","Data":"39e07ba1d363ef9be20296cc96d9e4c8166d43a08ded9d617e7f41dab15dbab3"} Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.634412 4973 scope.go:117] "RemoveContainer" containerID="523ff44d0d4e2a76f995db507faacddad44fc13c73598200d0aa665c37f4cb35" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.634990 4973 scope.go:117] "RemoveContainer" containerID="39e07ba1d363ef9be20296cc96d9e4c8166d43a08ded9d617e7f41dab15dbab3" Mar 20 13:23:40 crc kubenswrapper[4973]: E0320 13:23:40.635161 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jllfx_openshift-ovn-kubernetes(774edfed-7d45-4b69-b9d7-a3a914cbca04)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.654439 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.668571 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.682537 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.698460 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.710820 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.725912 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.746249 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ad1dd43-c2a7-4e78-9205-34f9a5f38dc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f147336d21c520b769cc2412032e87f34909104d9249047521ad2e72f489564b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15cc75da411a896570582ba77693c099b590adc36f8a41d5f43e0b94c5d6e97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed276b5c92b41fc0f7761442a51338cd71a913e99cc605598fc379cbc1451651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4c87775e72595ceaeaf9ec3c2f6136b579c1f0b2f7b1542525e14da6b424ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dba5b3c826c4891871a963069a09e7b3dceac4295aa4173204d50da0a23b2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.764747 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdb39ce9-8aab-40c4-83ca-67d6aeaa3261\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e67ab19704e074f39eba08b8d61e69ddca313f486f07b8f8aa307aa35f3931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0da83b4a45a3b65af0d17acc529fb5991668c1603f5f6b88b1476f579ad029e0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:21:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:21:22.128927 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:21:22.134975 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:21:22.169407 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:21:22.174062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:21:50.608438 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:21:50.608529 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:50Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787de60574f16b2026ae38a99d60b46b6329107ce49531d094fb400a8b010e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49856b1129ff41ad2951ef95db8bc9cc9faff376293559651f05b9905fb3c11c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49fb5f33c67890448955d4851c677b66d07065ef5711938d3c96aae65082166d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.778394 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.796313 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07ee7da60e3690096c10650c3b6c42fec3b15bd2067e4ce5013b7b450626f4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://871fb9fe2847621c60969a007f73da50cc62ca926e0251ee5c0fb27aefbda812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66b6b2fb9b2f8d32d126fc2a6ce3795a88f698d6da5fe5e2dd8f67633eaed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ecf535291dac1ed0e5ea70a4bf7b4dd5d8db1a7d39b63ace83f7b7376a1a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://769012078c45497a3a9634abb008e5887154b0b6af53007e47fa95b7f8093e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a85b4f11b49688632f7d86a55bf387f3469cf372fffceeb507ca48fb35d031\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e07ba1d363ef9be20296cc96d9e4c8166d43a08ded9d617e7f41dab15dbab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://523ff44d0d4e2a76f995db507faacddad44fc13c73598200d0aa665c37f4cb35\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:23:12Z\\\",\\\"message\\\":\\\"g{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/redhat-marketplace\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.140\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0320 13:23:12.813392 7144 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:12Z is after 2025-08-24T17:21:41Z]\\\\nI0320 13:23:12.813373 \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e07ba1d363ef9be20296cc96d9e4c8166d43a08ded9d617e7f41dab15dbab3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:23:39Z\\\",\\\"message\\\":\\\"cer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd/etcd]} name:Service_openshift-etcd/etcd_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.253:2379: 10.217.5.253:9979:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {de17f0de-cfb1-4534-bb42-c40f5e050c73}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 13:23:39.794764 7509 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd/etcd]} name:Service_openshift-etcd/etcd_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.253:2379: 10.217.5.253:9979:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {de17f0de-cfb1-4534-bb42-c40f5e050c73}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 13:23:39.794034 7509 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://049540dfa35ad15c3611412fbe908d24faf4f90a09da41b7a16ce2c5b4bd5062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.808945 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.821475 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d64e13c0-e87e-4e1e-b47d-c01a84e81e97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2599b537b19ffc88d8144a6613b7812f3ab5e582d3ae0706db4919c06465c1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6a3b9c865bf4d2dbabc06a94edba659a61fe90590cb2ced9f0284d6b204bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73942ac042b1d677693f81c2a365d5c6922c4843073c5754aee0c6007a87814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58875c22f02f77b2e02162cc91ff1ccf330bae82634752c372e67bdfb78e06ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58875c22f02f77b2e02162cc91ff1ccf330bae82634752c372e67bdfb78e06ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.832938 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.843814 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.856736 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47617dca1598ac1606a2c6a9c1e00cd3d4acb516976b6b2d685ae48fa382baf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"2026-03-20T13:22:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a426a4cc-b0a1-4217-8aaf-9af3a933f09f\\\\n2026-03-20T13:22:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a426a4cc-b0a1-4217-8aaf-9af3a933f09f to /host/opt/cni/bin/\\\\n2026-03-20T13:22:46Z [verbose] multus-daemon started\\\\n2026-03-20T13:22:46Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:23:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.873828 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72006c12565eceb6bcd910f7a84aefb14a881325ea4617bed4ccffdd31e356c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.888834 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0f13493cd2874f6e19f0ced0646d2902d6cc172ca6b1560a28949984bebc70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.899017 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.914138 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:40Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.950419 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.950432 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.950463 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:40 crc kubenswrapper[4973]: I0320 13:23:40.950493 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:40 crc kubenswrapper[4973]: E0320 13:23:40.950999 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:40 crc kubenswrapper[4973]: E0320 13:23:40.951090 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:40 crc kubenswrapper[4973]: E0320 13:23:40.951206 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:40 crc kubenswrapper[4973]: E0320 13:23:40.951304 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.641004 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jllfx_774edfed-7d45-4b69-b9d7-a3a914cbca04/ovnkube-controller/3.log" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.644660 4973 scope.go:117] "RemoveContainer" containerID="39e07ba1d363ef9be20296cc96d9e4c8166d43a08ded9d617e7f41dab15dbab3" Mar 20 13:23:41 crc kubenswrapper[4973]: E0320 13:23:41.644789 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jllfx_openshift-ovn-kubernetes(774edfed-7d45-4b69-b9d7-a3a914cbca04)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.659263 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-qqncz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6e2d3006-c203-45e9-875b-8b8210a85409\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0dfc50bede5f92f2d8d57338b99bc8a053134ca82220d6a83e9e20f8ea48dcf7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gnk94\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-qqncz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.674475 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://53e230d1e11fd46a64bfeb4ff42a21952433c067ccbe64263126b15fdea20ec7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.695218 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-57hnn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"35802646-2926-42b8-913a-986001818f97\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47617dca1598ac1606a2c6a9c1e00cd3d4acb516976b6b2d685ae48fa382baf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:23:31Z\\\",\\\"message\\\":\\\"2026-03-20T13:22:45+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_a426a4cc-b0a1-4217-8aaf-9af3a933f09f\\\\n2026-03-20T13:22:45+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_a426a4cc-b0a1-4217-8aaf-9af3a933f09f to /host/opt/cni/bin/\\\\n2026-03-20T13:22:46Z [verbose] multus-daemon started\\\\n2026-03-20T13:22:46Z [verbose] Readiness Indicator file check\\\\n2026-03-20T13:23:31Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h9hbz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-57hnn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.711585 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b85f66bb-77a7-4c4c-8d36-6a94a52c90dd\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a72006c12565eceb6bcd910f7a84aefb14a881325ea4617bed4ccffdd31e356c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc49f4b11f2366f8c06688e35154cfa1d3d2605496d29948d1a5f52749453455\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2e44bb6fbfa8b3a535d917334a61f7c150765702e5ff9785bad55c89c6c2b5a4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0067b4acfe443c0ee671fcecea5d9cbef3ba7b533455fa3fbad69172f174b1d6\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5bb2b93a21889dd0f69f832554a584b6704811ccfb1dc2346fdd2104ff5acfb5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4f340d2a2e03a9455446c4c53997d15afa1203a546285471b3253ef1b8ec6f17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://945c42e542e98f8da986c6f38e6691a74336ea88eb4d537700aaa2ebcb33d242\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-48dqg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmj8d\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.730774 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d18fe6e1-563f-476a-8193-275b6f92839b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae0f13493cd2874f6e19f0ced0646d2902d6cc172ca6b1560a28949984bebc70\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:22:28Z\\\",\\\"message\\\":\\\"file observer\\\\nW0320 13:22:28.478898 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 13:22:28.479165 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 13:22:28.481417 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-635279581/tls.crt::/tmp/serving-cert-635279581/tls.key\\\\\\\"\\\\nI0320 13:22:28.936816 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 13:22:28.940001 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 13:22:28.940026 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 13:22:28.940051 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 13:22:28.940058 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 13:22:28.943652 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0320 13:22:28.943671 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0320 13:22:28.943683 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943691 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 13:22:28.943697 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 13:22:28.943701 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 13:22:28.943706 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 13:22:28.943710 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0320 13:22:28.946207 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:28Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:23:13Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.741920 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.753706 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"cdb39ce9-8aab-40c4-83ca-67d6aeaa3261\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0e67ab19704e074f39eba08b8d61e69ddca313f486f07b8f8aa307aa35f3931\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0da83b4a45a3b65af0d17acc529fb5991668c1603f5f6b88b1476f579ad029e0\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T13:21:50Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 13:21:22.128927 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 13:21:22.134975 1 observer_polling.go:159] Starting file observer\\\\nI0320 13:21:22.169407 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 13:21:22.174062 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 13:21:50.608438 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 13:21:50.608529 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:21:50Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://787de60574f16b2026ae38a99d60b46b6329107ce49531d094fb400a8b010e67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49856b1129ff41ad2951ef95db8bc9cc9faff376293559651f05b9905fb3c11c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49fb5f33c67890448955d4851c677b66d07065ef5711938d3c96aae65082166d\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.765222 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.765265 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.765303 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.765320 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.765330 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:41Z","lastTransitionTime":"2026-03-20T13:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.767030 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://40005d394a173ec7e0b9675cc5f814b885594df92f5ef6c0783f196c9c916374\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:41 crc kubenswrapper[4973]: E0320 13:23:41.777100 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.779296 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.779991 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.780042 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.780054 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.780076 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.780090 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:41Z","lastTransitionTime":"2026-03-20T13:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.789685 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:41 crc kubenswrapper[4973]: E0320 13:23:41.793068 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.795739 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.795758 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.795765 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.795777 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.795786 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:41Z","lastTransitionTime":"2026-03-20T13:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.801598 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"70745a45-4eff-4e56-b9ab-efa4a7c83306\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ec84203b7228943aded78fdcd0f77771bc9c9cedcf1202d76c19221963eb6560\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9wxk4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-qlztx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:41 crc kubenswrapper[4973]: E0320 13:23:41.806974 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.809950 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.809984 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.809994 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.810010 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.810021 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:41Z","lastTransitionTime":"2026-03-20T13:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.813117 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-7kszd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"93c5ad90-87bf-4668-9d87-34e676b15783\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x94sv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-7kszd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:41 crc kubenswrapper[4973]: E0320 13:23:41.819494 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.822505 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.822557 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.822569 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.822586 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.822599 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:41Z","lastTransitionTime":"2026-03-20T13:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.823305 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"b606d037-c146-4c4b-985a-8cea73f83da5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://60887fda1803fffa234dcb1b2c1a9f2de370b57a9263a1b69e412b762fb36f25\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aa81f915792e14f645a7e1021689c3b9a563585eab9609f82492d56ab74a642a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ws7x5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-sp2rb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:41 crc kubenswrapper[4973]: E0320 13:23:41.836806 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T13:23:41Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"b12408c6-1427-4b07-880e-3523bdf11c11\\\",\\\"systemUUID\\\":\\\"bbb3e27f-a5bd-49db-8577-6a161b4912bb\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:41 crc kubenswrapper[4973]: E0320 13:23:41.836924 4973 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.844304 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ad1dd43-c2a7-4e78-9205-34f9a5f38dc9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f147336d21c520b769cc2412032e87f34909104d9249047521ad2e72f489564b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://15cc75da411a896570582ba77693c099b590adc36f8a41d5f43e0b94c5d6e97d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed276b5c92b41fc0f7761442a51338cd71a913e99cc605598fc379cbc1451651\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ab4c87775e72595ceaeaf9ec3c2f6136b579c1f0b2f7b1542525e14da6b424ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3dba5b3c826c4891871a963069a09e7b3dceac4295aa4173204d50da0a23b2e3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://aab3e36eece91c1b512d6aa1b98132e236a385db108e476e218da2a5c5e92ed9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b6446d3fdacd16099c121eb5e54061c31c22e2a69054c7a3a4ea9fb6a473206e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://997bf6df487cf7edbaa89a9e4e90ba84a91f8f7d23292e95c88c114b4eb206c3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:23Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.864238 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"774edfed-7d45-4b69-b9d7-a3a914cbca04\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://07ee7da60e3690096c10650c3b6c42fec3b15bd2067e4ce5013b7b450626f4cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://871fb9fe2847621c60969a007f73da50cc62ca926e0251ee5c0fb27aefbda812\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5f66b6b2fb9b2f8d32d126fc2a6ce3795a88f698d6da5fe5e2dd8f67633eaed1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b0ecf535291dac1ed0e5ea70a4bf7b4dd5d8db1a7d39b63ace83f7b7376a1a79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://769012078c45497a3a9634abb008e5887154b0b6af53007e47fa95b7f8093e71\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84a85b4f11b49688632f7d86a55bf387f3469cf372fffceeb507ca48fb35d031\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://39e07ba1d363ef9be20296cc96d9e4c8166d43a08ded9d617e7f41dab15dbab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://39e07ba1d363ef9be20296cc96d9e4c8166d43a08ded9d617e7f41dab15dbab3\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T13:23:39Z\\\",\\\"message\\\":\\\"cer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd/etcd]} name:Service_openshift-etcd/etcd_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.253:2379: 10.217.5.253:9979:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {de17f0de-cfb1-4534-bb42-c40f5e050c73}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0320 13:23:39.794764 7509 transact.go:42] Configuring OVN: [{Op:update Table:Load_Balancer Row:map[external_ids:{GoMap:map[k8s.ovn.org/kind:Service k8s.ovn.org/owner:openshift-etcd/etcd]} name:Service_openshift-etcd/etcd_TCP_cluster options:{GoMap:map[event:false hairpin_snat_ip:169.254.0.5 fd69::5 neighbor_responder:none reject:true skip_snat:false]} protocol:{GoSet:[tcp]} selection_fields:{GoSet:[]} vips:{GoMap:map[10.217.5.253:2379: 10.217.5.253:9979:]}] Rows:[] Columns:[] Mutations:[] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {de17f0de-cfb1-4534-bb42-c40f5e050c73}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nF0320 13:23:39.794034 7509 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T13:23:39Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jllfx_openshift-ovn-kubernetes(774edfed-7d45-4b69-b9d7-a3a914cbca04)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://049540dfa35ad15c3611412fbe908d24faf4f90a09da41b7a16ce2c5b4bd5062\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lx76c\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-jllfx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.872574 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4eac37bd-3af0-4afa-a134-c6b2f86e061a\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:22Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0ba8a144101821fa71067c8073f7d0771b7217fea755bc51e4b4f588dbec09da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b1c617537b23c2c59f0b78a02744f7b743e074317d6513b8f1682d77767af003\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.881072 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-7qcsb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2747a19a-a33a-458e-bc5d-bda5c13a2bf1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f61365381e1296edb3b4d23550f28425ec52579c8fe07a093bc1841419e11325\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7sgs6\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:22:42Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-7qcsb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.890738 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:44Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:45Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3fbf7331ec2ae5e3142ccb0b94078c837127036be0680390dae1ed033e1e8401\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea110afb9a53703e9287e75a72c0e9c122bdaab85bc7d29a0c1cb95a5e62a8a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:41 crc kubenswrapper[4973]: I0320 13:23:41.902498 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d64e13c0-e87e-4e1e-b47d-c01a84e81e97\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T13:21:20Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2599b537b19ffc88d8144a6613b7812f3ab5e582d3ae0706db4919c06465c1e1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0b6a3b9c865bf4d2dbabc06a94edba659a61fe90590cb2ced9f0284d6b204bd1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b73942ac042b1d677693f81c2a365d5c6922c4843073c5754aee0c6007a87814\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T13:21:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://58875c22f02f77b2e02162cc91ff1ccf330bae82634752c372e67bdfb78e06ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58875c22f02f77b2e02162cc91ff1ccf330bae82634752c372e67bdfb78e06ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T13:21:21Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T13:21:21Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T13:21:20Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:41Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:42 crc kubenswrapper[4973]: I0320 13:23:42.950581 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:42 crc kubenswrapper[4973]: I0320 13:23:42.950652 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:42 crc kubenswrapper[4973]: I0320 13:23:42.950648 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:42 crc kubenswrapper[4973]: I0320 13:23:42.950834 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:42 crc kubenswrapper[4973]: E0320 13:23:42.950828 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:42 crc kubenswrapper[4973]: E0320 13:23:42.950926 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:42 crc kubenswrapper[4973]: E0320 13:23:42.951027 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:42 crc kubenswrapper[4973]: E0320 13:23:42.951131 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:44 crc kubenswrapper[4973]: I0320 13:23:44.949994 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:44 crc kubenswrapper[4973]: I0320 13:23:44.950051 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:44 crc kubenswrapper[4973]: I0320 13:23:44.949999 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:44 crc kubenswrapper[4973]: E0320 13:23:44.950148 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:44 crc kubenswrapper[4973]: I0320 13:23:44.949995 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:44 crc kubenswrapper[4973]: E0320 13:23:44.950286 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:44 crc kubenswrapper[4973]: E0320 13:23:44.950524 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:44 crc kubenswrapper[4973]: E0320 13:23:44.950999 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:45 crc kubenswrapper[4973]: E0320 13:23:45.049112 4973 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:23:46 crc kubenswrapper[4973]: I0320 13:23:46.843472 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:23:46 crc kubenswrapper[4973]: E0320 13:23:46.843767 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:50.843726619 +0000 UTC m=+211.587396423 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:23:46 crc kubenswrapper[4973]: I0320 13:23:46.844746 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:46 crc kubenswrapper[4973]: I0320 13:23:46.844970 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:46 crc kubenswrapper[4973]: I0320 13:23:46.845156 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:46 crc kubenswrapper[4973]: E0320 13:23:46.845534 4973 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:23:46 crc kubenswrapper[4973]: E0320 13:23:46.845756 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:24:50.845732002 +0000 UTC m=+211.589401776 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 13:23:46 crc kubenswrapper[4973]: E0320 13:23:46.846543 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:23:46 crc kubenswrapper[4973]: E0320 13:23:46.846721 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:23:46 crc kubenswrapper[4973]: E0320 13:23:46.846871 4973 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:46 crc kubenswrapper[4973]: E0320 13:23:46.847056 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 13:24:50.847034267 +0000 UTC m=+211.590704041 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:46 crc kubenswrapper[4973]: E0320 13:23:46.847561 4973 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:23:46 crc kubenswrapper[4973]: E0320 13:23:46.847788 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 13:24:50.847767887 +0000 UTC m=+211.591437671 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 13:23:46 crc kubenswrapper[4973]: I0320 13:23:46.947028 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93c5ad90-87bf-4668-9d87-34e676b15783-metrics-certs\") pod \"network-metrics-daemon-7kszd\" (UID: \"93c5ad90-87bf-4668-9d87-34e676b15783\") " pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:46 crc kubenswrapper[4973]: I0320 13:23:46.947123 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:46 crc kubenswrapper[4973]: E0320 13:23:46.947326 4973 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:23:46 crc kubenswrapper[4973]: E0320 13:23:46.947458 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 13:23:46 crc kubenswrapper[4973]: E0320 13:23:46.947480 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/93c5ad90-87bf-4668-9d87-34e676b15783-metrics-certs podName:93c5ad90-87bf-4668-9d87-34e676b15783 nodeName:}" failed. No retries permitted until 2026-03-20 13:24:50.947441408 +0000 UTC m=+211.691111342 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/93c5ad90-87bf-4668-9d87-34e676b15783-metrics-certs") pod "network-metrics-daemon-7kszd" (UID: "93c5ad90-87bf-4668-9d87-34e676b15783") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 13:23:46 crc kubenswrapper[4973]: E0320 13:23:46.947490 4973 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 13:23:46 crc kubenswrapper[4973]: E0320 13:23:46.947517 4973 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:46 crc kubenswrapper[4973]: E0320 13:23:46.947605 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 13:24:50.947577322 +0000 UTC m=+211.691247106 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 13:23:46 crc kubenswrapper[4973]: I0320 13:23:46.949595 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:46 crc kubenswrapper[4973]: I0320 13:23:46.949655 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:46 crc kubenswrapper[4973]: I0320 13:23:46.949684 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:46 crc kubenswrapper[4973]: I0320 13:23:46.949614 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:46 crc kubenswrapper[4973]: E0320 13:23:46.949827 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:46 crc kubenswrapper[4973]: E0320 13:23:46.950006 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:46 crc kubenswrapper[4973]: E0320 13:23:46.950227 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:46 crc kubenswrapper[4973]: E0320 13:23:46.950465 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:48 crc kubenswrapper[4973]: I0320 13:23:48.949542 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:48 crc kubenswrapper[4973]: I0320 13:23:48.949573 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:48 crc kubenswrapper[4973]: I0320 13:23:48.949573 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:48 crc kubenswrapper[4973]: I0320 13:23:48.949627 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:48 crc kubenswrapper[4973]: E0320 13:23:48.949772 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:48 crc kubenswrapper[4973]: E0320 13:23:48.949891 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:48 crc kubenswrapper[4973]: E0320 13:23:48.950014 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:48 crc kubenswrapper[4973]: E0320 13:23:48.950310 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:49 crc kubenswrapper[4973]: I0320 13:23:49.972550 4973 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T13:22:42Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T13:23:49Z is after 2025-08-24T17:21:41Z" Mar 20 13:23:50 crc kubenswrapper[4973]: I0320 13:23:50.040747 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podStartSLOduration=106.040712731 podStartE2EDuration="1m46.040712731s" podCreationTimestamp="2026-03-20 13:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:50.040122886 +0000 UTC m=+150.783792640" watchObservedRunningTime="2026-03-20 13:23:50.040712731 +0000 UTC m=+150.784382515" Mar 20 13:23:50 crc kubenswrapper[4973]: E0320 13:23:50.049532 4973 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:23:50 crc kubenswrapper[4973]: I0320 13:23:50.103143 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-sp2rb" podStartSLOduration=105.103121698 podStartE2EDuration="1m45.103121698s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:50.071974366 +0000 UTC m=+150.815644120" watchObservedRunningTime="2026-03-20 13:23:50.103121698 +0000 UTC m=+150.846791452" Mar 20 13:23:50 crc kubenswrapper[4973]: I0320 13:23:50.103515 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=47.103508078 podStartE2EDuration="47.103508078s" podCreationTimestamp="2026-03-20 13:23:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:50.102544092 +0000 UTC m=+150.846213886" watchObservedRunningTime="2026-03-20 13:23:50.103508078 +0000 UTC m=+150.847177832" Mar 20 13:23:50 crc kubenswrapper[4973]: I0320 13:23:50.122383 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=16.122367212 podStartE2EDuration="16.122367212s" podCreationTimestamp="2026-03-20 13:23:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:50.122064833 +0000 UTC m=+150.865734587" watchObservedRunningTime="2026-03-20 13:23:50.122367212 +0000 UTC m=+150.866036956" Mar 20 13:23:50 crc kubenswrapper[4973]: I0320 13:23:50.205745 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=14.205721687 podStartE2EDuration="14.205721687s" podCreationTimestamp="2026-03-20 13:23:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:50.205403228 +0000 UTC m=+150.949073012" watchObservedRunningTime="2026-03-20 13:23:50.205721687 +0000 UTC m=+150.949391441" Mar 20 13:23:50 crc kubenswrapper[4973]: I0320 13:23:50.216005 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=65.21598847 podStartE2EDuration="1m5.21598847s" podCreationTimestamp="2026-03-20 13:22:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:50.215007715 +0000 UTC m=+150.958677469" watchObservedRunningTime="2026-03-20 13:23:50.21598847 +0000 UTC m=+150.959658214" Mar 20 13:23:50 crc kubenswrapper[4973]: I0320 13:23:50.226398 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-7qcsb" podStartSLOduration=106.226379408 podStartE2EDuration="1m46.226379408s" podCreationTimestamp="2026-03-20 13:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:50.224728874 +0000 UTC m=+150.968398618" watchObservedRunningTime="2026-03-20 13:23:50.226379408 +0000 UTC m=+150.970049162" Mar 20 13:23:50 crc kubenswrapper[4973]: I0320 13:23:50.238091 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-57hnn" podStartSLOduration=105.23807322 podStartE2EDuration="1m45.23807322s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:50.237851214 +0000 UTC m=+150.981520978" watchObservedRunningTime="2026-03-20 13:23:50.23807322 +0000 UTC m=+150.981742964" Mar 20 13:23:50 crc kubenswrapper[4973]: I0320 13:23:50.255363 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-tmj8d" podStartSLOduration=105.25532034 podStartE2EDuration="1m45.25532034s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:50.255141956 +0000 UTC m=+150.998811700" watchObservedRunningTime="2026-03-20 13:23:50.25532034 +0000 UTC m=+150.998990084" Mar 20 13:23:50 crc kubenswrapper[4973]: I0320 13:23:50.273128 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=64.273111826 podStartE2EDuration="1m4.273111826s" podCreationTimestamp="2026-03-20 13:22:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:50.272718205 +0000 UTC m=+151.016387959" watchObservedRunningTime="2026-03-20 13:23:50.273111826 +0000 UTC m=+151.016781570" Mar 20 13:23:50 crc kubenswrapper[4973]: I0320 13:23:50.282477 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-qqncz" podStartSLOduration=106.282462065 podStartE2EDuration="1m46.282462065s" podCreationTimestamp="2026-03-20 13:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:50.282026254 +0000 UTC m=+151.025695998" watchObservedRunningTime="2026-03-20 13:23:50.282462065 +0000 UTC m=+151.026131809" Mar 20 13:23:50 crc kubenswrapper[4973]: I0320 13:23:50.949963 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:50 crc kubenswrapper[4973]: I0320 13:23:50.950040 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:50 crc kubenswrapper[4973]: I0320 13:23:50.949969 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:50 crc kubenswrapper[4973]: I0320 13:23:50.949969 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:50 crc kubenswrapper[4973]: E0320 13:23:50.950203 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:50 crc kubenswrapper[4973]: E0320 13:23:50.950289 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:50 crc kubenswrapper[4973]: E0320 13:23:50.950379 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:50 crc kubenswrapper[4973]: E0320 13:23:50.950551 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.025456 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.025518 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.025539 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.025569 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.025591 4973 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T13:23:52Z","lastTransitionTime":"2026-03-20T13:23:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.073892 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-965f5"] Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.074226 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-965f5" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.076600 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.082613 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.083707 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.083830 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.092108 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4535ed0f-5988-4a4b-ae98-7484da5ad915-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-965f5\" (UID: \"4535ed0f-5988-4a4b-ae98-7484da5ad915\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-965f5" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.092189 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4535ed0f-5988-4a4b-ae98-7484da5ad915-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-965f5\" (UID: \"4535ed0f-5988-4a4b-ae98-7484da5ad915\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-965f5" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.092230 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4535ed0f-5988-4a4b-ae98-7484da5ad915-service-ca\") pod \"cluster-version-operator-5c965bbfc6-965f5\" (UID: \"4535ed0f-5988-4a4b-ae98-7484da5ad915\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-965f5" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.092254 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4535ed0f-5988-4a4b-ae98-7484da5ad915-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-965f5\" (UID: \"4535ed0f-5988-4a4b-ae98-7484da5ad915\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-965f5" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.092292 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4535ed0f-5988-4a4b-ae98-7484da5ad915-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-965f5\" (UID: \"4535ed0f-5988-4a4b-ae98-7484da5ad915\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-965f5" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.193701 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4535ed0f-5988-4a4b-ae98-7484da5ad915-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-965f5\" (UID: \"4535ed0f-5988-4a4b-ae98-7484da5ad915\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-965f5" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.193754 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4535ed0f-5988-4a4b-ae98-7484da5ad915-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-965f5\" (UID: \"4535ed0f-5988-4a4b-ae98-7484da5ad915\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-965f5" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.193780 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4535ed0f-5988-4a4b-ae98-7484da5ad915-service-ca\") pod \"cluster-version-operator-5c965bbfc6-965f5\" (UID: \"4535ed0f-5988-4a4b-ae98-7484da5ad915\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-965f5" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.193812 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4535ed0f-5988-4a4b-ae98-7484da5ad915-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-965f5\" (UID: \"4535ed0f-5988-4a4b-ae98-7484da5ad915\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-965f5" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.193838 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4535ed0f-5988-4a4b-ae98-7484da5ad915-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-965f5\" (UID: \"4535ed0f-5988-4a4b-ae98-7484da5ad915\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-965f5" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.193854 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/4535ed0f-5988-4a4b-ae98-7484da5ad915-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-965f5\" (UID: \"4535ed0f-5988-4a4b-ae98-7484da5ad915\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-965f5" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.194072 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/4535ed0f-5988-4a4b-ae98-7484da5ad915-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-965f5\" (UID: \"4535ed0f-5988-4a4b-ae98-7484da5ad915\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-965f5" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.194895 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4535ed0f-5988-4a4b-ae98-7484da5ad915-service-ca\") pod \"cluster-version-operator-5c965bbfc6-965f5\" (UID: \"4535ed0f-5988-4a4b-ae98-7484da5ad915\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-965f5" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.201219 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4535ed0f-5988-4a4b-ae98-7484da5ad915-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-965f5\" (UID: \"4535ed0f-5988-4a4b-ae98-7484da5ad915\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-965f5" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.216237 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4535ed0f-5988-4a4b-ae98-7484da5ad915-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-965f5\" (UID: \"4535ed0f-5988-4a4b-ae98-7484da5ad915\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-965f5" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.390493 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-965f5" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.683230 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-965f5" event={"ID":"4535ed0f-5988-4a4b-ae98-7484da5ad915","Type":"ContainerStarted","Data":"b1f23c6c05309cb834ed744a9b729c5d2bf1b1c422579836fb5cbd77c1363239"} Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.683283 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-965f5" event={"ID":"4535ed0f-5988-4a4b-ae98-7484da5ad915","Type":"ContainerStarted","Data":"a9e4e759067152bcad4a5c6cf9f31ab31356d2fb2763d066d52fca4366e24246"} Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.698093 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-965f5" podStartSLOduration=108.698067747 podStartE2EDuration="1m48.698067747s" podCreationTimestamp="2026-03-20 13:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:23:52.698047386 +0000 UTC m=+153.441717130" watchObservedRunningTime="2026-03-20 13:23:52.698067747 +0000 UTC m=+153.441737521" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.949900 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.949994 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:52 crc kubenswrapper[4973]: E0320 13:23:52.950082 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.950328 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.950400 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:52 crc kubenswrapper[4973]: E0320 13:23:52.950915 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:52 crc kubenswrapper[4973]: E0320 13:23:52.951129 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:52 crc kubenswrapper[4973]: E0320 13:23:52.951202 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.951710 4973 scope.go:117] "RemoveContainer" containerID="39e07ba1d363ef9be20296cc96d9e4c8166d43a08ded9d617e7f41dab15dbab3" Mar 20 13:23:52 crc kubenswrapper[4973]: E0320 13:23:52.952034 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jllfx_openshift-ovn-kubernetes(774edfed-7d45-4b69-b9d7-a3a914cbca04)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.986334 4973 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 20 13:23:52 crc kubenswrapper[4973]: I0320 13:23:52.998316 4973 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 13:23:54 crc kubenswrapper[4973]: I0320 13:23:54.950038 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:54 crc kubenswrapper[4973]: E0320 13:23:54.950195 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:54 crc kubenswrapper[4973]: I0320 13:23:54.950272 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:54 crc kubenswrapper[4973]: I0320 13:23:54.950272 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:54 crc kubenswrapper[4973]: I0320 13:23:54.950269 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:54 crc kubenswrapper[4973]: E0320 13:23:54.950440 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:54 crc kubenswrapper[4973]: E0320 13:23:54.950535 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:54 crc kubenswrapper[4973]: E0320 13:23:54.950595 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:55 crc kubenswrapper[4973]: E0320 13:23:55.050853 4973 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:23:56 crc kubenswrapper[4973]: I0320 13:23:56.950482 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:56 crc kubenswrapper[4973]: I0320 13:23:56.950492 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:56 crc kubenswrapper[4973]: E0320 13:23:56.950906 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:23:56 crc kubenswrapper[4973]: I0320 13:23:56.950592 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:56 crc kubenswrapper[4973]: I0320 13:23:56.950507 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:56 crc kubenswrapper[4973]: E0320 13:23:56.951028 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:56 crc kubenswrapper[4973]: E0320 13:23:56.951130 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:56 crc kubenswrapper[4973]: E0320 13:23:56.951167 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:58 crc kubenswrapper[4973]: I0320 13:23:58.949741 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:23:58 crc kubenswrapper[4973]: I0320 13:23:58.949835 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:23:58 crc kubenswrapper[4973]: I0320 13:23:58.949881 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:23:58 crc kubenswrapper[4973]: I0320 13:23:58.949963 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:23:58 crc kubenswrapper[4973]: E0320 13:23:58.949878 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:23:58 crc kubenswrapper[4973]: E0320 13:23:58.950061 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:23:58 crc kubenswrapper[4973]: E0320 13:23:58.950220 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:23:58 crc kubenswrapper[4973]: E0320 13:23:58.950301 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:00 crc kubenswrapper[4973]: E0320 13:24:00.051322 4973 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:24:00 crc kubenswrapper[4973]: I0320 13:24:00.949852 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:24:00 crc kubenswrapper[4973]: I0320 13:24:00.949889 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:00 crc kubenswrapper[4973]: I0320 13:24:00.949889 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:00 crc kubenswrapper[4973]: I0320 13:24:00.950015 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:00 crc kubenswrapper[4973]: E0320 13:24:00.950203 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:24:00 crc kubenswrapper[4973]: E0320 13:24:00.950282 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:00 crc kubenswrapper[4973]: E0320 13:24:00.950378 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:00 crc kubenswrapper[4973]: E0320 13:24:00.950445 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:02 crc kubenswrapper[4973]: I0320 13:24:02.949528 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:24:02 crc kubenswrapper[4973]: I0320 13:24:02.949542 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:02 crc kubenswrapper[4973]: I0320 13:24:02.949572 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:02 crc kubenswrapper[4973]: E0320 13:24:02.949881 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:02 crc kubenswrapper[4973]: E0320 13:24:02.949741 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:24:02 crc kubenswrapper[4973]: I0320 13:24:02.949605 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:02 crc kubenswrapper[4973]: E0320 13:24:02.949956 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:02 crc kubenswrapper[4973]: E0320 13:24:02.950017 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:04 crc kubenswrapper[4973]: I0320 13:24:04.950500 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:04 crc kubenswrapper[4973]: E0320 13:24:04.950630 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:04 crc kubenswrapper[4973]: I0320 13:24:04.950683 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:04 crc kubenswrapper[4973]: I0320 13:24:04.950712 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:04 crc kubenswrapper[4973]: I0320 13:24:04.950689 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:24:04 crc kubenswrapper[4973]: E0320 13:24:04.950979 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:04 crc kubenswrapper[4973]: E0320 13:24:04.951057 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:04 crc kubenswrapper[4973]: E0320 13:24:04.951208 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:24:05 crc kubenswrapper[4973]: E0320 13:24:05.052435 4973 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:24:05 crc kubenswrapper[4973]: I0320 13:24:05.951046 4973 scope.go:117] "RemoveContainer" containerID="39e07ba1d363ef9be20296cc96d9e4c8166d43a08ded9d617e7f41dab15dbab3" Mar 20 13:24:05 crc kubenswrapper[4973]: E0320 13:24:05.951449 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jllfx_openshift-ovn-kubernetes(774edfed-7d45-4b69-b9d7-a3a914cbca04)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" Mar 20 13:24:06 crc kubenswrapper[4973]: I0320 13:24:06.950259 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:06 crc kubenswrapper[4973]: I0320 13:24:06.950380 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:06 crc kubenswrapper[4973]: I0320 13:24:06.950477 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:06 crc kubenswrapper[4973]: I0320 13:24:06.950283 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:24:06 crc kubenswrapper[4973]: E0320 13:24:06.950486 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:06 crc kubenswrapper[4973]: E0320 13:24:06.950648 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:06 crc kubenswrapper[4973]: E0320 13:24:06.950763 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:06 crc kubenswrapper[4973]: E0320 13:24:06.950857 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:24:08 crc kubenswrapper[4973]: I0320 13:24:08.950264 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:08 crc kubenswrapper[4973]: I0320 13:24:08.950375 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:08 crc kubenswrapper[4973]: I0320 13:24:08.950375 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:24:08 crc kubenswrapper[4973]: I0320 13:24:08.950397 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:08 crc kubenswrapper[4973]: E0320 13:24:08.950678 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:08 crc kubenswrapper[4973]: E0320 13:24:08.950813 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:08 crc kubenswrapper[4973]: E0320 13:24:08.950874 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:24:08 crc kubenswrapper[4973]: E0320 13:24:08.950924 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:10 crc kubenswrapper[4973]: E0320 13:24:10.054051 4973 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:24:10 crc kubenswrapper[4973]: I0320 13:24:10.950067 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:24:10 crc kubenswrapper[4973]: I0320 13:24:10.950140 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:10 crc kubenswrapper[4973]: E0320 13:24:10.950252 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:24:10 crc kubenswrapper[4973]: I0320 13:24:10.950136 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:10 crc kubenswrapper[4973]: I0320 13:24:10.950557 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:10 crc kubenswrapper[4973]: E0320 13:24:10.950706 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:10 crc kubenswrapper[4973]: E0320 13:24:10.950952 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:10 crc kubenswrapper[4973]: E0320 13:24:10.951030 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:12 crc kubenswrapper[4973]: I0320 13:24:12.949907 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:12 crc kubenswrapper[4973]: I0320 13:24:12.949989 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:12 crc kubenswrapper[4973]: I0320 13:24:12.950055 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:24:12 crc kubenswrapper[4973]: E0320 13:24:12.950127 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:12 crc kubenswrapper[4973]: I0320 13:24:12.949936 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:12 crc kubenswrapper[4973]: E0320 13:24:12.950522 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:12 crc kubenswrapper[4973]: E0320 13:24:12.950635 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:12 crc kubenswrapper[4973]: E0320 13:24:12.950822 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:24:14 crc kubenswrapper[4973]: I0320 13:24:14.950049 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:14 crc kubenswrapper[4973]: I0320 13:24:14.950099 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:24:14 crc kubenswrapper[4973]: I0320 13:24:14.950185 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:14 crc kubenswrapper[4973]: I0320 13:24:14.950326 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:14 crc kubenswrapper[4973]: E0320 13:24:14.950319 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:24:14 crc kubenswrapper[4973]: E0320 13:24:14.950485 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:14 crc kubenswrapper[4973]: E0320 13:24:14.950540 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:14 crc kubenswrapper[4973]: E0320 13:24:14.950625 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:15 crc kubenswrapper[4973]: E0320 13:24:15.056432 4973 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:24:16 crc kubenswrapper[4973]: I0320 13:24:16.950060 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:16 crc kubenswrapper[4973]: I0320 13:24:16.950152 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:24:16 crc kubenswrapper[4973]: I0320 13:24:16.950187 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:16 crc kubenswrapper[4973]: I0320 13:24:16.950227 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:16 crc kubenswrapper[4973]: E0320 13:24:16.950255 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:16 crc kubenswrapper[4973]: E0320 13:24:16.950325 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:16 crc kubenswrapper[4973]: E0320 13:24:16.950467 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:24:16 crc kubenswrapper[4973]: E0320 13:24:16.950598 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:17 crc kubenswrapper[4973]: I0320 13:24:17.769837 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-57hnn_35802646-2926-42b8-913a-986001818f97/kube-multus/1.log" Mar 20 13:24:17 crc kubenswrapper[4973]: I0320 13:24:17.770761 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-57hnn_35802646-2926-42b8-913a-986001818f97/kube-multus/0.log" Mar 20 13:24:17 crc kubenswrapper[4973]: I0320 13:24:17.770849 4973 generic.go:334] "Generic (PLEG): container finished" podID="35802646-2926-42b8-913a-986001818f97" containerID="47617dca1598ac1606a2c6a9c1e00cd3d4acb516976b6b2d685ae48fa382baf1" exitCode=1 Mar 20 13:24:17 crc kubenswrapper[4973]: I0320 13:24:17.770899 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-57hnn" event={"ID":"35802646-2926-42b8-913a-986001818f97","Type":"ContainerDied","Data":"47617dca1598ac1606a2c6a9c1e00cd3d4acb516976b6b2d685ae48fa382baf1"} Mar 20 13:24:17 crc kubenswrapper[4973]: I0320 13:24:17.770954 4973 scope.go:117] "RemoveContainer" containerID="410021eb68c40f5c11d11c0ac2e8981c68a0a6ab6a68b093dda05b215b75c225" Mar 20 13:24:17 crc kubenswrapper[4973]: I0320 13:24:17.771596 4973 scope.go:117] "RemoveContainer" containerID="47617dca1598ac1606a2c6a9c1e00cd3d4acb516976b6b2d685ae48fa382baf1" Mar 20 13:24:17 crc kubenswrapper[4973]: E0320 13:24:17.771877 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-57hnn_openshift-multus(35802646-2926-42b8-913a-986001818f97)\"" pod="openshift-multus/multus-57hnn" podUID="35802646-2926-42b8-913a-986001818f97" Mar 20 13:24:18 crc kubenswrapper[4973]: I0320 13:24:18.779025 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-57hnn_35802646-2926-42b8-913a-986001818f97/kube-multus/1.log" Mar 20 13:24:18 crc kubenswrapper[4973]: I0320 13:24:18.949780 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:18 crc kubenswrapper[4973]: I0320 13:24:18.949811 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:18 crc kubenswrapper[4973]: E0320 13:24:18.949944 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:18 crc kubenswrapper[4973]: I0320 13:24:18.950002 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:18 crc kubenswrapper[4973]: I0320 13:24:18.950045 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:24:18 crc kubenswrapper[4973]: E0320 13:24:18.950051 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:18 crc kubenswrapper[4973]: E0320 13:24:18.950484 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:24:18 crc kubenswrapper[4973]: I0320 13:24:18.950683 4973 scope.go:117] "RemoveContainer" containerID="39e07ba1d363ef9be20296cc96d9e4c8166d43a08ded9d617e7f41dab15dbab3" Mar 20 13:24:18 crc kubenswrapper[4973]: E0320 13:24:18.950767 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:18 crc kubenswrapper[4973]: E0320 13:24:18.950923 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-jllfx_openshift-ovn-kubernetes(774edfed-7d45-4b69-b9d7-a3a914cbca04)\"" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" Mar 20 13:24:20 crc kubenswrapper[4973]: E0320 13:24:20.057055 4973 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:24:20 crc kubenswrapper[4973]: I0320 13:24:20.950331 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:20 crc kubenswrapper[4973]: I0320 13:24:20.950375 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:20 crc kubenswrapper[4973]: I0320 13:24:20.950458 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:20 crc kubenswrapper[4973]: E0320 13:24:20.950468 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:20 crc kubenswrapper[4973]: E0320 13:24:20.950527 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:20 crc kubenswrapper[4973]: I0320 13:24:20.950605 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:24:20 crc kubenswrapper[4973]: E0320 13:24:20.950732 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:20 crc kubenswrapper[4973]: E0320 13:24:20.950810 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:24:22 crc kubenswrapper[4973]: I0320 13:24:22.949897 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:22 crc kubenswrapper[4973]: I0320 13:24:22.949921 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:22 crc kubenswrapper[4973]: E0320 13:24:22.950073 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:22 crc kubenswrapper[4973]: E0320 13:24:22.950334 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:22 crc kubenswrapper[4973]: I0320 13:24:22.950470 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:22 crc kubenswrapper[4973]: E0320 13:24:22.950568 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:22 crc kubenswrapper[4973]: I0320 13:24:22.950625 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:24:22 crc kubenswrapper[4973]: E0320 13:24:22.950713 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:24:24 crc kubenswrapper[4973]: I0320 13:24:24.949570 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:24 crc kubenswrapper[4973]: I0320 13:24:24.949676 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:24 crc kubenswrapper[4973]: E0320 13:24:24.949724 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:24 crc kubenswrapper[4973]: I0320 13:24:24.949570 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:24:24 crc kubenswrapper[4973]: E0320 13:24:24.949904 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:24 crc kubenswrapper[4973]: E0320 13:24:24.949967 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:24:24 crc kubenswrapper[4973]: I0320 13:24:24.949599 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:24 crc kubenswrapper[4973]: E0320 13:24:24.950046 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:25 crc kubenswrapper[4973]: E0320 13:24:25.058992 4973 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:24:26 crc kubenswrapper[4973]: I0320 13:24:26.950152 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:26 crc kubenswrapper[4973]: I0320 13:24:26.950227 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:24:26 crc kubenswrapper[4973]: I0320 13:24:26.950242 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:26 crc kubenswrapper[4973]: I0320 13:24:26.950282 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:26 crc kubenswrapper[4973]: E0320 13:24:26.950538 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:26 crc kubenswrapper[4973]: E0320 13:24:26.950726 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:26 crc kubenswrapper[4973]: E0320 13:24:26.950812 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:24:26 crc kubenswrapper[4973]: E0320 13:24:26.950994 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:28 crc kubenswrapper[4973]: I0320 13:24:28.949559 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:28 crc kubenswrapper[4973]: I0320 13:24:28.949652 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:28 crc kubenswrapper[4973]: I0320 13:24:28.949673 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:24:28 crc kubenswrapper[4973]: E0320 13:24:28.949803 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:28 crc kubenswrapper[4973]: I0320 13:24:28.949888 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:28 crc kubenswrapper[4973]: E0320 13:24:28.950310 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:28 crc kubenswrapper[4973]: E0320 13:24:28.950582 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:24:28 crc kubenswrapper[4973]: E0320 13:24:28.950306 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:29 crc kubenswrapper[4973]: I0320 13:24:29.952194 4973 scope.go:117] "RemoveContainer" containerID="39e07ba1d363ef9be20296cc96d9e4c8166d43a08ded9d617e7f41dab15dbab3" Mar 20 13:24:30 crc kubenswrapper[4973]: E0320 13:24:30.059600 4973 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:24:30 crc kubenswrapper[4973]: I0320 13:24:30.777793 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7kszd"] Mar 20 13:24:30 crc kubenswrapper[4973]: I0320 13:24:30.777918 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:24:30 crc kubenswrapper[4973]: E0320 13:24:30.778017 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:24:30 crc kubenswrapper[4973]: I0320 13:24:30.824842 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jllfx_774edfed-7d45-4b69-b9d7-a3a914cbca04/ovnkube-controller/3.log" Mar 20 13:24:30 crc kubenswrapper[4973]: I0320 13:24:30.826758 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" event={"ID":"774edfed-7d45-4b69-b9d7-a3a914cbca04","Type":"ContainerStarted","Data":"30ea7d81ac52428fd43b39fb40295daa70c1f8c01fc9159a5997250355e48b36"} Mar 20 13:24:30 crc kubenswrapper[4973]: I0320 13:24:30.827981 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:24:30 crc kubenswrapper[4973]: I0320 13:24:30.853164 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" podStartSLOduration=145.853147077 podStartE2EDuration="2m25.853147077s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:30.852260553 +0000 UTC m=+191.595930317" watchObservedRunningTime="2026-03-20 13:24:30.853147077 +0000 UTC m=+191.596816821" Mar 20 13:24:30 crc kubenswrapper[4973]: I0320 13:24:30.950450 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:30 crc kubenswrapper[4973]: I0320 13:24:30.950926 4973 scope.go:117] "RemoveContainer" containerID="47617dca1598ac1606a2c6a9c1e00cd3d4acb516976b6b2d685ae48fa382baf1" Mar 20 13:24:30 crc kubenswrapper[4973]: E0320 13:24:30.950998 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:30 crc kubenswrapper[4973]: I0320 13:24:30.950459 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:30 crc kubenswrapper[4973]: I0320 13:24:30.950701 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:30 crc kubenswrapper[4973]: E0320 13:24:30.951256 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:30 crc kubenswrapper[4973]: E0320 13:24:30.951558 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:31 crc kubenswrapper[4973]: I0320 13:24:31.832272 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-57hnn_35802646-2926-42b8-913a-986001818f97/kube-multus/1.log" Mar 20 13:24:31 crc kubenswrapper[4973]: I0320 13:24:31.832421 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-57hnn" event={"ID":"35802646-2926-42b8-913a-986001818f97","Type":"ContainerStarted","Data":"f0c63468c8d0dbcc605d699d587c5443a1c5e7b884fa8bb415694f7c6679b7c6"} Mar 20 13:24:32 crc kubenswrapper[4973]: I0320 13:24:32.949862 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:32 crc kubenswrapper[4973]: I0320 13:24:32.949902 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:32 crc kubenswrapper[4973]: I0320 13:24:32.949916 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:32 crc kubenswrapper[4973]: I0320 13:24:32.949963 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:24:32 crc kubenswrapper[4973]: E0320 13:24:32.950046 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:32 crc kubenswrapper[4973]: E0320 13:24:32.951144 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:32 crc kubenswrapper[4973]: E0320 13:24:32.951225 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:32 crc kubenswrapper[4973]: E0320 13:24:32.951723 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:24:34 crc kubenswrapper[4973]: I0320 13:24:34.950701 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:34 crc kubenswrapper[4973]: E0320 13:24:34.951460 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 13:24:34 crc kubenswrapper[4973]: I0320 13:24:34.950760 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:34 crc kubenswrapper[4973]: E0320 13:24:34.951605 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 13:24:34 crc kubenswrapper[4973]: I0320 13:24:34.950761 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:24:34 crc kubenswrapper[4973]: E0320 13:24:34.951719 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7kszd" podUID="93c5ad90-87bf-4668-9d87-34e676b15783" Mar 20 13:24:34 crc kubenswrapper[4973]: I0320 13:24:34.950746 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:34 crc kubenswrapper[4973]: E0320 13:24:34.951807 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 13:24:36 crc kubenswrapper[4973]: I0320 13:24:36.950299 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:36 crc kubenswrapper[4973]: I0320 13:24:36.950397 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:36 crc kubenswrapper[4973]: I0320 13:24:36.950479 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:36 crc kubenswrapper[4973]: I0320 13:24:36.950693 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:24:36 crc kubenswrapper[4973]: I0320 13:24:36.952476 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 13:24:36 crc kubenswrapper[4973]: I0320 13:24:36.953890 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 13:24:36 crc kubenswrapper[4973]: I0320 13:24:36.954440 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 13:24:36 crc kubenswrapper[4973]: I0320 13:24:36.954903 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 13:24:36 crc kubenswrapper[4973]: I0320 13:24:36.955155 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 13:24:36 crc kubenswrapper[4973]: I0320 13:24:36.956430 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.029509 4973 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.060215 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-k9jzg"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.060726 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.061183 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jt5vz"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.061739 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-jt5vz" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.065924 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jl6jn"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.075041 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jl6jn" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.075308 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.075552 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.075563 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.075590 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.075590 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.075644 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.075675 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.075722 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.075775 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.075801 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.075874 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.075949 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.076014 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.076085 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.076282 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.076406 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.075063 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mtrbp"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.081714 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-7qt22"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.082180 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2zrvx"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.082547 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46h7s"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.082805 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tl6hw"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.083072 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdnm6"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.083100 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7qt22" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.083414 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6grs"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.081826 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.083568 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdnm6" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.083631 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2zrvx" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.083992 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46h7s" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.084052 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.084411 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-4bk2w"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.085164 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6grs" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.085658 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-4bk2w" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.086073 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-k7krj"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.086750 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.094460 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-xt4hf"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.094952 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.094987 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-xt4hf" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.095321 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.097722 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.098663 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.101987 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-c4gcj"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.102576 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-c4gcj" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.102800 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5dqnp"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.103287 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-5dqnp" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.104500 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r57q7"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.104829 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.121358 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.121823 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.122129 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8hngw"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.122199 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.122313 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.122655 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8hngw" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.122685 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.126543 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.126543 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.126740 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.126712 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.126839 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.126915 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.127002 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.127139 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.127233 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.127311 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.127396 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.127777 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.127967 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.128134 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.129583 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.131503 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.132772 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.132796 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.132821 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.133062 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.133192 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.133209 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.133412 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.133492 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.133790 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.134166 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.134280 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.134394 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.134518 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.134623 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.134662 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.134521 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.134666 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.134667 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.134881 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.133414 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.134637 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.135371 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.135157 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.135658 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.140488 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.140765 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.141569 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-mhgwt"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.142611 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.143278 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.143498 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.143773 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.144038 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.145123 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.133800 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.153964 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.154416 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.155020 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-mhgwt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.156098 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g8xx\" (UniqueName: \"kubernetes.io/projected/0b6717fd-e636-4be6-8c04-c9b46924a3b2-kube-api-access-9g8xx\") pod \"machine-api-operator-5694c8668f-jt5vz\" (UID: \"0b6717fd-e636-4be6-8c04-c9b46924a3b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jt5vz" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.156186 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b6717fd-e636-4be6-8c04-c9b46924a3b2-config\") pod \"machine-api-operator-5694c8668f-jt5vz\" (UID: \"0b6717fd-e636-4be6-8c04-c9b46924a3b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jt5vz" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.156209 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de8d912e-7616-42ee-a688-b43d5b85dc44-console-oauth-config\") pod \"console-f9d7485db-k7krj\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.156233 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1337d1ae-1f3f-4d65-aa7d-1e1f97e26e90-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-h6grs\" (UID: \"1337d1ae-1f3f-4d65-aa7d-1e1f97e26e90\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6grs" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.156261 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de8d912e-7616-42ee-a688-b43d5b85dc44-console-serving-cert\") pod \"console-f9d7485db-k7krj\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.156289 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de8d912e-7616-42ee-a688-b43d5b85dc44-service-ca\") pod \"console-f9d7485db-k7krj\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.156316 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7cq4\" (UniqueName: \"kubernetes.io/projected/1337d1ae-1f3f-4d65-aa7d-1e1f97e26e90-kube-api-access-k7cq4\") pod \"cluster-samples-operator-665b6dd947-h6grs\" (UID: \"1337d1ae-1f3f-4d65-aa7d-1e1f97e26e90\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6grs" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.156405 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wvh9\" (UniqueName: \"kubernetes.io/projected/de8d912e-7616-42ee-a688-b43d5b85dc44-kube-api-access-4wvh9\") pod \"console-f9d7485db-k7krj\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.156458 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0b6717fd-e636-4be6-8c04-c9b46924a3b2-images\") pod \"machine-api-operator-5694c8668f-jt5vz\" (UID: \"0b6717fd-e636-4be6-8c04-c9b46924a3b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jt5vz" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.156484 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0b6717fd-e636-4be6-8c04-c9b46924a3b2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jt5vz\" (UID: \"0b6717fd-e636-4be6-8c04-c9b46924a3b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jt5vz" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.156505 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr8k6\" (UniqueName: \"kubernetes.io/projected/6e84a900-8f09-438e-a365-60d6b9fc835b-kube-api-access-mr8k6\") pod \"downloads-7954f5f757-xt4hf\" (UID: \"6e84a900-8f09-438e-a365-60d6b9fc835b\") " pod="openshift-console/downloads-7954f5f757-xt4hf" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.156527 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de8d912e-7616-42ee-a688-b43d5b85dc44-oauth-serving-cert\") pod \"console-f9d7485db-k7krj\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.156552 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a60cb615-f335-45fa-86dd-ddf121e62737-config\") pod \"route-controller-manager-6576b87f9c-b7jm8\" (UID: \"a60cb615-f335-45fa-86dd-ddf121e62737\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.156581 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de8d912e-7616-42ee-a688-b43d5b85dc44-trusted-ca-bundle\") pod \"console-f9d7485db-k7krj\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.156611 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vmbj\" (UniqueName: \"kubernetes.io/projected/a60cb615-f335-45fa-86dd-ddf121e62737-kube-api-access-6vmbj\") pod \"route-controller-manager-6576b87f9c-b7jm8\" (UID: \"a60cb615-f335-45fa-86dd-ddf121e62737\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.156640 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de8d912e-7616-42ee-a688-b43d5b85dc44-console-config\") pod \"console-f9d7485db-k7krj\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.156675 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a60cb615-f335-45fa-86dd-ddf121e62737-client-ca\") pod \"route-controller-manager-6576b87f9c-b7jm8\" (UID: \"a60cb615-f335-45fa-86dd-ddf121e62737\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.156703 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a60cb615-f335-45fa-86dd-ddf121e62737-serving-cert\") pod \"route-controller-manager-6576b87f9c-b7jm8\" (UID: \"a60cb615-f335-45fa-86dd-ddf121e62737\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.157691 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.157790 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.157838 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.157946 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.158022 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.158137 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.158487 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.158850 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.158971 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.159084 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4ddcl"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.159096 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.159135 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.159653 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4ddcl" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.159189 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.159228 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.159387 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.159429 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.159857 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.159504 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.160098 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.160223 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.160316 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.160330 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.160430 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.160444 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.163399 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7qwpx"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.163582 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.163929 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.164002 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7qwpx" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.164321 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljb4c"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.164644 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljb4c" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.165461 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.165679 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.165881 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgmzv"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.166640 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgmzv" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.167223 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.168769 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.170515 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.175816 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wkc4c"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.176395 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.179614 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wkc4c" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.215611 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-zg96g"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.216783 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jt5vz"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.216819 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmxvj"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.217464 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hvdx7"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.218859 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zg96g" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.219153 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmxvj" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.219479 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hvdx7" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.224122 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-w8bf7"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.239254 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-26mgc"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.239772 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kz2ff"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.240165 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566884-j8dqd"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.240589 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566875-2dssd"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.240978 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-wgtgl"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.241384 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-l7crv"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.241707 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kz2ff" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.241770 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8w6wp"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.241795 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566884-j8dqd" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.241857 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.241863 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-wgtgl" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.241927 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-2dssd" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.242015 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-26mgc" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.242070 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8bf7" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.242613 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-56g5d"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.242773 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-8w6wp" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.242973 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bw9k5"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.243086 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-56g5d" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.243442 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mtrbp"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.243499 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bw9k5" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.245396 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jl6jn"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.249900 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-k9jzg"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.250723 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-xt4hf"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.255315 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.255783 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.259326 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.259477 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-4bk2w"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.259541 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.259561 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.260379 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdnm6"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.260678 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de8d912e-7616-42ee-a688-b43d5b85dc44-oauth-serving-cert\") pod \"console-f9d7485db-k7krj\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.260711 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a60cb615-f335-45fa-86dd-ddf121e62737-config\") pod \"route-controller-manager-6576b87f9c-b7jm8\" (UID: \"a60cb615-f335-45fa-86dd-ddf121e62737\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.260729 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de8d912e-7616-42ee-a688-b43d5b85dc44-trusted-ca-bundle\") pod \"console-f9d7485db-k7krj\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.260745 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vmbj\" (UniqueName: \"kubernetes.io/projected/a60cb615-f335-45fa-86dd-ddf121e62737-kube-api-access-6vmbj\") pod \"route-controller-manager-6576b87f9c-b7jm8\" (UID: \"a60cb615-f335-45fa-86dd-ddf121e62737\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.260764 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de8d912e-7616-42ee-a688-b43d5b85dc44-console-config\") pod \"console-f9d7485db-k7krj\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.260778 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a60cb615-f335-45fa-86dd-ddf121e62737-client-ca\") pod \"route-controller-manager-6576b87f9c-b7jm8\" (UID: \"a60cb615-f335-45fa-86dd-ddf121e62737\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.260793 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a60cb615-f335-45fa-86dd-ddf121e62737-serving-cert\") pod \"route-controller-manager-6576b87f9c-b7jm8\" (UID: \"a60cb615-f335-45fa-86dd-ddf121e62737\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.260815 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g8xx\" (UniqueName: \"kubernetes.io/projected/0b6717fd-e636-4be6-8c04-c9b46924a3b2-kube-api-access-9g8xx\") pod \"machine-api-operator-5694c8668f-jt5vz\" (UID: \"0b6717fd-e636-4be6-8c04-c9b46924a3b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jt5vz" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.260840 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b6717fd-e636-4be6-8c04-c9b46924a3b2-config\") pod \"machine-api-operator-5694c8668f-jt5vz\" (UID: \"0b6717fd-e636-4be6-8c04-c9b46924a3b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jt5vz" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.260854 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de8d912e-7616-42ee-a688-b43d5b85dc44-console-oauth-config\") pod \"console-f9d7485db-k7krj\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.260869 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1337d1ae-1f3f-4d65-aa7d-1e1f97e26e90-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-h6grs\" (UID: \"1337d1ae-1f3f-4d65-aa7d-1e1f97e26e90\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6grs" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.260887 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de8d912e-7616-42ee-a688-b43d5b85dc44-console-serving-cert\") pod \"console-f9d7485db-k7krj\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.260902 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de8d912e-7616-42ee-a688-b43d5b85dc44-service-ca\") pod \"console-f9d7485db-k7krj\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.260920 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7cq4\" (UniqueName: \"kubernetes.io/projected/1337d1ae-1f3f-4d65-aa7d-1e1f97e26e90-kube-api-access-k7cq4\") pod \"cluster-samples-operator-665b6dd947-h6grs\" (UID: \"1337d1ae-1f3f-4d65-aa7d-1e1f97e26e90\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6grs" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.260938 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wvh9\" (UniqueName: \"kubernetes.io/projected/de8d912e-7616-42ee-a688-b43d5b85dc44-kube-api-access-4wvh9\") pod \"console-f9d7485db-k7krj\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.260972 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0b6717fd-e636-4be6-8c04-c9b46924a3b2-images\") pod \"machine-api-operator-5694c8668f-jt5vz\" (UID: \"0b6717fd-e636-4be6-8c04-c9b46924a3b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jt5vz" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.260991 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0b6717fd-e636-4be6-8c04-c9b46924a3b2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jt5vz\" (UID: \"0b6717fd-e636-4be6-8c04-c9b46924a3b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jt5vz" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.261007 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr8k6\" (UniqueName: \"kubernetes.io/projected/6e84a900-8f09-438e-a365-60d6b9fc835b-kube-api-access-mr8k6\") pod \"downloads-7954f5f757-xt4hf\" (UID: \"6e84a900-8f09-438e-a365-60d6b9fc835b\") " pod="openshift-console/downloads-7954f5f757-xt4hf" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.262894 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0b6717fd-e636-4be6-8c04-c9b46924a3b2-config\") pod \"machine-api-operator-5694c8668f-jt5vz\" (UID: \"0b6717fd-e636-4be6-8c04-c9b46924a3b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jt5vz" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.263887 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de8d912e-7616-42ee-a688-b43d5b85dc44-oauth-serving-cert\") pod \"console-f9d7485db-k7krj\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.264601 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0b6717fd-e636-4be6-8c04-c9b46924a3b2-images\") pod \"machine-api-operator-5694c8668f-jt5vz\" (UID: \"0b6717fd-e636-4be6-8c04-c9b46924a3b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jt5vz" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.265074 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a60cb615-f335-45fa-86dd-ddf121e62737-config\") pod \"route-controller-manager-6576b87f9c-b7jm8\" (UID: \"a60cb615-f335-45fa-86dd-ddf121e62737\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.266561 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a60cb615-f335-45fa-86dd-ddf121e62737-client-ca\") pod \"route-controller-manager-6576b87f9c-b7jm8\" (UID: \"a60cb615-f335-45fa-86dd-ddf121e62737\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.266776 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.266945 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de8d912e-7616-42ee-a688-b43d5b85dc44-service-ca\") pod \"console-f9d7485db-k7krj\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.267023 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de8d912e-7616-42ee-a688-b43d5b85dc44-console-config\") pod \"console-f9d7485db-k7krj\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.268984 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-xd95h"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.269609 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46h7s"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.269691 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-xd95h" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.270202 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a60cb615-f335-45fa-86dd-ddf121e62737-serving-cert\") pod \"route-controller-manager-6576b87f9c-b7jm8\" (UID: \"a60cb615-f335-45fa-86dd-ddf121e62737\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.271314 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.271455 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de8d912e-7616-42ee-a688-b43d5b85dc44-console-oauth-config\") pod \"console-f9d7485db-k7krj\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.271574 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.272431 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de8d912e-7616-42ee-a688-b43d5b85dc44-console-serving-cert\") pod \"console-f9d7485db-k7krj\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.272481 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2zrvx"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.273383 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-k7krj"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.276168 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de8d912e-7616-42ee-a688-b43d5b85dc44-trusted-ca-bundle\") pod \"console-f9d7485db-k7krj\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.276229 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-c4gcj"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.276258 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5dqnp"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.277109 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0b6717fd-e636-4be6-8c04-c9b46924a3b2-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-jt5vz\" (UID: \"0b6717fd-e636-4be6-8c04-c9b46924a3b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jt5vz" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.277142 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6grs"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.281043 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r57q7"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.281079 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljb4c"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.282194 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8hngw"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.283626 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.283761 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hvdx7"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.283785 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgmzv"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.284730 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.285659 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-zg96g"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.286511 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/1337d1ae-1f3f-4d65-aa7d-1e1f97e26e90-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-h6grs\" (UID: \"1337d1ae-1f3f-4d65-aa7d-1e1f97e26e90\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6grs" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.288392 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566884-j8dqd"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.290441 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.290479 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-mhgwt"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.291420 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7qwpx"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.292489 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-56g5d"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.294695 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tl6hw"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.295803 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8w6wp"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.297109 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bw9k5"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.297890 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-l7crv"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.298873 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-26mgc"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.299838 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4ddcl"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.301004 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kz2ff"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.303786 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.303820 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmxvj"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.303854 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xdmgm"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.304613 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wkc4c"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.304696 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xdmgm" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.310220 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-mpgxt"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.317677 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566875-2dssd"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.317840 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-mpgxt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.319413 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-w8bf7"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.321703 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xdmgm"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.322887 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-mpgxt"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.322966 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.328377 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-6v2ms"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.329752 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6v2ms" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.337824 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6v2ms"] Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.343399 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.362651 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.382828 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.392606 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.411488 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.423225 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.463602 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.483649 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.503020 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.522623 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.543723 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.563323 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.582439 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.602723 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.622086 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.642918 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.662824 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.683700 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.703017 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.723019 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.743646 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.762618 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.783226 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.803557 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.822837 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.842960 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.863251 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.883027 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.902446 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.923168 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.943173 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.962902 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 13:24:43 crc kubenswrapper[4973]: I0320 13:24:43.982813 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.002976 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.023257 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.043059 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.064434 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.082765 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.103611 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.124190 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.142904 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.163394 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.184101 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.203060 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.222090 4973 request.go:700] Waited for 1.001567353s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmco-proxy-tls&limit=500&resourceVersion=0 Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.223624 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.242938 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.264086 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.282557 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.303135 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.330279 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.342849 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.362751 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.382701 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.403194 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.423189 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.443257 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.462945 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.483050 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.503253 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.522998 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.543898 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.563662 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.583570 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.602834 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.623730 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.644472 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.663830 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.684688 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.704050 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.724208 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.743700 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.764098 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.784645 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.804006 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.824305 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.844533 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.863382 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.931671 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr8k6\" (UniqueName: \"kubernetes.io/projected/6e84a900-8f09-438e-a365-60d6b9fc835b-kube-api-access-mr8k6\") pod \"downloads-7954f5f757-xt4hf\" (UID: \"6e84a900-8f09-438e-a365-60d6b9fc835b\") " pod="openshift-console/downloads-7954f5f757-xt4hf" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.958968 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7cq4\" (UniqueName: \"kubernetes.io/projected/1337d1ae-1f3f-4d65-aa7d-1e1f97e26e90-kube-api-access-k7cq4\") pod \"cluster-samples-operator-665b6dd947-h6grs\" (UID: \"1337d1ae-1f3f-4d65-aa7d-1e1f97e26e90\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6grs" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.973071 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wvh9\" (UniqueName: \"kubernetes.io/projected/de8d912e-7616-42ee-a688-b43d5b85dc44-kube-api-access-4wvh9\") pod \"console-f9d7485db-k7krj\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:24:44 crc kubenswrapper[4973]: I0320 13:24:44.991074 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vmbj\" (UniqueName: \"kubernetes.io/projected/a60cb615-f335-45fa-86dd-ddf121e62737-kube-api-access-6vmbj\") pod \"route-controller-manager-6576b87f9c-b7jm8\" (UID: \"a60cb615-f335-45fa-86dd-ddf121e62737\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.007182 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.012224 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g8xx\" (UniqueName: \"kubernetes.io/projected/0b6717fd-e636-4be6-8c04-c9b46924a3b2-kube-api-access-9g8xx\") pod \"machine-api-operator-5694c8668f-jt5vz\" (UID: \"0b6717fd-e636-4be6-8c04-c9b46924a3b2\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-jt5vz" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.024727 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.043675 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.050266 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6grs" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.062607 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.064171 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.086714 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.103708 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.108128 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-xt4hf" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.118866 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.123913 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.144516 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.164145 4973 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.197489 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.200476 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-jt5vz" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.205431 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.223547 4973 request.go:700] Waited for 1.892539624s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-dns/secrets?fieldSelector=metadata.name%3Ddns-dockercfg-jwfmh&limit=500&resourceVersion=0 Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.225861 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.243546 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.295369 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/391ef260-a6ea-4cab-bca3-280435898381-etcd-client\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.295418 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dce229d-701a-4a70-9c44-5f99d4c6fe79-serving-cert\") pod \"openshift-config-operator-7777fb866f-2zrvx\" (UID: \"2dce229d-701a-4a70-9c44-5f99d4c6fe79\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2zrvx" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.295444 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47aff328-d38c-426c-8462-12c6b98a82fd-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jl6jn\" (UID: \"47aff328-d38c-426c-8462-12c6b98a82fd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jl6jn" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.295467 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.295514 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.295537 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/391ef260-a6ea-4cab-bca3-280435898381-node-pullsecrets\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.295558 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/69bb010a-a13d-4458-8118-80c5aebb6e65-encryption-config\") pod \"apiserver-7bbb656c7d-f8vm4\" (UID: \"69bb010a-a13d-4458-8118-80c5aebb6e65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.295582 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2dce229d-701a-4a70-9c44-5f99d4c6fe79-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2zrvx\" (UID: \"2dce229d-701a-4a70-9c44-5f99d4c6fe79\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2zrvx" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.295605 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8521b3b0-4fb2-45b2-90b5-7080e766aafa-config\") pod \"controller-manager-879f6c89f-mtrbp\" (UID: \"8521b3b0-4fb2-45b2-90b5-7080e766aafa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.295628 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.295654 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97449\" (UniqueName: \"kubernetes.io/projected/47aff328-d38c-426c-8462-12c6b98a82fd-kube-api-access-97449\") pod \"openshift-apiserver-operator-796bbdcf4f-jl6jn\" (UID: \"47aff328-d38c-426c-8462-12c6b98a82fd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jl6jn" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.295675 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/391ef260-a6ea-4cab-bca3-280435898381-encryption-config\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.295694 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b1fe291e-3490-49c0-9443-e5b0f03db19c-audit-dir\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.295715 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lg7ck\" (UniqueName: \"kubernetes.io/projected/5f512921-f02c-464b-af06-d65fb95f0071-kube-api-access-lg7ck\") pod \"console-operator-58897d9998-4bk2w\" (UID: \"5f512921-f02c-464b-af06-d65fb95f0071\") " pod="openshift-console-operator/console-operator-58897d9998-4bk2w" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.295736 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/330a4463-669f-4f2e-aa4b-614ab0654579-trusted-ca\") pod \"ingress-operator-5b745b69d9-8hngw\" (UID: \"330a4463-669f-4f2e-aa4b-614ab0654579\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8hngw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.295769 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d16677da-48c3-4fd4-9e59-b0013daa4825-auth-proxy-config\") pod \"machine-approver-56656f9798-7qt22\" (UID: \"d16677da-48c3-4fd4-9e59-b0013daa4825\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7qt22" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.295791 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx7wc\" (UniqueName: \"kubernetes.io/projected/8521b3b0-4fb2-45b2-90b5-7080e766aafa-kube-api-access-vx7wc\") pod \"controller-manager-879f6c89f-mtrbp\" (UID: \"8521b3b0-4fb2-45b2-90b5-7080e766aafa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.295814 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr5bs\" (UniqueName: \"kubernetes.io/projected/b1fe291e-3490-49c0-9443-e5b0f03db19c-kube-api-access-jr5bs\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.295835 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5dlv\" (UniqueName: \"kubernetes.io/projected/0bf18c8f-c77f-4208-b464-19772b0221f4-kube-api-access-l5dlv\") pod \"openshift-controller-manager-operator-756b6f6bc6-tdnm6\" (UID: \"0bf18c8f-c77f-4208-b464-19772b0221f4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdnm6" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.295863 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/391ef260-a6ea-4cab-bca3-280435898381-serving-cert\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.295907 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/391ef260-a6ea-4cab-bca3-280435898381-audit-dir\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.295976 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.296010 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.296033 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f512921-f02c-464b-af06-d65fb95f0071-serving-cert\") pod \"console-operator-58897d9998-4bk2w\" (UID: \"5f512921-f02c-464b-af06-d65fb95f0071\") " pod="openshift-console-operator/console-operator-58897d9998-4bk2w" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.296068 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxv7v\" (UniqueName: \"kubernetes.io/projected/322233fd-b71f-4ef5-931f-58e98326386a-kube-api-access-xxv7v\") pod \"authentication-operator-69f744f599-c4gcj\" (UID: \"322233fd-b71f-4ef5-931f-58e98326386a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c4gcj" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.296126 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.296155 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2gv9\" (UniqueName: \"kubernetes.io/projected/d16677da-48c3-4fd4-9e59-b0013daa4825-kube-api-access-r2gv9\") pod \"machine-approver-56656f9798-7qt22\" (UID: \"d16677da-48c3-4fd4-9e59-b0013daa4825\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7qt22" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.296192 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a4e363de-fd5c-4f76-8943-ae3c56f3765b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.296219 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/391ef260-a6ea-4cab-bca3-280435898381-audit\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.296239 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lzxb\" (UniqueName: \"kubernetes.io/projected/391ef260-a6ea-4cab-bca3-280435898381-kube-api-access-6lzxb\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.296265 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8521b3b0-4fb2-45b2-90b5-7080e766aafa-serving-cert\") pod \"controller-manager-879f6c89f-mtrbp\" (UID: \"8521b3b0-4fb2-45b2-90b5-7080e766aafa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.296286 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/322233fd-b71f-4ef5-931f-58e98326386a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-c4gcj\" (UID: \"322233fd-b71f-4ef5-931f-58e98326386a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c4gcj" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.296369 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69bb010a-a13d-4458-8118-80c5aebb6e65-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-f8vm4\" (UID: \"69bb010a-a13d-4458-8118-80c5aebb6e65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.296526 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4e363de-fd5c-4f76-8943-ae3c56f3765b-trusted-ca\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.296581 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.296656 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.296680 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.296747 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a4e363de-fd5c-4f76-8943-ae3c56f3765b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.296770 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/69bb010a-a13d-4458-8118-80c5aebb6e65-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-f8vm4\" (UID: \"69bb010a-a13d-4458-8118-80c5aebb6e65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.296790 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.296817 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f512921-f02c-464b-af06-d65fb95f0071-config\") pod \"console-operator-58897d9998-4bk2w\" (UID: \"5f512921-f02c-464b-af06-d65fb95f0071\") " pod="openshift-console-operator/console-operator-58897d9998-4bk2w" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.296838 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f512921-f02c-464b-af06-d65fb95f0071-trusted-ca\") pod \"console-operator-58897d9998-4bk2w\" (UID: \"5f512921-f02c-464b-af06-d65fb95f0071\") " pod="openshift-console-operator/console-operator-58897d9998-4bk2w" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.296858 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dq4gn\" (UniqueName: \"kubernetes.io/projected/95609887-af52-4179-88e3-7f3730642377-kube-api-access-dq4gn\") pod \"cluster-image-registry-operator-dc59b4c8b-46h7s\" (UID: \"95609887-af52-4179-88e3-7f3730642377\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46h7s" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.296891 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69bb010a-a13d-4458-8118-80c5aebb6e65-serving-cert\") pod \"apiserver-7bbb656c7d-f8vm4\" (UID: \"69bb010a-a13d-4458-8118-80c5aebb6e65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.296948 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2aab7bd5-eca7-4faa-a4fc-d83be6cc6c3c-metrics-tls\") pod \"dns-operator-744455d44c-5dqnp\" (UID: \"2aab7bd5-eca7-4faa-a4fc-d83be6cc6c3c\") " pod="openshift-dns-operator/dns-operator-744455d44c-5dqnp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.296993 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a4e363de-fd5c-4f76-8943-ae3c56f3765b-registry-tls\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.297016 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/391ef260-a6ea-4cab-bca3-280435898381-etcd-serving-ca\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.297036 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/69bb010a-a13d-4458-8118-80c5aebb6e65-audit-dir\") pod \"apiserver-7bbb656c7d-f8vm4\" (UID: \"69bb010a-a13d-4458-8118-80c5aebb6e65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.297093 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpb8n\" (UniqueName: \"kubernetes.io/projected/2aab7bd5-eca7-4faa-a4fc-d83be6cc6c3c-kube-api-access-lpb8n\") pod \"dns-operator-744455d44c-5dqnp\" (UID: \"2aab7bd5-eca7-4faa-a4fc-d83be6cc6c3c\") " pod="openshift-dns-operator/dns-operator-744455d44c-5dqnp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.297116 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dr4d\" (UniqueName: \"kubernetes.io/projected/330a4463-669f-4f2e-aa4b-614ab0654579-kube-api-access-4dr4d\") pod \"ingress-operator-5b745b69d9-8hngw\" (UID: \"330a4463-669f-4f2e-aa4b-614ab0654579\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8hngw" Mar 20 13:24:45 crc kubenswrapper[4973]: E0320 13:24:45.297709 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:45.797688999 +0000 UTC m=+206.541358753 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.298134 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a4e363de-fd5c-4f76-8943-ae3c56f3765b-registry-certificates\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.298170 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/69bb010a-a13d-4458-8118-80c5aebb6e65-etcd-client\") pod \"apiserver-7bbb656c7d-f8vm4\" (UID: \"69bb010a-a13d-4458-8118-80c5aebb6e65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.298194 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d16677da-48c3-4fd4-9e59-b0013daa4825-machine-approver-tls\") pod \"machine-approver-56656f9798-7qt22\" (UID: \"d16677da-48c3-4fd4-9e59-b0013daa4825\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7qt22" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.298231 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/322233fd-b71f-4ef5-931f-58e98326386a-config\") pod \"authentication-operator-69f744f599-c4gcj\" (UID: \"322233fd-b71f-4ef5-931f-58e98326386a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c4gcj" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.298282 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bf18c8f-c77f-4208-b464-19772b0221f4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-tdnm6\" (UID: \"0bf18c8f-c77f-4208-b464-19772b0221f4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdnm6" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.298312 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/322233fd-b71f-4ef5-931f-58e98326386a-serving-cert\") pod \"authentication-operator-69f744f599-c4gcj\" (UID: \"322233fd-b71f-4ef5-931f-58e98326386a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c4gcj" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.298479 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjt9x\" (UniqueName: \"kubernetes.io/projected/2dce229d-701a-4a70-9c44-5f99d4c6fe79-kube-api-access-gjt9x\") pod \"openshift-config-operator-7777fb866f-2zrvx\" (UID: \"2dce229d-701a-4a70-9c44-5f99d4c6fe79\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2zrvx" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.298509 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.298529 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/322233fd-b71f-4ef5-931f-58e98326386a-service-ca-bundle\") pod \"authentication-operator-69f744f599-c4gcj\" (UID: \"322233fd-b71f-4ef5-931f-58e98326386a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c4gcj" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.298558 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95609887-af52-4179-88e3-7f3730642377-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-46h7s\" (UID: \"95609887-af52-4179-88e3-7f3730642377\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46h7s" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.298578 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.298666 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95609887-af52-4179-88e3-7f3730642377-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-46h7s\" (UID: \"95609887-af52-4179-88e3-7f3730642377\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46h7s" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.298706 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz7fv\" (UniqueName: \"kubernetes.io/projected/a4e363de-fd5c-4f76-8943-ae3c56f3765b-kube-api-access-xz7fv\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.298725 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/391ef260-a6ea-4cab-bca3-280435898381-config\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.298742 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/95609887-af52-4179-88e3-7f3730642377-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-46h7s\" (UID: \"95609887-af52-4179-88e3-7f3730642377\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46h7s" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.298767 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/391ef260-a6ea-4cab-bca3-280435898381-trusted-ca-bundle\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.298785 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8521b3b0-4fb2-45b2-90b5-7080e766aafa-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mtrbp\" (UID: \"8521b3b0-4fb2-45b2-90b5-7080e766aafa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.298814 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/391ef260-a6ea-4cab-bca3-280435898381-image-import-ca\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.298829 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/69bb010a-a13d-4458-8118-80c5aebb6e65-audit-policies\") pod \"apiserver-7bbb656c7d-f8vm4\" (UID: \"69bb010a-a13d-4458-8118-80c5aebb6e65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.298852 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d16677da-48c3-4fd4-9e59-b0013daa4825-config\") pod \"machine-approver-56656f9798-7qt22\" (UID: \"d16677da-48c3-4fd4-9e59-b0013daa4825\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7qt22" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.298868 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bf18c8f-c77f-4208-b464-19772b0221f4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-tdnm6\" (UID: \"0bf18c8f-c77f-4208-b464-19772b0221f4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdnm6" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.298883 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/330a4463-669f-4f2e-aa4b-614ab0654579-metrics-tls\") pod \"ingress-operator-5b745b69d9-8hngw\" (UID: \"330a4463-669f-4f2e-aa4b-614ab0654579\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8hngw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.298900 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkmln\" (UniqueName: \"kubernetes.io/projected/69bb010a-a13d-4458-8118-80c5aebb6e65-kube-api-access-zkmln\") pod \"apiserver-7bbb656c7d-f8vm4\" (UID: \"69bb010a-a13d-4458-8118-80c5aebb6e65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.298923 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8521b3b0-4fb2-45b2-90b5-7080e766aafa-client-ca\") pod \"controller-manager-879f6c89f-mtrbp\" (UID: \"8521b3b0-4fb2-45b2-90b5-7080e766aafa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.298957 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b1fe291e-3490-49c0-9443-e5b0f03db19c-audit-policies\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.298975 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/330a4463-669f-4f2e-aa4b-614ab0654579-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8hngw\" (UID: \"330a4463-669f-4f2e-aa4b-614ab0654579\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8hngw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.298996 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a4e363de-fd5c-4f76-8943-ae3c56f3765b-bound-sa-token\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.299012 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47aff328-d38c-426c-8462-12c6b98a82fd-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jl6jn\" (UID: \"47aff328-d38c-426c-8462-12c6b98a82fd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jl6jn" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.338237 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-k7krj"] Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.378126 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-xt4hf"] Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.399601 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.399805 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.399827 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f512921-f02c-464b-af06-d65fb95f0071-serving-cert\") pod \"console-operator-58897d9998-4bk2w\" (UID: \"5f512921-f02c-464b-af06-d65fb95f0071\") " pod="openshift-console-operator/console-operator-58897d9998-4bk2w" Mar 20 13:24:45 crc kubenswrapper[4973]: E0320 13:24:45.399858 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:45.899830537 +0000 UTC m=+206.643500281 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.399896 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6847acbc-29ec-4939-a6aa-4617b8e438e7-cert\") pod \"ingress-canary-xdmgm\" (UID: \"6847acbc-29ec-4939-a6aa-4617b8e438e7\") " pod="openshift-ingress-canary/ingress-canary-xdmgm" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.399935 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/17fc0166-9183-4bbd-a091-644b431349e1-plugins-dir\") pod \"csi-hostpathplugin-mpgxt\" (UID: \"17fc0166-9183-4bbd-a091-644b431349e1\") " pod="hostpath-provisioner/csi-hostpathplugin-mpgxt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.399961 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2gv9\" (UniqueName: \"kubernetes.io/projected/d16677da-48c3-4fd4-9e59-b0013daa4825-kube-api-access-r2gv9\") pod \"machine-approver-56656f9798-7qt22\" (UID: \"d16677da-48c3-4fd4-9e59-b0013daa4825\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7qt22" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.399978 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/391ef260-a6ea-4cab-bca3-280435898381-audit\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.399993 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lzxb\" (UniqueName: \"kubernetes.io/projected/391ef260-a6ea-4cab-bca3-280435898381-kube-api-access-6lzxb\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400010 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8521b3b0-4fb2-45b2-90b5-7080e766aafa-serving-cert\") pod \"controller-manager-879f6c89f-mtrbp\" (UID: \"8521b3b0-4fb2-45b2-90b5-7080e766aafa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400027 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a4e363de-fd5c-4f76-8943-ae3c56f3765b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400048 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94a86206-8505-4c7d-82aa-2b482c0eb08b-serving-cert\") pod \"service-ca-operator-777779d784-56g5d\" (UID: \"94a86206-8505-4c7d-82aa-2b482c0eb08b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-56g5d" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400064 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e-apiservice-cert\") pod \"packageserver-d55dfcdfc-z2rch\" (UID: \"925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400081 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4e363de-fd5c-4f76-8943-ae3c56f3765b-trusted-ca\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400098 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/17fc0166-9183-4bbd-a091-644b431349e1-registration-dir\") pod \"csi-hostpathplugin-mpgxt\" (UID: \"17fc0166-9183-4bbd-a091-644b431349e1\") " pod="hostpath-provisioner/csi-hostpathplugin-mpgxt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400126 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400143 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400159 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e880f23-4bef-4e96-bf00-c94dc4551c5a-metrics-certs\") pod \"router-default-5444994796-wgtgl\" (UID: \"0e880f23-4bef-4e96-bf00-c94dc4551c5a\") " pod="openshift-ingress/router-default-5444994796-wgtgl" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400176 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/69bb010a-a13d-4458-8118-80c5aebb6e65-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-f8vm4\" (UID: \"69bb010a-a13d-4458-8118-80c5aebb6e65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400193 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400208 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f512921-f02c-464b-af06-d65fb95f0071-config\") pod \"console-operator-58897d9998-4bk2w\" (UID: \"5f512921-f02c-464b-af06-d65fb95f0071\") " pod="openshift-console-operator/console-operator-58897d9998-4bk2w" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400226 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f512921-f02c-464b-af06-d65fb95f0071-trusted-ca\") pod \"console-operator-58897d9998-4bk2w\" (UID: \"5f512921-f02c-464b-af06-d65fb95f0071\") " pod="openshift-console-operator/console-operator-58897d9998-4bk2w" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400242 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dq4gn\" (UniqueName: \"kubernetes.io/projected/95609887-af52-4179-88e3-7f3730642377-kube-api-access-dq4gn\") pod \"cluster-image-registry-operator-dc59b4c8b-46h7s\" (UID: \"95609887-af52-4179-88e3-7f3730642377\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46h7s" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400264 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt9hg\" (UniqueName: \"kubernetes.io/projected/94a86206-8505-4c7d-82aa-2b482c0eb08b-kube-api-access-lt9hg\") pod \"service-ca-operator-777779d784-56g5d\" (UID: \"94a86206-8505-4c7d-82aa-2b482c0eb08b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-56g5d" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400282 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb200682-a8ee-406a-9c09-d881f40842e7-metrics-tls\") pod \"dns-default-6v2ms\" (UID: \"cb200682-a8ee-406a-9c09-d881f40842e7\") " pod="openshift-dns/dns-default-6v2ms" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400299 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a4e363de-fd5c-4f76-8943-ae3c56f3765b-registry-tls\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400314 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/391ef260-a6ea-4cab-bca3-280435898381-etcd-serving-ca\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400330 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpb8n\" (UniqueName: \"kubernetes.io/projected/2aab7bd5-eca7-4faa-a4fc-d83be6cc6c3c-kube-api-access-lpb8n\") pod \"dns-operator-744455d44c-5dqnp\" (UID: \"2aab7bd5-eca7-4faa-a4fc-d83be6cc6c3c\") " pod="openshift-dns-operator/dns-operator-744455d44c-5dqnp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400371 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d16677da-48c3-4fd4-9e59-b0013daa4825-machine-approver-tls\") pod \"machine-approver-56656f9798-7qt22\" (UID: \"d16677da-48c3-4fd4-9e59-b0013daa4825\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7qt22" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400389 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/83d5e1a7-2827-4dfd-9feb-3b9630a62515-etcd-ca\") pod \"etcd-operator-b45778765-mhgwt\" (UID: \"83d5e1a7-2827-4dfd-9feb-3b9630a62515\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mhgwt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400406 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5e47b273-3bc0-4f32-8bdb-aa283db4d8a1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-kz2ff\" (UID: \"5e47b273-3bc0-4f32-8bdb-aa283db4d8a1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kz2ff" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400433 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a4e363de-fd5c-4f76-8943-ae3c56f3765b-registry-certificates\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400451 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b0cbdcce-514f-4b72-8c8c-17029b7217a8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-l7crv\" (UID: \"b0cbdcce-514f-4b72-8c8c-17029b7217a8\") " pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400467 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6de410a6-e2d0-4c6d-8a38-d8d5c49a30e5-srv-cert\") pod \"catalog-operator-68c6474976-4ddcl\" (UID: \"6de410a6-e2d0-4c6d-8a38-d8d5c49a30e5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4ddcl" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400484 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb200682-a8ee-406a-9c09-d881f40842e7-config-volume\") pod \"dns-default-6v2ms\" (UID: \"cb200682-a8ee-406a-9c09-d881f40842e7\") " pod="openshift-dns/dns-default-6v2ms" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400501 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjt9x\" (UniqueName: \"kubernetes.io/projected/2dce229d-701a-4a70-9c44-5f99d4c6fe79-kube-api-access-gjt9x\") pod \"openshift-config-operator-7777fb866f-2zrvx\" (UID: \"2dce229d-701a-4a70-9c44-5f99d4c6fe79\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2zrvx" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400520 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400537 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/322233fd-b71f-4ef5-931f-58e98326386a-service-ca-bundle\") pod \"authentication-operator-69f744f599-c4gcj\" (UID: \"322233fd-b71f-4ef5-931f-58e98326386a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c4gcj" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400556 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400571 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/17fc0166-9183-4bbd-a091-644b431349e1-socket-dir\") pod \"csi-hostpathplugin-mpgxt\" (UID: \"17fc0166-9183-4bbd-a091-644b431349e1\") " pod="hostpath-provisioner/csi-hostpathplugin-mpgxt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400589 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e-tmpfs\") pod \"packageserver-d55dfcdfc-z2rch\" (UID: \"925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" Mar 20 13:24:45 crc kubenswrapper[4973]: E0320 13:24:45.400614 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:45.90059541 +0000 UTC m=+206.644265154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400640 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e-webhook-cert\") pod \"packageserver-d55dfcdfc-z2rch\" (UID: \"925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400661 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e880f23-4bef-4e96-bf00-c94dc4551c5a-service-ca-bundle\") pod \"router-default-5444994796-wgtgl\" (UID: \"0e880f23-4bef-4e96-bf00-c94dc4551c5a\") " pod="openshift-ingress/router-default-5444994796-wgtgl" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400692 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/95609887-af52-4179-88e3-7f3730642377-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-46h7s\" (UID: \"95609887-af52-4179-88e3-7f3730642377\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46h7s" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400710 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b0cbdcce-514f-4b72-8c8c-17029b7217a8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-l7crv\" (UID: \"b0cbdcce-514f-4b72-8c8c-17029b7217a8\") " pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400736 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/391ef260-a6ea-4cab-bca3-280435898381-config\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400754 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8521b3b0-4fb2-45b2-90b5-7080e766aafa-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mtrbp\" (UID: \"8521b3b0-4fb2-45b2-90b5-7080e766aafa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400771 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ef043a74-5704-48d8-abc6-4a1afef82b9c-node-bootstrap-token\") pod \"machine-config-server-xd95h\" (UID: \"ef043a74-5704-48d8-abc6-4a1afef82b9c\") " pod="openshift-machine-config-operator/machine-config-server-xd95h" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400787 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/391ef260-a6ea-4cab-bca3-280435898381-image-import-ca\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400802 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/330a4463-669f-4f2e-aa4b-614ab0654579-metrics-tls\") pod \"ingress-operator-5b745b69d9-8hngw\" (UID: \"330a4463-669f-4f2e-aa4b-614ab0654579\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8hngw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400851 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9d6f338-6a7a-4ba2-b1c9-12485bb30937-config-volume\") pod \"collect-profiles-29566875-2dssd\" (UID: \"b9d6f338-6a7a-4ba2-b1c9-12485bb30937\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-2dssd" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400868 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/330a4463-669f-4f2e-aa4b-614ab0654579-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8hngw\" (UID: \"330a4463-669f-4f2e-aa4b-614ab0654579\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8hngw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400883 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m92n\" (UniqueName: \"kubernetes.io/projected/cb200682-a8ee-406a-9c09-d881f40842e7-kube-api-access-7m92n\") pod \"dns-default-6v2ms\" (UID: \"cb200682-a8ee-406a-9c09-d881f40842e7\") " pod="openshift-dns/dns-default-6v2ms" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400900 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j55fz\" (UniqueName: \"kubernetes.io/projected/b22a43e3-90de-4609-bf64-006de1716ae3-kube-api-access-j55fz\") pod \"control-plane-machine-set-operator-78cbb6b69f-wkc4c\" (UID: \"b22a43e3-90de-4609-bf64-006de1716ae3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wkc4c" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400916 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83d5e1a7-2827-4dfd-9feb-3b9630a62515-config\") pod \"etcd-operator-b45778765-mhgwt\" (UID: \"83d5e1a7-2827-4dfd-9feb-3b9630a62515\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mhgwt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400932 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxnnr\" (UniqueName: \"kubernetes.io/projected/a56cf4d4-faa0-469e-b856-d1c030dd19d9-kube-api-access-mxnnr\") pod \"service-ca-9c57cc56f-8w6wp\" (UID: \"a56cf4d4-faa0-469e-b856-d1c030dd19d9\") " pod="openshift-service-ca/service-ca-9c57cc56f-8w6wp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400947 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47aff328-d38c-426c-8462-12c6b98a82fd-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jl6jn\" (UID: \"47aff328-d38c-426c-8462-12c6b98a82fd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jl6jn" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400964 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a4e363de-fd5c-4f76-8943-ae3c56f3765b-bound-sa-token\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400982 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bbb5ef4b-b4d4-4ef5-aab3-b9b8f759c6b4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-w8bf7\" (UID: \"bbb5ef4b-b4d4-4ef5-aab3-b9b8f759c6b4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8bf7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.400999 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/391ef260-a6ea-4cab-bca3-280435898381-etcd-client\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401016 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dce229d-701a-4a70-9c44-5f99d4c6fe79-serving-cert\") pod \"openshift-config-operator-7777fb866f-2zrvx\" (UID: \"2dce229d-701a-4a70-9c44-5f99d4c6fe79\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2zrvx" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401032 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401050 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh8zf\" (UniqueName: \"kubernetes.io/projected/b0cbdcce-514f-4b72-8c8c-17029b7217a8-kube-api-access-vh8zf\") pod \"marketplace-operator-79b997595-l7crv\" (UID: \"b0cbdcce-514f-4b72-8c8c-17029b7217a8\") " pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401067 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401083 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/69bb010a-a13d-4458-8118-80c5aebb6e65-encryption-config\") pod \"apiserver-7bbb656c7d-f8vm4\" (UID: \"69bb010a-a13d-4458-8118-80c5aebb6e65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401098 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401119 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/94e24fab-c452-4121-a617-9d5f02b8ba1b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pmxvj\" (UID: \"94e24fab-c452-4121-a617-9d5f02b8ba1b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmxvj" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401137 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/391ef260-a6ea-4cab-bca3-280435898381-node-pullsecrets\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401153 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-97449\" (UniqueName: \"kubernetes.io/projected/47aff328-d38c-426c-8462-12c6b98a82fd-kube-api-access-97449\") pod \"openshift-apiserver-operator-796bbdcf4f-jl6jn\" (UID: \"47aff328-d38c-426c-8462-12c6b98a82fd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jl6jn" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401169 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b1fe291e-3490-49c0-9443-e5b0f03db19c-audit-dir\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401187 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9d6f338-6a7a-4ba2-b1c9-12485bb30937-secret-volume\") pod \"collect-profiles-29566875-2dssd\" (UID: \"b9d6f338-6a7a-4ba2-b1c9-12485bb30937\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-2dssd" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401184 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/69bb010a-a13d-4458-8118-80c5aebb6e65-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-f8vm4\" (UID: \"69bb010a-a13d-4458-8118-80c5aebb6e65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401203 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94e24fab-c452-4121-a617-9d5f02b8ba1b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pmxvj\" (UID: \"94e24fab-c452-4121-a617-9d5f02b8ba1b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmxvj" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401279 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb5tr\" (UniqueName: \"kubernetes.io/projected/596b91f0-06b9-4e89-9816-da8049dad9e3-kube-api-access-tb5tr\") pod \"machine-config-operator-74547568cd-hvdx7\" (UID: \"596b91f0-06b9-4e89-9816-da8049dad9e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hvdx7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401316 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx7wc\" (UniqueName: \"kubernetes.io/projected/8521b3b0-4fb2-45b2-90b5-7080e766aafa-kube-api-access-vx7wc\") pod \"controller-manager-879f6c89f-mtrbp\" (UID: \"8521b3b0-4fb2-45b2-90b5-7080e766aafa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401380 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9aeba530-5722-4f41-9082-3f9316f06505-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wgmzv\" (UID: \"9aeba530-5722-4f41-9082-3f9316f06505\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgmzv" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401408 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37071a22-82fc-4b04-bc09-6535395faae6-config\") pod \"kube-controller-manager-operator-78b949d7b-7qwpx\" (UID: \"37071a22-82fc-4b04-bc09-6535395faae6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7qwpx" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401432 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a56cf4d4-faa0-469e-b856-d1c030dd19d9-signing-key\") pod \"service-ca-9c57cc56f-8w6wp\" (UID: \"a56cf4d4-faa0-469e-b856-d1c030dd19d9\") " pod="openshift-service-ca/service-ca-9c57cc56f-8w6wp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401462 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/391ef260-a6ea-4cab-bca3-280435898381-audit\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401476 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/391ef260-a6ea-4cab-bca3-280435898381-serving-cert\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401499 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fca0553c-03be-49c4-ba2a-dced5bc62586-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-26mgc\" (UID: \"fca0553c-03be-49c4-ba2a-dced5bc62586\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-26mgc" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401521 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37071a22-82fc-4b04-bc09-6535395faae6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7qwpx\" (UID: \"37071a22-82fc-4b04-bc09-6535395faae6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7qwpx" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401544 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/596b91f0-06b9-4e89-9816-da8049dad9e3-images\") pod \"machine-config-operator-74547568cd-hvdx7\" (UID: \"596b91f0-06b9-4e89-9816-da8049dad9e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hvdx7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401570 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401596 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxv7v\" (UniqueName: \"kubernetes.io/projected/322233fd-b71f-4ef5-931f-58e98326386a-kube-api-access-xxv7v\") pod \"authentication-operator-69f744f599-c4gcj\" (UID: \"322233fd-b71f-4ef5-931f-58e98326386a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c4gcj" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401623 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401649 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99dkz\" (UniqueName: \"kubernetes.io/projected/0e880f23-4bef-4e96-bf00-c94dc4551c5a-kube-api-access-99dkz\") pod \"router-default-5444994796-wgtgl\" (UID: \"0e880f23-4bef-4e96-bf00-c94dc4551c5a\") " pod="openshift-ingress/router-default-5444994796-wgtgl" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401693 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/322233fd-b71f-4ef5-931f-58e98326386a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-c4gcj\" (UID: \"322233fd-b71f-4ef5-931f-58e98326386a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c4gcj" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401715 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aeba530-5722-4f41-9082-3f9316f06505-config\") pod \"kube-apiserver-operator-766d6c64bb-wgmzv\" (UID: \"9aeba530-5722-4f41-9082-3f9316f06505\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgmzv" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401736 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jct5c\" (UniqueName: \"kubernetes.io/projected/66c28cfe-8a0b-459a-bbab-59053fe226b8-kube-api-access-jct5c\") pod \"auto-csr-approver-29566884-j8dqd\" (UID: \"66c28cfe-8a0b-459a-bbab-59053fe226b8\") " pod="openshift-infra/auto-csr-approver-29566884-j8dqd" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401793 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5f512921-f02c-464b-af06-d65fb95f0071-trusted-ca\") pod \"console-operator-58897d9998-4bk2w\" (UID: \"5f512921-f02c-464b-af06-d65fb95f0071\") " pod="openshift-console-operator/console-operator-58897d9998-4bk2w" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401802 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69bb010a-a13d-4458-8118-80c5aebb6e65-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-f8vm4\" (UID: \"69bb010a-a13d-4458-8118-80c5aebb6e65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401859 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d6d6c1c-878b-47a0-9475-ac3ec45c17b0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ljb4c\" (UID: \"4d6d6c1c-878b-47a0-9475-ac3ec45c17b0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljb4c" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401884 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzvrj\" (UniqueName: \"kubernetes.io/projected/6de410a6-e2d0-4c6d-8a38-d8d5c49a30e5-kube-api-access-mzvrj\") pod \"catalog-operator-68c6474976-4ddcl\" (UID: \"6de410a6-e2d0-4c6d-8a38-d8d5c49a30e5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4ddcl" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401937 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5e47b273-3bc0-4f32-8bdb-aa283db4d8a1-srv-cert\") pod \"olm-operator-6b444d44fb-kz2ff\" (UID: \"5e47b273-3bc0-4f32-8bdb-aa283db4d8a1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kz2ff" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401961 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.401978 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6de410a6-e2d0-4c6d-8a38-d8d5c49a30e5-profile-collector-cert\") pod \"catalog-operator-68c6474976-4ddcl\" (UID: \"6de410a6-e2d0-4c6d-8a38-d8d5c49a30e5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4ddcl" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.402032 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/596b91f0-06b9-4e89-9816-da8049dad9e3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hvdx7\" (UID: \"596b91f0-06b9-4e89-9816-da8049dad9e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hvdx7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.402048 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/83d5e1a7-2827-4dfd-9feb-3b9630a62515-etcd-client\") pod \"etcd-operator-b45778765-mhgwt\" (UID: \"83d5e1a7-2827-4dfd-9feb-3b9630a62515\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mhgwt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.402066 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a4e363de-fd5c-4f76-8943-ae3c56f3765b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.402085 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xmgd\" (UniqueName: \"kubernetes.io/projected/4d6d6c1c-878b-47a0-9475-ac3ec45c17b0-kube-api-access-9xmgd\") pod \"kube-storage-version-migrator-operator-b67b599dd-ljb4c\" (UID: \"4d6d6c1c-878b-47a0-9475-ac3ec45c17b0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljb4c" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.402105 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2aab7bd5-eca7-4faa-a4fc-d83be6cc6c3c-metrics-tls\") pod \"dns-operator-744455d44c-5dqnp\" (UID: \"2aab7bd5-eca7-4faa-a4fc-d83be6cc6c3c\") " pod="openshift-dns-operator/dns-operator-744455d44c-5dqnp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.402124 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2x56\" (UniqueName: \"kubernetes.io/projected/6847acbc-29ec-4939-a6aa-4617b8e438e7-kube-api-access-g2x56\") pod \"ingress-canary-xdmgm\" (UID: \"6847acbc-29ec-4939-a6aa-4617b8e438e7\") " pod="openshift-ingress-canary/ingress-canary-xdmgm" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.402144 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pn6xj\" (UniqueName: \"kubernetes.io/projected/ef043a74-5704-48d8-abc6-4a1afef82b9c-kube-api-access-pn6xj\") pod \"machine-config-server-xd95h\" (UID: \"ef043a74-5704-48d8-abc6-4a1afef82b9c\") " pod="openshift-machine-config-operator/machine-config-server-xd95h" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.402164 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69bb010a-a13d-4458-8118-80c5aebb6e65-serving-cert\") pod \"apiserver-7bbb656c7d-f8vm4\" (UID: \"69bb010a-a13d-4458-8118-80c5aebb6e65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.402184 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/69bb010a-a13d-4458-8118-80c5aebb6e65-audit-dir\") pod \"apiserver-7bbb656c7d-f8vm4\" (UID: \"69bb010a-a13d-4458-8118-80c5aebb6e65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.402199 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dr4d\" (UniqueName: \"kubernetes.io/projected/330a4463-669f-4f2e-aa4b-614ab0654579-kube-api-access-4dr4d\") pod \"ingress-operator-5b745b69d9-8hngw\" (UID: \"330a4463-669f-4f2e-aa4b-614ab0654579\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8hngw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.402216 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xn5g\" (UniqueName: \"kubernetes.io/projected/b9d6f338-6a7a-4ba2-b1c9-12485bb30937-kube-api-access-4xn5g\") pod \"collect-profiles-29566875-2dssd\" (UID: \"b9d6f338-6a7a-4ba2-b1c9-12485bb30937\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-2dssd" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.402235 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/17fc0166-9183-4bbd-a091-644b431349e1-csi-data-dir\") pod \"csi-hostpathplugin-mpgxt\" (UID: \"17fc0166-9183-4bbd-a091-644b431349e1\") " pod="hostpath-provisioner/csi-hostpathplugin-mpgxt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.402239 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/69bb010a-a13d-4458-8118-80c5aebb6e65-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-f8vm4\" (UID: \"69bb010a-a13d-4458-8118-80c5aebb6e65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.402253 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/69bb010a-a13d-4458-8118-80c5aebb6e65-etcd-client\") pod \"apiserver-7bbb656c7d-f8vm4\" (UID: \"69bb010a-a13d-4458-8118-80c5aebb6e65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.402288 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qw7n\" (UniqueName: \"kubernetes.io/projected/3c73e93c-a062-4361-bc15-95eb55598666-kube-api-access-9qw7n\") pod \"migrator-59844c95c7-zg96g\" (UID: \"3c73e93c-a062-4361-bc15-95eb55598666\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zg96g" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.402316 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/322233fd-b71f-4ef5-931f-58e98326386a-config\") pod \"authentication-operator-69f744f599-c4gcj\" (UID: \"322233fd-b71f-4ef5-931f-58e98326386a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c4gcj" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.402365 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bf18c8f-c77f-4208-b464-19772b0221f4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-tdnm6\" (UID: \"0bf18c8f-c77f-4208-b464-19772b0221f4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdnm6" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.402391 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/322233fd-b71f-4ef5-931f-58e98326386a-serving-cert\") pod \"authentication-operator-69f744f599-c4gcj\" (UID: \"322233fd-b71f-4ef5-931f-58e98326386a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c4gcj" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.402414 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9aeba530-5722-4f41-9082-3f9316f06505-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wgmzv\" (UID: \"9aeba530-5722-4f41-9082-3f9316f06505\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgmzv" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.402433 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f512921-f02c-464b-af06-d65fb95f0071-config\") pod \"console-operator-58897d9998-4bk2w\" (UID: \"5f512921-f02c-464b-af06-d65fb95f0071\") " pod="openshift-console-operator/console-operator-58897d9998-4bk2w" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.402478 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhqn4\" (UniqueName: \"kubernetes.io/projected/17fc0166-9183-4bbd-a091-644b431349e1-kube-api-access-nhqn4\") pod \"csi-hostpathplugin-mpgxt\" (UID: \"17fc0166-9183-4bbd-a091-644b431349e1\") " pod="hostpath-provisioner/csi-hostpathplugin-mpgxt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.402555 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d6d6c1c-878b-47a0-9475-ac3ec45c17b0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ljb4c\" (UID: \"4d6d6c1c-878b-47a0-9475-ac3ec45c17b0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljb4c" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.402826 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79nf2\" (UniqueName: \"kubernetes.io/projected/bbb5ef4b-b4d4-4ef5-aab3-b9b8f759c6b4-kube-api-access-79nf2\") pod \"machine-config-controller-84d6567774-w8bf7\" (UID: \"bbb5ef4b-b4d4-4ef5-aab3-b9b8f759c6b4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8bf7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.402883 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a56cf4d4-faa0-469e-b856-d1c030dd19d9-signing-cabundle\") pod \"service-ca-9c57cc56f-8w6wp\" (UID: \"a56cf4d4-faa0-469e-b856-d1c030dd19d9\") " pod="openshift-service-ca/service-ca-9c57cc56f-8w6wp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.402926 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/05c127a2-f6b5-4d71-8646-e29396ea7971-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-bw9k5\" (UID: \"05c127a2-f6b5-4d71-8646-e29396ea7971\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bw9k5" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.402974 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/596b91f0-06b9-4e89-9816-da8049dad9e3-proxy-tls\") pod \"machine-config-operator-74547568cd-hvdx7\" (UID: \"596b91f0-06b9-4e89-9816-da8049dad9e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hvdx7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.402997 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95609887-af52-4179-88e3-7f3730642377-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-46h7s\" (UID: \"95609887-af52-4179-88e3-7f3730642377\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46h7s" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403080 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95609887-af52-4179-88e3-7f3730642377-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-46h7s\" (UID: \"95609887-af52-4179-88e3-7f3730642377\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46h7s" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403109 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krg4j\" (UniqueName: \"kubernetes.io/projected/fca0553c-03be-49c4-ba2a-dced5bc62586-kube-api-access-krg4j\") pod \"multus-admission-controller-857f4d67dd-26mgc\" (UID: \"fca0553c-03be-49c4-ba2a-dced5bc62586\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-26mgc" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403156 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37071a22-82fc-4b04-bc09-6535395faae6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7qwpx\" (UID: \"37071a22-82fc-4b04-bc09-6535395faae6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7qwpx" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403181 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knqbb\" (UniqueName: \"kubernetes.io/projected/05c127a2-f6b5-4d71-8646-e29396ea7971-kube-api-access-knqbb\") pod \"package-server-manager-789f6589d5-bw9k5\" (UID: \"05c127a2-f6b5-4d71-8646-e29396ea7971\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bw9k5" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403203 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94e24fab-c452-4121-a617-9d5f02b8ba1b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pmxvj\" (UID: \"94e24fab-c452-4121-a617-9d5f02b8ba1b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmxvj" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403235 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz7fv\" (UniqueName: \"kubernetes.io/projected/a4e363de-fd5c-4f76-8943-ae3c56f3765b-kube-api-access-xz7fv\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403262 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0e880f23-4bef-4e96-bf00-c94dc4551c5a-stats-auth\") pod \"router-default-5444994796-wgtgl\" (UID: \"0e880f23-4bef-4e96-bf00-c94dc4551c5a\") " pod="openshift-ingress/router-default-5444994796-wgtgl" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403286 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/69bb010a-a13d-4458-8118-80c5aebb6e65-audit-dir\") pod \"apiserver-7bbb656c7d-f8vm4\" (UID: \"69bb010a-a13d-4458-8118-80c5aebb6e65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403290 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/391ef260-a6ea-4cab-bca3-280435898381-trusted-ca-bundle\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403400 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0e880f23-4bef-4e96-bf00-c94dc4551c5a-default-certificate\") pod \"router-default-5444994796-wgtgl\" (UID: \"0e880f23-4bef-4e96-bf00-c94dc4551c5a\") " pod="openshift-ingress/router-default-5444994796-wgtgl" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403435 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/69bb010a-a13d-4458-8118-80c5aebb6e65-audit-policies\") pod \"apiserver-7bbb656c7d-f8vm4\" (UID: \"69bb010a-a13d-4458-8118-80c5aebb6e65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403445 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a4e363de-fd5c-4f76-8943-ae3c56f3765b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403462 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d16677da-48c3-4fd4-9e59-b0013daa4825-config\") pod \"machine-approver-56656f9798-7qt22\" (UID: \"d16677da-48c3-4fd4-9e59-b0013daa4825\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7qt22" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403480 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bf18c8f-c77f-4208-b464-19772b0221f4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-tdnm6\" (UID: \"0bf18c8f-c77f-4208-b464-19772b0221f4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdnm6" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403501 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/17fc0166-9183-4bbd-a091-644b431349e1-mountpoint-dir\") pod \"csi-hostpathplugin-mpgxt\" (UID: \"17fc0166-9183-4bbd-a091-644b431349e1\") " pod="hostpath-provisioner/csi-hostpathplugin-mpgxt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403531 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8521b3b0-4fb2-45b2-90b5-7080e766aafa-client-ca\") pod \"controller-manager-879f6c89f-mtrbp\" (UID: \"8521b3b0-4fb2-45b2-90b5-7080e766aafa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403550 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b1fe291e-3490-49c0-9443-e5b0f03db19c-audit-policies\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403566 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a86206-8505-4c7d-82aa-2b482c0eb08b-config\") pod \"service-ca-operator-777779d784-56g5d\" (UID: \"94a86206-8505-4c7d-82aa-2b482c0eb08b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-56g5d" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403596 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkmln\" (UniqueName: \"kubernetes.io/projected/69bb010a-a13d-4458-8118-80c5aebb6e65-kube-api-access-zkmln\") pod \"apiserver-7bbb656c7d-f8vm4\" (UID: \"69bb010a-a13d-4458-8118-80c5aebb6e65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403613 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bbb5ef4b-b4d4-4ef5-aab3-b9b8f759c6b4-proxy-tls\") pod \"machine-config-controller-84d6567774-w8bf7\" (UID: \"bbb5ef4b-b4d4-4ef5-aab3-b9b8f759c6b4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8bf7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403631 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b22a43e3-90de-4609-bf64-006de1716ae3-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wkc4c\" (UID: \"b22a43e3-90de-4609-bf64-006de1716ae3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wkc4c" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403677 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83d5e1a7-2827-4dfd-9feb-3b9630a62515-serving-cert\") pod \"etcd-operator-b45778765-mhgwt\" (UID: \"83d5e1a7-2827-4dfd-9feb-3b9630a62515\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mhgwt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403694 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/83d5e1a7-2827-4dfd-9feb-3b9630a62515-etcd-service-ca\") pod \"etcd-operator-b45778765-mhgwt\" (UID: \"83d5e1a7-2827-4dfd-9feb-3b9630a62515\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mhgwt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403710 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fbnq\" (UniqueName: \"kubernetes.io/projected/83d5e1a7-2827-4dfd-9feb-3b9630a62515-kube-api-access-2fbnq\") pod \"etcd-operator-b45778765-mhgwt\" (UID: \"83d5e1a7-2827-4dfd-9feb-3b9630a62515\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mhgwt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403817 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtkq9\" (UniqueName: \"kubernetes.io/projected/925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e-kube-api-access-vtkq9\") pod \"packageserver-d55dfcdfc-z2rch\" (UID: \"925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403838 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47aff328-d38c-426c-8462-12c6b98a82fd-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jl6jn\" (UID: \"47aff328-d38c-426c-8462-12c6b98a82fd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jl6jn" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403871 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2dce229d-701a-4a70-9c44-5f99d4c6fe79-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2zrvx\" (UID: \"2dce229d-701a-4a70-9c44-5f99d4c6fe79\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2zrvx" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403889 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8521b3b0-4fb2-45b2-90b5-7080e766aafa-config\") pod \"controller-manager-879f6c89f-mtrbp\" (UID: \"8521b3b0-4fb2-45b2-90b5-7080e766aafa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403905 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ef043a74-5704-48d8-abc6-4a1afef82b9c-certs\") pod \"machine-config-server-xd95h\" (UID: \"ef043a74-5704-48d8-abc6-4a1afef82b9c\") " pod="openshift-machine-config-operator/machine-config-server-xd95h" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.403923 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kx9b\" (UniqueName: \"kubernetes.io/projected/5e47b273-3bc0-4f32-8bdb-aa283db4d8a1-kube-api-access-9kx9b\") pod \"olm-operator-6b444d44fb-kz2ff\" (UID: \"5e47b273-3bc0-4f32-8bdb-aa283db4d8a1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kz2ff" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.404314 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/391ef260-a6ea-4cab-bca3-280435898381-trusted-ca-bundle\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.404396 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/69bb010a-a13d-4458-8118-80c5aebb6e65-audit-policies\") pod \"apiserver-7bbb656c7d-f8vm4\" (UID: \"69bb010a-a13d-4458-8118-80c5aebb6e65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.404663 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8521b3b0-4fb2-45b2-90b5-7080e766aafa-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-mtrbp\" (UID: \"8521b3b0-4fb2-45b2-90b5-7080e766aafa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.404776 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d16677da-48c3-4fd4-9e59-b0013daa4825-config\") pod \"machine-approver-56656f9798-7qt22\" (UID: \"d16677da-48c3-4fd4-9e59-b0013daa4825\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7qt22" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.405290 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a4e363de-fd5c-4f76-8943-ae3c56f3765b-registry-certificates\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.405520 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/391ef260-a6ea-4cab-bca3-280435898381-node-pullsecrets\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.405528 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b1fe291e-3490-49c0-9443-e5b0f03db19c-audit-dir\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.406118 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a4e363de-fd5c-4f76-8943-ae3c56f3765b-registry-tls\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.406149 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/330a4463-669f-4f2e-aa4b-614ab0654579-trusted-ca\") pod \"ingress-operator-5b745b69d9-8hngw\" (UID: \"330a4463-669f-4f2e-aa4b-614ab0654579\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8hngw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.406194 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/391ef260-a6ea-4cab-bca3-280435898381-encryption-config\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.406222 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lg7ck\" (UniqueName: \"kubernetes.io/projected/5f512921-f02c-464b-af06-d65fb95f0071-kube-api-access-lg7ck\") pod \"console-operator-58897d9998-4bk2w\" (UID: \"5f512921-f02c-464b-af06-d65fb95f0071\") " pod="openshift-console-operator/console-operator-58897d9998-4bk2w" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.406253 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr5bs\" (UniqueName: \"kubernetes.io/projected/b1fe291e-3490-49c0-9443-e5b0f03db19c-kube-api-access-jr5bs\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.406280 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5dlv\" (UniqueName: \"kubernetes.io/projected/0bf18c8f-c77f-4208-b464-19772b0221f4-kube-api-access-l5dlv\") pod \"openshift-controller-manager-operator-756b6f6bc6-tdnm6\" (UID: \"0bf18c8f-c77f-4208-b464-19772b0221f4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdnm6" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.406308 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d16677da-48c3-4fd4-9e59-b0013daa4825-auth-proxy-config\") pod \"machine-approver-56656f9798-7qt22\" (UID: \"d16677da-48c3-4fd4-9e59-b0013daa4825\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7qt22" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.406353 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/391ef260-a6ea-4cab-bca3-280435898381-audit-dir\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.408651 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/2dce229d-701a-4a70-9c44-5f99d4c6fe79-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2zrvx\" (UID: \"2dce229d-701a-4a70-9c44-5f99d4c6fe79\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2zrvx" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.408936 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/391ef260-a6ea-4cab-bca3-280435898381-audit-dir\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.409043 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f512921-f02c-464b-af06-d65fb95f0071-serving-cert\") pod \"console-operator-58897d9998-4bk2w\" (UID: \"5f512921-f02c-464b-af06-d65fb95f0071\") " pod="openshift-console-operator/console-operator-58897d9998-4bk2w" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.409173 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/d16677da-48c3-4fd4-9e59-b0013daa4825-machine-approver-tls\") pod \"machine-approver-56656f9798-7qt22\" (UID: \"d16677da-48c3-4fd4-9e59-b0013daa4825\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7qt22" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.409402 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2dce229d-701a-4a70-9c44-5f99d4c6fe79-serving-cert\") pod \"openshift-config-operator-7777fb866f-2zrvx\" (UID: \"2dce229d-701a-4a70-9c44-5f99d4c6fe79\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2zrvx" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.409426 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d16677da-48c3-4fd4-9e59-b0013daa4825-auth-proxy-config\") pod \"machine-approver-56656f9798-7qt22\" (UID: \"d16677da-48c3-4fd4-9e59-b0013daa4825\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7qt22" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.410862 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.410908 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/69bb010a-a13d-4458-8118-80c5aebb6e65-serving-cert\") pod \"apiserver-7bbb656c7d-f8vm4\" (UID: \"69bb010a-a13d-4458-8118-80c5aebb6e65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.411114 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/391ef260-a6ea-4cab-bca3-280435898381-etcd-serving-ca\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.411315 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/69bb010a-a13d-4458-8118-80c5aebb6e65-etcd-client\") pod \"apiserver-7bbb656c7d-f8vm4\" (UID: \"69bb010a-a13d-4458-8118-80c5aebb6e65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.411519 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b1fe291e-3490-49c0-9443-e5b0f03db19c-audit-policies\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.416802 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4e363de-fd5c-4f76-8943-ae3c56f3765b-trusted-ca\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.418745 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/391ef260-a6ea-4cab-bca3-280435898381-etcd-client\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.418863 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/47aff328-d38c-426c-8462-12c6b98a82fd-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-jl6jn\" (UID: \"47aff328-d38c-426c-8462-12c6b98a82fd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jl6jn" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.419011 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.419304 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0bf18c8f-c77f-4208-b464-19772b0221f4-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-tdnm6\" (UID: \"0bf18c8f-c77f-4208-b464-19772b0221f4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdnm6" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.419515 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.419584 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/391ef260-a6ea-4cab-bca3-280435898381-image-import-ca\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.420256 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/322233fd-b71f-4ef5-931f-58e98326386a-service-ca-bundle\") pod \"authentication-operator-69f744f599-c4gcj\" (UID: \"322233fd-b71f-4ef5-931f-58e98326386a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c4gcj" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.420318 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/322233fd-b71f-4ef5-931f-58e98326386a-config\") pod \"authentication-operator-69f744f599-c4gcj\" (UID: \"322233fd-b71f-4ef5-931f-58e98326386a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c4gcj" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.420822 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8521b3b0-4fb2-45b2-90b5-7080e766aafa-serving-cert\") pod \"controller-manager-879f6c89f-mtrbp\" (UID: \"8521b3b0-4fb2-45b2-90b5-7080e766aafa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.422292 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/330a4463-669f-4f2e-aa4b-614ab0654579-metrics-tls\") pod \"ingress-operator-5b745b69d9-8hngw\" (UID: \"330a4463-669f-4f2e-aa4b-614ab0654579\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8hngw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.422319 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/69bb010a-a13d-4458-8118-80c5aebb6e65-encryption-config\") pod \"apiserver-7bbb656c7d-f8vm4\" (UID: \"69bb010a-a13d-4458-8118-80c5aebb6e65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.422454 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8521b3b0-4fb2-45b2-90b5-7080e766aafa-config\") pod \"controller-manager-879f6c89f-mtrbp\" (UID: \"8521b3b0-4fb2-45b2-90b5-7080e766aafa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.422922 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/330a4463-669f-4f2e-aa4b-614ab0654579-trusted-ca\") pod \"ingress-operator-5b745b69d9-8hngw\" (UID: \"330a4463-669f-4f2e-aa4b-614ab0654579\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8hngw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.424328 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.424478 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.425276 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.425392 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.425730 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a4e363de-fd5c-4f76-8943-ae3c56f3765b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.426033 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.426095 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.426364 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/391ef260-a6ea-4cab-bca3-280435898381-serving-cert\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.426431 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.426557 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/95609887-af52-4179-88e3-7f3730642377-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-46h7s\" (UID: \"95609887-af52-4179-88e3-7f3730642377\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46h7s" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.426655 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/391ef260-a6ea-4cab-bca3-280435898381-encryption-config\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.428432 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/391ef260-a6ea-4cab-bca3-280435898381-config\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.428700 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/47aff328-d38c-426c-8462-12c6b98a82fd-config\") pod \"openshift-apiserver-operator-796bbdcf4f-jl6jn\" (UID: \"47aff328-d38c-426c-8462-12c6b98a82fd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jl6jn" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.428783 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0bf18c8f-c77f-4208-b464-19772b0221f4-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-tdnm6\" (UID: \"0bf18c8f-c77f-4208-b464-19772b0221f4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdnm6" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.429429 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/322233fd-b71f-4ef5-931f-58e98326386a-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-c4gcj\" (UID: \"322233fd-b71f-4ef5-931f-58e98326386a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c4gcj" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.429560 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.429852 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/95609887-af52-4179-88e3-7f3730642377-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-46h7s\" (UID: \"95609887-af52-4179-88e3-7f3730642377\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46h7s" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.431296 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/322233fd-b71f-4ef5-931f-58e98326386a-serving-cert\") pod \"authentication-operator-69f744f599-c4gcj\" (UID: \"322233fd-b71f-4ef5-931f-58e98326386a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c4gcj" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.433030 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8521b3b0-4fb2-45b2-90b5-7080e766aafa-client-ca\") pod \"controller-manager-879f6c89f-mtrbp\" (UID: \"8521b3b0-4fb2-45b2-90b5-7080e766aafa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.433278 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2aab7bd5-eca7-4faa-a4fc-d83be6cc6c3c-metrics-tls\") pod \"dns-operator-744455d44c-5dqnp\" (UID: \"2aab7bd5-eca7-4faa-a4fc-d83be6cc6c3c\") " pod="openshift-dns-operator/dns-operator-744455d44c-5dqnp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.435984 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-jt5vz"] Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.438107 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dq4gn\" (UniqueName: \"kubernetes.io/projected/95609887-af52-4179-88e3-7f3730642377-kube-api-access-dq4gn\") pod \"cluster-image-registry-operator-dc59b4c8b-46h7s\" (UID: \"95609887-af52-4179-88e3-7f3730642377\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46h7s" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.457952 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpb8n\" (UniqueName: \"kubernetes.io/projected/2aab7bd5-eca7-4faa-a4fc-d83be6cc6c3c-kube-api-access-lpb8n\") pod \"dns-operator-744455d44c-5dqnp\" (UID: \"2aab7bd5-eca7-4faa-a4fc-d83be6cc6c3c\") " pod="openshift-dns-operator/dns-operator-744455d44c-5dqnp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.486702 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2gv9\" (UniqueName: \"kubernetes.io/projected/d16677da-48c3-4fd4-9e59-b0013daa4825-kube-api-access-r2gv9\") pod \"machine-approver-56656f9798-7qt22\" (UID: \"d16677da-48c3-4fd4-9e59-b0013daa4825\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7qt22" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.495042 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lzxb\" (UniqueName: \"kubernetes.io/projected/391ef260-a6ea-4cab-bca3-280435898381-kube-api-access-6lzxb\") pod \"apiserver-76f77b778f-k9jzg\" (UID: \"391ef260-a6ea-4cab-bca3-280435898381\") " pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.507169 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.507362 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5e47b273-3bc0-4f32-8bdb-aa283db4d8a1-srv-cert\") pod \"olm-operator-6b444d44fb-kz2ff\" (UID: \"5e47b273-3bc0-4f32-8bdb-aa283db4d8a1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kz2ff" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.507404 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6de410a6-e2d0-4c6d-8a38-d8d5c49a30e5-profile-collector-cert\") pod \"catalog-operator-68c6474976-4ddcl\" (UID: \"6de410a6-e2d0-4c6d-8a38-d8d5c49a30e5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4ddcl" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.507438 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/596b91f0-06b9-4e89-9816-da8049dad9e3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hvdx7\" (UID: \"596b91f0-06b9-4e89-9816-da8049dad9e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hvdx7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.507462 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xmgd\" (UniqueName: \"kubernetes.io/projected/4d6d6c1c-878b-47a0-9475-ac3ec45c17b0-kube-api-access-9xmgd\") pod \"kube-storage-version-migrator-operator-b67b599dd-ljb4c\" (UID: \"4d6d6c1c-878b-47a0-9475-ac3ec45c17b0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljb4c" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.507484 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/83d5e1a7-2827-4dfd-9feb-3b9630a62515-etcd-client\") pod \"etcd-operator-b45778765-mhgwt\" (UID: \"83d5e1a7-2827-4dfd-9feb-3b9630a62515\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mhgwt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.507506 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2x56\" (UniqueName: \"kubernetes.io/projected/6847acbc-29ec-4939-a6aa-4617b8e438e7-kube-api-access-g2x56\") pod \"ingress-canary-xdmgm\" (UID: \"6847acbc-29ec-4939-a6aa-4617b8e438e7\") " pod="openshift-ingress-canary/ingress-canary-xdmgm" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.507532 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pn6xj\" (UniqueName: \"kubernetes.io/projected/ef043a74-5704-48d8-abc6-4a1afef82b9c-kube-api-access-pn6xj\") pod \"machine-config-server-xd95h\" (UID: \"ef043a74-5704-48d8-abc6-4a1afef82b9c\") " pod="openshift-machine-config-operator/machine-config-server-xd95h" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.507561 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xn5g\" (UniqueName: \"kubernetes.io/projected/b9d6f338-6a7a-4ba2-b1c9-12485bb30937-kube-api-access-4xn5g\") pod \"collect-profiles-29566875-2dssd\" (UID: \"b9d6f338-6a7a-4ba2-b1c9-12485bb30937\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-2dssd" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.507586 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qw7n\" (UniqueName: \"kubernetes.io/projected/3c73e93c-a062-4361-bc15-95eb55598666-kube-api-access-9qw7n\") pod \"migrator-59844c95c7-zg96g\" (UID: \"3c73e93c-a062-4361-bc15-95eb55598666\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zg96g" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.507608 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/17fc0166-9183-4bbd-a091-644b431349e1-csi-data-dir\") pod \"csi-hostpathplugin-mpgxt\" (UID: \"17fc0166-9183-4bbd-a091-644b431349e1\") " pod="hostpath-provisioner/csi-hostpathplugin-mpgxt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.507631 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhqn4\" (UniqueName: \"kubernetes.io/projected/17fc0166-9183-4bbd-a091-644b431349e1-kube-api-access-nhqn4\") pod \"csi-hostpathplugin-mpgxt\" (UID: \"17fc0166-9183-4bbd-a091-644b431349e1\") " pod="hostpath-provisioner/csi-hostpathplugin-mpgxt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.507656 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9aeba530-5722-4f41-9082-3f9316f06505-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wgmzv\" (UID: \"9aeba530-5722-4f41-9082-3f9316f06505\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgmzv" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.507678 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/05c127a2-f6b5-4d71-8646-e29396ea7971-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-bw9k5\" (UID: \"05c127a2-f6b5-4d71-8646-e29396ea7971\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bw9k5" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.507700 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/596b91f0-06b9-4e89-9816-da8049dad9e3-proxy-tls\") pod \"machine-config-operator-74547568cd-hvdx7\" (UID: \"596b91f0-06b9-4e89-9816-da8049dad9e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hvdx7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.507723 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d6d6c1c-878b-47a0-9475-ac3ec45c17b0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ljb4c\" (UID: \"4d6d6c1c-878b-47a0-9475-ac3ec45c17b0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljb4c" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.507746 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79nf2\" (UniqueName: \"kubernetes.io/projected/bbb5ef4b-b4d4-4ef5-aab3-b9b8f759c6b4-kube-api-access-79nf2\") pod \"machine-config-controller-84d6567774-w8bf7\" (UID: \"bbb5ef4b-b4d4-4ef5-aab3-b9b8f759c6b4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8bf7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.507770 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a56cf4d4-faa0-469e-b856-d1c030dd19d9-signing-cabundle\") pod \"service-ca-9c57cc56f-8w6wp\" (UID: \"a56cf4d4-faa0-469e-b856-d1c030dd19d9\") " pod="openshift-service-ca/service-ca-9c57cc56f-8w6wp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.507801 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94e24fab-c452-4121-a617-9d5f02b8ba1b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pmxvj\" (UID: \"94e24fab-c452-4121-a617-9d5f02b8ba1b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmxvj" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.507832 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krg4j\" (UniqueName: \"kubernetes.io/projected/fca0553c-03be-49c4-ba2a-dced5bc62586-kube-api-access-krg4j\") pod \"multus-admission-controller-857f4d67dd-26mgc\" (UID: \"fca0553c-03be-49c4-ba2a-dced5bc62586\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-26mgc" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.507853 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37071a22-82fc-4b04-bc09-6535395faae6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7qwpx\" (UID: \"37071a22-82fc-4b04-bc09-6535395faae6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7qwpx" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.507875 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knqbb\" (UniqueName: \"kubernetes.io/projected/05c127a2-f6b5-4d71-8646-e29396ea7971-kube-api-access-knqbb\") pod \"package-server-manager-789f6589d5-bw9k5\" (UID: \"05c127a2-f6b5-4d71-8646-e29396ea7971\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bw9k5" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.507896 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0e880f23-4bef-4e96-bf00-c94dc4551c5a-default-certificate\") pod \"router-default-5444994796-wgtgl\" (UID: \"0e880f23-4bef-4e96-bf00-c94dc4551c5a\") " pod="openshift-ingress/router-default-5444994796-wgtgl" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.507918 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0e880f23-4bef-4e96-bf00-c94dc4551c5a-stats-auth\") pod \"router-default-5444994796-wgtgl\" (UID: \"0e880f23-4bef-4e96-bf00-c94dc4551c5a\") " pod="openshift-ingress/router-default-5444994796-wgtgl" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.507941 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/17fc0166-9183-4bbd-a091-644b431349e1-mountpoint-dir\") pod \"csi-hostpathplugin-mpgxt\" (UID: \"17fc0166-9183-4bbd-a091-644b431349e1\") " pod="hostpath-provisioner/csi-hostpathplugin-mpgxt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.507973 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a86206-8505-4c7d-82aa-2b482c0eb08b-config\") pod \"service-ca-operator-777779d784-56g5d\" (UID: \"94a86206-8505-4c7d-82aa-2b482c0eb08b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-56g5d" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.507993 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bbb5ef4b-b4d4-4ef5-aab3-b9b8f759c6b4-proxy-tls\") pod \"machine-config-controller-84d6567774-w8bf7\" (UID: \"bbb5ef4b-b4d4-4ef5-aab3-b9b8f759c6b4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8bf7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508015 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b22a43e3-90de-4609-bf64-006de1716ae3-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wkc4c\" (UID: \"b22a43e3-90de-4609-bf64-006de1716ae3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wkc4c" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508036 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/83d5e1a7-2827-4dfd-9feb-3b9630a62515-etcd-service-ca\") pod \"etcd-operator-b45778765-mhgwt\" (UID: \"83d5e1a7-2827-4dfd-9feb-3b9630a62515\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mhgwt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508059 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fbnq\" (UniqueName: \"kubernetes.io/projected/83d5e1a7-2827-4dfd-9feb-3b9630a62515-kube-api-access-2fbnq\") pod \"etcd-operator-b45778765-mhgwt\" (UID: \"83d5e1a7-2827-4dfd-9feb-3b9630a62515\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mhgwt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508082 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83d5e1a7-2827-4dfd-9feb-3b9630a62515-serving-cert\") pod \"etcd-operator-b45778765-mhgwt\" (UID: \"83d5e1a7-2827-4dfd-9feb-3b9630a62515\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mhgwt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508104 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtkq9\" (UniqueName: \"kubernetes.io/projected/925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e-kube-api-access-vtkq9\") pod \"packageserver-d55dfcdfc-z2rch\" (UID: \"925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508133 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ef043a74-5704-48d8-abc6-4a1afef82b9c-certs\") pod \"machine-config-server-xd95h\" (UID: \"ef043a74-5704-48d8-abc6-4a1afef82b9c\") " pod="openshift-machine-config-operator/machine-config-server-xd95h" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508155 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kx9b\" (UniqueName: \"kubernetes.io/projected/5e47b273-3bc0-4f32-8bdb-aa283db4d8a1-kube-api-access-9kx9b\") pod \"olm-operator-6b444d44fb-kz2ff\" (UID: \"5e47b273-3bc0-4f32-8bdb-aa283db4d8a1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kz2ff" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508218 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6847acbc-29ec-4939-a6aa-4617b8e438e7-cert\") pod \"ingress-canary-xdmgm\" (UID: \"6847acbc-29ec-4939-a6aa-4617b8e438e7\") " pod="openshift-ingress-canary/ingress-canary-xdmgm" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508239 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/17fc0166-9183-4bbd-a091-644b431349e1-plugins-dir\") pod \"csi-hostpathplugin-mpgxt\" (UID: \"17fc0166-9183-4bbd-a091-644b431349e1\") " pod="hostpath-provisioner/csi-hostpathplugin-mpgxt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508262 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94a86206-8505-4c7d-82aa-2b482c0eb08b-serving-cert\") pod \"service-ca-operator-777779d784-56g5d\" (UID: \"94a86206-8505-4c7d-82aa-2b482c0eb08b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-56g5d" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508283 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e-apiservice-cert\") pod \"packageserver-d55dfcdfc-z2rch\" (UID: \"925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508314 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/17fc0166-9183-4bbd-a091-644b431349e1-registration-dir\") pod \"csi-hostpathplugin-mpgxt\" (UID: \"17fc0166-9183-4bbd-a091-644b431349e1\") " pod="hostpath-provisioner/csi-hostpathplugin-mpgxt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508362 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e880f23-4bef-4e96-bf00-c94dc4551c5a-metrics-certs\") pod \"router-default-5444994796-wgtgl\" (UID: \"0e880f23-4bef-4e96-bf00-c94dc4551c5a\") " pod="openshift-ingress/router-default-5444994796-wgtgl" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508388 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt9hg\" (UniqueName: \"kubernetes.io/projected/94a86206-8505-4c7d-82aa-2b482c0eb08b-kube-api-access-lt9hg\") pod \"service-ca-operator-777779d784-56g5d\" (UID: \"94a86206-8505-4c7d-82aa-2b482c0eb08b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-56g5d" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508411 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb200682-a8ee-406a-9c09-d881f40842e7-metrics-tls\") pod \"dns-default-6v2ms\" (UID: \"cb200682-a8ee-406a-9c09-d881f40842e7\") " pod="openshift-dns/dns-default-6v2ms" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508436 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/83d5e1a7-2827-4dfd-9feb-3b9630a62515-etcd-ca\") pod \"etcd-operator-b45778765-mhgwt\" (UID: \"83d5e1a7-2827-4dfd-9feb-3b9630a62515\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mhgwt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508457 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5e47b273-3bc0-4f32-8bdb-aa283db4d8a1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-kz2ff\" (UID: \"5e47b273-3bc0-4f32-8bdb-aa283db4d8a1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kz2ff" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508479 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b0cbdcce-514f-4b72-8c8c-17029b7217a8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-l7crv\" (UID: \"b0cbdcce-514f-4b72-8c8c-17029b7217a8\") " pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508498 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6de410a6-e2d0-4c6d-8a38-d8d5c49a30e5-srv-cert\") pod \"catalog-operator-68c6474976-4ddcl\" (UID: \"6de410a6-e2d0-4c6d-8a38-d8d5c49a30e5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4ddcl" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508518 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb200682-a8ee-406a-9c09-d881f40842e7-config-volume\") pod \"dns-default-6v2ms\" (UID: \"cb200682-a8ee-406a-9c09-d881f40842e7\") " pod="openshift-dns/dns-default-6v2ms" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508547 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/17fc0166-9183-4bbd-a091-644b431349e1-socket-dir\") pod \"csi-hostpathplugin-mpgxt\" (UID: \"17fc0166-9183-4bbd-a091-644b431349e1\") " pod="hostpath-provisioner/csi-hostpathplugin-mpgxt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508593 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e-webhook-cert\") pod \"packageserver-d55dfcdfc-z2rch\" (UID: \"925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508614 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e880f23-4bef-4e96-bf00-c94dc4551c5a-service-ca-bundle\") pod \"router-default-5444994796-wgtgl\" (UID: \"0e880f23-4bef-4e96-bf00-c94dc4551c5a\") " pod="openshift-ingress/router-default-5444994796-wgtgl" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508638 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e-tmpfs\") pod \"packageserver-d55dfcdfc-z2rch\" (UID: \"925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508659 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b0cbdcce-514f-4b72-8c8c-17029b7217a8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-l7crv\" (UID: \"b0cbdcce-514f-4b72-8c8c-17029b7217a8\") " pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508681 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ef043a74-5704-48d8-abc6-4a1afef82b9c-node-bootstrap-token\") pod \"machine-config-server-xd95h\" (UID: \"ef043a74-5704-48d8-abc6-4a1afef82b9c\") " pod="openshift-machine-config-operator/machine-config-server-xd95h" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508705 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9d6f338-6a7a-4ba2-b1c9-12485bb30937-config-volume\") pod \"collect-profiles-29566875-2dssd\" (UID: \"b9d6f338-6a7a-4ba2-b1c9-12485bb30937\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-2dssd" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508732 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m92n\" (UniqueName: \"kubernetes.io/projected/cb200682-a8ee-406a-9c09-d881f40842e7-kube-api-access-7m92n\") pod \"dns-default-6v2ms\" (UID: \"cb200682-a8ee-406a-9c09-d881f40842e7\") " pod="openshift-dns/dns-default-6v2ms" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508754 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxnnr\" (UniqueName: \"kubernetes.io/projected/a56cf4d4-faa0-469e-b856-d1c030dd19d9-kube-api-access-mxnnr\") pod \"service-ca-9c57cc56f-8w6wp\" (UID: \"a56cf4d4-faa0-469e-b856-d1c030dd19d9\") " pod="openshift-service-ca/service-ca-9c57cc56f-8w6wp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508776 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j55fz\" (UniqueName: \"kubernetes.io/projected/b22a43e3-90de-4609-bf64-006de1716ae3-kube-api-access-j55fz\") pod \"control-plane-machine-set-operator-78cbb6b69f-wkc4c\" (UID: \"b22a43e3-90de-4609-bf64-006de1716ae3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wkc4c" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508796 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83d5e1a7-2827-4dfd-9feb-3b9630a62515-config\") pod \"etcd-operator-b45778765-mhgwt\" (UID: \"83d5e1a7-2827-4dfd-9feb-3b9630a62515\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mhgwt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508823 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bbb5ef4b-b4d4-4ef5-aab3-b9b8f759c6b4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-w8bf7\" (UID: \"bbb5ef4b-b4d4-4ef5-aab3-b9b8f759c6b4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8bf7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508847 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh8zf\" (UniqueName: \"kubernetes.io/projected/b0cbdcce-514f-4b72-8c8c-17029b7217a8-kube-api-access-vh8zf\") pod \"marketplace-operator-79b997595-l7crv\" (UID: \"b0cbdcce-514f-4b72-8c8c-17029b7217a8\") " pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508874 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/94e24fab-c452-4121-a617-9d5f02b8ba1b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pmxvj\" (UID: \"94e24fab-c452-4121-a617-9d5f02b8ba1b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmxvj" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508903 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb5tr\" (UniqueName: \"kubernetes.io/projected/596b91f0-06b9-4e89-9816-da8049dad9e3-kube-api-access-tb5tr\") pod \"machine-config-operator-74547568cd-hvdx7\" (UID: \"596b91f0-06b9-4e89-9816-da8049dad9e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hvdx7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508926 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9d6f338-6a7a-4ba2-b1c9-12485bb30937-secret-volume\") pod \"collect-profiles-29566875-2dssd\" (UID: \"b9d6f338-6a7a-4ba2-b1c9-12485bb30937\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-2dssd" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508947 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94e24fab-c452-4121-a617-9d5f02b8ba1b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pmxvj\" (UID: \"94e24fab-c452-4121-a617-9d5f02b8ba1b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmxvj" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.508986 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a56cf4d4-faa0-469e-b856-d1c030dd19d9-signing-key\") pod \"service-ca-9c57cc56f-8w6wp\" (UID: \"a56cf4d4-faa0-469e-b856-d1c030dd19d9\") " pod="openshift-service-ca/service-ca-9c57cc56f-8w6wp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.509009 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9aeba530-5722-4f41-9082-3f9316f06505-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wgmzv\" (UID: \"9aeba530-5722-4f41-9082-3f9316f06505\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgmzv" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.509031 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37071a22-82fc-4b04-bc09-6535395faae6-config\") pod \"kube-controller-manager-operator-78b949d7b-7qwpx\" (UID: \"37071a22-82fc-4b04-bc09-6535395faae6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7qwpx" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.509052 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fca0553c-03be-49c4-ba2a-dced5bc62586-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-26mgc\" (UID: \"fca0553c-03be-49c4-ba2a-dced5bc62586\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-26mgc" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.509075 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37071a22-82fc-4b04-bc09-6535395faae6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7qwpx\" (UID: \"37071a22-82fc-4b04-bc09-6535395faae6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7qwpx" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.509095 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/596b91f0-06b9-4e89-9816-da8049dad9e3-images\") pod \"machine-config-operator-74547568cd-hvdx7\" (UID: \"596b91f0-06b9-4e89-9816-da8049dad9e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hvdx7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.509125 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99dkz\" (UniqueName: \"kubernetes.io/projected/0e880f23-4bef-4e96-bf00-c94dc4551c5a-kube-api-access-99dkz\") pod \"router-default-5444994796-wgtgl\" (UID: \"0e880f23-4bef-4e96-bf00-c94dc4551c5a\") " pod="openshift-ingress/router-default-5444994796-wgtgl" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.509150 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jct5c\" (UniqueName: \"kubernetes.io/projected/66c28cfe-8a0b-459a-bbab-59053fe226b8-kube-api-access-jct5c\") pod \"auto-csr-approver-29566884-j8dqd\" (UID: \"66c28cfe-8a0b-459a-bbab-59053fe226b8\") " pod="openshift-infra/auto-csr-approver-29566884-j8dqd" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.509174 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aeba530-5722-4f41-9082-3f9316f06505-config\") pod \"kube-apiserver-operator-766d6c64bb-wgmzv\" (UID: \"9aeba530-5722-4f41-9082-3f9316f06505\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgmzv" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.509196 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzvrj\" (UniqueName: \"kubernetes.io/projected/6de410a6-e2d0-4c6d-8a38-d8d5c49a30e5-kube-api-access-mzvrj\") pod \"catalog-operator-68c6474976-4ddcl\" (UID: \"6de410a6-e2d0-4c6d-8a38-d8d5c49a30e5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4ddcl" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.509218 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d6d6c1c-878b-47a0-9475-ac3ec45c17b0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ljb4c\" (UID: \"4d6d6c1c-878b-47a0-9475-ac3ec45c17b0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljb4c" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.509882 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4d6d6c1c-878b-47a0-9475-ac3ec45c17b0-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-ljb4c\" (UID: \"4d6d6c1c-878b-47a0-9475-ac3ec45c17b0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljb4c" Mar 20 13:24:45 crc kubenswrapper[4973]: E0320 13:24:45.509978 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:46.009962337 +0000 UTC m=+206.753632081 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.511005 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/17fc0166-9183-4bbd-a091-644b431349e1-csi-data-dir\") pod \"csi-hostpathplugin-mpgxt\" (UID: \"17fc0166-9183-4bbd-a091-644b431349e1\") " pod="hostpath-provisioner/csi-hostpathplugin-mpgxt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.511274 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/17fc0166-9183-4bbd-a091-644b431349e1-registration-dir\") pod \"csi-hostpathplugin-mpgxt\" (UID: \"17fc0166-9183-4bbd-a091-644b431349e1\") " pod="hostpath-provisioner/csi-hostpathplugin-mpgxt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.511627 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/596b91f0-06b9-4e89-9816-da8049dad9e3-images\") pod \"machine-config-operator-74547568cd-hvdx7\" (UID: \"596b91f0-06b9-4e89-9816-da8049dad9e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hvdx7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.513950 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/37071a22-82fc-4b04-bc09-6535395faae6-config\") pod \"kube-controller-manager-operator-78b949d7b-7qwpx\" (UID: \"37071a22-82fc-4b04-bc09-6535395faae6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7qwpx" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.513708 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9aeba530-5722-4f41-9082-3f9316f06505-config\") pod \"kube-apiserver-operator-766d6c64bb-wgmzv\" (UID: \"9aeba530-5722-4f41-9082-3f9316f06505\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgmzv" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.512730 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/83d5e1a7-2827-4dfd-9feb-3b9630a62515-etcd-service-ca\") pod \"etcd-operator-b45778765-mhgwt\" (UID: \"83d5e1a7-2827-4dfd-9feb-3b9630a62515\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mhgwt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.514245 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/17fc0166-9183-4bbd-a091-644b431349e1-mountpoint-dir\") pod \"csi-hostpathplugin-mpgxt\" (UID: \"17fc0166-9183-4bbd-a091-644b431349e1\") " pod="hostpath-provisioner/csi-hostpathplugin-mpgxt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.514795 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/6de410a6-e2d0-4c6d-8a38-d8d5c49a30e5-profile-collector-cert\") pod \"catalog-operator-68c6474976-4ddcl\" (UID: \"6de410a6-e2d0-4c6d-8a38-d8d5c49a30e5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4ddcl" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.515474 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/5e47b273-3bc0-4f32-8bdb-aa283db4d8a1-srv-cert\") pod \"olm-operator-6b444d44fb-kz2ff\" (UID: \"5e47b273-3bc0-4f32-8bdb-aa283db4d8a1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kz2ff" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.515967 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94a86206-8505-4c7d-82aa-2b482c0eb08b-config\") pod \"service-ca-operator-777779d784-56g5d\" (UID: \"94a86206-8505-4c7d-82aa-2b482c0eb08b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-56g5d" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.516175 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0e880f23-4bef-4e96-bf00-c94dc4551c5a-metrics-certs\") pod \"router-default-5444994796-wgtgl\" (UID: \"0e880f23-4bef-4e96-bf00-c94dc4551c5a\") " pod="openshift-ingress/router-default-5444994796-wgtgl" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.517446 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/37071a22-82fc-4b04-bc09-6535395faae6-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-7qwpx\" (UID: \"37071a22-82fc-4b04-bc09-6535395faae6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7qwpx" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.518693 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9aeba530-5722-4f41-9082-3f9316f06505-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-wgmzv\" (UID: \"9aeba530-5722-4f41-9082-3f9316f06505\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgmzv" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.520196 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/0e880f23-4bef-4e96-bf00-c94dc4551c5a-default-certificate\") pod \"router-default-5444994796-wgtgl\" (UID: \"0e880f23-4bef-4e96-bf00-c94dc4551c5a\") " pod="openshift-ingress/router-default-5444994796-wgtgl" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.520834 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjt9x\" (UniqueName: \"kubernetes.io/projected/2dce229d-701a-4a70-9c44-5f99d4c6fe79-kube-api-access-gjt9x\") pod \"openshift-config-operator-7777fb866f-2zrvx\" (UID: \"2dce229d-701a-4a70-9c44-5f99d4c6fe79\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2zrvx" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.521448 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/a56cf4d4-faa0-469e-b856-d1c030dd19d9-signing-cabundle\") pod \"service-ca-9c57cc56f-8w6wp\" (UID: \"a56cf4d4-faa0-469e-b856-d1c030dd19d9\") " pod="openshift-service-ca/service-ca-9c57cc56f-8w6wp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.522519 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/83d5e1a7-2827-4dfd-9feb-3b9630a62515-etcd-ca\") pod \"etcd-operator-b45778765-mhgwt\" (UID: \"83d5e1a7-2827-4dfd-9feb-3b9630a62515\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mhgwt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.523452 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7qt22" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.523698 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e-apiservice-cert\") pod \"packageserver-d55dfcdfc-z2rch\" (UID: \"925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.523970 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94e24fab-c452-4121-a617-9d5f02b8ba1b-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pmxvj\" (UID: \"94e24fab-c452-4121-a617-9d5f02b8ba1b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmxvj" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.525468 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cb200682-a8ee-406a-9c09-d881f40842e7-config-volume\") pod \"dns-default-6v2ms\" (UID: \"cb200682-a8ee-406a-9c09-d881f40842e7\") " pod="openshift-dns/dns-default-6v2ms" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.525657 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/17fc0166-9183-4bbd-a091-644b431349e1-socket-dir\") pod \"csi-hostpathplugin-mpgxt\" (UID: \"17fc0166-9183-4bbd-a091-644b431349e1\") " pod="hostpath-provisioner/csi-hostpathplugin-mpgxt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.526787 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83d5e1a7-2827-4dfd-9feb-3b9630a62515-config\") pod \"etcd-operator-b45778765-mhgwt\" (UID: \"83d5e1a7-2827-4dfd-9feb-3b9630a62515\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mhgwt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.530705 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/17fc0166-9183-4bbd-a091-644b431349e1-plugins-dir\") pod \"csi-hostpathplugin-mpgxt\" (UID: \"17fc0166-9183-4bbd-a091-644b431349e1\") " pod="hostpath-provisioner/csi-hostpathplugin-mpgxt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.536311 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e-tmpfs\") pod \"packageserver-d55dfcdfc-z2rch\" (UID: \"925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.536431 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e-webhook-cert\") pod \"packageserver-d55dfcdfc-z2rch\" (UID: \"925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.536561 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/0e880f23-4bef-4e96-bf00-c94dc4551c5a-stats-auth\") pod \"router-default-5444994796-wgtgl\" (UID: \"0e880f23-4bef-4e96-bf00-c94dc4551c5a\") " pod="openshift-ingress/router-default-5444994796-wgtgl" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.536836 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0e880f23-4bef-4e96-bf00-c94dc4551c5a-service-ca-bundle\") pod \"router-default-5444994796-wgtgl\" (UID: \"0e880f23-4bef-4e96-bf00-c94dc4551c5a\") " pod="openshift-ingress/router-default-5444994796-wgtgl" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.537011 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/05c127a2-f6b5-4d71-8646-e29396ea7971-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-bw9k5\" (UID: \"05c127a2-f6b5-4d71-8646-e29396ea7971\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bw9k5" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.537260 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4d6d6c1c-878b-47a0-9475-ac3ec45c17b0-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-ljb4c\" (UID: \"4d6d6c1c-878b-47a0-9475-ac3ec45c17b0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljb4c" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.537793 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/ef043a74-5704-48d8-abc6-4a1afef82b9c-node-bootstrap-token\") pod \"machine-config-server-xd95h\" (UID: \"ef043a74-5704-48d8-abc6-4a1afef82b9c\") " pod="openshift-machine-config-operator/machine-config-server-xd95h" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.538230 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/596b91f0-06b9-4e89-9816-da8049dad9e3-proxy-tls\") pod \"machine-config-operator-74547568cd-hvdx7\" (UID: \"596b91f0-06b9-4e89-9816-da8049dad9e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hvdx7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.539873 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a4e363de-fd5c-4f76-8943-ae3c56f3765b-bound-sa-token\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.539890 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b0cbdcce-514f-4b72-8c8c-17029b7217a8-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-l7crv\" (UID: \"b0cbdcce-514f-4b72-8c8c-17029b7217a8\") " pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.542303 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94e24fab-c452-4121-a617-9d5f02b8ba1b-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pmxvj\" (UID: \"94e24fab-c452-4121-a617-9d5f02b8ba1b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmxvj" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.542331 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cb200682-a8ee-406a-9c09-d881f40842e7-metrics-tls\") pod \"dns-default-6v2ms\" (UID: \"cb200682-a8ee-406a-9c09-d881f40842e7\") " pod="openshift-dns/dns-default-6v2ms" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.543107 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/b22a43e3-90de-4609-bf64-006de1716ae3-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-wkc4c\" (UID: \"b22a43e3-90de-4609-bf64-006de1716ae3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wkc4c" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.545616 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/83d5e1a7-2827-4dfd-9feb-3b9630a62515-etcd-client\") pod \"etcd-operator-b45778765-mhgwt\" (UID: \"83d5e1a7-2827-4dfd-9feb-3b9630a62515\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mhgwt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.546007 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/5e47b273-3bc0-4f32-8bdb-aa283db4d8a1-profile-collector-cert\") pod \"olm-operator-6b444d44fb-kz2ff\" (UID: \"5e47b273-3bc0-4f32-8bdb-aa283db4d8a1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kz2ff" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.547051 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/6de410a6-e2d0-4c6d-8a38-d8d5c49a30e5-srv-cert\") pod \"catalog-operator-68c6474976-4ddcl\" (UID: \"6de410a6-e2d0-4c6d-8a38-d8d5c49a30e5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4ddcl" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.547607 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83d5e1a7-2827-4dfd-9feb-3b9630a62515-serving-cert\") pod \"etcd-operator-b45778765-mhgwt\" (UID: \"83d5e1a7-2827-4dfd-9feb-3b9630a62515\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mhgwt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.547706 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8"] Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.548283 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6847acbc-29ec-4939-a6aa-4617b8e438e7-cert\") pod \"ingress-canary-xdmgm\" (UID: \"6847acbc-29ec-4939-a6aa-4617b8e438e7\") " pod="openshift-ingress-canary/ingress-canary-xdmgm" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.549508 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b0cbdcce-514f-4b72-8c8c-17029b7217a8-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-l7crv\" (UID: \"b0cbdcce-514f-4b72-8c8c-17029b7217a8\") " pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.550233 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bbb5ef4b-b4d4-4ef5-aab3-b9b8f759c6b4-proxy-tls\") pod \"machine-config-controller-84d6567774-w8bf7\" (UID: \"bbb5ef4b-b4d4-4ef5-aab3-b9b8f759c6b4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8bf7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.550599 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bbb5ef4b-b4d4-4ef5-aab3-b9b8f759c6b4-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-w8bf7\" (UID: \"bbb5ef4b-b4d4-4ef5-aab3-b9b8f759c6b4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8bf7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.551797 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/a56cf4d4-faa0-469e-b856-d1c030dd19d9-signing-key\") pod \"service-ca-9c57cc56f-8w6wp\" (UID: \"a56cf4d4-faa0-469e-b856-d1c030dd19d9\") " pod="openshift-service-ca/service-ca-9c57cc56f-8w6wp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.552241 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/fca0553c-03be-49c4-ba2a-dced5bc62586-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-26mgc\" (UID: \"fca0553c-03be-49c4-ba2a-dced5bc62586\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-26mgc" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.553011 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/ef043a74-5704-48d8-abc6-4a1afef82b9c-certs\") pod \"machine-config-server-xd95h\" (UID: \"ef043a74-5704-48d8-abc6-4a1afef82b9c\") " pod="openshift-machine-config-operator/machine-config-server-xd95h" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.554581 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94a86206-8505-4c7d-82aa-2b482c0eb08b-serving-cert\") pod \"service-ca-operator-777779d784-56g5d\" (UID: \"94a86206-8505-4c7d-82aa-2b482c0eb08b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-56g5d" Mar 20 13:24:45 crc kubenswrapper[4973]: W0320 13:24:45.555109 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda60cb615_f335_45fa_86dd_ddf121e62737.slice/crio-e7834d592b7d5b99d9d3af1aabbecdddc83511528d6113dd87f34737e85acf66 WatchSource:0}: Error finding container e7834d592b7d5b99d9d3af1aabbecdddc83511528d6113dd87f34737e85acf66: Status 404 returned error can't find the container with id e7834d592b7d5b99d9d3af1aabbecdddc83511528d6113dd87f34737e85acf66 Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.556945 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9d6f338-6a7a-4ba2-b1c9-12485bb30937-config-volume\") pod \"collect-profiles-29566875-2dssd\" (UID: \"b9d6f338-6a7a-4ba2-b1c9-12485bb30937\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-2dssd" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.557044 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/596b91f0-06b9-4e89-9816-da8049dad9e3-auth-proxy-config\") pod \"machine-config-operator-74547568cd-hvdx7\" (UID: \"596b91f0-06b9-4e89-9816-da8049dad9e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hvdx7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.558500 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9d6f338-6a7a-4ba2-b1c9-12485bb30937-secret-volume\") pod \"collect-profiles-29566875-2dssd\" (UID: \"b9d6f338-6a7a-4ba2-b1c9-12485bb30937\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-2dssd" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.558691 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6grs"] Mar 20 13:24:45 crc kubenswrapper[4973]: W0320 13:24:45.560153 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd16677da_48c3_4fd4_9e59_b0013daa4825.slice/crio-0cd83d74bc01c1f115f60f745c00ee6155a816d886cbc87a5d60b2df2fa511f5 WatchSource:0}: Error finding container 0cd83d74bc01c1f115f60f745c00ee6155a816d886cbc87a5d60b2df2fa511f5: Status 404 returned error can't find the container with id 0cd83d74bc01c1f115f60f745c00ee6155a816d886cbc87a5d60b2df2fa511f5 Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.560262 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dr4d\" (UniqueName: \"kubernetes.io/projected/330a4463-669f-4f2e-aa4b-614ab0654579-kube-api-access-4dr4d\") pod \"ingress-operator-5b745b69d9-8hngw\" (UID: \"330a4463-669f-4f2e-aa4b-614ab0654579\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8hngw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.578154 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/330a4463-669f-4f2e-aa4b-614ab0654579-bound-sa-token\") pod \"ingress-operator-5b745b69d9-8hngw\" (UID: \"330a4463-669f-4f2e-aa4b-614ab0654579\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8hngw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.600617 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-97449\" (UniqueName: \"kubernetes.io/projected/47aff328-d38c-426c-8462-12c6b98a82fd-kube-api-access-97449\") pod \"openshift-apiserver-operator-796bbdcf4f-jl6jn\" (UID: \"47aff328-d38c-426c-8462-12c6b98a82fd\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jl6jn" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.610181 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:45 crc kubenswrapper[4973]: E0320 13:24:45.610549 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:46.110538091 +0000 UTC m=+206.854207835 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.615550 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx7wc\" (UniqueName: \"kubernetes.io/projected/8521b3b0-4fb2-45b2-90b5-7080e766aafa-kube-api-access-vx7wc\") pod \"controller-manager-879f6c89f-mtrbp\" (UID: \"8521b3b0-4fb2-45b2-90b5-7080e766aafa\") " pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.617696 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2zrvx" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.639385 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/95609887-af52-4179-88e3-7f3730642377-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-46h7s\" (UID: \"95609887-af52-4179-88e3-7f3730642377\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46h7s" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.666501 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz7fv\" (UniqueName: \"kubernetes.io/projected/a4e363de-fd5c-4f76-8943-ae3c56f3765b-kube-api-access-xz7fv\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.676561 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkmln\" (UniqueName: \"kubernetes.io/projected/69bb010a-a13d-4458-8118-80c5aebb6e65-kube-api-access-zkmln\") pod \"apiserver-7bbb656c7d-f8vm4\" (UID: \"69bb010a-a13d-4458-8118-80c5aebb6e65\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.699899 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lg7ck\" (UniqueName: \"kubernetes.io/projected/5f512921-f02c-464b-af06-d65fb95f0071-kube-api-access-lg7ck\") pod \"console-operator-58897d9998-4bk2w\" (UID: \"5f512921-f02c-464b-af06-d65fb95f0071\") " pod="openshift-console-operator/console-operator-58897d9998-4bk2w" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.712266 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:45 crc kubenswrapper[4973]: E0320 13:24:45.712449 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:46.212423742 +0000 UTC m=+206.956093486 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.712592 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:45 crc kubenswrapper[4973]: E0320 13:24:45.712906 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:46.212899395 +0000 UTC m=+206.956569139 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.719490 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr5bs\" (UniqueName: \"kubernetes.io/projected/b1fe291e-3490-49c0-9443-e5b0f03db19c-kube-api-access-jr5bs\") pod \"oauth-openshift-558db77b4-tl6hw\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.731213 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.740466 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxv7v\" (UniqueName: \"kubernetes.io/projected/322233fd-b71f-4ef5-931f-58e98326386a-kube-api-access-xxv7v\") pod \"authentication-operator-69f744f599-c4gcj\" (UID: \"322233fd-b71f-4ef5-931f-58e98326386a\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-c4gcj" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.750744 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-5dqnp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.756381 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5dlv\" (UniqueName: \"kubernetes.io/projected/0bf18c8f-c77f-4208-b464-19772b0221f4-kube-api-access-l5dlv\") pod \"openshift-controller-manager-operator-756b6f6bc6-tdnm6\" (UID: \"0bf18c8f-c77f-4208-b464-19772b0221f4\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdnm6" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.763023 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8hngw" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.776704 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2zrvx"] Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.783569 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:45 crc kubenswrapper[4973]: W0320 13:24:45.784083 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2dce229d_701a_4a70_9c44_5f99d4c6fe79.slice/crio-61f91f43411e85042c1aaa57e9dfe17cc067690c516a74b27b2d174fbebd36ee WatchSource:0}: Error finding container 61f91f43411e85042c1aaa57e9dfe17cc067690c516a74b27b2d174fbebd36ee: Status 404 returned error can't find the container with id 61f91f43411e85042c1aaa57e9dfe17cc067690c516a74b27b2d174fbebd36ee Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.799697 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krg4j\" (UniqueName: \"kubernetes.io/projected/fca0553c-03be-49c4-ba2a-dced5bc62586-kube-api-access-krg4j\") pod \"multus-admission-controller-857f4d67dd-26mgc\" (UID: \"fca0553c-03be-49c4-ba2a-dced5bc62586\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-26mgc" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.812574 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jl6jn" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.814459 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:45 crc kubenswrapper[4973]: E0320 13:24:45.814947 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:46.314933281 +0000 UTC m=+207.058603015 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.824519 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxnnr\" (UniqueName: \"kubernetes.io/projected/a56cf4d4-faa0-469e-b856-d1c030dd19d9-kube-api-access-mxnnr\") pod \"service-ca-9c57cc56f-8w6wp\" (UID: \"a56cf4d4-faa0-469e-b856-d1c030dd19d9\") " pod="openshift-service-ca/service-ca-9c57cc56f-8w6wp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.837817 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.856220 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdnm6" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.859048 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99dkz\" (UniqueName: \"kubernetes.io/projected/0e880f23-4bef-4e96-bf00-c94dc4551c5a-kube-api-access-99dkz\") pod \"router-default-5444994796-wgtgl\" (UID: \"0e880f23-4bef-4e96-bf00-c94dc4551c5a\") " pod="openshift-ingress/router-default-5444994796-wgtgl" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.859840 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9aeba530-5722-4f41-9082-3f9316f06505-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-wgmzv\" (UID: \"9aeba530-5722-4f41-9082-3f9316f06505\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgmzv" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.879156 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jct5c\" (UniqueName: \"kubernetes.io/projected/66c28cfe-8a0b-459a-bbab-59053fe226b8-kube-api-access-jct5c\") pod \"auto-csr-approver-29566884-j8dqd\" (UID: \"66c28cfe-8a0b-459a-bbab-59053fe226b8\") " pod="openshift-infra/auto-csr-approver-29566884-j8dqd" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.879754 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46h7s" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.892489 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-wgtgl" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.898660 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-26mgc" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.899590 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhqn4\" (UniqueName: \"kubernetes.io/projected/17fc0166-9183-4bbd-a091-644b431349e1-kube-api-access-nhqn4\") pod \"csi-hostpathplugin-mpgxt\" (UID: \"17fc0166-9183-4bbd-a091-644b431349e1\") " pod="hostpath-provisioner/csi-hostpathplugin-mpgxt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.915554 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:45 crc kubenswrapper[4973]: E0320 13:24:45.916019 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:46.416007229 +0000 UTC m=+207.159676973 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.918774 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt9hg\" (UniqueName: \"kubernetes.io/projected/94a86206-8505-4c7d-82aa-2b482c0eb08b-kube-api-access-lt9hg\") pod \"service-ca-operator-777779d784-56g5d\" (UID: \"94a86206-8505-4c7d-82aa-2b482c0eb08b\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-56g5d" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.925001 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-8w6wp" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.930291 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-k7krj" event={"ID":"de8d912e-7616-42ee-a688-b43d5b85dc44","Type":"ContainerStarted","Data":"9177ef8fd6fdd2756e298bcec7d63f4104099f4e0bb820eda29fc40835fc0d91"} Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.930353 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-k7krj" event={"ID":"de8d912e-7616-42ee-a688-b43d5b85dc44","Type":"ContainerStarted","Data":"6909a8e8efa8aea6b3c5239315309179c8a21ed6be0e5c3ed0c16fa23a7e9cd3"} Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.931220 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-56g5d" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.951630 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-4bk2w" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.960544 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8" event={"ID":"a60cb615-f335-45fa-86dd-ddf121e62737","Type":"ContainerStarted","Data":"5b4441d53cf53f861345b1c4b9573b80e9b281d00ad7ac3728c5446304237526"} Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.960583 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8" event={"ID":"a60cb615-f335-45fa-86dd-ddf121e62737","Type":"ContainerStarted","Data":"e7834d592b7d5b99d9d3af1aabbecdddc83511528d6113dd87f34737e85acf66"} Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.961663 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-mpgxt" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.962266 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.969857 4973 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-b7jm8 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.969904 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8" podUID="a60cb615-f335-45fa-86dd-ddf121e62737" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.13:8443/healthz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 20 13:24:45 crc kubenswrapper[4973]: I0320 13:24:45.970198 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2zrvx" event={"ID":"2dce229d-701a-4a70-9c44-5f99d4c6fe79","Type":"ContainerStarted","Data":"61f91f43411e85042c1aaa57e9dfe17cc067690c516a74b27b2d174fbebd36ee"} Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:45.991135 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79nf2\" (UniqueName: \"kubernetes.io/projected/bbb5ef4b-b4d4-4ef5-aab3-b9b8f759c6b4-kube-api-access-79nf2\") pod \"machine-config-controller-84d6567774-w8bf7\" (UID: \"bbb5ef4b-b4d4-4ef5-aab3-b9b8f759c6b4\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8bf7" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:45.995639 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/37071a22-82fc-4b04-bc09-6535395faae6-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-7qwpx\" (UID: \"37071a22-82fc-4b04-bc09-6535395faae6\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7qwpx" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:45.999472 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xmgd\" (UniqueName: \"kubernetes.io/projected/4d6d6c1c-878b-47a0-9475-ac3ec45c17b0-kube-api-access-9xmgd\") pod \"kube-storage-version-migrator-operator-b67b599dd-ljb4c\" (UID: \"4d6d6c1c-878b-47a0-9475-ac3ec45c17b0\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljb4c" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:45.999768 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.006896 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knqbb\" (UniqueName: \"kubernetes.io/projected/05c127a2-f6b5-4d71-8646-e29396ea7971-kube-api-access-knqbb\") pod \"package-server-manager-789f6589d5-bw9k5\" (UID: \"05c127a2-f6b5-4d71-8646-e29396ea7971\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bw9k5" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.017628 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:46 crc kubenswrapper[4973]: E0320 13:24:46.019557 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:46.519537608 +0000 UTC m=+207.263207362 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.027016 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jt5vz" event={"ID":"0b6717fd-e636-4be6-8c04-c9b46924a3b2","Type":"ContainerStarted","Data":"9bf7de86fd0200ccb3669489551c3c58022ff29fb86cd0092998d24e29f76e33"} Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.027059 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jt5vz" event={"ID":"0b6717fd-e636-4be6-8c04-c9b46924a3b2","Type":"ContainerStarted","Data":"9f5e64e3a7701c388bd6f79854931a469f2f6175b2ff1aa629229427a7fc845d"} Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.027073 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-jt5vz" event={"ID":"0b6717fd-e636-4be6-8c04-c9b46924a3b2","Type":"ContainerStarted","Data":"67fd042ff022e5397fe1a6a78af0a8c0e7afc4de5c04270162ca475b43262a3e"} Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.037013 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-xt4hf" event={"ID":"6e84a900-8f09-438e-a365-60d6b9fc835b","Type":"ContainerStarted","Data":"146d352fb4ee6f9fe516a03e7eb1e98f7badcce6140e8b0be93d55c21029bd7d"} Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.037054 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-xt4hf" event={"ID":"6e84a900-8f09-438e-a365-60d6b9fc835b","Type":"ContainerStarted","Data":"95d08072df344b7455563a5001fffefc2ef9178b8ac7390234816bdf0c5ceac5"} Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.037900 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-xt4hf" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.038882 4973 patch_prober.go:28] interesting pod/downloads-7954f5f757-xt4hf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.038927 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xt4hf" podUID="6e84a900-8f09-438e-a365-60d6b9fc835b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.039620 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-c4gcj" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.042551 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh8zf\" (UniqueName: \"kubernetes.io/projected/b0cbdcce-514f-4b72-8c8c-17029b7217a8-kube-api-access-vh8zf\") pod \"marketplace-operator-79b997595-l7crv\" (UID: \"b0cbdcce-514f-4b72-8c8c-17029b7217a8\") " pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.048313 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6grs" event={"ID":"1337d1ae-1f3f-4d65-aa7d-1e1f97e26e90","Type":"ContainerStarted","Data":"3ecf27bb53e82297cc2fe23d9f528c299a4b2b5e858bd51a3b3702a68e31076d"} Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.048404 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6grs" event={"ID":"1337d1ae-1f3f-4d65-aa7d-1e1f97e26e90","Type":"ContainerStarted","Data":"0e373afc33188dac3df9f5550c7e79743d38a5db72aa66b59c9f608859e7e035"} Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.057244 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzvrj\" (UniqueName: \"kubernetes.io/projected/6de410a6-e2d0-4c6d-8a38-d8d5c49a30e5-kube-api-access-mzvrj\") pod \"catalog-operator-68c6474976-4ddcl\" (UID: \"6de410a6-e2d0-4c6d-8a38-d8d5c49a30e5\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4ddcl" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.063406 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7qt22" event={"ID":"d16677da-48c3-4fd4-9e59-b0013daa4825","Type":"ContainerStarted","Data":"51cffe4fc3c7ea663f1e7673a5c7d5f5e2c9028361edb4105400d69f9fbc4777"} Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.063445 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7qt22" event={"ID":"d16677da-48c3-4fd4-9e59-b0013daa4825","Type":"ContainerStarted","Data":"0cd83d74bc01c1f115f60f745c00ee6155a816d886cbc87a5d60b2df2fa511f5"} Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.076270 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/94e24fab-c452-4121-a617-9d5f02b8ba1b-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-pmxvj\" (UID: \"94e24fab-c452-4121-a617-9d5f02b8ba1b\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmxvj" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.081468 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4ddcl" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.090141 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2x56\" (UniqueName: \"kubernetes.io/projected/6847acbc-29ec-4939-a6aa-4617b8e438e7-kube-api-access-g2x56\") pod \"ingress-canary-xdmgm\" (UID: \"6847acbc-29ec-4939-a6aa-4617b8e438e7\") " pod="openshift-ingress-canary/ingress-canary-xdmgm" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.090156 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7qwpx" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.098271 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljb4c" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.105968 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgmzv" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.119951 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:46 crc kubenswrapper[4973]: E0320 13:24:46.120937 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:46.620920954 +0000 UTC m=+207.364590698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.121706 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kx9b\" (UniqueName: \"kubernetes.io/projected/5e47b273-3bc0-4f32-8bdb-aa283db4d8a1-kube-api-access-9kx9b\") pod \"olm-operator-6b444d44fb-kz2ff\" (UID: \"5e47b273-3bc0-4f32-8bdb-aa283db4d8a1\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kz2ff" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.128742 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j55fz\" (UniqueName: \"kubernetes.io/projected/b22a43e3-90de-4609-bf64-006de1716ae3-kube-api-access-j55fz\") pod \"control-plane-machine-set-operator-78cbb6b69f-wkc4c\" (UID: \"b22a43e3-90de-4609-bf64-006de1716ae3\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wkc4c" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.131953 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmxvj" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.144562 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m92n\" (UniqueName: \"kubernetes.io/projected/cb200682-a8ee-406a-9c09-d881f40842e7-kube-api-access-7m92n\") pod \"dns-default-6v2ms\" (UID: \"cb200682-a8ee-406a-9c09-d881f40842e7\") " pod="openshift-dns/dns-default-6v2ms" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.146940 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kz2ff" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.161241 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtkq9\" (UniqueName: \"kubernetes.io/projected/925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e-kube-api-access-vtkq9\") pod \"packageserver-d55dfcdfc-z2rch\" (UID: \"925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.177221 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566884-j8dqd" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.180874 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fbnq\" (UniqueName: \"kubernetes.io/projected/83d5e1a7-2827-4dfd-9feb-3b9630a62515-kube-api-access-2fbnq\") pod \"etcd-operator-b45778765-mhgwt\" (UID: \"83d5e1a7-2827-4dfd-9feb-3b9630a62515\") " pod="openshift-etcd-operator/etcd-operator-b45778765-mhgwt" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.191299 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.205091 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xn5g\" (UniqueName: \"kubernetes.io/projected/b9d6f338-6a7a-4ba2-b1c9-12485bb30937-kube-api-access-4xn5g\") pod \"collect-profiles-29566875-2dssd\" (UID: \"b9d6f338-6a7a-4ba2-b1c9-12485bb30937\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-2dssd" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.207161 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8bf7" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.220532 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-2dssd" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.221327 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:46 crc kubenswrapper[4973]: E0320 13:24:46.221489 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:46.721474017 +0000 UTC m=+207.465143761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:46 crc kubenswrapper[4973]: E0320 13:24:46.222215 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:46.722207769 +0000 UTC m=+207.465877513 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.221931 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.225012 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-5dqnp"] Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.242610 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bw9k5" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.269446 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pn6xj\" (UniqueName: \"kubernetes.io/projected/ef043a74-5704-48d8-abc6-4a1afef82b9c-kube-api-access-pn6xj\") pod \"machine-config-server-xd95h\" (UID: \"ef043a74-5704-48d8-abc6-4a1afef82b9c\") " pod="openshift-machine-config-operator/machine-config-server-xd95h" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.269698 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xdmgm" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.269710 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb5tr\" (UniqueName: \"kubernetes.io/projected/596b91f0-06b9-4e89-9816-da8049dad9e3-kube-api-access-tb5tr\") pod \"machine-config-operator-74547568cd-hvdx7\" (UID: \"596b91f0-06b9-4e89-9816-da8049dad9e3\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hvdx7" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.277965 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qw7n\" (UniqueName: \"kubernetes.io/projected/3c73e93c-a062-4361-bc15-95eb55598666-kube-api-access-9qw7n\") pod \"migrator-59844c95c7-zg96g\" (UID: \"3c73e93c-a062-4361-bc15-95eb55598666\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zg96g" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.287084 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-6v2ms" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.305626 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4"] Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.323977 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:46 crc kubenswrapper[4973]: E0320 13:24:46.324303 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:46.824287895 +0000 UTC m=+207.567957639 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.366657 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-8hngw"] Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.367380 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jl6jn"] Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.369324 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-mhgwt" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.375201 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.405079 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdnm6"] Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.412032 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wkc4c" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.421381 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zg96g" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.425864 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:46 crc kubenswrapper[4973]: E0320 13:24:46.426184 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:46.926169126 +0000 UTC m=+207.669838870 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.438493 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hvdx7" Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.472857 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-k9jzg"] Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.502110 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46h7s"] Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.526462 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:46 crc kubenswrapper[4973]: E0320 13:24:46.526693 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:47.026670017 +0000 UTC m=+207.770339761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.526992 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:46 crc kubenswrapper[4973]: E0320 13:24:46.527270 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:47.027258995 +0000 UTC m=+207.770928739 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.531848 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mtrbp"] Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.547909 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-xd95h" Mar 20 13:24:46 crc kubenswrapper[4973]: W0320 13:24:46.581002 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bf18c8f_c77f_4208_b464_19772b0221f4.slice/crio-f4fc35632b11ee99a6d8ae19eea7cf59de39a1245650871793ee5d9a75c65c18 WatchSource:0}: Error finding container f4fc35632b11ee99a6d8ae19eea7cf59de39a1245650871793ee5d9a75c65c18: Status 404 returned error can't find the container with id f4fc35632b11ee99a6d8ae19eea7cf59de39a1245650871793ee5d9a75c65c18 Mar 20 13:24:46 crc kubenswrapper[4973]: W0320 13:24:46.581281 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod330a4463_669f_4f2e_aa4b_614ab0654579.slice/crio-b51d0971c7d631af7cfa01035c23066985561bd4db33b0856a893df065300145 WatchSource:0}: Error finding container b51d0971c7d631af7cfa01035c23066985561bd4db33b0856a893df065300145: Status 404 returned error can't find the container with id b51d0971c7d631af7cfa01035c23066985561bd4db33b0856a893df065300145 Mar 20 13:24:46 crc kubenswrapper[4973]: W0320 13:24:46.581525 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47aff328_d38c_426c_8462_12c6b98a82fd.slice/crio-b268c1a2071ae395510af2e4af60c8681ff4a5e8009c30b5b1dc00a897c99fd5 WatchSource:0}: Error finding container b268c1a2071ae395510af2e4af60c8681ff4a5e8009c30b5b1dc00a897c99fd5: Status 404 returned error can't find the container with id b268c1a2071ae395510af2e4af60c8681ff4a5e8009c30b5b1dc00a897c99fd5 Mar 20 13:24:46 crc kubenswrapper[4973]: W0320 13:24:46.606921 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef043a74_5704_48d8_abc6_4a1afef82b9c.slice/crio-9c80f8d3c2191ee2753c0d905a9a77fd0baaf65fb8cca0c3ba77634cfc565eb6 WatchSource:0}: Error finding container 9c80f8d3c2191ee2753c0d905a9a77fd0baaf65fb8cca0c3ba77634cfc565eb6: Status 404 returned error can't find the container with id 9c80f8d3c2191ee2753c0d905a9a77fd0baaf65fb8cca0c3ba77634cfc565eb6 Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.627387 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:46 crc kubenswrapper[4973]: E0320 13:24:46.627707 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:47.127693863 +0000 UTC m=+207.871363607 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.698289 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-8w6wp"] Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.729252 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:46 crc kubenswrapper[4973]: E0320 13:24:46.729597 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:47.229583985 +0000 UTC m=+207.973253729 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:46 crc kubenswrapper[4973]: W0320 13:24:46.797016 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda56cf4d4_faa0_469e_b856_d1c030dd19d9.slice/crio-6cbc1b41da2671a5f4521bde5e826f8db2d6b6e53ea3bab3fc48d3af842e587a WatchSource:0}: Error finding container 6cbc1b41da2671a5f4521bde5e826f8db2d6b6e53ea3bab3fc48d3af842e587a: Status 404 returned error can't find the container with id 6cbc1b41da2671a5f4521bde5e826f8db2d6b6e53ea3bab3fc48d3af842e587a Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.829731 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:46 crc kubenswrapper[4973]: E0320 13:24:46.829932 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:47.329912622 +0000 UTC m=+208.073582366 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.830058 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:46 crc kubenswrapper[4973]: E0320 13:24:46.830322 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:47.330312343 +0000 UTC m=+208.073982087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.931161 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:46 crc kubenswrapper[4973]: E0320 13:24:46.931569 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:47.431535465 +0000 UTC m=+208.175205209 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:46 crc kubenswrapper[4973]: I0320 13:24:46.931743 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:46 crc kubenswrapper[4973]: E0320 13:24:46.932161 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:47.432148373 +0000 UTC m=+208.175818117 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.033033 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:47 crc kubenswrapper[4973]: E0320 13:24:47.033145 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:47.533119838 +0000 UTC m=+208.276789582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.033579 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:47 crc kubenswrapper[4973]: E0320 13:24:47.033911 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:47.533892159 +0000 UTC m=+208.277561903 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.091158 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" event={"ID":"391ef260-a6ea-4cab-bca3-280435898381","Type":"ContainerStarted","Data":"087d8505f69a3113d305ef14c88f8c5a3742b8615ba5541e2c4c0d636a718a23"} Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.096972 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-8w6wp" event={"ID":"a56cf4d4-faa0-469e-b856-d1c030dd19d9","Type":"ContainerStarted","Data":"6cbc1b41da2671a5f4521bde5e826f8db2d6b6e53ea3bab3fc48d3af842e587a"} Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.098361 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" event={"ID":"8521b3b0-4fb2-45b2-90b5-7080e766aafa","Type":"ContainerStarted","Data":"ba8a643536ada88fe35ad82fa1e371d009cde21292ce1a68f1e4a8f87a22a6a8"} Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.101433 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6grs" event={"ID":"1337d1ae-1f3f-4d65-aa7d-1e1f97e26e90","Type":"ContainerStarted","Data":"4f26dfdc411e066cd6e149926d5ed5290d35e1351738a807f330f7eb181b8900"} Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.103521 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8hngw" event={"ID":"330a4463-669f-4f2e-aa4b-614ab0654579","Type":"ContainerStarted","Data":"b51d0971c7d631af7cfa01035c23066985561bd4db33b0856a893df065300145"} Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.105115 4973 generic.go:334] "Generic (PLEG): container finished" podID="2dce229d-701a-4a70-9c44-5f99d4c6fe79" containerID="b5d624361281ad1055f5bd80ef7f1c28edbe0703b5d0223b12b3c91dfe98bf04" exitCode=0 Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.105179 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2zrvx" event={"ID":"2dce229d-701a-4a70-9c44-5f99d4c6fe79","Type":"ContainerDied","Data":"b5d624361281ad1055f5bd80ef7f1c28edbe0703b5d0223b12b3c91dfe98bf04"} Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.111976 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5dqnp" event={"ID":"2aab7bd5-eca7-4faa-a4fc-d83be6cc6c3c","Type":"ContainerStarted","Data":"3159ae3fd9622e829222072fafcbaef6bd9a831d2f1acd252acfaa7548466a9b"} Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.112021 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5dqnp" event={"ID":"2aab7bd5-eca7-4faa-a4fc-d83be6cc6c3c","Type":"ContainerStarted","Data":"60110a60a5d713ed04b31aef56a1a92015edab633e9f1efc0df71aa83798cf51"} Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.116787 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-xd95h" event={"ID":"ef043a74-5704-48d8-abc6-4a1afef82b9c","Type":"ContainerStarted","Data":"9c80f8d3c2191ee2753c0d905a9a77fd0baaf65fb8cca0c3ba77634cfc565eb6"} Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.118881 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" event={"ID":"69bb010a-a13d-4458-8118-80c5aebb6e65","Type":"ContainerStarted","Data":"7ae0254fa7428197ccd0057afd3b5f91b4e2cf331ab84ba9017bedbb83c09b64"} Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.120276 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46h7s" event={"ID":"95609887-af52-4179-88e3-7f3730642377","Type":"ContainerStarted","Data":"2785d144687899077aff727649fcafa7d4ca03f0d5259e6737b251117699650c"} Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.122268 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jl6jn" event={"ID":"47aff328-d38c-426c-8462-12c6b98a82fd","Type":"ContainerStarted","Data":"b268c1a2071ae395510af2e4af60c8681ff4a5e8009c30b5b1dc00a897c99fd5"} Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.128542 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-wgtgl" event={"ID":"0e880f23-4bef-4e96-bf00-c94dc4551c5a","Type":"ContainerStarted","Data":"b824383e01c8df94aaa5809814495fba866ff51ce71903d2604814a774acd055"} Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.128580 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-wgtgl" event={"ID":"0e880f23-4bef-4e96-bf00-c94dc4551c5a","Type":"ContainerStarted","Data":"1e3505d31f02b313b2dbf235b1883e586d604d52f11e075c6adb36c11cf77d1b"} Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.133786 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdnm6" event={"ID":"0bf18c8f-c77f-4208-b464-19772b0221f4","Type":"ContainerStarted","Data":"f4fc35632b11ee99a6d8ae19eea7cf59de39a1245650871793ee5d9a75c65c18"} Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.134231 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:47 crc kubenswrapper[4973]: E0320 13:24:47.134351 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:47.634312009 +0000 UTC m=+208.377981753 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.134458 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:47 crc kubenswrapper[4973]: E0320 13:24:47.134769 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:47.634758851 +0000 UTC m=+208.378428595 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.137523 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7qt22" event={"ID":"d16677da-48c3-4fd4-9e59-b0013daa4825","Type":"ContainerStarted","Data":"91ce49782df2fb3e6cdd72fa30a607ab670a32835284ef9d5e5413e06f644376"} Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.138177 4973 patch_prober.go:28] interesting pod/downloads-7954f5f757-xt4hf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.138250 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xt4hf" podUID="6e84a900-8f09-438e-a365-60d6b9fc835b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.151152 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8" Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.236839 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:47 crc kubenswrapper[4973]: E0320 13:24:47.239895 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:47.739869156 +0000 UTC m=+208.483539001 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.296971 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljb4c"] Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.341680 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:47 crc kubenswrapper[4973]: E0320 13:24:47.342316 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:47.842294593 +0000 UTC m=+208.585964337 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.416056 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-56g5d"] Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.439994 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tl6hw"] Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.442959 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:47 crc kubenswrapper[4973]: E0320 13:24:47.443322 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:47.943307449 +0000 UTC m=+208.686977193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.458687 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-4bk2w"] Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.492524 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-mpgxt"] Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.499332 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7qwpx"] Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.501718 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-c4gcj"] Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.522779 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-26mgc"] Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.545457 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:47 crc kubenswrapper[4973]: E0320 13:24:47.546109 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:48.046091206 +0000 UTC m=+208.789760950 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.559925 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-zg96g"] Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.567205 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-hvdx7"] Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.574731 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmxvj"] Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.587029 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kz2ff"] Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.594851 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-mhgwt"] Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.613833 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgmzv"] Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.616218 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4ddcl"] Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.647262 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:47 crc kubenswrapper[4973]: E0320 13:24:47.648184 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:48.148146692 +0000 UTC m=+208.891816436 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.721295 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xdmgm"] Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.726965 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-w8bf7"] Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.730696 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch"] Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.732532 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566875-2dssd"] Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.735615 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566884-j8dqd"] Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.741673 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-l7crv"] Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.749528 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:47 crc kubenswrapper[4973]: E0320 13:24:47.749871 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:48.249856129 +0000 UTC m=+208.993525873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.757245 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wkc4c"] Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.759263 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-6v2ms"] Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.764858 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-wgtgl" podStartSLOduration=162.7648335 podStartE2EDuration="2m42.7648335s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:47.731251681 +0000 UTC m=+208.474921425" watchObservedRunningTime="2026-03-20 13:24:47.7648335 +0000 UTC m=+208.508503244" Mar 20 13:24:47 crc kubenswrapper[4973]: W0320 13:24:47.771014 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0cbdcce_514f_4b72_8c8c_17029b7217a8.slice/crio-1c9e41c5ed9170e422bc0a5dcef516c793a78b237e475c0c2f24ca2222b43c6b WatchSource:0}: Error finding container 1c9e41c5ed9170e422bc0a5dcef516c793a78b237e475c0c2f24ca2222b43c6b: Status 404 returned error can't find the container with id 1c9e41c5ed9170e422bc0a5dcef516c793a78b237e475c0c2f24ca2222b43c6b Mar 20 13:24:47 crc kubenswrapper[4973]: W0320 13:24:47.773998 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9d6f338_6a7a_4ba2_b1c9_12485bb30937.slice/crio-f3e181a66cf44a5811e4649214668a9610bf0274da40991eba44a30f763d51cb WatchSource:0}: Error finding container f3e181a66cf44a5811e4649214668a9610bf0274da40991eba44a30f763d51cb: Status 404 returned error can't find the container with id f3e181a66cf44a5811e4649214668a9610bf0274da40991eba44a30f763d51cb Mar 20 13:24:47 crc kubenswrapper[4973]: W0320 13:24:47.784635 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod66c28cfe_8a0b_459a_bbab_59053fe226b8.slice/crio-76f4625175a4f1545ae789991f286fbe98ad6c9eeb5735a8c89adb5d7531e8c7 WatchSource:0}: Error finding container 76f4625175a4f1545ae789991f286fbe98ad6c9eeb5735a8c89adb5d7531e8c7: Status 404 returned error can't find the container with id 76f4625175a4f1545ae789991f286fbe98ad6c9eeb5735a8c89adb5d7531e8c7 Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.786529 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bw9k5"] Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.795307 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-h6grs" podStartSLOduration=163.7952894 podStartE2EDuration="2m43.7952894s" podCreationTimestamp="2026-03-20 13:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:47.795031582 +0000 UTC m=+208.538701356" watchObservedRunningTime="2026-03-20 13:24:47.7952894 +0000 UTC m=+208.538959154" Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.812826 4973 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.841506 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-jt5vz" podStartSLOduration=162.841487253 podStartE2EDuration="2m42.841487253s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:47.840793264 +0000 UTC m=+208.584462998" watchObservedRunningTime="2026-03-20 13:24:47.841487253 +0000 UTC m=+208.585156997" Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.850165 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:47 crc kubenswrapper[4973]: E0320 13:24:47.850426 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:48.350399071 +0000 UTC m=+209.094068815 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.850514 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:47 crc kubenswrapper[4973]: E0320 13:24:47.851845 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:48.351822852 +0000 UTC m=+209.095492586 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.894555 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-wgtgl" Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.899104 4973 patch_prober.go:28] interesting pod/router-default-5444994796-wgtgl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:24:47 crc kubenswrapper[4973]: [-]has-synced failed: reason withheld Mar 20 13:24:47 crc kubenswrapper[4973]: [+]process-running ok Mar 20 13:24:47 crc kubenswrapper[4973]: healthz check failed Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.899355 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wgtgl" podUID="0e880f23-4bef-4e96-bf00-c94dc4551c5a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.926595 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-xt4hf" podStartSLOduration=163.92657466 podStartE2EDuration="2m43.92657466s" podCreationTimestamp="2026-03-20 13:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:47.877769881 +0000 UTC m=+208.621439625" watchObservedRunningTime="2026-03-20 13:24:47.92657466 +0000 UTC m=+208.670244404" Mar 20 13:24:47 crc kubenswrapper[4973]: I0320 13:24:47.952150 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:47 crc kubenswrapper[4973]: E0320 13:24:47.952500 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:48.452485047 +0000 UTC m=+209.196154791 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.044679 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-k7krj" podStartSLOduration=164.044663469 podStartE2EDuration="2m44.044663469s" podCreationTimestamp="2026-03-20 13:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:48.040443357 +0000 UTC m=+208.784113111" watchObservedRunningTime="2026-03-20 13:24:48.044663469 +0000 UTC m=+208.788333213" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.052505 4973 ???:1] "http: TLS handshake error from 192.168.126.11:37368: no serving certificate available for the kubelet" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.053419 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:48 crc kubenswrapper[4973]: E0320 13:24:48.053695 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:48.55368459 +0000 UTC m=+209.297354334 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.100895 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8" podStartSLOduration=163.100882061 podStartE2EDuration="2m43.100882061s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:48.099687958 +0000 UTC m=+208.843357702" watchObservedRunningTime="2026-03-20 13:24:48.100882061 +0000 UTC m=+208.844551805" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.137280 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-7qt22" podStartSLOduration=164.137265122 podStartE2EDuration="2m44.137265122s" podCreationTimestamp="2026-03-20 13:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:48.136289844 +0000 UTC m=+208.879959588" watchObservedRunningTime="2026-03-20 13:24:48.137265122 +0000 UTC m=+208.880934866" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.155765 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:48 crc kubenswrapper[4973]: E0320 13:24:48.156021 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:48.655991043 +0000 UTC m=+209.399660787 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.156148 4973 ???:1] "http: TLS handshake error from 192.168.126.11:37378: no serving certificate available for the kubelet" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.156290 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:48 crc kubenswrapper[4973]: E0320 13:24:48.156626 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:48.6566133 +0000 UTC m=+209.400283044 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.200833 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7qwpx" event={"ID":"37071a22-82fc-4b04-bc09-6535395faae6","Type":"ContainerStarted","Data":"30454918e7e8305c2b8219838be5c0c737286d8fc622e75908b1330cce0fadbd"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.202772 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zg96g" event={"ID":"3c73e93c-a062-4361-bc15-95eb55598666","Type":"ContainerStarted","Data":"52d537aff2dd3aa373b9cacea4dd621b3ca09d99e2a696797528749af818085b"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.207627 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566884-j8dqd" event={"ID":"66c28cfe-8a0b-459a-bbab-59053fe226b8","Type":"ContainerStarted","Data":"76f4625175a4f1545ae789991f286fbe98ad6c9eeb5735a8c89adb5d7531e8c7"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.209551 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hvdx7" event={"ID":"596b91f0-06b9-4e89-9816-da8049dad9e3","Type":"ContainerStarted","Data":"31e1d8c76617d187cb45352026b3f80776c230c7327ca119409e10a8bc862bf3"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.209610 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hvdx7" event={"ID":"596b91f0-06b9-4e89-9816-da8049dad9e3","Type":"ContainerStarted","Data":"7776e7181a994e4b0ddd17699c7ac21b3f47b0ec05138dc83238a5d6651e0813"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.227662 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" event={"ID":"8521b3b0-4fb2-45b2-90b5-7080e766aafa","Type":"ContainerStarted","Data":"286fe45ab59e91258bf977cd12fd7bb5bb7a5aa840b5a7073a1d20bd5620ba71"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.228654 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.229734 4973 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-mtrbp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.229767 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" podUID="8521b3b0-4fb2-45b2-90b5-7080e766aafa" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.231837 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" event={"ID":"925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e","Type":"ContainerStarted","Data":"85410914e1c5d7a77879bb3c9e0816e53dbf22473f47f5dc599fb2d145541ace"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.234974 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2zrvx" event={"ID":"2dce229d-701a-4a70-9c44-5f99d4c6fe79","Type":"ContainerStarted","Data":"d0cde68c7556dd21c8e6f4f9a3af2739c4053e7a95dab10778ca7a0161a2a42c"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.235059 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2zrvx" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.240757 4973 ???:1] "http: TLS handshake error from 192.168.126.11:37388: no serving certificate available for the kubelet" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.248020 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" podStartSLOduration=163.248003299 podStartE2EDuration="2m43.248003299s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:48.245063244 +0000 UTC m=+208.988732978" watchObservedRunningTime="2026-03-20 13:24:48.248003299 +0000 UTC m=+208.991673043" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.263807 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:48 crc kubenswrapper[4973]: E0320 13:24:48.264710 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:48.76469204 +0000 UTC m=+209.508361784 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.281155 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdnm6" event={"ID":"0bf18c8f-c77f-4208-b464-19772b0221f4","Type":"ContainerStarted","Data":"bceed7691715e4ed9a2492bf2952654d2d2e2d0cc4490c777ec8daf541f60aaa"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.285003 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2zrvx" podStartSLOduration=164.284983056 podStartE2EDuration="2m44.284983056s" podCreationTimestamp="2026-03-20 13:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:48.260083017 +0000 UTC m=+209.003752761" watchObservedRunningTime="2026-03-20 13:24:48.284983056 +0000 UTC m=+209.028652800" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.298057 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgmzv" event={"ID":"9aeba530-5722-4f41-9082-3f9316f06505","Type":"ContainerStarted","Data":"9b78ec6e50432987aafbc8afbee89d83b8e96cd6de73e8b8eac8c9d2e57d9b55"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.316643 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wkc4c" event={"ID":"b22a43e3-90de-4609-bf64-006de1716ae3","Type":"ContainerStarted","Data":"421f4eb09b88c8790dd375e5ae955baa63eaeef4d014b0b01b3a297550dba0c6"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.345195 4973 ???:1] "http: TLS handshake error from 192.168.126.11:37394: no serving certificate available for the kubelet" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.365891 4973 ???:1] "http: TLS handshake error from 192.168.126.11:37398: no serving certificate available for the kubelet" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.370416 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:48 crc kubenswrapper[4973]: E0320 13:24:48.371896 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:48.871881985 +0000 UTC m=+209.615551719 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.378628 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8hngw" event={"ID":"330a4463-669f-4f2e-aa4b-614ab0654579","Type":"ContainerStarted","Data":"12136dd1aad2c4ac1c2e1db23e32f9ad0e18ee58641bf671b8415b899b6d7da9"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.378667 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8hngw" event={"ID":"330a4463-669f-4f2e-aa4b-614ab0654579","Type":"ContainerStarted","Data":"d2d7ecbfaba19b75fdda724d205af3be8a54db13b9ba5597aade40b93678670c"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.401988 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-tdnm6" podStartSLOduration=164.401969674 podStartE2EDuration="2m44.401969674s" podCreationTimestamp="2026-03-20 13:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:48.296035136 +0000 UTC m=+209.039704880" watchObservedRunningTime="2026-03-20 13:24:48.401969674 +0000 UTC m=+209.145639418" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.402897 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-8w6wp" event={"ID":"a56cf4d4-faa0-469e-b856-d1c030dd19d9","Type":"ContainerStarted","Data":"40cf9231b8bf7f37fca238963f4a9477520e710843715bfc5ee3b9f2c642c752"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.413408 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-8hngw" podStartSLOduration=163.413391554 podStartE2EDuration="2m43.413391554s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:48.401089458 +0000 UTC m=+209.144759202" watchObservedRunningTime="2026-03-20 13:24:48.413391554 +0000 UTC m=+209.157061298" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.426679 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" event={"ID":"b0cbdcce-514f-4b72-8c8c-17029b7217a8","Type":"ContainerStarted","Data":"1c9e41c5ed9170e422bc0a5dcef516c793a78b237e475c0c2f24ca2222b43c6b"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.440452 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-8w6wp" podStartSLOduration=163.440433114 podStartE2EDuration="2m43.440433114s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:48.429404335 +0000 UTC m=+209.173074079" watchObservedRunningTime="2026-03-20 13:24:48.440433114 +0000 UTC m=+209.184102858" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.446406 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6v2ms" event={"ID":"cb200682-a8ee-406a-9c09-d881f40842e7","Type":"ContainerStarted","Data":"3b28c8d9d8c1a919dff763467976387806886326d0e8b7ab73337786d36971b3"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.449729 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-c4gcj" event={"ID":"322233fd-b71f-4ef5-931f-58e98326386a","Type":"ContainerStarted","Data":"471f216b5f013092d31311f3dcf572110c7e705c440b36bf513bf4accf7a4537"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.452943 4973 ???:1] "http: TLS handshake error from 192.168.126.11:37400: no serving certificate available for the kubelet" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.457633 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" event={"ID":"b1fe291e-3490-49c0-9443-e5b0f03db19c","Type":"ContainerStarted","Data":"74e0a04a31c0c8b76fe37fed06f1b575b045e1f31557417141c74f81a9d51547"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.460670 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.462257 4973 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-tl6hw container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.26:6443/healthz\": dial tcp 10.217.0.26:6443: connect: connection refused" start-of-body= Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.462310 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" podUID="b1fe291e-3490-49c0-9443-e5b0f03db19c" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.26:6443/healthz\": dial tcp 10.217.0.26:6443: connect: connection refused" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.463417 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-2dssd" event={"ID":"b9d6f338-6a7a-4ba2-b1c9-12485bb30937","Type":"ContainerStarted","Data":"f3e181a66cf44a5811e4649214668a9610bf0274da40991eba44a30f763d51cb"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.480126 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:48 crc kubenswrapper[4973]: E0320 13:24:48.480534 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:48.980522632 +0000 UTC m=+209.724192366 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.480429 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-c4gcj" podStartSLOduration=164.480413939 podStartE2EDuration="2m44.480413939s" podCreationTimestamp="2026-03-20 13:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:48.478685888 +0000 UTC m=+209.222355632" watchObservedRunningTime="2026-03-20 13:24:48.480413939 +0000 UTC m=+209.224083683" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.481084 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:48 crc kubenswrapper[4973]: E0320 13:24:48.481359 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:48.981351675 +0000 UTC m=+209.725021409 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.510525 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-4bk2w" event={"ID":"5f512921-f02c-464b-af06-d65fb95f0071","Type":"ContainerStarted","Data":"e5d858887afdba12902efe40069f3ebd2099bcf6aed35190aa7e3e0c6b60ddd7"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.511531 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-4bk2w" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.518941 4973 patch_prober.go:28] interesting pod/console-operator-58897d9998-4bk2w container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.518990 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-4bk2w" podUID="5f512921-f02c-464b-af06-d65fb95f0071" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.535292 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" podStartSLOduration=164.535275312 podStartE2EDuration="2m44.535275312s" podCreationTimestamp="2026-03-20 13:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:48.499845099 +0000 UTC m=+209.243514843" watchObservedRunningTime="2026-03-20 13:24:48.535275312 +0000 UTC m=+209.278945056" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.546515 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xdmgm" event={"ID":"6847acbc-29ec-4939-a6aa-4617b8e438e7","Type":"ContainerStarted","Data":"0a2d0e20196ad3948f465a70ba22105d1689c99a05805a5751066302325e3b55"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.557808 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmxvj" event={"ID":"94e24fab-c452-4121-a617-9d5f02b8ba1b","Type":"ContainerStarted","Data":"60beb23e98adbf10488884c1d4b830890e4fff3dd1fefced468b1387b9dd0522"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.566956 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-56g5d" event={"ID":"94a86206-8505-4c7d-82aa-2b482c0eb08b","Type":"ContainerStarted","Data":"6dbf9d9dccb67910ecec5bc5b3df3cd5983c1f42d7d7b315fd3c9f2900a7d7db"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.567029 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-56g5d" event={"ID":"94a86206-8505-4c7d-82aa-2b482c0eb08b","Type":"ContainerStarted","Data":"b71a59b79c6779662596ef0efd9985e5a9562a585db6df1437ed9825587c77c6"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.583642 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:48 crc kubenswrapper[4973]: E0320 13:24:48.585071 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:49.085049818 +0000 UTC m=+209.828719572 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.585505 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-mhgwt" event={"ID":"83d5e1a7-2827-4dfd-9feb-3b9630a62515","Type":"ContainerStarted","Data":"b6534c94bb6439eb28ea50a69110b1a4b8c7e1df3766ea440a41e658fa292787"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.588741 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-56g5d" podStartSLOduration=163.588726605 podStartE2EDuration="2m43.588726605s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:48.588713825 +0000 UTC m=+209.332383569" watchObservedRunningTime="2026-03-20 13:24:48.588726605 +0000 UTC m=+209.332396349" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.591429 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bw9k5" event={"ID":"05c127a2-f6b5-4d71-8646-e29396ea7971","Type":"ContainerStarted","Data":"be22f00484b8cbb5a47d1a8c78db635ae2c3ab662abd22222f628712add5185c"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.595283 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-xd95h" event={"ID":"ef043a74-5704-48d8-abc6-4a1afef82b9c","Type":"ContainerStarted","Data":"bf98c8d3bdd5f22a02cce72b531616f3166ce8a1f19669cf3012553f5f8fb78a"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.598592 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-4bk2w" podStartSLOduration=164.598572619 podStartE2EDuration="2m44.598572619s" podCreationTimestamp="2026-03-20 13:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:48.537321871 +0000 UTC m=+209.280991625" watchObservedRunningTime="2026-03-20 13:24:48.598572619 +0000 UTC m=+209.342242373" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.600494 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46h7s" event={"ID":"95609887-af52-4179-88e3-7f3730642377","Type":"ContainerStarted","Data":"c2da59898762639aae7a4aee9dc8ec3678d9783e2b720caef1efe429fd4d6722"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.605953 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljb4c" event={"ID":"4d6d6c1c-878b-47a0-9475-ac3ec45c17b0","Type":"ContainerStarted","Data":"1672180b671bd1db7d1ed231f6d73f0ee85d0cc3d918d849039847ee2ea73d75"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.606001 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljb4c" event={"ID":"4d6d6c1c-878b-47a0-9475-ac3ec45c17b0","Type":"ContainerStarted","Data":"ef0ab64baf949491bf374719af6fc43260df8e59cc2caae11de0fa087df328b2"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.628258 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-xd95h" podStartSLOduration=5.628234335 podStartE2EDuration="5.628234335s" podCreationTimestamp="2026-03-20 13:24:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:48.619802552 +0000 UTC m=+209.363472296" watchObservedRunningTime="2026-03-20 13:24:48.628234335 +0000 UTC m=+209.371904079" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.631505 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kz2ff" event={"ID":"5e47b273-3bc0-4f32-8bdb-aa283db4d8a1","Type":"ContainerStarted","Data":"4e981366a90d3ea50050acf1eabeead4e82b5eae7475c9bd481afb65fd265408"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.631866 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kz2ff" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.633721 4973 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-kz2ff container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" start-of-body= Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.633753 4973 generic.go:334] "Generic (PLEG): container finished" podID="391ef260-a6ea-4cab-bca3-280435898381" containerID="1985bd98e3b0185622700f9cd762233d7643ddd551db7d84674ebb0ce5cef1c0" exitCode=0 Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.633756 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kz2ff" podUID="5e47b273-3bc0-4f32-8bdb-aa283db4d8a1" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": dial tcp 10.217.0.38:8443: connect: connection refused" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.633801 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" event={"ID":"391ef260-a6ea-4cab-bca3-280435898381","Type":"ContainerDied","Data":"1985bd98e3b0185622700f9cd762233d7643ddd551db7d84674ebb0ce5cef1c0"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.645527 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-46h7s" podStartSLOduration=163.645511164 podStartE2EDuration="2m43.645511164s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:48.645195905 +0000 UTC m=+209.388865659" watchObservedRunningTime="2026-03-20 13:24:48.645511164 +0000 UTC m=+209.389180908" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.652792 4973 ???:1] "http: TLS handshake error from 192.168.126.11:37414: no serving certificate available for the kubelet" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.661067 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-26mgc" event={"ID":"fca0553c-03be-49c4-ba2a-dced5bc62586","Type":"ContainerStarted","Data":"886bcc900190d6e74430ae7dd2d14c3d56e925a56cbe67a91b73a9b4b2196c62"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.669066 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4ddcl" event={"ID":"6de410a6-e2d0-4c6d-8a38-d8d5c49a30e5","Type":"ContainerStarted","Data":"e7e65017f20f55fc68cd363955408cb7aea12bb7bf0d263ac4a21a628af32ed1"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.669660 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4ddcl" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.670689 4973 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-4ddcl container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.670733 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4ddcl" podUID="6de410a6-e2d0-4c6d-8a38-d8d5c49a30e5" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.671442 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jl6jn" event={"ID":"47aff328-d38c-426c-8462-12c6b98a82fd","Type":"ContainerStarted","Data":"f24a4cbd3469439d6a570322cec0870d5b90af96915054d37ad5b521f421dd2e"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.679872 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-ljb4c" podStartSLOduration=163.679854766 podStartE2EDuration="2m43.679854766s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:48.674804839 +0000 UTC m=+209.418474583" watchObservedRunningTime="2026-03-20 13:24:48.679854766 +0000 UTC m=+209.423524510" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.686554 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:48 crc kubenswrapper[4973]: E0320 13:24:48.693019 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:49.192999575 +0000 UTC m=+209.936669409 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.700740 4973 generic.go:334] "Generic (PLEG): container finished" podID="69bb010a-a13d-4458-8118-80c5aebb6e65" containerID="c244efa9e8343558cf97af4593ef3f2553e9953b6093b835e47063dba50a608c" exitCode=0 Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.700826 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" event={"ID":"69bb010a-a13d-4458-8118-80c5aebb6e65","Type":"ContainerDied","Data":"c244efa9e8343558cf97af4593ef3f2553e9953b6093b835e47063dba50a608c"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.714071 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kz2ff" podStartSLOduration=163.714052392 podStartE2EDuration="2m43.714052392s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:48.711237131 +0000 UTC m=+209.454906875" watchObservedRunningTime="2026-03-20 13:24:48.714052392 +0000 UTC m=+209.457722136" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.740068 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mpgxt" event={"ID":"17fc0166-9183-4bbd-a091-644b431349e1","Type":"ContainerStarted","Data":"45cb0e5b6e90e72e28ea8dd502019248eff66c6a5aab3559eda51425a1a599bb"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.779374 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8bf7" event={"ID":"bbb5ef4b-b4d4-4ef5-aab3-b9b8f759c6b4","Type":"ContainerStarted","Data":"8fcf21584a0e0146af85f5f8a94977e0639dfbcb5e762938792ed70e2894bafa"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.791545 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:48 crc kubenswrapper[4973]: E0320 13:24:48.791698 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:49.291676154 +0000 UTC m=+210.035345898 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.791976 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:48 crc kubenswrapper[4973]: E0320 13:24:48.793178 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:49.293156966 +0000 UTC m=+210.036826720 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.793577 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4ddcl" podStartSLOduration=163.793557988 podStartE2EDuration="2m43.793557988s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:48.783689223 +0000 UTC m=+209.527358967" watchObservedRunningTime="2026-03-20 13:24:48.793557988 +0000 UTC m=+209.537227732" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.803309 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-jl6jn" podStartSLOduration=164.803294329 podStartE2EDuration="2m44.803294329s" podCreationTimestamp="2026-03-20 13:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:48.800871809 +0000 UTC m=+209.544541553" watchObservedRunningTime="2026-03-20 13:24:48.803294329 +0000 UTC m=+209.546964073" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.814806 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-5dqnp" event={"ID":"2aab7bd5-eca7-4faa-a4fc-d83be6cc6c3c","Type":"ContainerStarted","Data":"0307b10cd486b1a3838d256940655d419dcb891b5e426cc6e1453cbda2eb8eb9"} Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.848858 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-5dqnp" podStartSLOduration=163.848842264 podStartE2EDuration="2m43.848842264s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:48.847754323 +0000 UTC m=+209.591424067" watchObservedRunningTime="2026-03-20 13:24:48.848842264 +0000 UTC m=+209.592511998" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.892817 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:48 crc kubenswrapper[4973]: E0320 13:24:48.894316 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:49.394294306 +0000 UTC m=+210.137964050 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.899548 4973 patch_prober.go:28] interesting pod/router-default-5444994796-wgtgl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:24:48 crc kubenswrapper[4973]: [-]has-synced failed: reason withheld Mar 20 13:24:48 crc kubenswrapper[4973]: [+]process-running ok Mar 20 13:24:48 crc kubenswrapper[4973]: healthz check failed Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.899597 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wgtgl" podUID="0e880f23-4bef-4e96-bf00-c94dc4551c5a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.994668 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:48 crc kubenswrapper[4973]: E0320 13:24:48.995224 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:49.495210099 +0000 UTC m=+210.238879843 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:48 crc kubenswrapper[4973]: I0320 13:24:48.996042 4973 ???:1] "http: TLS handshake error from 192.168.126.11:37420: no serving certificate available for the kubelet" Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.096062 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:49 crc kubenswrapper[4973]: E0320 13:24:49.099508 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:49.599460859 +0000 UTC m=+210.343130713 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.198084 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:49 crc kubenswrapper[4973]: E0320 13:24:49.198376 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:49.698364894 +0000 UTC m=+210.442034638 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.299531 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:49 crc kubenswrapper[4973]: E0320 13:24:49.299738 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:49.79970619 +0000 UTC m=+210.543375934 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.300138 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:49 crc kubenswrapper[4973]: E0320 13:24:49.300431 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:49.80041967 +0000 UTC m=+210.544089414 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.401612 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:49 crc kubenswrapper[4973]: E0320 13:24:49.401787 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:49.901760776 +0000 UTC m=+210.645430520 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.401904 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:49 crc kubenswrapper[4973]: E0320 13:24:49.402220 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:49.902213029 +0000 UTC m=+210.645882773 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.502676 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:49 crc kubenswrapper[4973]: E0320 13:24:49.502985 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:50.002971027 +0000 UTC m=+210.746640771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.604782 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:49 crc kubenswrapper[4973]: E0320 13:24:49.605045 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:50.105032904 +0000 UTC m=+210.848702648 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.704739 4973 ???:1] "http: TLS handshake error from 192.168.126.11:38196: no serving certificate available for the kubelet" Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.705786 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:49 crc kubenswrapper[4973]: E0320 13:24:49.706002 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:50.205988128 +0000 UTC m=+210.949657872 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.807225 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:49 crc kubenswrapper[4973]: E0320 13:24:49.807541 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:50.307528409 +0000 UTC m=+211.051198153 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.823082 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zg96g" event={"ID":"3c73e93c-a062-4361-bc15-95eb55598666","Type":"ContainerStarted","Data":"0ba141a8006dad2ed0739d5edf414095947f78014f1b6a4796308f840a355e3c"} Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.823149 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zg96g" event={"ID":"3c73e93c-a062-4361-bc15-95eb55598666","Type":"ContainerStarted","Data":"b3660a11364a17232ee9d7626d37275b88ce7d061ec2ca98bdacc9ab4a768406"} Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.834432 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmxvj" event={"ID":"94e24fab-c452-4121-a617-9d5f02b8ba1b","Type":"ContainerStarted","Data":"f4bf7863479d66155443b7e19011b699559b23cb2337f5c14d3177a09cd14fdb"} Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.842149 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mpgxt" event={"ID":"17fc0166-9183-4bbd-a091-644b431349e1","Type":"ContainerStarted","Data":"7f577ebfaa15ec47e136b91bf75ff2139e3872b184eb2eb251a838af66d7edff"} Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.844926 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-mhgwt" event={"ID":"83d5e1a7-2827-4dfd-9feb-3b9630a62515","Type":"ContainerStarted","Data":"9e2dd8d865e05c33589bffd0d9215886d6893facf72d2c92c71420d04f054bc2"} Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.847725 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zg96g" podStartSLOduration=164.847703399 podStartE2EDuration="2m44.847703399s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:49.843443016 +0000 UTC m=+210.587112760" watchObservedRunningTime="2026-03-20 13:24:49.847703399 +0000 UTC m=+210.591373153" Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.854871 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hvdx7" event={"ID":"596b91f0-06b9-4e89-9816-da8049dad9e3","Type":"ContainerStarted","Data":"4bdf9807f157f0f051fbe476087cd544be23ba47b7914fd52128d6d03ae58d74"} Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.856370 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7qwpx" event={"ID":"37071a22-82fc-4b04-bc09-6535395faae6","Type":"ContainerStarted","Data":"8fac168d1806f22a2547639ef9a48beaf773704e28dd1281a7984cf4125f5fd4"} Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.860756 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" event={"ID":"b0cbdcce-514f-4b72-8c8c-17029b7217a8","Type":"ContainerStarted","Data":"30bd3c695e315318c275f785e37f16f20888a754072f832bf4604eabf6fbe2cf"} Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.861459 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.862705 4973 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-l7crv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.862742 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" podUID="b0cbdcce-514f-4b72-8c8c-17029b7217a8" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.863157 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgmzv" event={"ID":"9aeba530-5722-4f41-9082-3f9316f06505","Type":"ContainerStarted","Data":"01462d91f852097c6fc03aaf5b30db92b68b4a8d34da4396b7ee130b57e86b85"} Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.864899 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4ddcl" event={"ID":"6de410a6-e2d0-4c6d-8a38-d8d5c49a30e5","Type":"ContainerStarted","Data":"4ff4c59fbeac37c81b035d2d85b6d634ff760f59b18cf7d0e41be0a142e4dc49"} Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.866211 4973 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-4ddcl container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.866310 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4ddcl" podUID="6de410a6-e2d0-4c6d-8a38-d8d5c49a30e5" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.878949 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6v2ms" event={"ID":"cb200682-a8ee-406a-9c09-d881f40842e7","Type":"ContainerStarted","Data":"b4776cb7d5cdc1c81014958f3f2ae92887d3ce6298e8aeba77f5a366c3fe11eb"} Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.878993 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-6v2ms" event={"ID":"cb200682-a8ee-406a-9c09-d881f40842e7","Type":"ContainerStarted","Data":"d7f2dac48700ee7b7801149e152994f8e11daf4d778ae021a87ca26c0194ff5d"} Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.879593 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-6v2ms" Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.885735 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-4bk2w" event={"ID":"5f512921-f02c-464b-af06-d65fb95f0071","Type":"ContainerStarted","Data":"ecebfd6bfdf0e1f243a5a5712948ef8dac51bc241b1c6d43c082d02fe786e5db"} Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.887217 4973 patch_prober.go:28] interesting pod/console-operator-58897d9998-4bk2w container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.887257 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-4bk2w" podUID="5f512921-f02c-464b-af06-d65fb95f0071" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.894393 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-2dssd" event={"ID":"b9d6f338-6a7a-4ba2-b1c9-12485bb30937","Type":"ContainerStarted","Data":"a8cf42a9d59625b7d17e2f95f60f398813c0fe69038ad0c286fec8890d8a13d7"} Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.899090 4973 patch_prober.go:28] interesting pod/router-default-5444994796-wgtgl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:24:49 crc kubenswrapper[4973]: [-]has-synced failed: reason withheld Mar 20 13:24:49 crc kubenswrapper[4973]: [+]process-running ok Mar 20 13:24:49 crc kubenswrapper[4973]: healthz check failed Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.899136 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wgtgl" podUID="0e880f23-4bef-4e96-bf00-c94dc4551c5a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.903179 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xdmgm" event={"ID":"6847acbc-29ec-4939-a6aa-4617b8e438e7","Type":"ContainerStarted","Data":"eb8ef4808a81aecdf4741bb71557abe67b79e43e01066601cba89a7f795a199c"} Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.910825 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.911916 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" event={"ID":"69bb010a-a13d-4458-8118-80c5aebb6e65","Type":"ContainerStarted","Data":"d093cc2d745e09d5b7445cb6e91ad2c563523d3d17b1f97cca85d5042babeaef"} Mar 20 13:24:49 crc kubenswrapper[4973]: E0320 13:24:49.913070 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:50.413051036 +0000 UTC m=+211.156720820 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.924497 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" event={"ID":"b1fe291e-3490-49c0-9443-e5b0f03db19c","Type":"ContainerStarted","Data":"8fc3ae1a34dca9e6b8bce0b1ad1a59852acffafd2e607e0f20d3875dfcaa1051"} Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.943105 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-26mgc" event={"ID":"fca0553c-03be-49c4-ba2a-dced5bc62586","Type":"ContainerStarted","Data":"f16fe2b22b6559e1c0da5f4248ace96864bb9318b425b85593d83a07cad525a0"} Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.943148 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-26mgc" event={"ID":"fca0553c-03be-49c4-ba2a-dced5bc62586","Type":"ContainerStarted","Data":"c76dc94fabfa5e098267a2c9c72d825c22e7310c0166a9c9a1383b61c8f2fbd0"} Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.968125 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bw9k5" event={"ID":"05c127a2-f6b5-4d71-8646-e29396ea7971","Type":"ContainerStarted","Data":"3232f29f9f835edd11390404529ce2dfe373ae881c14e14324eafe77d56756fb"} Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.968481 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bw9k5" event={"ID":"05c127a2-f6b5-4d71-8646-e29396ea7971","Type":"ContainerStarted","Data":"fa5349a74d8f0a7b4ca08574b6eb1581a5ce40403c65f8fa056bb400e97595db"} Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.969032 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bw9k5" Mar 20 13:24:49 crc kubenswrapper[4973]: I0320 13:24:49.980654 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-c4gcj" event={"ID":"322233fd-b71f-4ef5-931f-58e98326386a","Type":"ContainerStarted","Data":"47d877966356f8da1c19f2c22048fc9154ca3074a2441f67f538a5b574ded9e6"} Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.005522 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wkc4c" event={"ID":"b22a43e3-90de-4609-bf64-006de1716ae3","Type":"ContainerStarted","Data":"0cfdb91d1e80c590475907fc786790d3004277ddacfea14ed771af6a131d3765"} Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.018693 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:50 crc kubenswrapper[4973]: E0320 13:24:50.021987 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:50.5219755 +0000 UTC m=+211.265645244 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.034577 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8bf7" event={"ID":"bbb5ef4b-b4d4-4ef5-aab3-b9b8f759c6b4","Type":"ContainerStarted","Data":"8a185838f99fc6a138cb16f916582c168373d1e5df8b494051142b1f1065d6dc"} Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.034621 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8bf7" event={"ID":"bbb5ef4b-b4d4-4ef5-aab3-b9b8f759c6b4","Type":"ContainerStarted","Data":"351d5524038597b9efada48ceb3c71ae4d2b4f02e10d59b93a19362e48d7f9ba"} Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.041228 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kz2ff" event={"ID":"5e47b273-3bc0-4f32-8bdb-aa283db4d8a1","Type":"ContainerStarted","Data":"7126217d94e1349a5461bfdcac13fe4db3f94149300431f031d157553759af7f"} Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.059873 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" event={"ID":"925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e","Type":"ContainerStarted","Data":"6526dc6a5249aa733a59e5eaf020bcb049f6ee208c7b5a098a4d4625af1109f0"} Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.060779 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.063940 4973 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-z2rch container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:5443/healthz\": dial tcp 10.217.0.16:5443: connect: connection refused" start-of-body= Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.064017 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" podUID="925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.16:5443/healthz\": dial tcp 10.217.0.16:5443: connect: connection refused" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.070838 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-pmxvj" podStartSLOduration=165.070817539 podStartE2EDuration="2m45.070817539s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:49.904160428 +0000 UTC m=+210.647830182" watchObservedRunningTime="2026-03-20 13:24:50.070817539 +0000 UTC m=+210.814487283" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.071596 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-2dssd" podStartSLOduration=166.071589432 podStartE2EDuration="2m46.071589432s" podCreationTimestamp="2026-03-20 13:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:50.060036689 +0000 UTC m=+210.803706433" watchObservedRunningTime="2026-03-20 13:24:50.071589432 +0000 UTC m=+210.815259166" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.072608 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kz2ff" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.103699 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" event={"ID":"391ef260-a6ea-4cab-bca3-280435898381","Type":"ContainerStarted","Data":"df33446a3dbb1be46692d016d2e7b864e3eaf5799523b06261296aac70f2a48a"} Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.103780 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" event={"ID":"391ef260-a6ea-4cab-bca3-280435898381","Type":"ContainerStarted","Data":"ac178d705f317fbb1d6128a19529c99b1121da12dc85e218e4eaaf14dd3a01c0"} Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.120646 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.120870 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xdmgm" podStartSLOduration=7.120847084 podStartE2EDuration="7.120847084s" podCreationTimestamp="2026-03-20 13:24:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:50.118234448 +0000 UTC m=+210.861904192" watchObservedRunningTime="2026-03-20 13:24:50.120847084 +0000 UTC m=+210.864516828" Mar 20 13:24:50 crc kubenswrapper[4973]: E0320 13:24:50.121873 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:50.621853353 +0000 UTC m=+211.365523097 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.178741 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2zrvx" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.201546 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.223825 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:50 crc kubenswrapper[4973]: E0320 13:24:50.226843 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:50.726821153 +0000 UTC m=+211.470490897 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.283240 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-mhgwt" podStartSLOduration=165.283208311 podStartE2EDuration="2m45.283208311s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:50.281986896 +0000 UTC m=+211.025656640" watchObservedRunningTime="2026-03-20 13:24:50.283208311 +0000 UTC m=+211.026878055" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.284668 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" podStartSLOduration=165.284661223 podStartE2EDuration="2m45.284661223s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:50.201247445 +0000 UTC m=+210.944917199" watchObservedRunningTime="2026-03-20 13:24:50.284661223 +0000 UTC m=+211.028330967" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.329716 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:50 crc kubenswrapper[4973]: E0320 13:24:50.329910 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:50.829879838 +0000 UTC m=+211.573549582 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.329997 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:50 crc kubenswrapper[4973]: E0320 13:24:50.331346 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:50.831313509 +0000 UTC m=+211.574983253 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.378817 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-7qwpx" podStartSLOduration=165.37880071 podStartE2EDuration="2m45.37880071s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:50.377104922 +0000 UTC m=+211.120774666" watchObservedRunningTime="2026-03-20 13:24:50.37880071 +0000 UTC m=+211.122470454" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.378900 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-hvdx7" podStartSLOduration=165.378895903 podStartE2EDuration="2m45.378895903s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:50.312224389 +0000 UTC m=+211.055894123" watchObservedRunningTime="2026-03-20 13:24:50.378895903 +0000 UTC m=+211.122565647" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.430837 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-wgmzv" podStartSLOduration=165.430819833 podStartE2EDuration="2m45.430819833s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:50.424821139 +0000 UTC m=+211.168490883" watchObservedRunningTime="2026-03-20 13:24:50.430819833 +0000 UTC m=+211.174489577" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.432253 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:50 crc kubenswrapper[4973]: E0320 13:24:50.432609 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:50.932590644 +0000 UTC m=+211.676260388 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.432687 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:50 crc kubenswrapper[4973]: E0320 13:24:50.433064 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:50.933051807 +0000 UTC m=+211.676721551 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.460683 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-6v2ms" podStartSLOduration=7.460665784 podStartE2EDuration="7.460665784s" podCreationTimestamp="2026-03-20 13:24:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:50.458858772 +0000 UTC m=+211.202528526" watchObservedRunningTime="2026-03-20 13:24:50.460665784 +0000 UTC m=+211.204335528" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.534136 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:50 crc kubenswrapper[4973]: E0320 13:24:50.534572 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:51.034523686 +0000 UTC m=+211.778193430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.553397 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" podStartSLOduration=166.55338038 podStartE2EDuration="2m46.55338038s" podCreationTimestamp="2026-03-20 13:22:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:50.552781143 +0000 UTC m=+211.296450887" watchObservedRunningTime="2026-03-20 13:24:50.55338038 +0000 UTC m=+211.297050124" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.635728 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:50 crc kubenswrapper[4973]: E0320 13:24:50.636330 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:51.136312094 +0000 UTC m=+211.879981838 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.686136 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.716619 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-26mgc" podStartSLOduration=165.716592572 podStartE2EDuration="2m45.716592572s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:50.599679827 +0000 UTC m=+211.343349571" watchObservedRunningTime="2026-03-20 13:24:50.716592572 +0000 UTC m=+211.460262316" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.716997 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" podStartSLOduration=165.716987673 podStartE2EDuration="2m45.716987673s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:50.696015408 +0000 UTC m=+211.439685162" watchObservedRunningTime="2026-03-20 13:24:50.716987673 +0000 UTC m=+211.460657427" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.735189 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.735256 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.737123 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:50 crc kubenswrapper[4973]: E0320 13:24:50.737502 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:51.237484286 +0000 UTC m=+211.981154030 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.785297 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.785770 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.789584 4973 patch_prober.go:28] interesting pod/apiserver-76f77b778f-k9jzg container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.789636 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" podUID="391ef260-a6ea-4cab-bca3-280435898381" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.810092 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-w8bf7" podStartSLOduration=165.81007637 podStartE2EDuration="2m45.81007637s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:50.760305914 +0000 UTC m=+211.503975658" watchObservedRunningTime="2026-03-20 13:24:50.81007637 +0000 UTC m=+211.553746114" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.838618 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:50 crc kubenswrapper[4973]: E0320 13:24:50.839002 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:51.338990175 +0000 UTC m=+212.082659919 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.862819 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-wkc4c" podStartSLOduration=165.862803123 podStartE2EDuration="2m45.862803123s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:50.810603596 +0000 UTC m=+211.554273340" watchObservedRunningTime="2026-03-20 13:24:50.862803123 +0000 UTC m=+211.606472867" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.894233 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" podStartSLOduration=165.89421766 podStartE2EDuration="2m45.89421766s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:50.891863712 +0000 UTC m=+211.635533456" watchObservedRunningTime="2026-03-20 13:24:50.89421766 +0000 UTC m=+211.637887404" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.930808 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bw9k5" podStartSLOduration=165.930792116 podStartE2EDuration="2m45.930792116s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:50.930467506 +0000 UTC m=+211.674137250" watchObservedRunningTime="2026-03-20 13:24:50.930792116 +0000 UTC m=+211.674461850" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.943109 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.943538 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.943586 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.943661 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:50 crc kubenswrapper[4973]: E0320 13:24:50.946454 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:51.446417877 +0000 UTC m=+212.190087681 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.954576 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:50 crc kubenswrapper[4973]: I0320 13:24:50.977071 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.053057 4973 ???:1] "http: TLS handshake error from 192.168.126.11:38208: no serving certificate available for the kubelet" Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.054982 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93c5ad90-87bf-4668-9d87-34e676b15783-metrics-certs\") pod \"network-metrics-daemon-7kszd\" (UID: \"93c5ad90-87bf-4668-9d87-34e676b15783\") " pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.055057 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.055109 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:51 crc kubenswrapper[4973]: E0320 13:24:51.055484 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:51.555467445 +0000 UTC m=+212.299137189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.072542 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/93c5ad90-87bf-4668-9d87-34e676b15783-metrics-certs\") pod \"network-metrics-daemon-7kszd\" (UID: \"93c5ad90-87bf-4668-9d87-34e676b15783\") " pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.072620 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.078060 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.081737 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7kszd" Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.087657 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.142410 4973 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-l7crv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.142462 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" podUID="b0cbdcce-514f-4b72-8c8c-17029b7217a8" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.155666 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4ddcl" Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.156636 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:51 crc kubenswrapper[4973]: E0320 13:24:51.157076 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:51.657056607 +0000 UTC m=+212.400726351 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.260605 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:51 crc kubenswrapper[4973]: E0320 13:24:51.266603 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:51.766586129 +0000 UTC m=+212.510255873 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.297586 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.301549 4973 patch_prober.go:28] interesting pod/router-default-5444994796-wgtgl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:24:51 crc kubenswrapper[4973]: [-]has-synced failed: reason withheld Mar 20 13:24:51 crc kubenswrapper[4973]: [+]process-running ok Mar 20 13:24:51 crc kubenswrapper[4973]: healthz check failed Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.301665 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wgtgl" podUID="0e880f23-4bef-4e96-bf00-c94dc4551c5a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.366607 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.367221 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:51 crc kubenswrapper[4973]: E0320 13:24:51.367527 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:51.867513713 +0000 UTC m=+212.611183457 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.469035 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:51 crc kubenswrapper[4973]: E0320 13:24:51.470035 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:51.970018542 +0000 UTC m=+212.713688286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.511088 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-4bk2w" Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.572009 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:51 crc kubenswrapper[4973]: E0320 13:24:51.572379 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:52.072360927 +0000 UTC m=+212.816030671 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.677375 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:51 crc kubenswrapper[4973]: E0320 13:24:51.677760 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:52.177748799 +0000 UTC m=+212.921418543 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.783526 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:51 crc kubenswrapper[4973]: E0320 13:24:51.787160 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:52.287137206 +0000 UTC m=+213.030806950 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.888424 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:51 crc kubenswrapper[4973]: E0320 13:24:51.888778 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:52.38876391 +0000 UTC m=+213.132433654 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.905602 4973 patch_prober.go:28] interesting pod/router-default-5444994796-wgtgl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:24:51 crc kubenswrapper[4973]: [-]has-synced failed: reason withheld Mar 20 13:24:51 crc kubenswrapper[4973]: [+]process-running ok Mar 20 13:24:51 crc kubenswrapper[4973]: healthz check failed Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.905656 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wgtgl" podUID="0e880f23-4bef-4e96-bf00-c94dc4551c5a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:24:51 crc kubenswrapper[4973]: I0320 13:24:51.976561 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" Mar 20 13:24:52 crc kubenswrapper[4973]: I0320 13:24:52.001298 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:52 crc kubenswrapper[4973]: E0320 13:24:52.001814 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:52.501789424 +0000 UTC m=+213.245459168 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:52 crc kubenswrapper[4973]: I0320 13:24:52.104308 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:52 crc kubenswrapper[4973]: E0320 13:24:52.105055 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:52.604948271 +0000 UTC m=+213.348618015 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:52 crc kubenswrapper[4973]: I0320 13:24:52.110158 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:52 crc kubenswrapper[4973]: I0320 13:24:52.184579 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"31e327201dc368af484bd694c714b09a34d9501bc5b4d97fa1fd8bf41255d482"} Mar 20 13:24:52 crc kubenswrapper[4973]: I0320 13:24:52.196857 4973 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-l7crv container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" start-of-body= Mar 20 13:24:52 crc kubenswrapper[4973]: I0320 13:24:52.196917 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" podUID="b0cbdcce-514f-4b72-8c8c-17029b7217a8" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.36:8080/healthz\": dial tcp 10.217.0.36:8080: connect: connection refused" Mar 20 13:24:52 crc kubenswrapper[4973]: I0320 13:24:52.206541 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:52 crc kubenswrapper[4973]: E0320 13:24:52.207873 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:52.707811331 +0000 UTC m=+213.451481075 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:52 crc kubenswrapper[4973]: I0320 13:24:52.226068 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7kszd"] Mar 20 13:24:52 crc kubenswrapper[4973]: W0320 13:24:52.260490 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod93c5ad90_87bf_4668_9d87_34e676b15783.slice/crio-5fbfcf1cdf787bf808e6e8d6b4f5fc2a2e88f8be325c833b4980e0bd72e03d59 WatchSource:0}: Error finding container 5fbfcf1cdf787bf808e6e8d6b4f5fc2a2e88f8be325c833b4980e0bd72e03d59: Status 404 returned error can't find the container with id 5fbfcf1cdf787bf808e6e8d6b4f5fc2a2e88f8be325c833b4980e0bd72e03d59 Mar 20 13:24:52 crc kubenswrapper[4973]: I0320 13:24:52.260554 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-f8vm4" Mar 20 13:24:52 crc kubenswrapper[4973]: I0320 13:24:52.313003 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:52 crc kubenswrapper[4973]: E0320 13:24:52.315297 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:52.815286213 +0000 UTC m=+213.558955947 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:52 crc kubenswrapper[4973]: I0320 13:24:52.414108 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:52 crc kubenswrapper[4973]: E0320 13:24:52.414621 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:52.91459912 +0000 UTC m=+213.658268864 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:52 crc kubenswrapper[4973]: I0320 13:24:52.523006 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:52 crc kubenswrapper[4973]: E0320 13:24:52.523370 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:53.02335766 +0000 UTC m=+213.767027404 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:52 crc kubenswrapper[4973]: I0320 13:24:52.526177 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mtrbp"] Mar 20 13:24:52 crc kubenswrapper[4973]: I0320 13:24:52.588762 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8"] Mar 20 13:24:52 crc kubenswrapper[4973]: I0320 13:24:52.589002 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8" podUID="a60cb615-f335-45fa-86dd-ddf121e62737" containerName="route-controller-manager" containerID="cri-o://5b4441d53cf53f861345b1c4b9573b80e9b281d00ad7ac3728c5446304237526" gracePeriod=30 Mar 20 13:24:52 crc kubenswrapper[4973]: I0320 13:24:52.623992 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:52 crc kubenswrapper[4973]: E0320 13:24:52.624380 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:53.124312594 +0000 UTC m=+213.867982338 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:52 crc kubenswrapper[4973]: I0320 13:24:52.724962 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:52 crc kubenswrapper[4973]: E0320 13:24:52.725396 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:53.225382731 +0000 UTC m=+213.969052475 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:52 crc kubenswrapper[4973]: I0320 13:24:52.826151 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:52 crc kubenswrapper[4973]: E0320 13:24:52.826496 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:53.326479531 +0000 UTC m=+214.070149275 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:52 crc kubenswrapper[4973]: I0320 13:24:52.910481 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-5jzd6"] Mar 20 13:24:52 crc kubenswrapper[4973]: I0320 13:24:52.911693 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jzd6" Mar 20 13:24:52 crc kubenswrapper[4973]: I0320 13:24:52.916169 4973 patch_prober.go:28] interesting pod/router-default-5444994796-wgtgl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:24:52 crc kubenswrapper[4973]: [-]has-synced failed: reason withheld Mar 20 13:24:52 crc kubenswrapper[4973]: [+]process-running ok Mar 20 13:24:52 crc kubenswrapper[4973]: healthz check failed Mar 20 13:24:52 crc kubenswrapper[4973]: I0320 13:24:52.916208 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wgtgl" podUID="0e880f23-4bef-4e96-bf00-c94dc4551c5a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:24:52 crc kubenswrapper[4973]: I0320 13:24:52.923317 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 13:24:52 crc kubenswrapper[4973]: I0320 13:24:52.928789 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5jzd6"] Mar 20 13:24:52 crc kubenswrapper[4973]: I0320 13:24:52.929670 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:52 crc kubenswrapper[4973]: E0320 13:24:52.930050 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:53.43003789 +0000 UTC m=+214.173707634 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.031095 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.031431 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2115631d-0f02-4cb4-bfee-e18dd87a0462-utilities\") pod \"certified-operators-5jzd6\" (UID: \"2115631d-0f02-4cb4-bfee-e18dd87a0462\") " pod="openshift-marketplace/certified-operators-5jzd6" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.031475 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glj8x\" (UniqueName: \"kubernetes.io/projected/2115631d-0f02-4cb4-bfee-e18dd87a0462-kube-api-access-glj8x\") pod \"certified-operators-5jzd6\" (UID: \"2115631d-0f02-4cb4-bfee-e18dd87a0462\") " pod="openshift-marketplace/certified-operators-5jzd6" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.031536 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2115631d-0f02-4cb4-bfee-e18dd87a0462-catalog-content\") pod \"certified-operators-5jzd6\" (UID: \"2115631d-0f02-4cb4-bfee-e18dd87a0462\") " pod="openshift-marketplace/certified-operators-5jzd6" Mar 20 13:24:53 crc kubenswrapper[4973]: E0320 13:24:53.031650 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:53.531632143 +0000 UTC m=+214.275301887 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.067215 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-plc2f"] Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.069309 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-plc2f" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.072035 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.096844 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-plc2f"] Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.145225 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2115631d-0f02-4cb4-bfee-e18dd87a0462-catalog-content\") pod \"certified-operators-5jzd6\" (UID: \"2115631d-0f02-4cb4-bfee-e18dd87a0462\") " pod="openshift-marketplace/certified-operators-5jzd6" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.145314 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.145358 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2115631d-0f02-4cb4-bfee-e18dd87a0462-utilities\") pod \"certified-operators-5jzd6\" (UID: \"2115631d-0f02-4cb4-bfee-e18dd87a0462\") " pod="openshift-marketplace/certified-operators-5jzd6" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.145387 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f429634-2787-4daa-a443-e4ab84f2e6b7-utilities\") pod \"community-operators-plc2f\" (UID: \"8f429634-2787-4daa-a443-e4ab84f2e6b7\") " pod="openshift-marketplace/community-operators-plc2f" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.145413 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f429634-2787-4daa-a443-e4ab84f2e6b7-catalog-content\") pod \"community-operators-plc2f\" (UID: \"8f429634-2787-4daa-a443-e4ab84f2e6b7\") " pod="openshift-marketplace/community-operators-plc2f" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.145444 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glj8x\" (UniqueName: \"kubernetes.io/projected/2115631d-0f02-4cb4-bfee-e18dd87a0462-kube-api-access-glj8x\") pod \"certified-operators-5jzd6\" (UID: \"2115631d-0f02-4cb4-bfee-e18dd87a0462\") " pod="openshift-marketplace/certified-operators-5jzd6" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.145490 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8sms\" (UniqueName: \"kubernetes.io/projected/8f429634-2787-4daa-a443-e4ab84f2e6b7-kube-api-access-f8sms\") pod \"community-operators-plc2f\" (UID: \"8f429634-2787-4daa-a443-e4ab84f2e6b7\") " pod="openshift-marketplace/community-operators-plc2f" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.145938 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2115631d-0f02-4cb4-bfee-e18dd87a0462-catalog-content\") pod \"certified-operators-5jzd6\" (UID: \"2115631d-0f02-4cb4-bfee-e18dd87a0462\") " pod="openshift-marketplace/certified-operators-5jzd6" Mar 20 13:24:53 crc kubenswrapper[4973]: E0320 13:24:53.146215 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:53.64620187 +0000 UTC m=+214.389871614 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.146648 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2115631d-0f02-4cb4-bfee-e18dd87a0462-utilities\") pod \"certified-operators-5jzd6\" (UID: \"2115631d-0f02-4cb4-bfee-e18dd87a0462\") " pod="openshift-marketplace/certified-operators-5jzd6" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.168745 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glj8x\" (UniqueName: \"kubernetes.io/projected/2115631d-0f02-4cb4-bfee-e18dd87a0462-kube-api-access-glj8x\") pod \"certified-operators-5jzd6\" (UID: \"2115631d-0f02-4cb4-bfee-e18dd87a0462\") " pod="openshift-marketplace/certified-operators-5jzd6" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.198739 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c3281b9b87b3650b063b1d60b9a218c455e4401c319704ae6466bac8c05957d2"} Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.198801 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"37e429c601347aa33531d7914b1f7535586c96d2ae0dcd36c7e9a77a0fc57c0d"} Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.208391 4973 generic.go:334] "Generic (PLEG): container finished" podID="a60cb615-f335-45fa-86dd-ddf121e62737" containerID="5b4441d53cf53f861345b1c4b9573b80e9b281d00ad7ac3728c5446304237526" exitCode=0 Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.208475 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8" event={"ID":"a60cb615-f335-45fa-86dd-ddf121e62737","Type":"ContainerDied","Data":"5b4441d53cf53f861345b1c4b9573b80e9b281d00ad7ac3728c5446304237526"} Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.226648 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7kszd" event={"ID":"93c5ad90-87bf-4668-9d87-34e676b15783","Type":"ContainerStarted","Data":"200aed1a5108e311968062e529f8662a39c89aceb8a717be1a20377306c97fea"} Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.226720 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7kszd" event={"ID":"93c5ad90-87bf-4668-9d87-34e676b15783","Type":"ContainerStarted","Data":"5fbfcf1cdf787bf808e6e8d6b4f5fc2a2e88f8be325c833b4980e0bd72e03d59"} Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.245284 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ggbs9"] Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.246494 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ggbs9" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.247656 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jzd6" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.248593 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.248946 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8sms\" (UniqueName: \"kubernetes.io/projected/8f429634-2787-4daa-a443-e4ab84f2e6b7-kube-api-access-f8sms\") pod \"community-operators-plc2f\" (UID: \"8f429634-2787-4daa-a443-e4ab84f2e6b7\") " pod="openshift-marketplace/community-operators-plc2f" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.249007 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f429634-2787-4daa-a443-e4ab84f2e6b7-utilities\") pod \"community-operators-plc2f\" (UID: \"8f429634-2787-4daa-a443-e4ab84f2e6b7\") " pod="openshift-marketplace/community-operators-plc2f" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.249071 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f429634-2787-4daa-a443-e4ab84f2e6b7-catalog-content\") pod \"community-operators-plc2f\" (UID: \"8f429634-2787-4daa-a443-e4ab84f2e6b7\") " pod="openshift-marketplace/community-operators-plc2f" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.249686 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f429634-2787-4daa-a443-e4ab84f2e6b7-catalog-content\") pod \"community-operators-plc2f\" (UID: \"8f429634-2787-4daa-a443-e4ab84f2e6b7\") " pod="openshift-marketplace/community-operators-plc2f" Mar 20 13:24:53 crc kubenswrapper[4973]: E0320 13:24:53.249837 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:53.749792771 +0000 UTC m=+214.493462515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.250375 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f429634-2787-4daa-a443-e4ab84f2e6b7-utilities\") pod \"community-operators-plc2f\" (UID: \"8f429634-2787-4daa-a443-e4ab84f2e6b7\") " pod="openshift-marketplace/community-operators-plc2f" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.251005 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"51561b939b19c1182f825125548578dcf5f6caae28770281fbc2d8c1fedca470"} Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.251039 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b1461296a16a785805c176a0bca7dc7445dcf3d97bc1cafa3742a26a488d174f"} Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.265402 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"1a3ae366bf1eeaa63817a0093ddc72c7bddbf9a64ff47bc7fb887ef12cfe6a5e"} Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.265445 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.309420 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ggbs9"] Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.330187 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8sms\" (UniqueName: \"kubernetes.io/projected/8f429634-2787-4daa-a443-e4ab84f2e6b7-kube-api-access-f8sms\") pod \"community-operators-plc2f\" (UID: \"8f429634-2787-4daa-a443-e4ab84f2e6b7\") " pod="openshift-marketplace/community-operators-plc2f" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.359795 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" podUID="8521b3b0-4fb2-45b2-90b5-7080e766aafa" containerName="controller-manager" containerID="cri-o://286fe45ab59e91258bf977cd12fd7bb5bb7a5aa840b5a7073a1d20bd5620ba71" gracePeriod=30 Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.360164 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mpgxt" event={"ID":"17fc0166-9183-4bbd-a091-644b431349e1","Type":"ContainerStarted","Data":"42e0eba4cb9cee55cccc95f9116fafd37ad1063d248ee4bc672fd7ddf0993cc4"} Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.368203 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce12ac02-5b58-44a3-a311-8cdd000ce41b-catalog-content\") pod \"certified-operators-ggbs9\" (UID: \"ce12ac02-5b58-44a3-a311-8cdd000ce41b\") " pod="openshift-marketplace/certified-operators-ggbs9" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.368302 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce12ac02-5b58-44a3-a311-8cdd000ce41b-utilities\") pod \"certified-operators-ggbs9\" (UID: \"ce12ac02-5b58-44a3-a311-8cdd000ce41b\") " pod="openshift-marketplace/certified-operators-ggbs9" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.368397 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.368478 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q72rj\" (UniqueName: \"kubernetes.io/projected/ce12ac02-5b58-44a3-a311-8cdd000ce41b-kube-api-access-q72rj\") pod \"certified-operators-ggbs9\" (UID: \"ce12ac02-5b58-44a3-a311-8cdd000ce41b\") " pod="openshift-marketplace/certified-operators-ggbs9" Mar 20 13:24:53 crc kubenswrapper[4973]: E0320 13:24:53.377273 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:53.87725402 +0000 UTC m=+214.620923764 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.402768 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-plc2f" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.457420 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tsxk4"] Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.461229 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tsxk4" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.470864 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:53 crc kubenswrapper[4973]: E0320 13:24:53.470986 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:53.970961865 +0000 UTC m=+214.714631609 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.471130 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce12ac02-5b58-44a3-a311-8cdd000ce41b-utilities\") pod \"certified-operators-ggbs9\" (UID: \"ce12ac02-5b58-44a3-a311-8cdd000ce41b\") " pod="openshift-marketplace/certified-operators-ggbs9" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.471167 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.471207 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q72rj\" (UniqueName: \"kubernetes.io/projected/ce12ac02-5b58-44a3-a311-8cdd000ce41b-kube-api-access-q72rj\") pod \"certified-operators-ggbs9\" (UID: \"ce12ac02-5b58-44a3-a311-8cdd000ce41b\") " pod="openshift-marketplace/certified-operators-ggbs9" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.471256 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce12ac02-5b58-44a3-a311-8cdd000ce41b-catalog-content\") pod \"certified-operators-ggbs9\" (UID: \"ce12ac02-5b58-44a3-a311-8cdd000ce41b\") " pod="openshift-marketplace/certified-operators-ggbs9" Mar 20 13:24:53 crc kubenswrapper[4973]: E0320 13:24:53.475425 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:53.975403483 +0000 UTC m=+214.719073297 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.475976 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce12ac02-5b58-44a3-a311-8cdd000ce41b-catalog-content\") pod \"certified-operators-ggbs9\" (UID: \"ce12ac02-5b58-44a3-a311-8cdd000ce41b\") " pod="openshift-marketplace/certified-operators-ggbs9" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.476393 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce12ac02-5b58-44a3-a311-8cdd000ce41b-utilities\") pod \"certified-operators-ggbs9\" (UID: \"ce12ac02-5b58-44a3-a311-8cdd000ce41b\") " pod="openshift-marketplace/certified-operators-ggbs9" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.479724 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tsxk4"] Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.505874 4973 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.508181 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q72rj\" (UniqueName: \"kubernetes.io/projected/ce12ac02-5b58-44a3-a311-8cdd000ce41b-kube-api-access-q72rj\") pod \"certified-operators-ggbs9\" (UID: \"ce12ac02-5b58-44a3-a311-8cdd000ce41b\") " pod="openshift-marketplace/certified-operators-ggbs9" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.571881 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:53 crc kubenswrapper[4973]: E0320 13:24:53.572027 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:54.072006362 +0000 UTC m=+214.815676116 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.572493 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6848e4e5-1d17-4bf7-8dee-bbbeddedd07d-catalog-content\") pod \"community-operators-tsxk4\" (UID: \"6848e4e5-1d17-4bf7-8dee-bbbeddedd07d\") " pod="openshift-marketplace/community-operators-tsxk4" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.572531 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6848e4e5-1d17-4bf7-8dee-bbbeddedd07d-utilities\") pod \"community-operators-tsxk4\" (UID: \"6848e4e5-1d17-4bf7-8dee-bbbeddedd07d\") " pod="openshift-marketplace/community-operators-tsxk4" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.572576 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.572601 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cqsf\" (UniqueName: \"kubernetes.io/projected/6848e4e5-1d17-4bf7-8dee-bbbeddedd07d-kube-api-access-5cqsf\") pod \"community-operators-tsxk4\" (UID: \"6848e4e5-1d17-4bf7-8dee-bbbeddedd07d\") " pod="openshift-marketplace/community-operators-tsxk4" Mar 20 13:24:53 crc kubenswrapper[4973]: E0320 13:24:53.572902 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:54.072894088 +0000 UTC m=+214.816563832 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.644278 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.657060 4973 ???:1] "http: TLS handshake error from 192.168.126.11:38224: no serving certificate available for the kubelet" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.676060 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.676254 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cqsf\" (UniqueName: \"kubernetes.io/projected/6848e4e5-1d17-4bf7-8dee-bbbeddedd07d-kube-api-access-5cqsf\") pod \"community-operators-tsxk4\" (UID: \"6848e4e5-1d17-4bf7-8dee-bbbeddedd07d\") " pod="openshift-marketplace/community-operators-tsxk4" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.676303 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6848e4e5-1d17-4bf7-8dee-bbbeddedd07d-catalog-content\") pod \"community-operators-tsxk4\" (UID: \"6848e4e5-1d17-4bf7-8dee-bbbeddedd07d\") " pod="openshift-marketplace/community-operators-tsxk4" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.676331 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6848e4e5-1d17-4bf7-8dee-bbbeddedd07d-utilities\") pod \"community-operators-tsxk4\" (UID: \"6848e4e5-1d17-4bf7-8dee-bbbeddedd07d\") " pod="openshift-marketplace/community-operators-tsxk4" Mar 20 13:24:53 crc kubenswrapper[4973]: E0320 13:24:53.676478 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:54.176448078 +0000 UTC m=+214.920117832 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.676777 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6848e4e5-1d17-4bf7-8dee-bbbeddedd07d-catalog-content\") pod \"community-operators-tsxk4\" (UID: \"6848e4e5-1d17-4bf7-8dee-bbbeddedd07d\") " pod="openshift-marketplace/community-operators-tsxk4" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.680565 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6848e4e5-1d17-4bf7-8dee-bbbeddedd07d-utilities\") pod \"community-operators-tsxk4\" (UID: \"6848e4e5-1d17-4bf7-8dee-bbbeddedd07d\") " pod="openshift-marketplace/community-operators-tsxk4" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.681068 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ggbs9" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.705511 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cqsf\" (UniqueName: \"kubernetes.io/projected/6848e4e5-1d17-4bf7-8dee-bbbeddedd07d-kube-api-access-5cqsf\") pod \"community-operators-tsxk4\" (UID: \"6848e4e5-1d17-4bf7-8dee-bbbeddedd07d\") " pod="openshift-marketplace/community-operators-tsxk4" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.777101 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a60cb615-f335-45fa-86dd-ddf121e62737-config\") pod \"a60cb615-f335-45fa-86dd-ddf121e62737\" (UID: \"a60cb615-f335-45fa-86dd-ddf121e62737\") " Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.777173 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a60cb615-f335-45fa-86dd-ddf121e62737-client-ca\") pod \"a60cb615-f335-45fa-86dd-ddf121e62737\" (UID: \"a60cb615-f335-45fa-86dd-ddf121e62737\") " Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.777201 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a60cb615-f335-45fa-86dd-ddf121e62737-serving-cert\") pod \"a60cb615-f335-45fa-86dd-ddf121e62737\" (UID: \"a60cb615-f335-45fa-86dd-ddf121e62737\") " Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.777328 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vmbj\" (UniqueName: \"kubernetes.io/projected/a60cb615-f335-45fa-86dd-ddf121e62737-kube-api-access-6vmbj\") pod \"a60cb615-f335-45fa-86dd-ddf121e62737\" (UID: \"a60cb615-f335-45fa-86dd-ddf121e62737\") " Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.777632 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:53 crc kubenswrapper[4973]: E0320 13:24:53.778022 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:54.278006519 +0000 UTC m=+215.021676263 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.779303 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a60cb615-f335-45fa-86dd-ddf121e62737-client-ca" (OuterVolumeSpecName: "client-ca") pod "a60cb615-f335-45fa-86dd-ddf121e62737" (UID: "a60cb615-f335-45fa-86dd-ddf121e62737"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.784796 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a60cb615-f335-45fa-86dd-ddf121e62737-config" (OuterVolumeSpecName: "config") pod "a60cb615-f335-45fa-86dd-ddf121e62737" (UID: "a60cb615-f335-45fa-86dd-ddf121e62737"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.786246 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-5jzd6"] Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.789129 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a60cb615-f335-45fa-86dd-ddf121e62737-kube-api-access-6vmbj" (OuterVolumeSpecName: "kube-api-access-6vmbj") pod "a60cb615-f335-45fa-86dd-ddf121e62737" (UID: "a60cb615-f335-45fa-86dd-ddf121e62737"). InnerVolumeSpecName "kube-api-access-6vmbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.791799 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a60cb615-f335-45fa-86dd-ddf121e62737-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a60cb615-f335-45fa-86dd-ddf121e62737" (UID: "a60cb615-f335-45fa-86dd-ddf121e62737"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:53 crc kubenswrapper[4973]: W0320 13:24:53.810675 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2115631d_0f02_4cb4_bfee_e18dd87a0462.slice/crio-703def46dbd3bf420b463ea2358eb7effd3cea44dbd17f56cb4a1cbcf1e74e6a WatchSource:0}: Error finding container 703def46dbd3bf420b463ea2358eb7effd3cea44dbd17f56cb4a1cbcf1e74e6a: Status 404 returned error can't find the container with id 703def46dbd3bf420b463ea2358eb7effd3cea44dbd17f56cb4a1cbcf1e74e6a Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.839877 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-plc2f"] Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.869377 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tsxk4" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.882952 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:53 crc kubenswrapper[4973]: E0320 13:24:53.884159 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:54.384139933 +0000 UTC m=+215.127809677 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.884372 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vmbj\" (UniqueName: \"kubernetes.io/projected/a60cb615-f335-45fa-86dd-ddf121e62737-kube-api-access-6vmbj\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.884386 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a60cb615-f335-45fa-86dd-ddf121e62737-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.884395 4973 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a60cb615-f335-45fa-86dd-ddf121e62737-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.884404 4973 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a60cb615-f335-45fa-86dd-ddf121e62737-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.899204 4973 patch_prober.go:28] interesting pod/router-default-5444994796-wgtgl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:24:53 crc kubenswrapper[4973]: [-]has-synced failed: reason withheld Mar 20 13:24:53 crc kubenswrapper[4973]: [+]process-running ok Mar 20 13:24:53 crc kubenswrapper[4973]: healthz check failed Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.899255 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wgtgl" podUID="0e880f23-4bef-4e96-bf00-c94dc4551c5a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:24:53 crc kubenswrapper[4973]: I0320 13:24:53.986132 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:53 crc kubenswrapper[4973]: E0320 13:24:53.986453 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:54.486440136 +0000 UTC m=+215.230109880 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.070496 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.087930 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:54 crc kubenswrapper[4973]: E0320 13:24:54.088257 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 13:24:54.588240455 +0000 UTC m=+215.331910199 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.156463 4973 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-20T13:24:53.505902834Z","Handler":null,"Name":""} Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.189277 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8521b3b0-4fb2-45b2-90b5-7080e766aafa-serving-cert\") pod \"8521b3b0-4fb2-45b2-90b5-7080e766aafa\" (UID: \"8521b3b0-4fb2-45b2-90b5-7080e766aafa\") " Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.189744 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8521b3b0-4fb2-45b2-90b5-7080e766aafa-client-ca\") pod \"8521b3b0-4fb2-45b2-90b5-7080e766aafa\" (UID: \"8521b3b0-4fb2-45b2-90b5-7080e766aafa\") " Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.189771 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8521b3b0-4fb2-45b2-90b5-7080e766aafa-config\") pod \"8521b3b0-4fb2-45b2-90b5-7080e766aafa\" (UID: \"8521b3b0-4fb2-45b2-90b5-7080e766aafa\") " Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.189880 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx7wc\" (UniqueName: \"kubernetes.io/projected/8521b3b0-4fb2-45b2-90b5-7080e766aafa-kube-api-access-vx7wc\") pod \"8521b3b0-4fb2-45b2-90b5-7080e766aafa\" (UID: \"8521b3b0-4fb2-45b2-90b5-7080e766aafa\") " Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.189936 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8521b3b0-4fb2-45b2-90b5-7080e766aafa-proxy-ca-bundles\") pod \"8521b3b0-4fb2-45b2-90b5-7080e766aafa\" (UID: \"8521b3b0-4fb2-45b2-90b5-7080e766aafa\") " Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.190071 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:54 crc kubenswrapper[4973]: E0320 13:24:54.190398 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 13:24:54.690383314 +0000 UTC m=+215.434053058 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-r57q7" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.191101 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8521b3b0-4fb2-45b2-90b5-7080e766aafa-config" (OuterVolumeSpecName: "config") pod "8521b3b0-4fb2-45b2-90b5-7080e766aafa" (UID: "8521b3b0-4fb2-45b2-90b5-7080e766aafa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.192913 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8521b3b0-4fb2-45b2-90b5-7080e766aafa-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8521b3b0-4fb2-45b2-90b5-7080e766aafa" (UID: "8521b3b0-4fb2-45b2-90b5-7080e766aafa"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.193063 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8521b3b0-4fb2-45b2-90b5-7080e766aafa-client-ca" (OuterVolumeSpecName: "client-ca") pod "8521b3b0-4fb2-45b2-90b5-7080e766aafa" (UID: "8521b3b0-4fb2-45b2-90b5-7080e766aafa"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.199118 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8521b3b0-4fb2-45b2-90b5-7080e766aafa-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8521b3b0-4fb2-45b2-90b5-7080e766aafa" (UID: "8521b3b0-4fb2-45b2-90b5-7080e766aafa"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.206395 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8521b3b0-4fb2-45b2-90b5-7080e766aafa-kube-api-access-vx7wc" (OuterVolumeSpecName: "kube-api-access-vx7wc") pod "8521b3b0-4fb2-45b2-90b5-7080e766aafa" (UID: "8521b3b0-4fb2-45b2-90b5-7080e766aafa"). InnerVolumeSpecName "kube-api-access-vx7wc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.208043 4973 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.208275 4973 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.285204 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tsxk4"] Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.291861 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.292184 4973 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8521b3b0-4fb2-45b2-90b5-7080e766aafa-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.292203 4973 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8521b3b0-4fb2-45b2-90b5-7080e766aafa-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.292214 4973 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8521b3b0-4fb2-45b2-90b5-7080e766aafa-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.292228 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8521b3b0-4fb2-45b2-90b5-7080e766aafa-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.292240 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx7wc\" (UniqueName: \"kubernetes.io/projected/8521b3b0-4fb2-45b2-90b5-7080e766aafa-kube-api-access-vx7wc\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.295638 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.380066 4973 generic.go:334] "Generic (PLEG): container finished" podID="8521b3b0-4fb2-45b2-90b5-7080e766aafa" containerID="286fe45ab59e91258bf977cd12fd7bb5bb7a5aa840b5a7073a1d20bd5620ba71" exitCode=0 Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.380223 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.382249 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" event={"ID":"8521b3b0-4fb2-45b2-90b5-7080e766aafa","Type":"ContainerDied","Data":"286fe45ab59e91258bf977cd12fd7bb5bb7a5aa840b5a7073a1d20bd5620ba71"} Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.382324 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-mtrbp" event={"ID":"8521b3b0-4fb2-45b2-90b5-7080e766aafa","Type":"ContainerDied","Data":"ba8a643536ada88fe35ad82fa1e371d009cde21292ce1a68f1e4a8f87a22a6a8"} Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.382469 4973 scope.go:117] "RemoveContainer" containerID="286fe45ab59e91258bf977cd12fd7bb5bb7a5aa840b5a7073a1d20bd5620ba71" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.393797 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.403675 4973 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.403978 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.405745 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg"] Mar 20 13:24:54 crc kubenswrapper[4973]: E0320 13:24:54.406050 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8521b3b0-4fb2-45b2-90b5-7080e766aafa" containerName="controller-manager" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.406990 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="8521b3b0-4fb2-45b2-90b5-7080e766aafa" containerName="controller-manager" Mar 20 13:24:54 crc kubenswrapper[4973]: E0320 13:24:54.407086 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a60cb615-f335-45fa-86dd-ddf121e62737" containerName="route-controller-manager" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.407151 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="a60cb615-f335-45fa-86dd-ddf121e62737" containerName="route-controller-manager" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.407303 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="8521b3b0-4fb2-45b2-90b5-7080e766aafa" containerName="controller-manager" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.407884 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="a60cb615-f335-45fa-86dd-ddf121e62737" containerName="route-controller-manager" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.408305 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.409661 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5"] Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.410173 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.412160 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.412749 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.422913 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg"] Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.426209 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mpgxt" event={"ID":"17fc0166-9183-4bbd-a091-644b431349e1","Type":"ContainerStarted","Data":"662045aba854d16581bf5fbbe68e1e2948ca1ac92c0cc393dcc51569b60ef053"} Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.426250 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-mpgxt" event={"ID":"17fc0166-9183-4bbd-a091-644b431349e1","Type":"ContainerStarted","Data":"861aab57792732a11acd8ab57c0865311bd9e9814e2dbdc9b169890e84dc9a50"} Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.426897 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.427035 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.427645 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.427922 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.443109 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.460522 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-plc2f" event={"ID":"8f429634-2787-4daa-a443-e4ab84f2e6b7","Type":"ContainerDied","Data":"f9fb4a8af7ca9f72f596503f7883fe894d2fee99d6a7d459cea017d98b13cf70"} Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.460547 4973 generic.go:334] "Generic (PLEG): container finished" podID="8f429634-2787-4daa-a443-e4ab84f2e6b7" containerID="f9fb4a8af7ca9f72f596503f7883fe894d2fee99d6a7d459cea017d98b13cf70" exitCode=0 Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.467931 4973 scope.go:117] "RemoveContainer" containerID="286fe45ab59e91258bf977cd12fd7bb5bb7a5aa840b5a7073a1d20bd5620ba71" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.470751 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-plc2f" event={"ID":"8f429634-2787-4daa-a443-e4ab84f2e6b7","Type":"ContainerStarted","Data":"2bfbadf6b0db245fff8160e3566710c30f4a4f50fcd5dcdf5a01db3dda0a749b"} Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.476580 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-r57q7\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.476744 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ggbs9"] Mar 20 13:24:54 crc kubenswrapper[4973]: E0320 13:24:54.478147 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"286fe45ab59e91258bf977cd12fd7bb5bb7a5aa840b5a7073a1d20bd5620ba71\": container with ID starting with 286fe45ab59e91258bf977cd12fd7bb5bb7a5aa840b5a7073a1d20bd5620ba71 not found: ID does not exist" containerID="286fe45ab59e91258bf977cd12fd7bb5bb7a5aa840b5a7073a1d20bd5620ba71" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.478216 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"286fe45ab59e91258bf977cd12fd7bb5bb7a5aa840b5a7073a1d20bd5620ba71"} err="failed to get container status \"286fe45ab59e91258bf977cd12fd7bb5bb7a5aa840b5a7073a1d20bd5620ba71\": rpc error: code = NotFound desc = could not find container \"286fe45ab59e91258bf977cd12fd7bb5bb7a5aa840b5a7073a1d20bd5620ba71\": container with ID starting with 286fe45ab59e91258bf977cd12fd7bb5bb7a5aa840b5a7073a1d20bd5620ba71 not found: ID does not exist" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.491555 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5"] Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.492228 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8" event={"ID":"a60cb615-f335-45fa-86dd-ddf121e62737","Type":"ContainerDied","Data":"e7834d592b7d5b99d9d3af1aabbecdddc83511528d6113dd87f34737e85acf66"} Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.492267 4973 scope.go:117] "RemoveContainer" containerID="5b4441d53cf53f861345b1c4b9573b80e9b281d00ad7ac3728c5446304237526" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.492292 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.494934 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf54d214-858c-4f73-ab36-0fd4ffcaf949-config\") pod \"controller-manager-64dbcdb7b8-h7kd5\" (UID: \"cf54d214-858c-4f73-ab36-0fd4ffcaf949\") " pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.495893 4973 generic.go:334] "Generic (PLEG): container finished" podID="2115631d-0f02-4cb4-bfee-e18dd87a0462" containerID="0acdc2d07d6b35d8a2476747b85cd85c583270203eb80a7f9d7a3c7f78fa44e5" exitCode=0 Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.495944 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jzd6" event={"ID":"2115631d-0f02-4cb4-bfee-e18dd87a0462","Type":"ContainerDied","Data":"0acdc2d07d6b35d8a2476747b85cd85c583270203eb80a7f9d7a3c7f78fa44e5"} Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.495967 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jzd6" event={"ID":"2115631d-0f02-4cb4-bfee-e18dd87a0462","Type":"ContainerStarted","Data":"703def46dbd3bf420b463ea2358eb7effd3cea44dbd17f56cb4a1cbcf1e74e6a"} Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.497026 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.498020 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.500610 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf54d214-858c-4f73-ab36-0fd4ffcaf949-client-ca\") pod \"controller-manager-64dbcdb7b8-h7kd5\" (UID: \"cf54d214-858c-4f73-ab36-0fd4ffcaf949\") " pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.500647 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e46b0744-2ae6-40e9-a4eb-352c338343f8-config\") pod \"route-controller-manager-5bdb56df7b-r8xhg\" (UID: \"e46b0744-2ae6-40e9-a4eb-352c338343f8\") " pod="openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.500675 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjfrw\" (UniqueName: \"kubernetes.io/projected/cf54d214-858c-4f73-ab36-0fd4ffcaf949-kube-api-access-pjfrw\") pod \"controller-manager-64dbcdb7b8-h7kd5\" (UID: \"cf54d214-858c-4f73-ab36-0fd4ffcaf949\") " pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.500714 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e46b0744-2ae6-40e9-a4eb-352c338343f8-serving-cert\") pod \"route-controller-manager-5bdb56df7b-r8xhg\" (UID: \"e46b0744-2ae6-40e9-a4eb-352c338343f8\") " pod="openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.500735 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e46b0744-2ae6-40e9-a4eb-352c338343f8-client-ca\") pod \"route-controller-manager-5bdb56df7b-r8xhg\" (UID: \"e46b0744-2ae6-40e9-a4eb-352c338343f8\") " pod="openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.500760 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf54d214-858c-4f73-ab36-0fd4ffcaf949-proxy-ca-bundles\") pod \"controller-manager-64dbcdb7b8-h7kd5\" (UID: \"cf54d214-858c-4f73-ab36-0fd4ffcaf949\") " pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.500783 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gh94h\" (UniqueName: \"kubernetes.io/projected/e46b0744-2ae6-40e9-a4eb-352c338343f8-kube-api-access-gh94h\") pod \"route-controller-manager-5bdb56df7b-r8xhg\" (UID: \"e46b0744-2ae6-40e9-a4eb-352c338343f8\") " pod="openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.500800 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf54d214-858c-4f73-ab36-0fd4ffcaf949-serving-cert\") pod \"controller-manager-64dbcdb7b8-h7kd5\" (UID: \"cf54d214-858c-4f73-ab36-0fd4ffcaf949\") " pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.501399 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7kszd" event={"ID":"93c5ad90-87bf-4668-9d87-34e676b15783","Type":"ContainerStarted","Data":"2f7b1699693d61b27a4a1e30c7d2ed2f1a655693c6e34c38e5b9c5f9aaf4b300"} Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.501523 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.501975 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.509172 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.512369 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-mpgxt" podStartSLOduration=11.512308497 podStartE2EDuration="11.512308497s" podCreationTimestamp="2026-03-20 13:24:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:54.460843251 +0000 UTC m=+215.204512995" watchObservedRunningTime="2026-03-20 13:24:54.512308497 +0000 UTC m=+215.255978241" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.536947 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mtrbp"] Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.549285 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-mtrbp"] Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.555288 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8"] Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.559903 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-b7jm8"] Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.597089 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7kszd" podStartSLOduration=169.597070774 podStartE2EDuration="2m49.597070774s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:54.59693715 +0000 UTC m=+215.340606894" watchObservedRunningTime="2026-03-20 13:24:54.597070774 +0000 UTC m=+215.340740518" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.603128 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e46b0744-2ae6-40e9-a4eb-352c338343f8-config\") pod \"route-controller-manager-5bdb56df7b-r8xhg\" (UID: \"e46b0744-2ae6-40e9-a4eb-352c338343f8\") " pod="openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.603172 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a07c04c1-a574-445e-9aa2-18f383a84c81-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a07c04c1-a574-445e-9aa2-18f383a84c81\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.603195 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a07c04c1-a574-445e-9aa2-18f383a84c81-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a07c04c1-a574-445e-9aa2-18f383a84c81\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.603218 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjfrw\" (UniqueName: \"kubernetes.io/projected/cf54d214-858c-4f73-ab36-0fd4ffcaf949-kube-api-access-pjfrw\") pod \"controller-manager-64dbcdb7b8-h7kd5\" (UID: \"cf54d214-858c-4f73-ab36-0fd4ffcaf949\") " pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.603261 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e46b0744-2ae6-40e9-a4eb-352c338343f8-serving-cert\") pod \"route-controller-manager-5bdb56df7b-r8xhg\" (UID: \"e46b0744-2ae6-40e9-a4eb-352c338343f8\") " pod="openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.603280 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e46b0744-2ae6-40e9-a4eb-352c338343f8-client-ca\") pod \"route-controller-manager-5bdb56df7b-r8xhg\" (UID: \"e46b0744-2ae6-40e9-a4eb-352c338343f8\") " pod="openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.603310 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf54d214-858c-4f73-ab36-0fd4ffcaf949-proxy-ca-bundles\") pod \"controller-manager-64dbcdb7b8-h7kd5\" (UID: \"cf54d214-858c-4f73-ab36-0fd4ffcaf949\") " pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.603332 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gh94h\" (UniqueName: \"kubernetes.io/projected/e46b0744-2ae6-40e9-a4eb-352c338343f8-kube-api-access-gh94h\") pod \"route-controller-manager-5bdb56df7b-r8xhg\" (UID: \"e46b0744-2ae6-40e9-a4eb-352c338343f8\") " pod="openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.603363 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf54d214-858c-4f73-ab36-0fd4ffcaf949-serving-cert\") pod \"controller-manager-64dbcdb7b8-h7kd5\" (UID: \"cf54d214-858c-4f73-ab36-0fd4ffcaf949\") " pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.603471 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf54d214-858c-4f73-ab36-0fd4ffcaf949-config\") pod \"controller-manager-64dbcdb7b8-h7kd5\" (UID: \"cf54d214-858c-4f73-ab36-0fd4ffcaf949\") " pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.603510 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf54d214-858c-4f73-ab36-0fd4ffcaf949-client-ca\") pod \"controller-manager-64dbcdb7b8-h7kd5\" (UID: \"cf54d214-858c-4f73-ab36-0fd4ffcaf949\") " pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.604272 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e46b0744-2ae6-40e9-a4eb-352c338343f8-config\") pod \"route-controller-manager-5bdb56df7b-r8xhg\" (UID: \"e46b0744-2ae6-40e9-a4eb-352c338343f8\") " pod="openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.605549 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf54d214-858c-4f73-ab36-0fd4ffcaf949-config\") pod \"controller-manager-64dbcdb7b8-h7kd5\" (UID: \"cf54d214-858c-4f73-ab36-0fd4ffcaf949\") " pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.606664 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e46b0744-2ae6-40e9-a4eb-352c338343f8-client-ca\") pod \"route-controller-manager-5bdb56df7b-r8xhg\" (UID: \"e46b0744-2ae6-40e9-a4eb-352c338343f8\") " pod="openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.606897 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf54d214-858c-4f73-ab36-0fd4ffcaf949-proxy-ca-bundles\") pod \"controller-manager-64dbcdb7b8-h7kd5\" (UID: \"cf54d214-858c-4f73-ab36-0fd4ffcaf949\") " pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.608460 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf54d214-858c-4f73-ab36-0fd4ffcaf949-client-ca\") pod \"controller-manager-64dbcdb7b8-h7kd5\" (UID: \"cf54d214-858c-4f73-ab36-0fd4ffcaf949\") " pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.615620 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf54d214-858c-4f73-ab36-0fd4ffcaf949-serving-cert\") pod \"controller-manager-64dbcdb7b8-h7kd5\" (UID: \"cf54d214-858c-4f73-ab36-0fd4ffcaf949\") " pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.618248 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e46b0744-2ae6-40e9-a4eb-352c338343f8-serving-cert\") pod \"route-controller-manager-5bdb56df7b-r8xhg\" (UID: \"e46b0744-2ae6-40e9-a4eb-352c338343f8\") " pod="openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.628751 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gh94h\" (UniqueName: \"kubernetes.io/projected/e46b0744-2ae6-40e9-a4eb-352c338343f8-kube-api-access-gh94h\") pod \"route-controller-manager-5bdb56df7b-r8xhg\" (UID: \"e46b0744-2ae6-40e9-a4eb-352c338343f8\") " pod="openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.629364 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjfrw\" (UniqueName: \"kubernetes.io/projected/cf54d214-858c-4f73-ab36-0fd4ffcaf949-kube-api-access-pjfrw\") pod \"controller-manager-64dbcdb7b8-h7kd5\" (UID: \"cf54d214-858c-4f73-ab36-0fd4ffcaf949\") " pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.705058 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a07c04c1-a574-445e-9aa2-18f383a84c81-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a07c04c1-a574-445e-9aa2-18f383a84c81\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.705577 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a07c04c1-a574-445e-9aa2-18f383a84c81-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a07c04c1-a574-445e-9aa2-18f383a84c81\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.705859 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a07c04c1-a574-445e-9aa2-18f383a84c81-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"a07c04c1-a574-445e-9aa2-18f383a84c81\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.722583 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a07c04c1-a574-445e-9aa2-18f383a84c81-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"a07c04c1-a574-445e-9aa2-18f383a84c81\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.747369 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.756811 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.765470 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.843946 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.897699 4973 patch_prober.go:28] interesting pod/router-default-5444994796-wgtgl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:24:54 crc kubenswrapper[4973]: [-]has-synced failed: reason withheld Mar 20 13:24:54 crc kubenswrapper[4973]: [+]process-running ok Mar 20 13:24:54 crc kubenswrapper[4973]: healthz check failed Mar 20 13:24:54 crc kubenswrapper[4973]: I0320 13:24:54.898273 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wgtgl" podUID="0e880f23-4bef-4e96-bf00-c94dc4551c5a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.033215 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c7jw4"] Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.034718 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c7jw4" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.044665 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.056333 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c7jw4"] Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.064086 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.065473 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.076586 4973 patch_prober.go:28] interesting pod/console-f9d7485db-k7krj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.076664 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-k7krj" podUID="de8d912e-7616-42ee-a688-b43d5b85dc44" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.112120 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzghc\" (UniqueName: \"kubernetes.io/projected/0993b0a3-f604-4447-bce2-01636b061230-kube-api-access-mzghc\") pod \"redhat-marketplace-c7jw4\" (UID: \"0993b0a3-f604-4447-bce2-01636b061230\") " pod="openshift-marketplace/redhat-marketplace-c7jw4" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.112180 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0993b0a3-f604-4447-bce2-01636b061230-utilities\") pod \"redhat-marketplace-c7jw4\" (UID: \"0993b0a3-f604-4447-bce2-01636b061230\") " pod="openshift-marketplace/redhat-marketplace-c7jw4" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.112205 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0993b0a3-f604-4447-bce2-01636b061230-catalog-content\") pod \"redhat-marketplace-c7jw4\" (UID: \"0993b0a3-f604-4447-bce2-01636b061230\") " pod="openshift-marketplace/redhat-marketplace-c7jw4" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.112909 4973 patch_prober.go:28] interesting pod/downloads-7954f5f757-xt4hf container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.112945 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xt4hf" podUID="6e84a900-8f09-438e-a365-60d6b9fc835b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.113247 4973 patch_prober.go:28] interesting pod/downloads-7954f5f757-xt4hf container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" start-of-body= Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.113291 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-xt4hf" podUID="6e84a900-8f09-438e-a365-60d6b9fc835b" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.12:8080/\": dial tcp 10.217.0.12:8080: connect: connection refused" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.212983 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzghc\" (UniqueName: \"kubernetes.io/projected/0993b0a3-f604-4447-bce2-01636b061230-kube-api-access-mzghc\") pod \"redhat-marketplace-c7jw4\" (UID: \"0993b0a3-f604-4447-bce2-01636b061230\") " pod="openshift-marketplace/redhat-marketplace-c7jw4" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.213056 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0993b0a3-f604-4447-bce2-01636b061230-utilities\") pod \"redhat-marketplace-c7jw4\" (UID: \"0993b0a3-f604-4447-bce2-01636b061230\") " pod="openshift-marketplace/redhat-marketplace-c7jw4" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.213083 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0993b0a3-f604-4447-bce2-01636b061230-catalog-content\") pod \"redhat-marketplace-c7jw4\" (UID: \"0993b0a3-f604-4447-bce2-01636b061230\") " pod="openshift-marketplace/redhat-marketplace-c7jw4" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.214191 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0993b0a3-f604-4447-bce2-01636b061230-utilities\") pod \"redhat-marketplace-c7jw4\" (UID: \"0993b0a3-f604-4447-bce2-01636b061230\") " pod="openshift-marketplace/redhat-marketplace-c7jw4" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.219195 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r57q7"] Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.220482 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0993b0a3-f604-4447-bce2-01636b061230-catalog-content\") pod \"redhat-marketplace-c7jw4\" (UID: \"0993b0a3-f604-4447-bce2-01636b061230\") " pod="openshift-marketplace/redhat-marketplace-c7jw4" Mar 20 13:24:55 crc kubenswrapper[4973]: W0320 13:24:55.247936 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4e363de_fd5c_4f76_8943_ae3c56f3765b.slice/crio-65ec8861ba4d159f50d1cd266216b5e762a531b244f8930afaa5447c5954c07a WatchSource:0}: Error finding container 65ec8861ba4d159f50d1cd266216b5e762a531b244f8930afaa5447c5954c07a: Status 404 returned error can't find the container with id 65ec8861ba4d159f50d1cd266216b5e762a531b244f8930afaa5447c5954c07a Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.248441 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.253236 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5"] Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.260166 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzghc\" (UniqueName: \"kubernetes.io/projected/0993b0a3-f604-4447-bce2-01636b061230-kube-api-access-mzghc\") pod \"redhat-marketplace-c7jw4\" (UID: \"0993b0a3-f604-4447-bce2-01636b061230\") " pod="openshift-marketplace/redhat-marketplace-c7jw4" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.264982 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg"] Mar 20 13:24:55 crc kubenswrapper[4973]: W0320 13:24:55.268466 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode46b0744_2ae6_40e9_a4eb_352c338343f8.slice/crio-6d898fb12c2a5416c35563969e48f6a7dfb9ddae4ded0db135929a021aa421da WatchSource:0}: Error finding container 6d898fb12c2a5416c35563969e48f6a7dfb9ddae4ded0db135929a021aa421da: Status 404 returned error can't find the container with id 6d898fb12c2a5416c35563969e48f6a7dfb9ddae4ded0db135929a021aa421da Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.367746 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c7jw4" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.417597 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v6w64"] Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.429934 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6w64" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.454921 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6w64"] Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.513909 4973 generic.go:334] "Generic (PLEG): container finished" podID="ce12ac02-5b58-44a3-a311-8cdd000ce41b" containerID="74110cbff82c888eabcbc295f453d16c580c1ff1fa2fcc580bff9ff44af29e18" exitCode=0 Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.514014 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ggbs9" event={"ID":"ce12ac02-5b58-44a3-a311-8cdd000ce41b","Type":"ContainerDied","Data":"74110cbff82c888eabcbc295f453d16c580c1ff1fa2fcc580bff9ff44af29e18"} Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.514045 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ggbs9" event={"ID":"ce12ac02-5b58-44a3-a311-8cdd000ce41b","Type":"ContainerStarted","Data":"83e2a1c677ebcc029e14180a3cd929bc6ce758f2315c324712c208527a996146"} Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.516611 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf96ec01-20ab-4529-ad42-d839540c3d8e-catalog-content\") pod \"redhat-marketplace-v6w64\" (UID: \"cf96ec01-20ab-4529-ad42-d839540c3d8e\") " pod="openshift-marketplace/redhat-marketplace-v6w64" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.516835 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf96ec01-20ab-4529-ad42-d839540c3d8e-utilities\") pod \"redhat-marketplace-v6w64\" (UID: \"cf96ec01-20ab-4529-ad42-d839540c3d8e\") " pod="openshift-marketplace/redhat-marketplace-v6w64" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.517243 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw6t4\" (UniqueName: \"kubernetes.io/projected/cf96ec01-20ab-4529-ad42-d839540c3d8e-kube-api-access-kw6t4\") pod \"redhat-marketplace-v6w64\" (UID: \"cf96ec01-20ab-4529-ad42-d839540c3d8e\") " pod="openshift-marketplace/redhat-marketplace-v6w64" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.517773 4973 generic.go:334] "Generic (PLEG): container finished" podID="b9d6f338-6a7a-4ba2-b1c9-12485bb30937" containerID="a8cf42a9d59625b7d17e2f95f60f398813c0fe69038ad0c286fec8890d8a13d7" exitCode=0 Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.517868 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-2dssd" event={"ID":"b9d6f338-6a7a-4ba2-b1c9-12485bb30937","Type":"ContainerDied","Data":"a8cf42a9d59625b7d17e2f95f60f398813c0fe69038ad0c286fec8890d8a13d7"} Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.525219 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a07c04c1-a574-445e-9aa2-18f383a84c81","Type":"ContainerStarted","Data":"10e29d1f8fcf0f07be6895f6be11c863df6029f2323e4c2dd5000130e7f96af3"} Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.528861 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" event={"ID":"a4e363de-fd5c-4f76-8943-ae3c56f3765b","Type":"ContainerStarted","Data":"65ec8861ba4d159f50d1cd266216b5e762a531b244f8930afaa5447c5954c07a"} Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.530791 4973 generic.go:334] "Generic (PLEG): container finished" podID="6848e4e5-1d17-4bf7-8dee-bbbeddedd07d" containerID="76d8537cdfc81c36d6474778ed43d692326ce6099b1f4fbc1a0adc5c195746d3" exitCode=0 Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.530923 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tsxk4" event={"ID":"6848e4e5-1d17-4bf7-8dee-bbbeddedd07d","Type":"ContainerDied","Data":"76d8537cdfc81c36d6474778ed43d692326ce6099b1f4fbc1a0adc5c195746d3"} Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.530975 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tsxk4" event={"ID":"6848e4e5-1d17-4bf7-8dee-bbbeddedd07d","Type":"ContainerStarted","Data":"61393645fbc553ded40796977f6a6b61c7061c60374e54e0a4a7cf8712191dad"} Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.539234 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" event={"ID":"cf54d214-858c-4f73-ab36-0fd4ffcaf949","Type":"ContainerStarted","Data":"baa574e39055a403234e9494c1cc9e7d7fc370c4dd7493ebf73fb299b4549c07"} Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.542478 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg" event={"ID":"e46b0744-2ae6-40e9-a4eb-352c338343f8","Type":"ContainerStarted","Data":"6d898fb12c2a5416c35563969e48f6a7dfb9ddae4ded0db135929a021aa421da"} Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.542506 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.551063 4973 patch_prober.go:28] interesting pod/route-controller-manager-5bdb56df7b-r8xhg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" start-of-body= Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.551121 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg" podUID="e46b0744-2ae6-40e9-a4eb-352c338343f8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.607250 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg" podStartSLOduration=2.607230676 podStartE2EDuration="2.607230676s" podCreationTimestamp="2026-03-20 13:24:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:55.606373591 +0000 UTC m=+216.350043335" watchObservedRunningTime="2026-03-20 13:24:55.607230676 +0000 UTC m=+216.350900420" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.619325 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf96ec01-20ab-4529-ad42-d839540c3d8e-utilities\") pod \"redhat-marketplace-v6w64\" (UID: \"cf96ec01-20ab-4529-ad42-d839540c3d8e\") " pod="openshift-marketplace/redhat-marketplace-v6w64" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.619485 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw6t4\" (UniqueName: \"kubernetes.io/projected/cf96ec01-20ab-4529-ad42-d839540c3d8e-kube-api-access-kw6t4\") pod \"redhat-marketplace-v6w64\" (UID: \"cf96ec01-20ab-4529-ad42-d839540c3d8e\") " pod="openshift-marketplace/redhat-marketplace-v6w64" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.619581 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf96ec01-20ab-4529-ad42-d839540c3d8e-catalog-content\") pod \"redhat-marketplace-v6w64\" (UID: \"cf96ec01-20ab-4529-ad42-d839540c3d8e\") " pod="openshift-marketplace/redhat-marketplace-v6w64" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.622297 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf96ec01-20ab-4529-ad42-d839540c3d8e-catalog-content\") pod \"redhat-marketplace-v6w64\" (UID: \"cf96ec01-20ab-4529-ad42-d839540c3d8e\") " pod="openshift-marketplace/redhat-marketplace-v6w64" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.622762 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf96ec01-20ab-4529-ad42-d839540c3d8e-utilities\") pod \"redhat-marketplace-v6w64\" (UID: \"cf96ec01-20ab-4529-ad42-d839540c3d8e\") " pod="openshift-marketplace/redhat-marketplace-v6w64" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.661578 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw6t4\" (UniqueName: \"kubernetes.io/projected/cf96ec01-20ab-4529-ad42-d839540c3d8e-kube-api-access-kw6t4\") pod \"redhat-marketplace-v6w64\" (UID: \"cf96ec01-20ab-4529-ad42-d839540c3d8e\") " pod="openshift-marketplace/redhat-marketplace-v6w64" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.742356 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c7jw4"] Mar 20 13:24:55 crc kubenswrapper[4973]: W0320 13:24:55.750239 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0993b0a3_f604_4447_bce2_01636b061230.slice/crio-b9cf01bd82dffd5ae33b8f9ca5ce597ada7b90d54b3d4a62d3126a51db83792a WatchSource:0}: Error finding container b9cf01bd82dffd5ae33b8f9ca5ce597ada7b90d54b3d4a62d3126a51db83792a: Status 404 returned error can't find the container with id b9cf01bd82dffd5ae33b8f9ca5ce597ada7b90d54b3d4a62d3126a51db83792a Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.763628 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6w64" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.827947 4973 patch_prober.go:28] interesting pod/apiserver-76f77b778f-k9jzg container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 20 13:24:55 crc kubenswrapper[4973]: [+]log ok Mar 20 13:24:55 crc kubenswrapper[4973]: [+]etcd ok Mar 20 13:24:55 crc kubenswrapper[4973]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 20 13:24:55 crc kubenswrapper[4973]: [+]poststarthook/generic-apiserver-start-informers ok Mar 20 13:24:55 crc kubenswrapper[4973]: [+]poststarthook/max-in-flight-filter ok Mar 20 13:24:55 crc kubenswrapper[4973]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 20 13:24:55 crc kubenswrapper[4973]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 20 13:24:55 crc kubenswrapper[4973]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 20 13:24:55 crc kubenswrapper[4973]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 20 13:24:55 crc kubenswrapper[4973]: [+]poststarthook/project.openshift.io-projectcache ok Mar 20 13:24:55 crc kubenswrapper[4973]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 20 13:24:55 crc kubenswrapper[4973]: [+]poststarthook/openshift.io-startinformers ok Mar 20 13:24:55 crc kubenswrapper[4973]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 20 13:24:55 crc kubenswrapper[4973]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 20 13:24:55 crc kubenswrapper[4973]: livez check failed Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.828054 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" podUID="391ef260-a6ea-4cab-bca3-280435898381" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.895163 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-wgtgl" Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.903202 4973 patch_prober.go:28] interesting pod/router-default-5444994796-wgtgl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:24:55 crc kubenswrapper[4973]: [-]has-synced failed: reason withheld Mar 20 13:24:55 crc kubenswrapper[4973]: [+]process-running ok Mar 20 13:24:55 crc kubenswrapper[4973]: healthz check failed Mar 20 13:24:55 crc kubenswrapper[4973]: I0320 13:24:55.903295 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wgtgl" podUID="0e880f23-4bef-4e96-bf00-c94dc4551c5a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.016537 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8521b3b0-4fb2-45b2-90b5-7080e766aafa" path="/var/lib/kubelet/pods/8521b3b0-4fb2-45b2-90b5-7080e766aafa/volumes" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.018723 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.019439 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a60cb615-f335-45fa-86dd-ddf121e62737" path="/var/lib/kubelet/pods/a60cb615-f335-45fa-86dd-ddf121e62737/volumes" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.068817 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6p25c"] Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.069992 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6p25c" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.076488 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.088271 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6p25c"] Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.133382 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/097e9042-52e2-4a7e-b567-5b97f34242d6-utilities\") pod \"redhat-operators-6p25c\" (UID: \"097e9042-52e2-4a7e-b567-5b97f34242d6\") " pod="openshift-marketplace/redhat-operators-6p25c" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.133426 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/097e9042-52e2-4a7e-b567-5b97f34242d6-catalog-content\") pod \"redhat-operators-6p25c\" (UID: \"097e9042-52e2-4a7e-b567-5b97f34242d6\") " pod="openshift-marketplace/redhat-operators-6p25c" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.133488 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64hqq\" (UniqueName: \"kubernetes.io/projected/097e9042-52e2-4a7e-b567-5b97f34242d6-kube-api-access-64hqq\") pod \"redhat-operators-6p25c\" (UID: \"097e9042-52e2-4a7e-b567-5b97f34242d6\") " pod="openshift-marketplace/redhat-operators-6p25c" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.186964 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6w64"] Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.198622 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.238828 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64hqq\" (UniqueName: \"kubernetes.io/projected/097e9042-52e2-4a7e-b567-5b97f34242d6-kube-api-access-64hqq\") pod \"redhat-operators-6p25c\" (UID: \"097e9042-52e2-4a7e-b567-5b97f34242d6\") " pod="openshift-marketplace/redhat-operators-6p25c" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.238884 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/097e9042-52e2-4a7e-b567-5b97f34242d6-utilities\") pod \"redhat-operators-6p25c\" (UID: \"097e9042-52e2-4a7e-b567-5b97f34242d6\") " pod="openshift-marketplace/redhat-operators-6p25c" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.238905 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/097e9042-52e2-4a7e-b567-5b97f34242d6-catalog-content\") pod \"redhat-operators-6p25c\" (UID: \"097e9042-52e2-4a7e-b567-5b97f34242d6\") " pod="openshift-marketplace/redhat-operators-6p25c" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.241240 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/097e9042-52e2-4a7e-b567-5b97f34242d6-utilities\") pod \"redhat-operators-6p25c\" (UID: \"097e9042-52e2-4a7e-b567-5b97f34242d6\") " pod="openshift-marketplace/redhat-operators-6p25c" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.245593 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/097e9042-52e2-4a7e-b567-5b97f34242d6-catalog-content\") pod \"redhat-operators-6p25c\" (UID: \"097e9042-52e2-4a7e-b567-5b97f34242d6\") " pod="openshift-marketplace/redhat-operators-6p25c" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.270497 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64hqq\" (UniqueName: \"kubernetes.io/projected/097e9042-52e2-4a7e-b567-5b97f34242d6-kube-api-access-64hqq\") pod \"redhat-operators-6p25c\" (UID: \"097e9042-52e2-4a7e-b567-5b97f34242d6\") " pod="openshift-marketplace/redhat-operators-6p25c" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.418616 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qw9hm"] Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.441988 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qw9hm" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.419625 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6p25c" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.452129 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qw9hm"] Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.546777 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af380bf0-7c0d-4790-8ae4-19697763a37a-utilities\") pod \"redhat-operators-qw9hm\" (UID: \"af380bf0-7c0d-4790-8ae4-19697763a37a\") " pod="openshift-marketplace/redhat-operators-qw9hm" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.546831 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwtgk\" (UniqueName: \"kubernetes.io/projected/af380bf0-7c0d-4790-8ae4-19697763a37a-kube-api-access-pwtgk\") pod \"redhat-operators-qw9hm\" (UID: \"af380bf0-7c0d-4790-8ae4-19697763a37a\") " pod="openshift-marketplace/redhat-operators-qw9hm" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.546875 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af380bf0-7c0d-4790-8ae4-19697763a37a-catalog-content\") pod \"redhat-operators-qw9hm\" (UID: \"af380bf0-7c0d-4790-8ae4-19697763a37a\") " pod="openshift-marketplace/redhat-operators-qw9hm" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.582615 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg" event={"ID":"e46b0744-2ae6-40e9-a4eb-352c338343f8","Type":"ContainerStarted","Data":"cdc75fb73c7ff5a7ccdb622bbe0680814d724e6117544d6c861a7b67966e607d"} Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.590375 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.610873 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" event={"ID":"a4e363de-fd5c-4f76-8943-ae3c56f3765b","Type":"ContainerStarted","Data":"8c0b5b254693bc98c893506550a9f2a364c6013e1f0d8c980544958121375ac9"} Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.611688 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.643995 4973 generic.go:334] "Generic (PLEG): container finished" podID="0993b0a3-f604-4447-bce2-01636b061230" containerID="40ff8a0068a1f16d6b0aadcc93ee6ca4ff61cbe773fe9cca174fa5284a1f3c41" exitCode=0 Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.644119 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c7jw4" event={"ID":"0993b0a3-f604-4447-bce2-01636b061230","Type":"ContainerDied","Data":"40ff8a0068a1f16d6b0aadcc93ee6ca4ff61cbe773fe9cca174fa5284a1f3c41"} Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.644190 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c7jw4" event={"ID":"0993b0a3-f604-4447-bce2-01636b061230","Type":"ContainerStarted","Data":"b9cf01bd82dffd5ae33b8f9ca5ce597ada7b90d54b3d4a62d3126a51db83792a"} Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.645855 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" podStartSLOduration=171.645836967 podStartE2EDuration="2m51.645836967s" podCreationTimestamp="2026-03-20 13:22:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:56.642559753 +0000 UTC m=+217.386229527" watchObservedRunningTime="2026-03-20 13:24:56.645836967 +0000 UTC m=+217.389506711" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.649858 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af380bf0-7c0d-4790-8ae4-19697763a37a-catalog-content\") pod \"redhat-operators-qw9hm\" (UID: \"af380bf0-7c0d-4790-8ae4-19697763a37a\") " pod="openshift-marketplace/redhat-operators-qw9hm" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.649928 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af380bf0-7c0d-4790-8ae4-19697763a37a-utilities\") pod \"redhat-operators-qw9hm\" (UID: \"af380bf0-7c0d-4790-8ae4-19697763a37a\") " pod="openshift-marketplace/redhat-operators-qw9hm" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.649981 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwtgk\" (UniqueName: \"kubernetes.io/projected/af380bf0-7c0d-4790-8ae4-19697763a37a-kube-api-access-pwtgk\") pod \"redhat-operators-qw9hm\" (UID: \"af380bf0-7c0d-4790-8ae4-19697763a37a\") " pod="openshift-marketplace/redhat-operators-qw9hm" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.651561 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af380bf0-7c0d-4790-8ae4-19697763a37a-catalog-content\") pod \"redhat-operators-qw9hm\" (UID: \"af380bf0-7c0d-4790-8ae4-19697763a37a\") " pod="openshift-marketplace/redhat-operators-qw9hm" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.651866 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af380bf0-7c0d-4790-8ae4-19697763a37a-utilities\") pod \"redhat-operators-qw9hm\" (UID: \"af380bf0-7c0d-4790-8ae4-19697763a37a\") " pod="openshift-marketplace/redhat-operators-qw9hm" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.653387 4973 generic.go:334] "Generic (PLEG): container finished" podID="cf96ec01-20ab-4529-ad42-d839540c3d8e" containerID="b9838194169be997935f92587ef678e9ede3335115a9337b6d2b5b4dfa205c98" exitCode=0 Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.653459 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6w64" event={"ID":"cf96ec01-20ab-4529-ad42-d839540c3d8e","Type":"ContainerDied","Data":"b9838194169be997935f92587ef678e9ede3335115a9337b6d2b5b4dfa205c98"} Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.653489 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6w64" event={"ID":"cf96ec01-20ab-4529-ad42-d839540c3d8e","Type":"ContainerStarted","Data":"cc7012e124f77a10942cbdaaa120915159b84bc9fb54d8844d8874e4f19347b9"} Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.661935 4973 generic.go:334] "Generic (PLEG): container finished" podID="a07c04c1-a574-445e-9aa2-18f383a84c81" containerID="40f423f5ffb6084517f282ae136b3582c81c1923b5cff0e726ffc6a8ddb32086" exitCode=0 Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.662014 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a07c04c1-a574-445e-9aa2-18f383a84c81","Type":"ContainerDied","Data":"40f423f5ffb6084517f282ae136b3582c81c1923b5cff0e726ffc6a8ddb32086"} Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.673255 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwtgk\" (UniqueName: \"kubernetes.io/projected/af380bf0-7c0d-4790-8ae4-19697763a37a-kube-api-access-pwtgk\") pod \"redhat-operators-qw9hm\" (UID: \"af380bf0-7c0d-4790-8ae4-19697763a37a\") " pod="openshift-marketplace/redhat-operators-qw9hm" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.683574 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" event={"ID":"cf54d214-858c-4f73-ab36-0fd4ffcaf949","Type":"ContainerStarted","Data":"eee1369c96ac6e8afaf23c91cfbb594037296badc2c9d22b37816a04b2fb70fb"} Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.683668 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.691772 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.849118 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qw9hm" Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.905273 4973 patch_prober.go:28] interesting pod/router-default-5444994796-wgtgl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:24:56 crc kubenswrapper[4973]: [-]has-synced failed: reason withheld Mar 20 13:24:56 crc kubenswrapper[4973]: [+]process-running ok Mar 20 13:24:56 crc kubenswrapper[4973]: healthz check failed Mar 20 13:24:56 crc kubenswrapper[4973]: I0320 13:24:56.905422 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wgtgl" podUID="0e880f23-4bef-4e96-bf00-c94dc4551c5a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.002322 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" podStartSLOduration=4.002305008 podStartE2EDuration="4.002305008s" podCreationTimestamp="2026-03-20 13:24:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:56.751706874 +0000 UTC m=+217.495376638" watchObservedRunningTime="2026-03-20 13:24:57.002305008 +0000 UTC m=+217.745974752" Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.006278 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.008373 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.010402 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.010552 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.021649 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.058287 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e9a4008-f747-46b4-9dd2-72def7c504a9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0e9a4008-f747-46b4-9dd2-72def7c504a9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.058371 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e9a4008-f747-46b4-9dd2-72def7c504a9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0e9a4008-f747-46b4-9dd2-72def7c504a9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.097905 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-2dssd" Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.162305 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9d6f338-6a7a-4ba2-b1c9-12485bb30937-config-volume\") pod \"b9d6f338-6a7a-4ba2-b1c9-12485bb30937\" (UID: \"b9d6f338-6a7a-4ba2-b1c9-12485bb30937\") " Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.162420 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9d6f338-6a7a-4ba2-b1c9-12485bb30937-secret-volume\") pod \"b9d6f338-6a7a-4ba2-b1c9-12485bb30937\" (UID: \"b9d6f338-6a7a-4ba2-b1c9-12485bb30937\") " Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.162489 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xn5g\" (UniqueName: \"kubernetes.io/projected/b9d6f338-6a7a-4ba2-b1c9-12485bb30937-kube-api-access-4xn5g\") pod \"b9d6f338-6a7a-4ba2-b1c9-12485bb30937\" (UID: \"b9d6f338-6a7a-4ba2-b1c9-12485bb30937\") " Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.162980 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e9a4008-f747-46b4-9dd2-72def7c504a9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0e9a4008-f747-46b4-9dd2-72def7c504a9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.163063 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e9a4008-f747-46b4-9dd2-72def7c504a9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0e9a4008-f747-46b4-9dd2-72def7c504a9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.163079 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e9a4008-f747-46b4-9dd2-72def7c504a9-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"0e9a4008-f747-46b4-9dd2-72def7c504a9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.163535 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9d6f338-6a7a-4ba2-b1c9-12485bb30937-config-volume" (OuterVolumeSpecName: "config-volume") pod "b9d6f338-6a7a-4ba2-b1c9-12485bb30937" (UID: "b9d6f338-6a7a-4ba2-b1c9-12485bb30937"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.171832 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9d6f338-6a7a-4ba2-b1c9-12485bb30937-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b9d6f338-6a7a-4ba2-b1c9-12485bb30937" (UID: "b9d6f338-6a7a-4ba2-b1c9-12485bb30937"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.185437 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e9a4008-f747-46b4-9dd2-72def7c504a9-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"0e9a4008-f747-46b4-9dd2-72def7c504a9\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.191051 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9d6f338-6a7a-4ba2-b1c9-12485bb30937-kube-api-access-4xn5g" (OuterVolumeSpecName: "kube-api-access-4xn5g") pod "b9d6f338-6a7a-4ba2-b1c9-12485bb30937" (UID: "b9d6f338-6a7a-4ba2-b1c9-12485bb30937"). InnerVolumeSpecName "kube-api-access-4xn5g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.248894 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qw9hm"] Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.264192 4973 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b9d6f338-6a7a-4ba2-b1c9-12485bb30937-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.264225 4973 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b9d6f338-6a7a-4ba2-b1c9-12485bb30937-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.264235 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xn5g\" (UniqueName: \"kubernetes.io/projected/b9d6f338-6a7a-4ba2-b1c9-12485bb30937-kube-api-access-4xn5g\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.325149 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6p25c"] Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.386452 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.776905 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-2dssd" event={"ID":"b9d6f338-6a7a-4ba2-b1c9-12485bb30937","Type":"ContainerDied","Data":"f3e181a66cf44a5811e4649214668a9610bf0274da40991eba44a30f763d51cb"} Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.777496 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3e181a66cf44a5811e4649214668a9610bf0274da40991eba44a30f763d51cb" Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.777482 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566875-2dssd" Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.802053 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw9hm" event={"ID":"af380bf0-7c0d-4790-8ae4-19697763a37a","Type":"ContainerStarted","Data":"3d684791e3e11579b746a59f772541ac3a689d302adef5db3402cb85cd0b432c"} Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.819596 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6p25c" event={"ID":"097e9042-52e2-4a7e-b567-5b97f34242d6","Type":"ContainerStarted","Data":"f75e40b7ab9e44acb7ae2421d86c232f4943ea4b43a0a500023ca0abec0dcf24"} Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.904147 4973 patch_prober.go:28] interesting pod/router-default-5444994796-wgtgl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:24:57 crc kubenswrapper[4973]: [-]has-synced failed: reason withheld Mar 20 13:24:57 crc kubenswrapper[4973]: [+]process-running ok Mar 20 13:24:57 crc kubenswrapper[4973]: healthz check failed Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.904231 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wgtgl" podUID="0e880f23-4bef-4e96-bf00-c94dc4551c5a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:24:57 crc kubenswrapper[4973]: I0320 13:24:57.923688 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 13:24:57 crc kubenswrapper[4973]: W0320 13:24:57.953176 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0e9a4008_f747_46b4_9dd2_72def7c504a9.slice/crio-ecd5a97e81212e068557480400f4c4bdd97e110de9f9d633f671ed4985941701 WatchSource:0}: Error finding container ecd5a97e81212e068557480400f4c4bdd97e110de9f9d633f671ed4985941701: Status 404 returned error can't find the container with id ecd5a97e81212e068557480400f4c4bdd97e110de9f9d633f671ed4985941701 Mar 20 13:24:58 crc kubenswrapper[4973]: I0320 13:24:58.186274 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:24:58 crc kubenswrapper[4973]: I0320 13:24:58.282465 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a07c04c1-a574-445e-9aa2-18f383a84c81-kube-api-access\") pod \"a07c04c1-a574-445e-9aa2-18f383a84c81\" (UID: \"a07c04c1-a574-445e-9aa2-18f383a84c81\") " Mar 20 13:24:58 crc kubenswrapper[4973]: I0320 13:24:58.282520 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a07c04c1-a574-445e-9aa2-18f383a84c81-kubelet-dir\") pod \"a07c04c1-a574-445e-9aa2-18f383a84c81\" (UID: \"a07c04c1-a574-445e-9aa2-18f383a84c81\") " Mar 20 13:24:58 crc kubenswrapper[4973]: I0320 13:24:58.282707 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a07c04c1-a574-445e-9aa2-18f383a84c81-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a07c04c1-a574-445e-9aa2-18f383a84c81" (UID: "a07c04c1-a574-445e-9aa2-18f383a84c81"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:24:58 crc kubenswrapper[4973]: I0320 13:24:58.282935 4973 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a07c04c1-a574-445e-9aa2-18f383a84c81-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:58 crc kubenswrapper[4973]: I0320 13:24:58.294910 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a07c04c1-a574-445e-9aa2-18f383a84c81-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a07c04c1-a574-445e-9aa2-18f383a84c81" (UID: "a07c04c1-a574-445e-9aa2-18f383a84c81"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:24:58 crc kubenswrapper[4973]: I0320 13:24:58.383991 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a07c04c1-a574-445e-9aa2-18f383a84c81-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:24:58 crc kubenswrapper[4973]: I0320 13:24:58.815932 4973 ???:1] "http: TLS handshake error from 192.168.126.11:38238: no serving certificate available for the kubelet" Mar 20 13:24:58 crc kubenswrapper[4973]: I0320 13:24:58.858669 4973 generic.go:334] "Generic (PLEG): container finished" podID="097e9042-52e2-4a7e-b567-5b97f34242d6" containerID="4b456b688e7f33f27332351c95d84b4a52455670e07e3318d97c7b157d2e50b2" exitCode=0 Mar 20 13:24:58 crc kubenswrapper[4973]: I0320 13:24:58.858785 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6p25c" event={"ID":"097e9042-52e2-4a7e-b567-5b97f34242d6","Type":"ContainerDied","Data":"4b456b688e7f33f27332351c95d84b4a52455670e07e3318d97c7b157d2e50b2"} Mar 20 13:24:58 crc kubenswrapper[4973]: I0320 13:24:58.866473 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"a07c04c1-a574-445e-9aa2-18f383a84c81","Type":"ContainerDied","Data":"10e29d1f8fcf0f07be6895f6be11c863df6029f2323e4c2dd5000130e7f96af3"} Mar 20 13:24:58 crc kubenswrapper[4973]: I0320 13:24:58.866511 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10e29d1f8fcf0f07be6895f6be11c863df6029f2323e4c2dd5000130e7f96af3" Mar 20 13:24:58 crc kubenswrapper[4973]: I0320 13:24:58.866584 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 13:24:58 crc kubenswrapper[4973]: I0320 13:24:58.872814 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0e9a4008-f747-46b4-9dd2-72def7c504a9","Type":"ContainerStarted","Data":"f2e7d8dc42780bd6533af26adfae458ecbcdf4d7437eb3b2ca62e04093cb37fb"} Mar 20 13:24:58 crc kubenswrapper[4973]: I0320 13:24:58.872878 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0e9a4008-f747-46b4-9dd2-72def7c504a9","Type":"ContainerStarted","Data":"ecd5a97e81212e068557480400f4c4bdd97e110de9f9d633f671ed4985941701"} Mar 20 13:24:58 crc kubenswrapper[4973]: I0320 13:24:58.879224 4973 generic.go:334] "Generic (PLEG): container finished" podID="af380bf0-7c0d-4790-8ae4-19697763a37a" containerID="5570c9afa0e5138df9257fd7b5615cfdab730c2e7fc1587ab05b1901e37f61b3" exitCode=0 Mar 20 13:24:58 crc kubenswrapper[4973]: I0320 13:24:58.880271 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw9hm" event={"ID":"af380bf0-7c0d-4790-8ae4-19697763a37a","Type":"ContainerDied","Data":"5570c9afa0e5138df9257fd7b5615cfdab730c2e7fc1587ab05b1901e37f61b3"} Mar 20 13:24:58 crc kubenswrapper[4973]: I0320 13:24:58.898635 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.898616211 podStartE2EDuration="2.898616211s" podCreationTimestamp="2026-03-20 13:24:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:24:58.897859989 +0000 UTC m=+219.641529753" watchObservedRunningTime="2026-03-20 13:24:58.898616211 +0000 UTC m=+219.642285965" Mar 20 13:24:58 crc kubenswrapper[4973]: I0320 13:24:58.902169 4973 patch_prober.go:28] interesting pod/router-default-5444994796-wgtgl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:24:58 crc kubenswrapper[4973]: [-]has-synced failed: reason withheld Mar 20 13:24:58 crc kubenswrapper[4973]: [+]process-running ok Mar 20 13:24:58 crc kubenswrapper[4973]: healthz check failed Mar 20 13:24:58 crc kubenswrapper[4973]: I0320 13:24:58.902263 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wgtgl" podUID="0e880f23-4bef-4e96-bf00-c94dc4551c5a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:24:59 crc kubenswrapper[4973]: I0320 13:24:59.896991 4973 patch_prober.go:28] interesting pod/router-default-5444994796-wgtgl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:24:59 crc kubenswrapper[4973]: [-]has-synced failed: reason withheld Mar 20 13:24:59 crc kubenswrapper[4973]: [+]process-running ok Mar 20 13:24:59 crc kubenswrapper[4973]: healthz check failed Mar 20 13:24:59 crc kubenswrapper[4973]: I0320 13:24:59.897050 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wgtgl" podUID="0e880f23-4bef-4e96-bf00-c94dc4551c5a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:24:59 crc kubenswrapper[4973]: I0320 13:24:59.922207 4973 generic.go:334] "Generic (PLEG): container finished" podID="0e9a4008-f747-46b4-9dd2-72def7c504a9" containerID="f2e7d8dc42780bd6533af26adfae458ecbcdf4d7437eb3b2ca62e04093cb37fb" exitCode=0 Mar 20 13:24:59 crc kubenswrapper[4973]: I0320 13:24:59.922253 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0e9a4008-f747-46b4-9dd2-72def7c504a9","Type":"ContainerDied","Data":"f2e7d8dc42780bd6533af26adfae458ecbcdf4d7437eb3b2ca62e04093cb37fb"} Mar 20 13:25:00 crc kubenswrapper[4973]: I0320 13:25:00.793724 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:25:00 crc kubenswrapper[4973]: I0320 13:25:00.803644 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-k9jzg" Mar 20 13:25:00 crc kubenswrapper[4973]: I0320 13:25:00.900204 4973 patch_prober.go:28] interesting pod/router-default-5444994796-wgtgl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:25:00 crc kubenswrapper[4973]: [-]has-synced failed: reason withheld Mar 20 13:25:00 crc kubenswrapper[4973]: [+]process-running ok Mar 20 13:25:00 crc kubenswrapper[4973]: healthz check failed Mar 20 13:25:00 crc kubenswrapper[4973]: I0320 13:25:00.900266 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wgtgl" podUID="0e880f23-4bef-4e96-bf00-c94dc4551c5a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:25:01 crc kubenswrapper[4973]: I0320 13:25:01.303986 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-6v2ms" Mar 20 13:25:01 crc kubenswrapper[4973]: I0320 13:25:01.398863 4973 ???:1] "http: TLS handshake error from 192.168.126.11:38166: no serving certificate available for the kubelet" Mar 20 13:25:01 crc kubenswrapper[4973]: I0320 13:25:01.895975 4973 patch_prober.go:28] interesting pod/router-default-5444994796-wgtgl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:25:01 crc kubenswrapper[4973]: [-]has-synced failed: reason withheld Mar 20 13:25:01 crc kubenswrapper[4973]: [+]process-running ok Mar 20 13:25:01 crc kubenswrapper[4973]: healthz check failed Mar 20 13:25:01 crc kubenswrapper[4973]: I0320 13:25:01.896051 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wgtgl" podUID="0e880f23-4bef-4e96-bf00-c94dc4551c5a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:25:02 crc kubenswrapper[4973]: I0320 13:25:02.895509 4973 patch_prober.go:28] interesting pod/router-default-5444994796-wgtgl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:25:02 crc kubenswrapper[4973]: [-]has-synced failed: reason withheld Mar 20 13:25:02 crc kubenswrapper[4973]: [+]process-running ok Mar 20 13:25:02 crc kubenswrapper[4973]: healthz check failed Mar 20 13:25:02 crc kubenswrapper[4973]: I0320 13:25:02.895907 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wgtgl" podUID="0e880f23-4bef-4e96-bf00-c94dc4551c5a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:25:03 crc kubenswrapper[4973]: I0320 13:25:03.895785 4973 patch_prober.go:28] interesting pod/router-default-5444994796-wgtgl container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 13:25:03 crc kubenswrapper[4973]: [-]has-synced failed: reason withheld Mar 20 13:25:03 crc kubenswrapper[4973]: [+]process-running ok Mar 20 13:25:03 crc kubenswrapper[4973]: healthz check failed Mar 20 13:25:03 crc kubenswrapper[4973]: I0320 13:25:03.895857 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-wgtgl" podUID="0e880f23-4bef-4e96-bf00-c94dc4551c5a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 13:25:04 crc kubenswrapper[4973]: I0320 13:25:04.897048 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-wgtgl" Mar 20 13:25:04 crc kubenswrapper[4973]: I0320 13:25:04.898934 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-wgtgl" Mar 20 13:25:05 crc kubenswrapper[4973]: I0320 13:25:05.065036 4973 patch_prober.go:28] interesting pod/console-f9d7485db-k7krj container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 20 13:25:05 crc kubenswrapper[4973]: I0320 13:25:05.065426 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-k7krj" podUID="de8d912e-7616-42ee-a688-b43d5b85dc44" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 20 13:25:05 crc kubenswrapper[4973]: I0320 13:25:05.132443 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-xt4hf" Mar 20 13:25:09 crc kubenswrapper[4973]: I0320 13:25:09.082841 4973 ???:1] "http: TLS handshake error from 192.168.126.11:38176: no serving certificate available for the kubelet" Mar 20 13:25:11 crc kubenswrapper[4973]: I0320 13:25:11.912841 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg"] Mar 20 13:25:11 crc kubenswrapper[4973]: I0320 13:25:11.913394 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg" podUID="e46b0744-2ae6-40e9-a4eb-352c338343f8" containerName="route-controller-manager" containerID="cri-o://cdc75fb73c7ff5a7ccdb622bbe0680814d724e6117544d6c861a7b67966e607d" gracePeriod=30 Mar 20 13:25:11 crc kubenswrapper[4973]: I0320 13:25:11.925027 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5"] Mar 20 13:25:11 crc kubenswrapper[4973]: I0320 13:25:11.925278 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" podUID="cf54d214-858c-4f73-ab36-0fd4ffcaf949" containerName="controller-manager" containerID="cri-o://eee1369c96ac6e8afaf23c91cfbb594037296badc2c9d22b37816a04b2fb70fb" gracePeriod=30 Mar 20 13:25:13 crc kubenswrapper[4973]: I0320 13:25:13.130539 4973 generic.go:334] "Generic (PLEG): container finished" podID="e46b0744-2ae6-40e9-a4eb-352c338343f8" containerID="cdc75fb73c7ff5a7ccdb622bbe0680814d724e6117544d6c861a7b67966e607d" exitCode=0 Mar 20 13:25:13 crc kubenswrapper[4973]: I0320 13:25:13.130618 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg" event={"ID":"e46b0744-2ae6-40e9-a4eb-352c338343f8","Type":"ContainerDied","Data":"cdc75fb73c7ff5a7ccdb622bbe0680814d724e6117544d6c861a7b67966e607d"} Mar 20 13:25:13 crc kubenswrapper[4973]: I0320 13:25:13.136712 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" event={"ID":"cf54d214-858c-4f73-ab36-0fd4ffcaf949","Type":"ContainerDied","Data":"eee1369c96ac6e8afaf23c91cfbb594037296badc2c9d22b37816a04b2fb70fb"} Mar 20 13:25:13 crc kubenswrapper[4973]: I0320 13:25:13.136642 4973 generic.go:334] "Generic (PLEG): container finished" podID="cf54d214-858c-4f73-ab36-0fd4ffcaf949" containerID="eee1369c96ac6e8afaf23c91cfbb594037296badc2c9d22b37816a04b2fb70fb" exitCode=0 Mar 20 13:25:13 crc kubenswrapper[4973]: I0320 13:25:13.321285 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:25:13 crc kubenswrapper[4973]: I0320 13:25:13.321453 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:25:14 crc kubenswrapper[4973]: I0320 13:25:14.748626 4973 patch_prober.go:28] interesting pod/route-controller-manager-5bdb56df7b-r8xhg container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" start-of-body= Mar 20 13:25:14 crc kubenswrapper[4973]: I0320 13:25:14.748683 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg" podUID="e46b0744-2ae6-40e9-a4eb-352c338343f8" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.49:8443/healthz\": dial tcp 10.217.0.49:8443: connect: connection refused" Mar 20 13:25:14 crc kubenswrapper[4973]: I0320 13:25:14.762667 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:25:14 crc kubenswrapper[4973]: I0320 13:25:14.766208 4973 patch_prober.go:28] interesting pod/controller-manager-64dbcdb7b8-h7kd5 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" start-of-body= Mar 20 13:25:14 crc kubenswrapper[4973]: I0320 13:25:14.766246 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" podUID="cf54d214-858c-4f73-ab36-0fd4ffcaf949" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": dial tcp 10.217.0.50:8443: connect: connection refused" Mar 20 13:25:15 crc kubenswrapper[4973]: I0320 13:25:15.069327 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:25:15 crc kubenswrapper[4973]: I0320 13:25:15.073221 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:25:20 crc kubenswrapper[4973]: I0320 13:25:20.763969 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:25:20 crc kubenswrapper[4973]: I0320 13:25:20.921007 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e9a4008-f747-46b4-9dd2-72def7c504a9-kubelet-dir\") pod \"0e9a4008-f747-46b4-9dd2-72def7c504a9\" (UID: \"0e9a4008-f747-46b4-9dd2-72def7c504a9\") " Mar 20 13:25:20 crc kubenswrapper[4973]: I0320 13:25:20.921107 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e9a4008-f747-46b4-9dd2-72def7c504a9-kube-api-access\") pod \"0e9a4008-f747-46b4-9dd2-72def7c504a9\" (UID: \"0e9a4008-f747-46b4-9dd2-72def7c504a9\") " Mar 20 13:25:20 crc kubenswrapper[4973]: I0320 13:25:20.921200 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e9a4008-f747-46b4-9dd2-72def7c504a9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0e9a4008-f747-46b4-9dd2-72def7c504a9" (UID: "0e9a4008-f747-46b4-9dd2-72def7c504a9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:25:20 crc kubenswrapper[4973]: I0320 13:25:20.921447 4973 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e9a4008-f747-46b4-9dd2-72def7c504a9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:20 crc kubenswrapper[4973]: I0320 13:25:20.929512 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e9a4008-f747-46b4-9dd2-72def7c504a9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0e9a4008-f747-46b4-9dd2-72def7c504a9" (UID: "0e9a4008-f747-46b4-9dd2-72def7c504a9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:21 crc kubenswrapper[4973]: I0320 13:25:21.022874 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e9a4008-f747-46b4-9dd2-72def7c504a9-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:21 crc kubenswrapper[4973]: I0320 13:25:21.189322 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"0e9a4008-f747-46b4-9dd2-72def7c504a9","Type":"ContainerDied","Data":"ecd5a97e81212e068557480400f4c4bdd97e110de9f9d633f671ed4985941701"} Mar 20 13:25:21 crc kubenswrapper[4973]: I0320 13:25:21.189612 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ecd5a97e81212e068557480400f4c4bdd97e110de9f9d633f671ed4985941701" Mar 20 13:25:21 crc kubenswrapper[4973]: I0320 13:25:21.189512 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 13:25:22 crc kubenswrapper[4973]: E0320 13:25:22.668221 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 20 13:25:22 crc kubenswrapper[4973]: E0320 13:25:22.668465 4973 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 13:25:22 crc kubenswrapper[4973]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 20 13:25:22 crc kubenswrapper[4973]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jct5c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29566884-j8dqd_openshift-infra(66c28cfe-8a0b-459a-bbab-59053fe226b8): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 20 13:25:22 crc kubenswrapper[4973]: > logger="UnhandledError" Mar 20 13:25:22 crc kubenswrapper[4973]: E0320 13:25:22.669697 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29566884-j8dqd" podUID="66c28cfe-8a0b-459a-bbab-59053fe226b8" Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.950908 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.956655 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg" Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.959719 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjfrw\" (UniqueName: \"kubernetes.io/projected/cf54d214-858c-4f73-ab36-0fd4ffcaf949-kube-api-access-pjfrw\") pod \"cf54d214-858c-4f73-ab36-0fd4ffcaf949\" (UID: \"cf54d214-858c-4f73-ab36-0fd4ffcaf949\") " Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.959815 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e46b0744-2ae6-40e9-a4eb-352c338343f8-client-ca\") pod \"e46b0744-2ae6-40e9-a4eb-352c338343f8\" (UID: \"e46b0744-2ae6-40e9-a4eb-352c338343f8\") " Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.959896 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf54d214-858c-4f73-ab36-0fd4ffcaf949-proxy-ca-bundles\") pod \"cf54d214-858c-4f73-ab36-0fd4ffcaf949\" (UID: \"cf54d214-858c-4f73-ab36-0fd4ffcaf949\") " Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.959945 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gh94h\" (UniqueName: \"kubernetes.io/projected/e46b0744-2ae6-40e9-a4eb-352c338343f8-kube-api-access-gh94h\") pod \"e46b0744-2ae6-40e9-a4eb-352c338343f8\" (UID: \"e46b0744-2ae6-40e9-a4eb-352c338343f8\") " Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.959988 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e46b0744-2ae6-40e9-a4eb-352c338343f8-config\") pod \"e46b0744-2ae6-40e9-a4eb-352c338343f8\" (UID: \"e46b0744-2ae6-40e9-a4eb-352c338343f8\") " Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.960050 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf54d214-858c-4f73-ab36-0fd4ffcaf949-client-ca\") pod \"cf54d214-858c-4f73-ab36-0fd4ffcaf949\" (UID: \"cf54d214-858c-4f73-ab36-0fd4ffcaf949\") " Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.960076 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf54d214-858c-4f73-ab36-0fd4ffcaf949-config\") pod \"cf54d214-858c-4f73-ab36-0fd4ffcaf949\" (UID: \"cf54d214-858c-4f73-ab36-0fd4ffcaf949\") " Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.960115 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e46b0744-2ae6-40e9-a4eb-352c338343f8-serving-cert\") pod \"e46b0744-2ae6-40e9-a4eb-352c338343f8\" (UID: \"e46b0744-2ae6-40e9-a4eb-352c338343f8\") " Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.960167 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf54d214-858c-4f73-ab36-0fd4ffcaf949-serving-cert\") pod \"cf54d214-858c-4f73-ab36-0fd4ffcaf949\" (UID: \"cf54d214-858c-4f73-ab36-0fd4ffcaf949\") " Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.960927 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e46b0744-2ae6-40e9-a4eb-352c338343f8-client-ca" (OuterVolumeSpecName: "client-ca") pod "e46b0744-2ae6-40e9-a4eb-352c338343f8" (UID: "e46b0744-2ae6-40e9-a4eb-352c338343f8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.961315 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e46b0744-2ae6-40e9-a4eb-352c338343f8-config" (OuterVolumeSpecName: "config") pod "e46b0744-2ae6-40e9-a4eb-352c338343f8" (UID: "e46b0744-2ae6-40e9-a4eb-352c338343f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.964085 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf54d214-858c-4f73-ab36-0fd4ffcaf949-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "cf54d214-858c-4f73-ab36-0fd4ffcaf949" (UID: "cf54d214-858c-4f73-ab36-0fd4ffcaf949"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.964094 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf54d214-858c-4f73-ab36-0fd4ffcaf949-client-ca" (OuterVolumeSpecName: "client-ca") pod "cf54d214-858c-4f73-ab36-0fd4ffcaf949" (UID: "cf54d214-858c-4f73-ab36-0fd4ffcaf949"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.967199 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf54d214-858c-4f73-ab36-0fd4ffcaf949-config" (OuterVolumeSpecName: "config") pod "cf54d214-858c-4f73-ab36-0fd4ffcaf949" (UID: "cf54d214-858c-4f73-ab36-0fd4ffcaf949"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.979644 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-58b6b647d5-bbqr8"] Mar 20 13:25:22 crc kubenswrapper[4973]: E0320 13:25:22.979885 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e9a4008-f747-46b4-9dd2-72def7c504a9" containerName="pruner" Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.979899 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e9a4008-f747-46b4-9dd2-72def7c504a9" containerName="pruner" Mar 20 13:25:22 crc kubenswrapper[4973]: E0320 13:25:22.979913 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e46b0744-2ae6-40e9-a4eb-352c338343f8" containerName="route-controller-manager" Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.979920 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="e46b0744-2ae6-40e9-a4eb-352c338343f8" containerName="route-controller-manager" Mar 20 13:25:22 crc kubenswrapper[4973]: E0320 13:25:22.979930 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a07c04c1-a574-445e-9aa2-18f383a84c81" containerName="pruner" Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.979937 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="a07c04c1-a574-445e-9aa2-18f383a84c81" containerName="pruner" Mar 20 13:25:22 crc kubenswrapper[4973]: E0320 13:25:22.979950 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9d6f338-6a7a-4ba2-b1c9-12485bb30937" containerName="collect-profiles" Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.979957 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9d6f338-6a7a-4ba2-b1c9-12485bb30937" containerName="collect-profiles" Mar 20 13:25:22 crc kubenswrapper[4973]: E0320 13:25:22.979966 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf54d214-858c-4f73-ab36-0fd4ffcaf949" containerName="controller-manager" Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.979973 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf54d214-858c-4f73-ab36-0fd4ffcaf949" containerName="controller-manager" Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.980117 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e9a4008-f747-46b4-9dd2-72def7c504a9" containerName="pruner" Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.980171 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9d6f338-6a7a-4ba2-b1c9-12485bb30937" containerName="collect-profiles" Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.980184 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="a07c04c1-a574-445e-9aa2-18f383a84c81" containerName="pruner" Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.980195 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="e46b0744-2ae6-40e9-a4eb-352c338343f8" containerName="route-controller-manager" Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.980209 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf54d214-858c-4f73-ab36-0fd4ffcaf949" containerName="controller-manager" Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.980584 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58b6b647d5-bbqr8" Mar 20 13:25:22 crc kubenswrapper[4973]: I0320 13:25:22.988048 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-58b6b647d5-bbqr8"] Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.020709 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e46b0744-2ae6-40e9-a4eb-352c338343f8-kube-api-access-gh94h" (OuterVolumeSpecName: "kube-api-access-gh94h") pod "e46b0744-2ae6-40e9-a4eb-352c338343f8" (UID: "e46b0744-2ae6-40e9-a4eb-352c338343f8"). InnerVolumeSpecName "kube-api-access-gh94h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.022036 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf54d214-858c-4f73-ab36-0fd4ffcaf949-kube-api-access-pjfrw" (OuterVolumeSpecName: "kube-api-access-pjfrw") pod "cf54d214-858c-4f73-ab36-0fd4ffcaf949" (UID: "cf54d214-858c-4f73-ab36-0fd4ffcaf949"). InnerVolumeSpecName "kube-api-access-pjfrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.023532 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e46b0744-2ae6-40e9-a4eb-352c338343f8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e46b0744-2ae6-40e9-a4eb-352c338343f8" (UID: "e46b0744-2ae6-40e9-a4eb-352c338343f8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.023611 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf54d214-858c-4f73-ab36-0fd4ffcaf949-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cf54d214-858c-4f73-ab36-0fd4ffcaf949" (UID: "cf54d214-858c-4f73-ab36-0fd4ffcaf949"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.061785 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50e38616-afae-4d9d-94da-7eb34be421ba-config\") pod \"controller-manager-58b6b647d5-bbqr8\" (UID: \"50e38616-afae-4d9d-94da-7eb34be421ba\") " pod="openshift-controller-manager/controller-manager-58b6b647d5-bbqr8" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.061865 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50e38616-afae-4d9d-94da-7eb34be421ba-serving-cert\") pod \"controller-manager-58b6b647d5-bbqr8\" (UID: \"50e38616-afae-4d9d-94da-7eb34be421ba\") " pod="openshift-controller-manager/controller-manager-58b6b647d5-bbqr8" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.061937 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50e38616-afae-4d9d-94da-7eb34be421ba-proxy-ca-bundles\") pod \"controller-manager-58b6b647d5-bbqr8\" (UID: \"50e38616-afae-4d9d-94da-7eb34be421ba\") " pod="openshift-controller-manager/controller-manager-58b6b647d5-bbqr8" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.061964 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50e38616-afae-4d9d-94da-7eb34be421ba-client-ca\") pod \"controller-manager-58b6b647d5-bbqr8\" (UID: \"50e38616-afae-4d9d-94da-7eb34be421ba\") " pod="openshift-controller-manager/controller-manager-58b6b647d5-bbqr8" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.062017 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2dqj\" (UniqueName: \"kubernetes.io/projected/50e38616-afae-4d9d-94da-7eb34be421ba-kube-api-access-h2dqj\") pod \"controller-manager-58b6b647d5-bbqr8\" (UID: \"50e38616-afae-4d9d-94da-7eb34be421ba\") " pod="openshift-controller-manager/controller-manager-58b6b647d5-bbqr8" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.062214 4973 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cf54d214-858c-4f73-ab36-0fd4ffcaf949-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.062235 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf54d214-858c-4f73-ab36-0fd4ffcaf949-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.062246 4973 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e46b0744-2ae6-40e9-a4eb-352c338343f8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.062256 4973 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cf54d214-858c-4f73-ab36-0fd4ffcaf949-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.062310 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjfrw\" (UniqueName: \"kubernetes.io/projected/cf54d214-858c-4f73-ab36-0fd4ffcaf949-kube-api-access-pjfrw\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.062322 4973 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e46b0744-2ae6-40e9-a4eb-352c338343f8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.062333 4973 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cf54d214-858c-4f73-ab36-0fd4ffcaf949-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.062391 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gh94h\" (UniqueName: \"kubernetes.io/projected/e46b0744-2ae6-40e9-a4eb-352c338343f8-kube-api-access-gh94h\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.062404 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e46b0744-2ae6-40e9-a4eb-352c338343f8-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.162799 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50e38616-afae-4d9d-94da-7eb34be421ba-config\") pod \"controller-manager-58b6b647d5-bbqr8\" (UID: \"50e38616-afae-4d9d-94da-7eb34be421ba\") " pod="openshift-controller-manager/controller-manager-58b6b647d5-bbqr8" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.163077 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50e38616-afae-4d9d-94da-7eb34be421ba-serving-cert\") pod \"controller-manager-58b6b647d5-bbqr8\" (UID: \"50e38616-afae-4d9d-94da-7eb34be421ba\") " pod="openshift-controller-manager/controller-manager-58b6b647d5-bbqr8" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.163163 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50e38616-afae-4d9d-94da-7eb34be421ba-proxy-ca-bundles\") pod \"controller-manager-58b6b647d5-bbqr8\" (UID: \"50e38616-afae-4d9d-94da-7eb34be421ba\") " pod="openshift-controller-manager/controller-manager-58b6b647d5-bbqr8" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.163234 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50e38616-afae-4d9d-94da-7eb34be421ba-client-ca\") pod \"controller-manager-58b6b647d5-bbqr8\" (UID: \"50e38616-afae-4d9d-94da-7eb34be421ba\") " pod="openshift-controller-manager/controller-manager-58b6b647d5-bbqr8" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.163357 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2dqj\" (UniqueName: \"kubernetes.io/projected/50e38616-afae-4d9d-94da-7eb34be421ba-kube-api-access-h2dqj\") pod \"controller-manager-58b6b647d5-bbqr8\" (UID: \"50e38616-afae-4d9d-94da-7eb34be421ba\") " pod="openshift-controller-manager/controller-manager-58b6b647d5-bbqr8" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.164552 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50e38616-afae-4d9d-94da-7eb34be421ba-client-ca\") pod \"controller-manager-58b6b647d5-bbqr8\" (UID: \"50e38616-afae-4d9d-94da-7eb34be421ba\") " pod="openshift-controller-manager/controller-manager-58b6b647d5-bbqr8" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.165033 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50e38616-afae-4d9d-94da-7eb34be421ba-proxy-ca-bundles\") pod \"controller-manager-58b6b647d5-bbqr8\" (UID: \"50e38616-afae-4d9d-94da-7eb34be421ba\") " pod="openshift-controller-manager/controller-manager-58b6b647d5-bbqr8" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.165079 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50e38616-afae-4d9d-94da-7eb34be421ba-config\") pod \"controller-manager-58b6b647d5-bbqr8\" (UID: \"50e38616-afae-4d9d-94da-7eb34be421ba\") " pod="openshift-controller-manager/controller-manager-58b6b647d5-bbqr8" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.168401 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50e38616-afae-4d9d-94da-7eb34be421ba-serving-cert\") pod \"controller-manager-58b6b647d5-bbqr8\" (UID: \"50e38616-afae-4d9d-94da-7eb34be421ba\") " pod="openshift-controller-manager/controller-manager-58b6b647d5-bbqr8" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.179122 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2dqj\" (UniqueName: \"kubernetes.io/projected/50e38616-afae-4d9d-94da-7eb34be421ba-kube-api-access-h2dqj\") pod \"controller-manager-58b6b647d5-bbqr8\" (UID: \"50e38616-afae-4d9d-94da-7eb34be421ba\") " pod="openshift-controller-manager/controller-manager-58b6b647d5-bbqr8" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.204183 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.204731 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5" event={"ID":"cf54d214-858c-4f73-ab36-0fd4ffcaf949","Type":"ContainerDied","Data":"baa574e39055a403234e9494c1cc9e7d7fc370c4dd7493ebf73fb299b4549c07"} Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.204889 4973 scope.go:117] "RemoveContainer" containerID="eee1369c96ac6e8afaf23c91cfbb594037296badc2c9d22b37816a04b2fb70fb" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.207646 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg" event={"ID":"e46b0744-2ae6-40e9-a4eb-352c338343f8","Type":"ContainerDied","Data":"6d898fb12c2a5416c35563969e48f6a7dfb9ddae4ded0db135929a021aa421da"} Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.207736 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg" Mar 20 13:25:23 crc kubenswrapper[4973]: E0320 13:25:23.209307 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29566884-j8dqd" podUID="66c28cfe-8a0b-459a-bbab-59053fe226b8" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.244126 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5"] Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.247385 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-64dbcdb7b8-h7kd5"] Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.249825 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg"] Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.263081 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5bdb56df7b-r8xhg"] Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.338110 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58b6b647d5-bbqr8" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.957137 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf54d214-858c-4f73-ab36-0fd4ffcaf949" path="/var/lib/kubelet/pods/cf54d214-858c-4f73-ab36-0fd4ffcaf949/volumes" Mar 20 13:25:23 crc kubenswrapper[4973]: I0320 13:25:23.957684 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e46b0744-2ae6-40e9-a4eb-352c338343f8" path="/var/lib/kubelet/pods/e46b0744-2ae6-40e9-a4eb-352c338343f8/volumes" Mar 20 13:25:26 crc kubenswrapper[4973]: I0320 13:25:26.250878 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bw9k5" Mar 20 13:25:27 crc kubenswrapper[4973]: E0320 13:25:27.311707 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 13:25:27 crc kubenswrapper[4973]: E0320 13:25:27.312280 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kw6t4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-v6w64_openshift-marketplace(cf96ec01-20ab-4529-ad42-d839540c3d8e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:25:27 crc kubenswrapper[4973]: E0320 13:25:27.313486 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-v6w64" podUID="cf96ec01-20ab-4529-ad42-d839540c3d8e" Mar 20 13:25:27 crc kubenswrapper[4973]: I0320 13:25:27.409271 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25"] Mar 20 13:25:27 crc kubenswrapper[4973]: I0320 13:25:27.419622 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25" Mar 20 13:25:27 crc kubenswrapper[4973]: I0320 13:25:27.421030 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94136700-6c7a-4496-97c3-ff8f7514a890-serving-cert\") pod \"route-controller-manager-c578596cd-c2t25\" (UID: \"94136700-6c7a-4496-97c3-ff8f7514a890\") " pod="openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25" Mar 20 13:25:27 crc kubenswrapper[4973]: I0320 13:25:27.421086 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94136700-6c7a-4496-97c3-ff8f7514a890-config\") pod \"route-controller-manager-c578596cd-c2t25\" (UID: \"94136700-6c7a-4496-97c3-ff8f7514a890\") " pod="openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25" Mar 20 13:25:27 crc kubenswrapper[4973]: I0320 13:25:27.421112 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94136700-6c7a-4496-97c3-ff8f7514a890-client-ca\") pod \"route-controller-manager-c578596cd-c2t25\" (UID: \"94136700-6c7a-4496-97c3-ff8f7514a890\") " pod="openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25" Mar 20 13:25:27 crc kubenswrapper[4973]: I0320 13:25:27.421143 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vcx2w\" (UniqueName: \"kubernetes.io/projected/94136700-6c7a-4496-97c3-ff8f7514a890-kube-api-access-vcx2w\") pod \"route-controller-manager-c578596cd-c2t25\" (UID: \"94136700-6c7a-4496-97c3-ff8f7514a890\") " pod="openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25" Mar 20 13:25:27 crc kubenswrapper[4973]: I0320 13:25:27.423262 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:25:27 crc kubenswrapper[4973]: I0320 13:25:27.423564 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:25:27 crc kubenswrapper[4973]: I0320 13:25:27.423692 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:25:27 crc kubenswrapper[4973]: I0320 13:25:27.423793 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:25:27 crc kubenswrapper[4973]: I0320 13:25:27.423915 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:25:27 crc kubenswrapper[4973]: I0320 13:25:27.424076 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:25:27 crc kubenswrapper[4973]: I0320 13:25:27.430538 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25"] Mar 20 13:25:27 crc kubenswrapper[4973]: I0320 13:25:27.522721 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94136700-6c7a-4496-97c3-ff8f7514a890-serving-cert\") pod \"route-controller-manager-c578596cd-c2t25\" (UID: \"94136700-6c7a-4496-97c3-ff8f7514a890\") " pod="openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25" Mar 20 13:25:27 crc kubenswrapper[4973]: I0320 13:25:27.522772 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94136700-6c7a-4496-97c3-ff8f7514a890-config\") pod \"route-controller-manager-c578596cd-c2t25\" (UID: \"94136700-6c7a-4496-97c3-ff8f7514a890\") " pod="openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25" Mar 20 13:25:27 crc kubenswrapper[4973]: I0320 13:25:27.522788 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94136700-6c7a-4496-97c3-ff8f7514a890-client-ca\") pod \"route-controller-manager-c578596cd-c2t25\" (UID: \"94136700-6c7a-4496-97c3-ff8f7514a890\") " pod="openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25" Mar 20 13:25:27 crc kubenswrapper[4973]: I0320 13:25:27.522811 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vcx2w\" (UniqueName: \"kubernetes.io/projected/94136700-6c7a-4496-97c3-ff8f7514a890-kube-api-access-vcx2w\") pod \"route-controller-manager-c578596cd-c2t25\" (UID: \"94136700-6c7a-4496-97c3-ff8f7514a890\") " pod="openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25" Mar 20 13:25:27 crc kubenswrapper[4973]: I0320 13:25:27.523940 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94136700-6c7a-4496-97c3-ff8f7514a890-client-ca\") pod \"route-controller-manager-c578596cd-c2t25\" (UID: \"94136700-6c7a-4496-97c3-ff8f7514a890\") " pod="openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25" Mar 20 13:25:27 crc kubenswrapper[4973]: I0320 13:25:27.524086 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94136700-6c7a-4496-97c3-ff8f7514a890-config\") pod \"route-controller-manager-c578596cd-c2t25\" (UID: \"94136700-6c7a-4496-97c3-ff8f7514a890\") " pod="openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25" Mar 20 13:25:27 crc kubenswrapper[4973]: I0320 13:25:27.530469 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94136700-6c7a-4496-97c3-ff8f7514a890-serving-cert\") pod \"route-controller-manager-c578596cd-c2t25\" (UID: \"94136700-6c7a-4496-97c3-ff8f7514a890\") " pod="openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25" Mar 20 13:25:27 crc kubenswrapper[4973]: I0320 13:25:27.545468 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vcx2w\" (UniqueName: \"kubernetes.io/projected/94136700-6c7a-4496-97c3-ff8f7514a890-kube-api-access-vcx2w\") pod \"route-controller-manager-c578596cd-c2t25\" (UID: \"94136700-6c7a-4496-97c3-ff8f7514a890\") " pod="openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25" Mar 20 13:25:27 crc kubenswrapper[4973]: I0320 13:25:27.773043 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25" Mar 20 13:25:29 crc kubenswrapper[4973]: I0320 13:25:29.816191 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 13:25:29 crc kubenswrapper[4973]: I0320 13:25:29.817528 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:25:29 crc kubenswrapper[4973]: I0320 13:25:29.818979 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 13:25:29 crc kubenswrapper[4973]: I0320 13:25:29.856822 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 13:25:29 crc kubenswrapper[4973]: I0320 13:25:29.857120 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 13:25:29 crc kubenswrapper[4973]: I0320 13:25:29.865816 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/857bcbbb-a323-4483-a06e-4c93b97dda55-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"857bcbbb-a323-4483-a06e-4c93b97dda55\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:25:29 crc kubenswrapper[4973]: I0320 13:25:29.865879 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/857bcbbb-a323-4483-a06e-4c93b97dda55-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"857bcbbb-a323-4483-a06e-4c93b97dda55\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:25:29 crc kubenswrapper[4973]: I0320 13:25:29.969393 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/857bcbbb-a323-4483-a06e-4c93b97dda55-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"857bcbbb-a323-4483-a06e-4c93b97dda55\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:25:29 crc kubenswrapper[4973]: I0320 13:25:29.969483 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/857bcbbb-a323-4483-a06e-4c93b97dda55-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"857bcbbb-a323-4483-a06e-4c93b97dda55\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:25:29 crc kubenswrapper[4973]: I0320 13:25:29.969587 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/857bcbbb-a323-4483-a06e-4c93b97dda55-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"857bcbbb-a323-4483-a06e-4c93b97dda55\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:25:30 crc kubenswrapper[4973]: I0320 13:25:30.003091 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/857bcbbb-a323-4483-a06e-4c93b97dda55-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"857bcbbb-a323-4483-a06e-4c93b97dda55\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:25:30 crc kubenswrapper[4973]: I0320 13:25:30.186878 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:25:31 crc kubenswrapper[4973]: I0320 13:25:31.082809 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 13:25:31 crc kubenswrapper[4973]: E0320 13:25:31.525474 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-v6w64" podUID="cf96ec01-20ab-4529-ad42-d839540c3d8e" Mar 20 13:25:31 crc kubenswrapper[4973]: E0320 13:25:31.597216 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 13:25:31 crc kubenswrapper[4973]: E0320 13:25:31.597884 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-64hqq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-6p25c_openshift-marketplace(097e9042-52e2-4a7e-b567-5b97f34242d6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:25:31 crc kubenswrapper[4973]: E0320 13:25:31.599037 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-6p25c" podUID="097e9042-52e2-4a7e-b567-5b97f34242d6" Mar 20 13:25:31 crc kubenswrapper[4973]: I0320 13:25:31.895826 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-58b6b647d5-bbqr8"] Mar 20 13:25:31 crc kubenswrapper[4973]: I0320 13:25:31.992081 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25"] Mar 20 13:25:33 crc kubenswrapper[4973]: E0320 13:25:33.435696 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-6p25c" podUID="097e9042-52e2-4a7e-b567-5b97f34242d6" Mar 20 13:25:33 crc kubenswrapper[4973]: E0320 13:25:33.502465 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 13:25:33 crc kubenswrapper[4973]: E0320 13:25:33.503622 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-f8sms,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-plc2f_openshift-marketplace(8f429634-2787-4daa-a443-e4ab84f2e6b7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:25:33 crc kubenswrapper[4973]: E0320 13:25:33.504976 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-plc2f" podUID="8f429634-2787-4daa-a443-e4ab84f2e6b7" Mar 20 13:25:33 crc kubenswrapper[4973]: E0320 13:25:33.531296 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 13:25:33 crc kubenswrapper[4973]: E0320 13:25:33.531827 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pwtgk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-qw9hm_openshift-marketplace(af380bf0-7c0d-4790-8ae4-19697763a37a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:25:33 crc kubenswrapper[4973]: E0320 13:25:33.533207 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-qw9hm" podUID="af380bf0-7c0d-4790-8ae4-19697763a37a" Mar 20 13:25:34 crc kubenswrapper[4973]: I0320 13:25:34.399460 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 13:25:34 crc kubenswrapper[4973]: I0320 13:25:34.400658 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:25:34 crc kubenswrapper[4973]: I0320 13:25:34.412095 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 13:25:34 crc kubenswrapper[4973]: I0320 13:25:34.438853 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48634fee-db9f-4111-abac-74b05132eaa9-kube-api-access\") pod \"installer-9-crc\" (UID: \"48634fee-db9f-4111-abac-74b05132eaa9\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:25:34 crc kubenswrapper[4973]: I0320 13:25:34.438945 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48634fee-db9f-4111-abac-74b05132eaa9-kubelet-dir\") pod \"installer-9-crc\" (UID: \"48634fee-db9f-4111-abac-74b05132eaa9\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:25:34 crc kubenswrapper[4973]: I0320 13:25:34.438989 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/48634fee-db9f-4111-abac-74b05132eaa9-var-lock\") pod \"installer-9-crc\" (UID: \"48634fee-db9f-4111-abac-74b05132eaa9\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:25:34 crc kubenswrapper[4973]: I0320 13:25:34.539701 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/48634fee-db9f-4111-abac-74b05132eaa9-var-lock\") pod \"installer-9-crc\" (UID: \"48634fee-db9f-4111-abac-74b05132eaa9\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:25:34 crc kubenswrapper[4973]: I0320 13:25:34.539743 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48634fee-db9f-4111-abac-74b05132eaa9-kube-api-access\") pod \"installer-9-crc\" (UID: \"48634fee-db9f-4111-abac-74b05132eaa9\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:25:34 crc kubenswrapper[4973]: I0320 13:25:34.539800 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48634fee-db9f-4111-abac-74b05132eaa9-kubelet-dir\") pod \"installer-9-crc\" (UID: \"48634fee-db9f-4111-abac-74b05132eaa9\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:25:34 crc kubenswrapper[4973]: I0320 13:25:34.539866 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48634fee-db9f-4111-abac-74b05132eaa9-kubelet-dir\") pod \"installer-9-crc\" (UID: \"48634fee-db9f-4111-abac-74b05132eaa9\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:25:34 crc kubenswrapper[4973]: I0320 13:25:34.539903 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/48634fee-db9f-4111-abac-74b05132eaa9-var-lock\") pod \"installer-9-crc\" (UID: \"48634fee-db9f-4111-abac-74b05132eaa9\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:25:34 crc kubenswrapper[4973]: I0320 13:25:34.558637 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48634fee-db9f-4111-abac-74b05132eaa9-kube-api-access\") pod \"installer-9-crc\" (UID: \"48634fee-db9f-4111-abac-74b05132eaa9\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:25:34 crc kubenswrapper[4973]: I0320 13:25:34.737621 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:25:35 crc kubenswrapper[4973]: E0320 13:25:35.229044 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-plc2f" podUID="8f429634-2787-4daa-a443-e4ab84f2e6b7" Mar 20 13:25:35 crc kubenswrapper[4973]: E0320 13:25:35.229063 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-qw9hm" podUID="af380bf0-7c0d-4790-8ae4-19697763a37a" Mar 20 13:25:35 crc kubenswrapper[4973]: I0320 13:25:35.248075 4973 scope.go:117] "RemoveContainer" containerID="cdc75fb73c7ff5a7ccdb622bbe0680814d724e6117544d6c861a7b67966e607d" Mar 20 13:25:35 crc kubenswrapper[4973]: E0320 13:25:35.327126 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 13:25:35 crc kubenswrapper[4973]: E0320 13:25:35.327291 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q72rj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-ggbs9_openshift-marketplace(ce12ac02-5b58-44a3-a311-8cdd000ce41b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:25:35 crc kubenswrapper[4973]: E0320 13:25:35.328606 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-ggbs9" podUID="ce12ac02-5b58-44a3-a311-8cdd000ce41b" Mar 20 13:25:35 crc kubenswrapper[4973]: E0320 13:25:35.329662 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 13:25:35 crc kubenswrapper[4973]: E0320 13:25:35.329779 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5cqsf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-tsxk4_openshift-marketplace(6848e4e5-1d17-4bf7-8dee-bbbeddedd07d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:25:35 crc kubenswrapper[4973]: E0320 13:25:35.332508 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-tsxk4" podUID="6848e4e5-1d17-4bf7-8dee-bbbeddedd07d" Mar 20 13:25:35 crc kubenswrapper[4973]: E0320 13:25:35.341160 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 13:25:35 crc kubenswrapper[4973]: E0320 13:25:35.341434 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mzghc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-c7jw4_openshift-marketplace(0993b0a3-f604-4447-bce2-01636b061230): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:25:35 crc kubenswrapper[4973]: E0320 13:25:35.342612 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-c7jw4" podUID="0993b0a3-f604-4447-bce2-01636b061230" Mar 20 13:25:35 crc kubenswrapper[4973]: E0320 13:25:35.376259 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 13:25:35 crc kubenswrapper[4973]: E0320 13:25:35.376740 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-glj8x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-5jzd6_openshift-marketplace(2115631d-0f02-4cb4-bfee-e18dd87a0462): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 13:25:35 crc kubenswrapper[4973]: E0320 13:25:35.377976 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-5jzd6" podUID="2115631d-0f02-4cb4-bfee-e18dd87a0462" Mar 20 13:25:35 crc kubenswrapper[4973]: I0320 13:25:35.633203 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 13:25:35 crc kubenswrapper[4973]: W0320 13:25:35.635569 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod48634fee_db9f_4111_abac_74b05132eaa9.slice/crio-f31f0d371f7831c07cef575f03946fb60b6adb68740c2029873ec3d640369f6a WatchSource:0}: Error finding container f31f0d371f7831c07cef575f03946fb60b6adb68740c2029873ec3d640369f6a: Status 404 returned error can't find the container with id f31f0d371f7831c07cef575f03946fb60b6adb68740c2029873ec3d640369f6a Mar 20 13:25:35 crc kubenswrapper[4973]: I0320 13:25:35.720636 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25"] Mar 20 13:25:35 crc kubenswrapper[4973]: W0320 13:25:35.728219 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94136700_6c7a_4496_97c3_ff8f7514a890.slice/crio-a00cfd0b5377761664deff55cd7cfc9f95aa0f82d2134bac06cc75880c5a76d0 WatchSource:0}: Error finding container a00cfd0b5377761664deff55cd7cfc9f95aa0f82d2134bac06cc75880c5a76d0: Status 404 returned error can't find the container with id a00cfd0b5377761664deff55cd7cfc9f95aa0f82d2134bac06cc75880c5a76d0 Mar 20 13:25:35 crc kubenswrapper[4973]: I0320 13:25:35.767820 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 13:25:35 crc kubenswrapper[4973]: I0320 13:25:35.778322 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-58b6b647d5-bbqr8"] Mar 20 13:25:35 crc kubenswrapper[4973]: W0320 13:25:35.780742 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod857bcbbb_a323_4483_a06e_4c93b97dda55.slice/crio-f98165f2215d0d6bb45ce0195ce53ceb79afc8fe86e1861ab83a3ddf0298630a WatchSource:0}: Error finding container f98165f2215d0d6bb45ce0195ce53ceb79afc8fe86e1861ab83a3ddf0298630a: Status 404 returned error can't find the container with id f98165f2215d0d6bb45ce0195ce53ceb79afc8fe86e1861ab83a3ddf0298630a Mar 20 13:25:35 crc kubenswrapper[4973]: W0320 13:25:35.784047 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod50e38616_afae_4d9d_94da_7eb34be421ba.slice/crio-d1af5c9f503ac5b526eb05ce40f592a946fea299276090896c68a560cd5918e1 WatchSource:0}: Error finding container d1af5c9f503ac5b526eb05ce40f592a946fea299276090896c68a560cd5918e1: Status 404 returned error can't find the container with id d1af5c9f503ac5b526eb05ce40f592a946fea299276090896c68a560cd5918e1 Mar 20 13:25:36 crc kubenswrapper[4973]: I0320 13:25:36.297934 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58b6b647d5-bbqr8" event={"ID":"50e38616-afae-4d9d-94da-7eb34be421ba","Type":"ContainerStarted","Data":"e81d33d68f916745f20ec5645cd9b977548aa3d2e65e10f2de8cb48d77cb1765"} Mar 20 13:25:36 crc kubenswrapper[4973]: I0320 13:25:36.298383 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58b6b647d5-bbqr8" event={"ID":"50e38616-afae-4d9d-94da-7eb34be421ba","Type":"ContainerStarted","Data":"d1af5c9f503ac5b526eb05ce40f592a946fea299276090896c68a560cd5918e1"} Mar 20 13:25:36 crc kubenswrapper[4973]: I0320 13:25:36.298403 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-58b6b647d5-bbqr8" Mar 20 13:25:36 crc kubenswrapper[4973]: I0320 13:25:36.298100 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-58b6b647d5-bbqr8" podUID="50e38616-afae-4d9d-94da-7eb34be421ba" containerName="controller-manager" containerID="cri-o://e81d33d68f916745f20ec5645cd9b977548aa3d2e65e10f2de8cb48d77cb1765" gracePeriod=30 Mar 20 13:25:36 crc kubenswrapper[4973]: I0320 13:25:36.310237 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"857bcbbb-a323-4483-a06e-4c93b97dda55","Type":"ContainerStarted","Data":"7e0fdffc8b1bd85a0dce664e994121f587ab4ab29cc921a171536377b95c36be"} Mar 20 13:25:36 crc kubenswrapper[4973]: I0320 13:25:36.310313 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"857bcbbb-a323-4483-a06e-4c93b97dda55","Type":"ContainerStarted","Data":"f98165f2215d0d6bb45ce0195ce53ceb79afc8fe86e1861ab83a3ddf0298630a"} Mar 20 13:25:36 crc kubenswrapper[4973]: I0320 13:25:36.312692 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25" event={"ID":"94136700-6c7a-4496-97c3-ff8f7514a890","Type":"ContainerStarted","Data":"ea59a1a2db0fc49adf8afd98df978a1bb5a16973a23a49eeb15c81d979142b22"} Mar 20 13:25:36 crc kubenswrapper[4973]: I0320 13:25:36.312749 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25" event={"ID":"94136700-6c7a-4496-97c3-ff8f7514a890","Type":"ContainerStarted","Data":"a00cfd0b5377761664deff55cd7cfc9f95aa0f82d2134bac06cc75880c5a76d0"} Mar 20 13:25:36 crc kubenswrapper[4973]: I0320 13:25:36.312743 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25" podUID="94136700-6c7a-4496-97c3-ff8f7514a890" containerName="route-controller-manager" containerID="cri-o://ea59a1a2db0fc49adf8afd98df978a1bb5a16973a23a49eeb15c81d979142b22" gracePeriod=30 Mar 20 13:25:36 crc kubenswrapper[4973]: I0320 13:25:36.313099 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25" Mar 20 13:25:36 crc kubenswrapper[4973]: I0320 13:25:36.315294 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"48634fee-db9f-4111-abac-74b05132eaa9","Type":"ContainerStarted","Data":"4b272b7dc2a5fe12dbeca86072cdef8eb91b88b9fa5d8ea961223e81c1498768"} Mar 20 13:25:36 crc kubenswrapper[4973]: I0320 13:25:36.315390 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"48634fee-db9f-4111-abac-74b05132eaa9","Type":"ContainerStarted","Data":"f31f0d371f7831c07cef575f03946fb60b6adb68740c2029873ec3d640369f6a"} Mar 20 13:25:36 crc kubenswrapper[4973]: E0320 13:25:36.318285 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-5jzd6" podUID="2115631d-0f02-4cb4-bfee-e18dd87a0462" Mar 20 13:25:36 crc kubenswrapper[4973]: E0320 13:25:36.318363 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-tsxk4" podUID="6848e4e5-1d17-4bf7-8dee-bbbeddedd07d" Mar 20 13:25:36 crc kubenswrapper[4973]: E0320 13:25:36.318380 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-ggbs9" podUID="ce12ac02-5b58-44a3-a311-8cdd000ce41b" Mar 20 13:25:36 crc kubenswrapper[4973]: E0320 13:25:36.318432 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-c7jw4" podUID="0993b0a3-f604-4447-bce2-01636b061230" Mar 20 13:25:36 crc kubenswrapper[4973]: I0320 13:25:36.319591 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25" Mar 20 13:25:36 crc kubenswrapper[4973]: I0320 13:25:36.322586 4973 patch_prober.go:28] interesting pod/controller-manager-58b6b647d5-bbqr8 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": EOF" start-of-body= Mar 20 13:25:36 crc kubenswrapper[4973]: I0320 13:25:36.322649 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-58b6b647d5-bbqr8" podUID="50e38616-afae-4d9d-94da-7eb34be421ba" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": EOF" Mar 20 13:25:36 crc kubenswrapper[4973]: I0320 13:25:36.332155 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-58b6b647d5-bbqr8" podStartSLOduration=25.332124776 podStartE2EDuration="25.332124776s" podCreationTimestamp="2026-03-20 13:25:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:25:36.327893404 +0000 UTC m=+257.071563148" watchObservedRunningTime="2026-03-20 13:25:36.332124776 +0000 UTC m=+257.075794520" Mar 20 13:25:36 crc kubenswrapper[4973]: I0320 13:25:36.405556 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=7.4055296649999995 podStartE2EDuration="7.405529665s" podCreationTimestamp="2026-03-20 13:25:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:25:36.380898844 +0000 UTC m=+257.124568588" watchObservedRunningTime="2026-03-20 13:25:36.405529665 +0000 UTC m=+257.149199399" Mar 20 13:25:36 crc kubenswrapper[4973]: I0320 13:25:36.428314 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25" podStartSLOduration=25.428290892 podStartE2EDuration="25.428290892s" podCreationTimestamp="2026-03-20 13:25:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:25:36.423165254 +0000 UTC m=+257.166835018" watchObservedRunningTime="2026-03-20 13:25:36.428290892 +0000 UTC m=+257.171960636" Mar 20 13:25:36 crc kubenswrapper[4973]: I0320 13:25:36.451611 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.451594445 podStartE2EDuration="2.451594445s" podCreationTimestamp="2026-03-20 13:25:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:25:36.449957498 +0000 UTC m=+257.193627242" watchObservedRunningTime="2026-03-20 13:25:36.451594445 +0000 UTC m=+257.195264189" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.012604 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.042253 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct"] Mar 20 13:25:37 crc kubenswrapper[4973]: E0320 13:25:37.042561 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94136700-6c7a-4496-97c3-ff8f7514a890" containerName="route-controller-manager" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.042591 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="94136700-6c7a-4496-97c3-ff8f7514a890" containerName="route-controller-manager" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.042758 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="94136700-6c7a-4496-97c3-ff8f7514a890" containerName="route-controller-manager" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.043264 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.054122 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct"] Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.078062 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94136700-6c7a-4496-97c3-ff8f7514a890-config\") pod \"94136700-6c7a-4496-97c3-ff8f7514a890\" (UID: \"94136700-6c7a-4496-97c3-ff8f7514a890\") " Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.078114 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94136700-6c7a-4496-97c3-ff8f7514a890-serving-cert\") pod \"94136700-6c7a-4496-97c3-ff8f7514a890\" (UID: \"94136700-6c7a-4496-97c3-ff8f7514a890\") " Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.078272 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94136700-6c7a-4496-97c3-ff8f7514a890-client-ca\") pod \"94136700-6c7a-4496-97c3-ff8f7514a890\" (UID: \"94136700-6c7a-4496-97c3-ff8f7514a890\") " Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.078365 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vcx2w\" (UniqueName: \"kubernetes.io/projected/94136700-6c7a-4496-97c3-ff8f7514a890-kube-api-access-vcx2w\") pod \"94136700-6c7a-4496-97c3-ff8f7514a890\" (UID: \"94136700-6c7a-4496-97c3-ff8f7514a890\") " Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.078564 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqbdh\" (UniqueName: \"kubernetes.io/projected/4f14d082-80d9-4d31-8863-b2580ab8b571-kube-api-access-lqbdh\") pod \"route-controller-manager-76f7f8d98-n79ct\" (UID: \"4f14d082-80d9-4d31-8863-b2580ab8b571\") " pod="openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.078790 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f14d082-80d9-4d31-8863-b2580ab8b571-config\") pod \"route-controller-manager-76f7f8d98-n79ct\" (UID: \"4f14d082-80d9-4d31-8863-b2580ab8b571\") " pod="openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.078831 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f14d082-80d9-4d31-8863-b2580ab8b571-serving-cert\") pod \"route-controller-manager-76f7f8d98-n79ct\" (UID: \"4f14d082-80d9-4d31-8863-b2580ab8b571\") " pod="openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.078868 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f14d082-80d9-4d31-8863-b2580ab8b571-client-ca\") pod \"route-controller-manager-76f7f8d98-n79ct\" (UID: \"4f14d082-80d9-4d31-8863-b2580ab8b571\") " pod="openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.080787 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94136700-6c7a-4496-97c3-ff8f7514a890-client-ca" (OuterVolumeSpecName: "client-ca") pod "94136700-6c7a-4496-97c3-ff8f7514a890" (UID: "94136700-6c7a-4496-97c3-ff8f7514a890"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.081059 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94136700-6c7a-4496-97c3-ff8f7514a890-config" (OuterVolumeSpecName: "config") pod "94136700-6c7a-4496-97c3-ff8f7514a890" (UID: "94136700-6c7a-4496-97c3-ff8f7514a890"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.086715 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94136700-6c7a-4496-97c3-ff8f7514a890-kube-api-access-vcx2w" (OuterVolumeSpecName: "kube-api-access-vcx2w") pod "94136700-6c7a-4496-97c3-ff8f7514a890" (UID: "94136700-6c7a-4496-97c3-ff8f7514a890"). InnerVolumeSpecName "kube-api-access-vcx2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.089460 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/94136700-6c7a-4496-97c3-ff8f7514a890-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "94136700-6c7a-4496-97c3-ff8f7514a890" (UID: "94136700-6c7a-4496-97c3-ff8f7514a890"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.154605 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58b6b647d5-bbqr8" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.180200 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50e38616-afae-4d9d-94da-7eb34be421ba-config\") pod \"50e38616-afae-4d9d-94da-7eb34be421ba\" (UID: \"50e38616-afae-4d9d-94da-7eb34be421ba\") " Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.180267 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50e38616-afae-4d9d-94da-7eb34be421ba-serving-cert\") pod \"50e38616-afae-4d9d-94da-7eb34be421ba\" (UID: \"50e38616-afae-4d9d-94da-7eb34be421ba\") " Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.180302 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50e38616-afae-4d9d-94da-7eb34be421ba-proxy-ca-bundles\") pod \"50e38616-afae-4d9d-94da-7eb34be421ba\" (UID: \"50e38616-afae-4d9d-94da-7eb34be421ba\") " Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.180321 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50e38616-afae-4d9d-94da-7eb34be421ba-client-ca\") pod \"50e38616-afae-4d9d-94da-7eb34be421ba\" (UID: \"50e38616-afae-4d9d-94da-7eb34be421ba\") " Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.180370 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2dqj\" (UniqueName: \"kubernetes.io/projected/50e38616-afae-4d9d-94da-7eb34be421ba-kube-api-access-h2dqj\") pod \"50e38616-afae-4d9d-94da-7eb34be421ba\" (UID: \"50e38616-afae-4d9d-94da-7eb34be421ba\") " Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.180535 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqbdh\" (UniqueName: \"kubernetes.io/projected/4f14d082-80d9-4d31-8863-b2580ab8b571-kube-api-access-lqbdh\") pod \"route-controller-manager-76f7f8d98-n79ct\" (UID: \"4f14d082-80d9-4d31-8863-b2580ab8b571\") " pod="openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.180575 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f14d082-80d9-4d31-8863-b2580ab8b571-config\") pod \"route-controller-manager-76f7f8d98-n79ct\" (UID: \"4f14d082-80d9-4d31-8863-b2580ab8b571\") " pod="openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.180607 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f14d082-80d9-4d31-8863-b2580ab8b571-serving-cert\") pod \"route-controller-manager-76f7f8d98-n79ct\" (UID: \"4f14d082-80d9-4d31-8863-b2580ab8b571\") " pod="openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.180631 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f14d082-80d9-4d31-8863-b2580ab8b571-client-ca\") pod \"route-controller-manager-76f7f8d98-n79ct\" (UID: \"4f14d082-80d9-4d31-8863-b2580ab8b571\") " pod="openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.180677 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94136700-6c7a-4496-97c3-ff8f7514a890-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.180687 4973 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/94136700-6c7a-4496-97c3-ff8f7514a890-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.180696 4973 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/94136700-6c7a-4496-97c3-ff8f7514a890-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.180705 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vcx2w\" (UniqueName: \"kubernetes.io/projected/94136700-6c7a-4496-97c3-ff8f7514a890-kube-api-access-vcx2w\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.182060 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50e38616-afae-4d9d-94da-7eb34be421ba-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "50e38616-afae-4d9d-94da-7eb34be421ba" (UID: "50e38616-afae-4d9d-94da-7eb34be421ba"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.182099 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50e38616-afae-4d9d-94da-7eb34be421ba-config" (OuterVolumeSpecName: "config") pod "50e38616-afae-4d9d-94da-7eb34be421ba" (UID: "50e38616-afae-4d9d-94da-7eb34be421ba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.182577 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/50e38616-afae-4d9d-94da-7eb34be421ba-client-ca" (OuterVolumeSpecName: "client-ca") pod "50e38616-afae-4d9d-94da-7eb34be421ba" (UID: "50e38616-afae-4d9d-94da-7eb34be421ba"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.182643 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f14d082-80d9-4d31-8863-b2580ab8b571-config\") pod \"route-controller-manager-76f7f8d98-n79ct\" (UID: \"4f14d082-80d9-4d31-8863-b2580ab8b571\") " pod="openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.182934 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f14d082-80d9-4d31-8863-b2580ab8b571-client-ca\") pod \"route-controller-manager-76f7f8d98-n79ct\" (UID: \"4f14d082-80d9-4d31-8863-b2580ab8b571\") " pod="openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.184688 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/50e38616-afae-4d9d-94da-7eb34be421ba-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "50e38616-afae-4d9d-94da-7eb34be421ba" (UID: "50e38616-afae-4d9d-94da-7eb34be421ba"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.185077 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50e38616-afae-4d9d-94da-7eb34be421ba-kube-api-access-h2dqj" (OuterVolumeSpecName: "kube-api-access-h2dqj") pod "50e38616-afae-4d9d-94da-7eb34be421ba" (UID: "50e38616-afae-4d9d-94da-7eb34be421ba"). InnerVolumeSpecName "kube-api-access-h2dqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.185938 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f14d082-80d9-4d31-8863-b2580ab8b571-serving-cert\") pod \"route-controller-manager-76f7f8d98-n79ct\" (UID: \"4f14d082-80d9-4d31-8863-b2580ab8b571\") " pod="openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.196989 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqbdh\" (UniqueName: \"kubernetes.io/projected/4f14d082-80d9-4d31-8863-b2580ab8b571-kube-api-access-lqbdh\") pod \"route-controller-manager-76f7f8d98-n79ct\" (UID: \"4f14d082-80d9-4d31-8863-b2580ab8b571\") " pod="openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.282821 4973 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/50e38616-afae-4d9d-94da-7eb34be421ba-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.282882 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2dqj\" (UniqueName: \"kubernetes.io/projected/50e38616-afae-4d9d-94da-7eb34be421ba-kube-api-access-h2dqj\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.282907 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/50e38616-afae-4d9d-94da-7eb34be421ba-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.282924 4973 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/50e38616-afae-4d9d-94da-7eb34be421ba-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.282942 4973 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/50e38616-afae-4d9d-94da-7eb34be421ba-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.320992 4973 generic.go:334] "Generic (PLEG): container finished" podID="50e38616-afae-4d9d-94da-7eb34be421ba" containerID="e81d33d68f916745f20ec5645cd9b977548aa3d2e65e10f2de8cb48d77cb1765" exitCode=0 Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.321051 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58b6b647d5-bbqr8" event={"ID":"50e38616-afae-4d9d-94da-7eb34be421ba","Type":"ContainerDied","Data":"e81d33d68f916745f20ec5645cd9b977548aa3d2e65e10f2de8cb48d77cb1765"} Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.321076 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-58b6b647d5-bbqr8" event={"ID":"50e38616-afae-4d9d-94da-7eb34be421ba","Type":"ContainerDied","Data":"d1af5c9f503ac5b526eb05ce40f592a946fea299276090896c68a560cd5918e1"} Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.321093 4973 scope.go:117] "RemoveContainer" containerID="e81d33d68f916745f20ec5645cd9b977548aa3d2e65e10f2de8cb48d77cb1765" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.321190 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-58b6b647d5-bbqr8" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.328646 4973 generic.go:334] "Generic (PLEG): container finished" podID="857bcbbb-a323-4483-a06e-4c93b97dda55" containerID="7e0fdffc8b1bd85a0dce664e994121f587ab4ab29cc921a171536377b95c36be" exitCode=0 Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.328728 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"857bcbbb-a323-4483-a06e-4c93b97dda55","Type":"ContainerDied","Data":"7e0fdffc8b1bd85a0dce664e994121f587ab4ab29cc921a171536377b95c36be"} Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.349537 4973 generic.go:334] "Generic (PLEG): container finished" podID="94136700-6c7a-4496-97c3-ff8f7514a890" containerID="ea59a1a2db0fc49adf8afd98df978a1bb5a16973a23a49eeb15c81d979142b22" exitCode=0 Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.349579 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.349611 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25" event={"ID":"94136700-6c7a-4496-97c3-ff8f7514a890","Type":"ContainerDied","Data":"ea59a1a2db0fc49adf8afd98df978a1bb5a16973a23a49eeb15c81d979142b22"} Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.349691 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25" event={"ID":"94136700-6c7a-4496-97c3-ff8f7514a890","Type":"ContainerDied","Data":"a00cfd0b5377761664deff55cd7cfc9f95aa0f82d2134bac06cc75880c5a76d0"} Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.357562 4973 scope.go:117] "RemoveContainer" containerID="e81d33d68f916745f20ec5645cd9b977548aa3d2e65e10f2de8cb48d77cb1765" Mar 20 13:25:37 crc kubenswrapper[4973]: E0320 13:25:37.358374 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e81d33d68f916745f20ec5645cd9b977548aa3d2e65e10f2de8cb48d77cb1765\": container with ID starting with e81d33d68f916745f20ec5645cd9b977548aa3d2e65e10f2de8cb48d77cb1765 not found: ID does not exist" containerID="e81d33d68f916745f20ec5645cd9b977548aa3d2e65e10f2de8cb48d77cb1765" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.358420 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e81d33d68f916745f20ec5645cd9b977548aa3d2e65e10f2de8cb48d77cb1765"} err="failed to get container status \"e81d33d68f916745f20ec5645cd9b977548aa3d2e65e10f2de8cb48d77cb1765\": rpc error: code = NotFound desc = could not find container \"e81d33d68f916745f20ec5645cd9b977548aa3d2e65e10f2de8cb48d77cb1765\": container with ID starting with e81d33d68f916745f20ec5645cd9b977548aa3d2e65e10f2de8cb48d77cb1765 not found: ID does not exist" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.358450 4973 scope.go:117] "RemoveContainer" containerID="ea59a1a2db0fc49adf8afd98df978a1bb5a16973a23a49eeb15c81d979142b22" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.359994 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-58b6b647d5-bbqr8"] Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.361692 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.363865 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-58b6b647d5-bbqr8"] Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.398129 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25"] Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.402513 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c578596cd-c2t25"] Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.404361 4973 scope.go:117] "RemoveContainer" containerID="ea59a1a2db0fc49adf8afd98df978a1bb5a16973a23a49eeb15c81d979142b22" Mar 20 13:25:37 crc kubenswrapper[4973]: E0320 13:25:37.408505 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea59a1a2db0fc49adf8afd98df978a1bb5a16973a23a49eeb15c81d979142b22\": container with ID starting with ea59a1a2db0fc49adf8afd98df978a1bb5a16973a23a49eeb15c81d979142b22 not found: ID does not exist" containerID="ea59a1a2db0fc49adf8afd98df978a1bb5a16973a23a49eeb15c81d979142b22" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.408553 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea59a1a2db0fc49adf8afd98df978a1bb5a16973a23a49eeb15c81d979142b22"} err="failed to get container status \"ea59a1a2db0fc49adf8afd98df978a1bb5a16973a23a49eeb15c81d979142b22\": rpc error: code = NotFound desc = could not find container \"ea59a1a2db0fc49adf8afd98df978a1bb5a16973a23a49eeb15c81d979142b22\": container with ID starting with ea59a1a2db0fc49adf8afd98df978a1bb5a16973a23a49eeb15c81d979142b22 not found: ID does not exist" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.572643 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct"] Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.956622 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50e38616-afae-4d9d-94da-7eb34be421ba" path="/var/lib/kubelet/pods/50e38616-afae-4d9d-94da-7eb34be421ba/volumes" Mar 20 13:25:37 crc kubenswrapper[4973]: I0320 13:25:37.957360 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94136700-6c7a-4496-97c3-ff8f7514a890" path="/var/lib/kubelet/pods/94136700-6c7a-4496-97c3-ff8f7514a890/volumes" Mar 20 13:25:38 crc kubenswrapper[4973]: I0320 13:25:38.357595 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct" event={"ID":"4f14d082-80d9-4d31-8863-b2580ab8b571","Type":"ContainerStarted","Data":"f3cc0457bf3aef76fe2b150e87362e3d8fe955752a8c2bdcd8ab3c410d67ca61"} Mar 20 13:25:38 crc kubenswrapper[4973]: I0320 13:25:38.357640 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct" event={"ID":"4f14d082-80d9-4d31-8863-b2580ab8b571","Type":"ContainerStarted","Data":"0b9ffebe2351713659e791dbd2a0a79b88043bc3b4fa48d89f3f259f5733948d"} Mar 20 13:25:38 crc kubenswrapper[4973]: I0320 13:25:38.357806 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct" Mar 20 13:25:38 crc kubenswrapper[4973]: I0320 13:25:38.363575 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct" Mar 20 13:25:38 crc kubenswrapper[4973]: I0320 13:25:38.382418 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct" podStartSLOduration=6.382398434 podStartE2EDuration="6.382398434s" podCreationTimestamp="2026-03-20 13:25:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:25:38.37912759 +0000 UTC m=+259.122797334" watchObservedRunningTime="2026-03-20 13:25:38.382398434 +0000 UTC m=+259.126068188" Mar 20 13:25:38 crc kubenswrapper[4973]: I0320 13:25:38.607829 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:25:38 crc kubenswrapper[4973]: I0320 13:25:38.707681 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/857bcbbb-a323-4483-a06e-4c93b97dda55-kube-api-access\") pod \"857bcbbb-a323-4483-a06e-4c93b97dda55\" (UID: \"857bcbbb-a323-4483-a06e-4c93b97dda55\") " Mar 20 13:25:38 crc kubenswrapper[4973]: I0320 13:25:38.707742 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/857bcbbb-a323-4483-a06e-4c93b97dda55-kubelet-dir\") pod \"857bcbbb-a323-4483-a06e-4c93b97dda55\" (UID: \"857bcbbb-a323-4483-a06e-4c93b97dda55\") " Mar 20 13:25:38 crc kubenswrapper[4973]: I0320 13:25:38.707960 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/857bcbbb-a323-4483-a06e-4c93b97dda55-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "857bcbbb-a323-4483-a06e-4c93b97dda55" (UID: "857bcbbb-a323-4483-a06e-4c93b97dda55"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:25:38 crc kubenswrapper[4973]: I0320 13:25:38.714346 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/857bcbbb-a323-4483-a06e-4c93b97dda55-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "857bcbbb-a323-4483-a06e-4c93b97dda55" (UID: "857bcbbb-a323-4483-a06e-4c93b97dda55"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:38 crc kubenswrapper[4973]: I0320 13:25:38.809496 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/857bcbbb-a323-4483-a06e-4c93b97dda55-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:38 crc kubenswrapper[4973]: I0320 13:25:38.809546 4973 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/857bcbbb-a323-4483-a06e-4c93b97dda55-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.370108 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.370094 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"857bcbbb-a323-4483-a06e-4c93b97dda55","Type":"ContainerDied","Data":"f98165f2215d0d6bb45ce0195ce53ceb79afc8fe86e1861ab83a3ddf0298630a"} Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.370796 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f98165f2215d0d6bb45ce0195ce53ceb79afc8fe86e1861ab83a3ddf0298630a" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.416114 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6db768d985-jfnt9"] Mar 20 13:25:39 crc kubenswrapper[4973]: E0320 13:25:39.416362 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="50e38616-afae-4d9d-94da-7eb34be421ba" containerName="controller-manager" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.416374 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="50e38616-afae-4d9d-94da-7eb34be421ba" containerName="controller-manager" Mar 20 13:25:39 crc kubenswrapper[4973]: E0320 13:25:39.416388 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="857bcbbb-a323-4483-a06e-4c93b97dda55" containerName="pruner" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.416394 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="857bcbbb-a323-4483-a06e-4c93b97dda55" containerName="pruner" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.416480 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="50e38616-afae-4d9d-94da-7eb34be421ba" containerName="controller-manager" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.416496 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="857bcbbb-a323-4483-a06e-4c93b97dda55" containerName="pruner" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.418167 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6db768d985-jfnt9" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.420663 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.420869 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.421199 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.421506 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.421633 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.421683 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.430801 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.431863 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6db768d985-jfnt9"] Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.517980 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e311f271-9488-44ec-98af-070ae0ef1651-config\") pod \"controller-manager-6db768d985-jfnt9\" (UID: \"e311f271-9488-44ec-98af-070ae0ef1651\") " pod="openshift-controller-manager/controller-manager-6db768d985-jfnt9" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.518027 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbnzr\" (UniqueName: \"kubernetes.io/projected/e311f271-9488-44ec-98af-070ae0ef1651-kube-api-access-gbnzr\") pod \"controller-manager-6db768d985-jfnt9\" (UID: \"e311f271-9488-44ec-98af-070ae0ef1651\") " pod="openshift-controller-manager/controller-manager-6db768d985-jfnt9" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.518183 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e311f271-9488-44ec-98af-070ae0ef1651-client-ca\") pod \"controller-manager-6db768d985-jfnt9\" (UID: \"e311f271-9488-44ec-98af-070ae0ef1651\") " pod="openshift-controller-manager/controller-manager-6db768d985-jfnt9" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.518239 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e311f271-9488-44ec-98af-070ae0ef1651-proxy-ca-bundles\") pod \"controller-manager-6db768d985-jfnt9\" (UID: \"e311f271-9488-44ec-98af-070ae0ef1651\") " pod="openshift-controller-manager/controller-manager-6db768d985-jfnt9" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.518282 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e311f271-9488-44ec-98af-070ae0ef1651-serving-cert\") pod \"controller-manager-6db768d985-jfnt9\" (UID: \"e311f271-9488-44ec-98af-070ae0ef1651\") " pod="openshift-controller-manager/controller-manager-6db768d985-jfnt9" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.619194 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e311f271-9488-44ec-98af-070ae0ef1651-client-ca\") pod \"controller-manager-6db768d985-jfnt9\" (UID: \"e311f271-9488-44ec-98af-070ae0ef1651\") " pod="openshift-controller-manager/controller-manager-6db768d985-jfnt9" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.619468 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e311f271-9488-44ec-98af-070ae0ef1651-proxy-ca-bundles\") pod \"controller-manager-6db768d985-jfnt9\" (UID: \"e311f271-9488-44ec-98af-070ae0ef1651\") " pod="openshift-controller-manager/controller-manager-6db768d985-jfnt9" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.619501 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e311f271-9488-44ec-98af-070ae0ef1651-serving-cert\") pod \"controller-manager-6db768d985-jfnt9\" (UID: \"e311f271-9488-44ec-98af-070ae0ef1651\") " pod="openshift-controller-manager/controller-manager-6db768d985-jfnt9" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.619532 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e311f271-9488-44ec-98af-070ae0ef1651-config\") pod \"controller-manager-6db768d985-jfnt9\" (UID: \"e311f271-9488-44ec-98af-070ae0ef1651\") " pod="openshift-controller-manager/controller-manager-6db768d985-jfnt9" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.619552 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbnzr\" (UniqueName: \"kubernetes.io/projected/e311f271-9488-44ec-98af-070ae0ef1651-kube-api-access-gbnzr\") pod \"controller-manager-6db768d985-jfnt9\" (UID: \"e311f271-9488-44ec-98af-070ae0ef1651\") " pod="openshift-controller-manager/controller-manager-6db768d985-jfnt9" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.621883 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e311f271-9488-44ec-98af-070ae0ef1651-config\") pod \"controller-manager-6db768d985-jfnt9\" (UID: \"e311f271-9488-44ec-98af-070ae0ef1651\") " pod="openshift-controller-manager/controller-manager-6db768d985-jfnt9" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.624533 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e311f271-9488-44ec-98af-070ae0ef1651-client-ca\") pod \"controller-manager-6db768d985-jfnt9\" (UID: \"e311f271-9488-44ec-98af-070ae0ef1651\") " pod="openshift-controller-manager/controller-manager-6db768d985-jfnt9" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.626184 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e311f271-9488-44ec-98af-070ae0ef1651-proxy-ca-bundles\") pod \"controller-manager-6db768d985-jfnt9\" (UID: \"e311f271-9488-44ec-98af-070ae0ef1651\") " pod="openshift-controller-manager/controller-manager-6db768d985-jfnt9" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.632562 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e311f271-9488-44ec-98af-070ae0ef1651-serving-cert\") pod \"controller-manager-6db768d985-jfnt9\" (UID: \"e311f271-9488-44ec-98af-070ae0ef1651\") " pod="openshift-controller-manager/controller-manager-6db768d985-jfnt9" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.643887 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbnzr\" (UniqueName: \"kubernetes.io/projected/e311f271-9488-44ec-98af-070ae0ef1651-kube-api-access-gbnzr\") pod \"controller-manager-6db768d985-jfnt9\" (UID: \"e311f271-9488-44ec-98af-070ae0ef1651\") " pod="openshift-controller-manager/controller-manager-6db768d985-jfnt9" Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.701357 4973 csr.go:261] certificate signing request csr-92j2f is approved, waiting to be issued Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.708382 4973 csr.go:257] certificate signing request csr-92j2f is issued Mar 20 13:25:39 crc kubenswrapper[4973]: I0320 13:25:39.749985 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6db768d985-jfnt9" Mar 20 13:25:40 crc kubenswrapper[4973]: I0320 13:25:40.130783 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6db768d985-jfnt9"] Mar 20 13:25:40 crc kubenswrapper[4973]: W0320 13:25:40.139461 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode311f271_9488_44ec_98af_070ae0ef1651.slice/crio-525b99999bcc8f7facaafdef68b5885cf84d4839af33f80d7f25c0a939da8202 WatchSource:0}: Error finding container 525b99999bcc8f7facaafdef68b5885cf84d4839af33f80d7f25c0a939da8202: Status 404 returned error can't find the container with id 525b99999bcc8f7facaafdef68b5885cf84d4839af33f80d7f25c0a939da8202 Mar 20 13:25:40 crc kubenswrapper[4973]: I0320 13:25:40.376953 4973 generic.go:334] "Generic (PLEG): container finished" podID="66c28cfe-8a0b-459a-bbab-59053fe226b8" containerID="3606634484df059d227d92640e427b3c4266d58ddb2cbfdc8486686661fd98d5" exitCode=0 Mar 20 13:25:40 crc kubenswrapper[4973]: I0320 13:25:40.377375 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566884-j8dqd" event={"ID":"66c28cfe-8a0b-459a-bbab-59053fe226b8","Type":"ContainerDied","Data":"3606634484df059d227d92640e427b3c4266d58ddb2cbfdc8486686661fd98d5"} Mar 20 13:25:40 crc kubenswrapper[4973]: I0320 13:25:40.380284 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6db768d985-jfnt9" event={"ID":"e311f271-9488-44ec-98af-070ae0ef1651","Type":"ContainerStarted","Data":"a23ae5a45cdbb57630353dfdb5b0383f7b30e8846fcdc33d5d52c131173a7423"} Mar 20 13:25:40 crc kubenswrapper[4973]: I0320 13:25:40.380313 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6db768d985-jfnt9" event={"ID":"e311f271-9488-44ec-98af-070ae0ef1651","Type":"ContainerStarted","Data":"525b99999bcc8f7facaafdef68b5885cf84d4839af33f80d7f25c0a939da8202"} Mar 20 13:25:40 crc kubenswrapper[4973]: I0320 13:25:40.380804 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6db768d985-jfnt9" Mar 20 13:25:40 crc kubenswrapper[4973]: I0320 13:25:40.384673 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6db768d985-jfnt9" Mar 20 13:25:40 crc kubenswrapper[4973]: I0320 13:25:40.409220 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6db768d985-jfnt9" podStartSLOduration=9.409198944 podStartE2EDuration="9.409198944s" podCreationTimestamp="2026-03-20 13:25:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:25:40.408572616 +0000 UTC m=+261.152242380" watchObservedRunningTime="2026-03-20 13:25:40.409198944 +0000 UTC m=+261.152868708" Mar 20 13:25:40 crc kubenswrapper[4973]: I0320 13:25:40.710063 4973 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-10 02:08:12.744402007 +0000 UTC Mar 20 13:25:40 crc kubenswrapper[4973]: I0320 13:25:40.710097 4973 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6348h42m32.034307716s for next certificate rotation Mar 20 13:25:41 crc kubenswrapper[4973]: I0320 13:25:41.671848 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566884-j8dqd" Mar 20 13:25:41 crc kubenswrapper[4973]: I0320 13:25:41.711179 4973 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-07 03:06:27.697578715 +0000 UTC Mar 20 13:25:41 crc kubenswrapper[4973]: I0320 13:25:41.711213 4973 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6277h40m45.986367907s for next certificate rotation Mar 20 13:25:41 crc kubenswrapper[4973]: I0320 13:25:41.747468 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jct5c\" (UniqueName: \"kubernetes.io/projected/66c28cfe-8a0b-459a-bbab-59053fe226b8-kube-api-access-jct5c\") pod \"66c28cfe-8a0b-459a-bbab-59053fe226b8\" (UID: \"66c28cfe-8a0b-459a-bbab-59053fe226b8\") " Mar 20 13:25:41 crc kubenswrapper[4973]: I0320 13:25:41.753051 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66c28cfe-8a0b-459a-bbab-59053fe226b8-kube-api-access-jct5c" (OuterVolumeSpecName: "kube-api-access-jct5c") pod "66c28cfe-8a0b-459a-bbab-59053fe226b8" (UID: "66c28cfe-8a0b-459a-bbab-59053fe226b8"). InnerVolumeSpecName "kube-api-access-jct5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:41 crc kubenswrapper[4973]: I0320 13:25:41.849228 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jct5c\" (UniqueName: \"kubernetes.io/projected/66c28cfe-8a0b-459a-bbab-59053fe226b8-kube-api-access-jct5c\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:42 crc kubenswrapper[4973]: I0320 13:25:42.399700 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566884-j8dqd" event={"ID":"66c28cfe-8a0b-459a-bbab-59053fe226b8","Type":"ContainerDied","Data":"76f4625175a4f1545ae789991f286fbe98ad6c9eeb5735a8c89adb5d7531e8c7"} Mar 20 13:25:42 crc kubenswrapper[4973]: I0320 13:25:42.399966 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76f4625175a4f1545ae789991f286fbe98ad6c9eeb5735a8c89adb5d7531e8c7" Mar 20 13:25:42 crc kubenswrapper[4973]: I0320 13:25:42.399741 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566884-j8dqd" Mar 20 13:25:43 crc kubenswrapper[4973]: I0320 13:25:43.320617 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:25:43 crc kubenswrapper[4973]: I0320 13:25:43.320688 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:25:49 crc kubenswrapper[4973]: I0320 13:25:49.450756 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6p25c" event={"ID":"097e9042-52e2-4a7e-b567-5b97f34242d6","Type":"ContainerStarted","Data":"3eb631ef9f8977b0cae6df746fb756b7f261e3f3c1b0ae8eb9666a2d6ba08e95"} Mar 20 13:25:49 crc kubenswrapper[4973]: I0320 13:25:49.454187 4973 generic.go:334] "Generic (PLEG): container finished" podID="cf96ec01-20ab-4529-ad42-d839540c3d8e" containerID="64fd272b1967a552dd5c1b2a0a1c219d65392d227a876ae1f044aab3ffba27ac" exitCode=0 Mar 20 13:25:49 crc kubenswrapper[4973]: I0320 13:25:49.454232 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6w64" event={"ID":"cf96ec01-20ab-4529-ad42-d839540c3d8e","Type":"ContainerDied","Data":"64fd272b1967a552dd5c1b2a0a1c219d65392d227a876ae1f044aab3ffba27ac"} Mar 20 13:25:50 crc kubenswrapper[4973]: I0320 13:25:50.462248 4973 generic.go:334] "Generic (PLEG): container finished" podID="097e9042-52e2-4a7e-b567-5b97f34242d6" containerID="3eb631ef9f8977b0cae6df746fb756b7f261e3f3c1b0ae8eb9666a2d6ba08e95" exitCode=0 Mar 20 13:25:50 crc kubenswrapper[4973]: I0320 13:25:50.462289 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6p25c" event={"ID":"097e9042-52e2-4a7e-b567-5b97f34242d6","Type":"ContainerDied","Data":"3eb631ef9f8977b0cae6df746fb756b7f261e3f3c1b0ae8eb9666a2d6ba08e95"} Mar 20 13:25:51 crc kubenswrapper[4973]: I0320 13:25:51.891148 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6db768d985-jfnt9"] Mar 20 13:25:51 crc kubenswrapper[4973]: I0320 13:25:51.891655 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6db768d985-jfnt9" podUID="e311f271-9488-44ec-98af-070ae0ef1651" containerName="controller-manager" containerID="cri-o://a23ae5a45cdbb57630353dfdb5b0383f7b30e8846fcdc33d5d52c131173a7423" gracePeriod=30 Mar 20 13:25:51 crc kubenswrapper[4973]: I0320 13:25:51.920517 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct"] Mar 20 13:25:51 crc kubenswrapper[4973]: I0320 13:25:51.920758 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct" podUID="4f14d082-80d9-4d31-8863-b2580ab8b571" containerName="route-controller-manager" containerID="cri-o://f3cc0457bf3aef76fe2b150e87362e3d8fe955752a8c2bdcd8ab3c410d67ca61" gracePeriod=30 Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.404002 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct" Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.476749 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ggbs9" event={"ID":"ce12ac02-5b58-44a3-a311-8cdd000ce41b","Type":"ContainerStarted","Data":"f18ac80061681553e87225e10ffc68884a4d14fe18f713ab93dd8675ad54347c"} Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.478536 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw9hm" event={"ID":"af380bf0-7c0d-4790-8ae4-19697763a37a","Type":"ContainerStarted","Data":"ab09514a390f3f2a225800d5e736873b2145b086b315c63dc7a6e6db9e60e3f7"} Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.480262 4973 generic.go:334] "Generic (PLEG): container finished" podID="e311f271-9488-44ec-98af-070ae0ef1651" containerID="a23ae5a45cdbb57630353dfdb5b0383f7b30e8846fcdc33d5d52c131173a7423" exitCode=0 Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.480311 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6db768d985-jfnt9" event={"ID":"e311f271-9488-44ec-98af-070ae0ef1651","Type":"ContainerDied","Data":"a23ae5a45cdbb57630353dfdb5b0383f7b30e8846fcdc33d5d52c131173a7423"} Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.480360 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6db768d985-jfnt9" event={"ID":"e311f271-9488-44ec-98af-070ae0ef1651","Type":"ContainerDied","Data":"525b99999bcc8f7facaafdef68b5885cf84d4839af33f80d7f25c0a939da8202"} Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.480376 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="525b99999bcc8f7facaafdef68b5885cf84d4839af33f80d7f25c0a939da8202" Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.484129 4973 generic.go:334] "Generic (PLEG): container finished" podID="0993b0a3-f604-4447-bce2-01636b061230" containerID="d5edb5515ca20a4ddc54247ad2599cece16abc6a5966dde3b7d31f262aa665cc" exitCode=0 Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.484216 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c7jw4" event={"ID":"0993b0a3-f604-4447-bce2-01636b061230","Type":"ContainerDied","Data":"d5edb5515ca20a4ddc54247ad2599cece16abc6a5966dde3b7d31f262aa665cc"} Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.485776 4973 generic.go:334] "Generic (PLEG): container finished" podID="4f14d082-80d9-4d31-8863-b2580ab8b571" containerID="f3cc0457bf3aef76fe2b150e87362e3d8fe955752a8c2bdcd8ab3c410d67ca61" exitCode=0 Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.485830 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct" event={"ID":"4f14d082-80d9-4d31-8863-b2580ab8b571","Type":"ContainerDied","Data":"f3cc0457bf3aef76fe2b150e87362e3d8fe955752a8c2bdcd8ab3c410d67ca61"} Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.485841 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct" Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.485852 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct" event={"ID":"4f14d082-80d9-4d31-8863-b2580ab8b571","Type":"ContainerDied","Data":"0b9ffebe2351713659e791dbd2a0a79b88043bc3b4fa48d89f3f259f5733948d"} Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.485891 4973 scope.go:117] "RemoveContainer" containerID="f3cc0457bf3aef76fe2b150e87362e3d8fe955752a8c2bdcd8ab3c410d67ca61" Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.491529 4973 generic.go:334] "Generic (PLEG): container finished" podID="8f429634-2787-4daa-a443-e4ab84f2e6b7" containerID="3be77d6c72034a6d01a1e20fc2da07144d4311518686935282dbae1c4fada22b" exitCode=0 Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.491656 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-plc2f" event={"ID":"8f429634-2787-4daa-a443-e4ab84f2e6b7","Type":"ContainerDied","Data":"3be77d6c72034a6d01a1e20fc2da07144d4311518686935282dbae1c4fada22b"} Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.515054 4973 scope.go:117] "RemoveContainer" containerID="f3cc0457bf3aef76fe2b150e87362e3d8fe955752a8c2bdcd8ab3c410d67ca61" Mar 20 13:25:52 crc kubenswrapper[4973]: E0320 13:25:52.516237 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3cc0457bf3aef76fe2b150e87362e3d8fe955752a8c2bdcd8ab3c410d67ca61\": container with ID starting with f3cc0457bf3aef76fe2b150e87362e3d8fe955752a8c2bdcd8ab3c410d67ca61 not found: ID does not exist" containerID="f3cc0457bf3aef76fe2b150e87362e3d8fe955752a8c2bdcd8ab3c410d67ca61" Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.516287 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3cc0457bf3aef76fe2b150e87362e3d8fe955752a8c2bdcd8ab3c410d67ca61"} err="failed to get container status \"f3cc0457bf3aef76fe2b150e87362e3d8fe955752a8c2bdcd8ab3c410d67ca61\": rpc error: code = NotFound desc = could not find container \"f3cc0457bf3aef76fe2b150e87362e3d8fe955752a8c2bdcd8ab3c410d67ca61\": container with ID starting with f3cc0457bf3aef76fe2b150e87362e3d8fe955752a8c2bdcd8ab3c410d67ca61 not found: ID does not exist" Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.518307 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6db768d985-jfnt9" Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.521052 4973 generic.go:334] "Generic (PLEG): container finished" podID="2115631d-0f02-4cb4-bfee-e18dd87a0462" containerID="dc6526e06ba5337105e9f0338a9e0a154a735b4c08f5cd80eebb85de1efe3952" exitCode=0 Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.521494 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jzd6" event={"ID":"2115631d-0f02-4cb4-bfee-e18dd87a0462","Type":"ContainerDied","Data":"dc6526e06ba5337105e9f0338a9e0a154a735b4c08f5cd80eebb85de1efe3952"} Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.523723 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tsxk4" event={"ID":"6848e4e5-1d17-4bf7-8dee-bbbeddedd07d","Type":"ContainerStarted","Data":"57fcdbdefa8d57bdc4dbd86ba592eb6e1557052c2fa7d15192881a337546bb0f"} Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.524679 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqbdh\" (UniqueName: \"kubernetes.io/projected/4f14d082-80d9-4d31-8863-b2580ab8b571-kube-api-access-lqbdh\") pod \"4f14d082-80d9-4d31-8863-b2580ab8b571\" (UID: \"4f14d082-80d9-4d31-8863-b2580ab8b571\") " Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.524829 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f14d082-80d9-4d31-8863-b2580ab8b571-config\") pod \"4f14d082-80d9-4d31-8863-b2580ab8b571\" (UID: \"4f14d082-80d9-4d31-8863-b2580ab8b571\") " Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.524869 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f14d082-80d9-4d31-8863-b2580ab8b571-client-ca\") pod \"4f14d082-80d9-4d31-8863-b2580ab8b571\" (UID: \"4f14d082-80d9-4d31-8863-b2580ab8b571\") " Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.524910 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f14d082-80d9-4d31-8863-b2580ab8b571-serving-cert\") pod \"4f14d082-80d9-4d31-8863-b2580ab8b571\" (UID: \"4f14d082-80d9-4d31-8863-b2580ab8b571\") " Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.527886 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f14d082-80d9-4d31-8863-b2580ab8b571-config" (OuterVolumeSpecName: "config") pod "4f14d082-80d9-4d31-8863-b2580ab8b571" (UID: "4f14d082-80d9-4d31-8863-b2580ab8b571"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.528221 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f14d082-80d9-4d31-8863-b2580ab8b571-client-ca" (OuterVolumeSpecName: "client-ca") pod "4f14d082-80d9-4d31-8863-b2580ab8b571" (UID: "4f14d082-80d9-4d31-8863-b2580ab8b571"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.546564 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f14d082-80d9-4d31-8863-b2580ab8b571-kube-api-access-lqbdh" (OuterVolumeSpecName: "kube-api-access-lqbdh") pod "4f14d082-80d9-4d31-8863-b2580ab8b571" (UID: "4f14d082-80d9-4d31-8863-b2580ab8b571"). InnerVolumeSpecName "kube-api-access-lqbdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.546913 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4f14d082-80d9-4d31-8863-b2580ab8b571-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4f14d082-80d9-4d31-8863-b2580ab8b571" (UID: "4f14d082-80d9-4d31-8863-b2580ab8b571"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.549880 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6w64" event={"ID":"cf96ec01-20ab-4529-ad42-d839540c3d8e","Type":"ContainerStarted","Data":"d14ab0675610c48df55ab8865ff590c4341eb609c474e6b170a211cdf71127d1"} Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.573797 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6p25c" event={"ID":"097e9042-52e2-4a7e-b567-5b97f34242d6","Type":"ContainerStarted","Data":"96aa2c4308d9871bde168cb5796161dfe73221250faa8c2c6a47bfb8cd322984"} Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.634287 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e311f271-9488-44ec-98af-070ae0ef1651-serving-cert\") pod \"e311f271-9488-44ec-98af-070ae0ef1651\" (UID: \"e311f271-9488-44ec-98af-070ae0ef1651\") " Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.634379 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbnzr\" (UniqueName: \"kubernetes.io/projected/e311f271-9488-44ec-98af-070ae0ef1651-kube-api-access-gbnzr\") pod \"e311f271-9488-44ec-98af-070ae0ef1651\" (UID: \"e311f271-9488-44ec-98af-070ae0ef1651\") " Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.634473 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e311f271-9488-44ec-98af-070ae0ef1651-proxy-ca-bundles\") pod \"e311f271-9488-44ec-98af-070ae0ef1651\" (UID: \"e311f271-9488-44ec-98af-070ae0ef1651\") " Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.634497 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e311f271-9488-44ec-98af-070ae0ef1651-client-ca\") pod \"e311f271-9488-44ec-98af-070ae0ef1651\" (UID: \"e311f271-9488-44ec-98af-070ae0ef1651\") " Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.634530 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e311f271-9488-44ec-98af-070ae0ef1651-config\") pod \"e311f271-9488-44ec-98af-070ae0ef1651\" (UID: \"e311f271-9488-44ec-98af-070ae0ef1651\") " Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.634777 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqbdh\" (UniqueName: \"kubernetes.io/projected/4f14d082-80d9-4d31-8863-b2580ab8b571-kube-api-access-lqbdh\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.634796 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f14d082-80d9-4d31-8863-b2580ab8b571-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.634938 4973 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4f14d082-80d9-4d31-8863-b2580ab8b571-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.635304 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e311f271-9488-44ec-98af-070ae0ef1651-client-ca" (OuterVolumeSpecName: "client-ca") pod "e311f271-9488-44ec-98af-070ae0ef1651" (UID: "e311f271-9488-44ec-98af-070ae0ef1651"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.635494 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e311f271-9488-44ec-98af-070ae0ef1651-config" (OuterVolumeSpecName: "config") pod "e311f271-9488-44ec-98af-070ae0ef1651" (UID: "e311f271-9488-44ec-98af-070ae0ef1651"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.635601 4973 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4f14d082-80d9-4d31-8863-b2580ab8b571-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.636319 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e311f271-9488-44ec-98af-070ae0ef1651-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e311f271-9488-44ec-98af-070ae0ef1651" (UID: "e311f271-9488-44ec-98af-070ae0ef1651"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.641516 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e311f271-9488-44ec-98af-070ae0ef1651-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e311f271-9488-44ec-98af-070ae0ef1651" (UID: "e311f271-9488-44ec-98af-070ae0ef1651"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.641557 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e311f271-9488-44ec-98af-070ae0ef1651-kube-api-access-gbnzr" (OuterVolumeSpecName: "kube-api-access-gbnzr") pod "e311f271-9488-44ec-98af-070ae0ef1651" (UID: "e311f271-9488-44ec-98af-070ae0ef1651"). InnerVolumeSpecName "kube-api-access-gbnzr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.670182 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v6w64" podStartSLOduration=2.958387838 podStartE2EDuration="57.670161664s" podCreationTimestamp="2026-03-20 13:24:55 +0000 UTC" firstStartedPulling="2026-03-20 13:24:56.658618607 +0000 UTC m=+217.402288351" lastFinishedPulling="2026-03-20 13:25:51.370392423 +0000 UTC m=+272.114062177" observedRunningTime="2026-03-20 13:25:52.668169462 +0000 UTC m=+273.411839206" watchObservedRunningTime="2026-03-20 13:25:52.670161664 +0000 UTC m=+273.413831408" Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.712948 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6p25c" podStartSLOduration=4.106192722 podStartE2EDuration="56.71293038s" podCreationTimestamp="2026-03-20 13:24:56 +0000 UTC" firstStartedPulling="2026-03-20 13:24:58.863665442 +0000 UTC m=+219.607335186" lastFinishedPulling="2026-03-20 13:25:51.47040308 +0000 UTC m=+272.214072844" observedRunningTime="2026-03-20 13:25:52.712607912 +0000 UTC m=+273.456277656" watchObservedRunningTime="2026-03-20 13:25:52.71293038 +0000 UTC m=+273.456600124" Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.736403 4973 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e311f271-9488-44ec-98af-070ae0ef1651-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.736438 4973 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e311f271-9488-44ec-98af-070ae0ef1651-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.736449 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e311f271-9488-44ec-98af-070ae0ef1651-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.736457 4973 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e311f271-9488-44ec-98af-070ae0ef1651-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.736466 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbnzr\" (UniqueName: \"kubernetes.io/projected/e311f271-9488-44ec-98af-070ae0ef1651-kube-api-access-gbnzr\") on node \"crc\" DevicePath \"\"" Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.817752 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct"] Mar 20 13:25:52 crc kubenswrapper[4973]: I0320 13:25:52.821864 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-76f7f8d98-n79ct"] Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.427218 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6ddb884dbd-9wz58"] Mar 20 13:25:53 crc kubenswrapper[4973]: E0320 13:25:53.428609 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f14d082-80d9-4d31-8863-b2580ab8b571" containerName="route-controller-manager" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.428688 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f14d082-80d9-4d31-8863-b2580ab8b571" containerName="route-controller-manager" Mar 20 13:25:53 crc kubenswrapper[4973]: E0320 13:25:53.428789 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e311f271-9488-44ec-98af-070ae0ef1651" containerName="controller-manager" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.428847 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="e311f271-9488-44ec-98af-070ae0ef1651" containerName="controller-manager" Mar 20 13:25:53 crc kubenswrapper[4973]: E0320 13:25:53.428916 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66c28cfe-8a0b-459a-bbab-59053fe226b8" containerName="oc" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.428985 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="66c28cfe-8a0b-459a-bbab-59053fe226b8" containerName="oc" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.429132 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="66c28cfe-8a0b-459a-bbab-59053fe226b8" containerName="oc" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.429198 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f14d082-80d9-4d31-8863-b2580ab8b571" containerName="route-controller-manager" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.429252 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="e311f271-9488-44ec-98af-070ae0ef1651" containerName="controller-manager" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.429740 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6ddb884dbd-9wz58" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.430601 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv"] Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.431299 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.433782 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.434302 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.434392 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.434303 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.434379 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.434743 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.443290 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv"] Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.446006 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6ddb884dbd-9wz58"] Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.550011 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cde7a8af-714b-4f14-816d-173bf104fa67-proxy-ca-bundles\") pod \"controller-manager-6ddb884dbd-9wz58\" (UID: \"cde7a8af-714b-4f14-816d-173bf104fa67\") " pod="openshift-controller-manager/controller-manager-6ddb884dbd-9wz58" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.550070 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3c687f3-1259-421f-a8cd-b6cde5d00784-client-ca\") pod \"route-controller-manager-5799975c57-rf5wv\" (UID: \"c3c687f3-1259-421f-a8cd-b6cde5d00784\") " pod="openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.550103 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cde7a8af-714b-4f14-816d-173bf104fa67-serving-cert\") pod \"controller-manager-6ddb884dbd-9wz58\" (UID: \"cde7a8af-714b-4f14-816d-173bf104fa67\") " pod="openshift-controller-manager/controller-manager-6ddb884dbd-9wz58" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.550132 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3c687f3-1259-421f-a8cd-b6cde5d00784-serving-cert\") pod \"route-controller-manager-5799975c57-rf5wv\" (UID: \"c3c687f3-1259-421f-a8cd-b6cde5d00784\") " pod="openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.550157 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3c687f3-1259-421f-a8cd-b6cde5d00784-config\") pod \"route-controller-manager-5799975c57-rf5wv\" (UID: \"c3c687f3-1259-421f-a8cd-b6cde5d00784\") " pod="openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.550180 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-spvlc\" (UniqueName: \"kubernetes.io/projected/c3c687f3-1259-421f-a8cd-b6cde5d00784-kube-api-access-spvlc\") pod \"route-controller-manager-5799975c57-rf5wv\" (UID: \"c3c687f3-1259-421f-a8cd-b6cde5d00784\") " pod="openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.550198 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cde7a8af-714b-4f14-816d-173bf104fa67-client-ca\") pod \"controller-manager-6ddb884dbd-9wz58\" (UID: \"cde7a8af-714b-4f14-816d-173bf104fa67\") " pod="openshift-controller-manager/controller-manager-6ddb884dbd-9wz58" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.550274 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qb68\" (UniqueName: \"kubernetes.io/projected/cde7a8af-714b-4f14-816d-173bf104fa67-kube-api-access-2qb68\") pod \"controller-manager-6ddb884dbd-9wz58\" (UID: \"cde7a8af-714b-4f14-816d-173bf104fa67\") " pod="openshift-controller-manager/controller-manager-6ddb884dbd-9wz58" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.550355 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cde7a8af-714b-4f14-816d-173bf104fa67-config\") pod \"controller-manager-6ddb884dbd-9wz58\" (UID: \"cde7a8af-714b-4f14-816d-173bf104fa67\") " pod="openshift-controller-manager/controller-manager-6ddb884dbd-9wz58" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.580519 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c7jw4" event={"ID":"0993b0a3-f604-4447-bce2-01636b061230","Type":"ContainerStarted","Data":"e25d1c90ec9a04104b2de18694da6c5b694a1131ea348f7eebf9bc63d2b6795f"} Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.581869 4973 generic.go:334] "Generic (PLEG): container finished" podID="6848e4e5-1d17-4bf7-8dee-bbbeddedd07d" containerID="57fcdbdefa8d57bdc4dbd86ba592eb6e1557052c2fa7d15192881a337546bb0f" exitCode=0 Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.581938 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tsxk4" event={"ID":"6848e4e5-1d17-4bf7-8dee-bbbeddedd07d","Type":"ContainerDied","Data":"57fcdbdefa8d57bdc4dbd86ba592eb6e1557052c2fa7d15192881a337546bb0f"} Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.589165 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-plc2f" event={"ID":"8f429634-2787-4daa-a443-e4ab84f2e6b7","Type":"ContainerStarted","Data":"18d92bd3a9ca7915dd4ae5d8f43927e1ae8256cc06c76bd242c200dfe3b90b44"} Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.591444 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jzd6" event={"ID":"2115631d-0f02-4cb4-bfee-e18dd87a0462","Type":"ContainerStarted","Data":"0395673bbf4bceba4bfbfacd5ed9da9de3375db709c57b139dce03e8d570c3e0"} Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.593126 4973 generic.go:334] "Generic (PLEG): container finished" podID="ce12ac02-5b58-44a3-a311-8cdd000ce41b" containerID="f18ac80061681553e87225e10ffc68884a4d14fe18f713ab93dd8675ad54347c" exitCode=0 Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.593168 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ggbs9" event={"ID":"ce12ac02-5b58-44a3-a311-8cdd000ce41b","Type":"ContainerDied","Data":"f18ac80061681553e87225e10ffc68884a4d14fe18f713ab93dd8675ad54347c"} Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.595815 4973 generic.go:334] "Generic (PLEG): container finished" podID="af380bf0-7c0d-4790-8ae4-19697763a37a" containerID="ab09514a390f3f2a225800d5e736873b2145b086b315c63dc7a6e6db9e60e3f7" exitCode=0 Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.595868 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6db768d985-jfnt9" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.596295 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw9hm" event={"ID":"af380bf0-7c0d-4790-8ae4-19697763a37a","Type":"ContainerDied","Data":"ab09514a390f3f2a225800d5e736873b2145b086b315c63dc7a6e6db9e60e3f7"} Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.605624 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c7jw4" podStartSLOduration=2.347552808 podStartE2EDuration="58.605611574s" podCreationTimestamp="2026-03-20 13:24:55 +0000 UTC" firstStartedPulling="2026-03-20 13:24:56.650644067 +0000 UTC m=+217.394313811" lastFinishedPulling="2026-03-20 13:25:52.908702833 +0000 UTC m=+273.652372577" observedRunningTime="2026-03-20 13:25:53.605402019 +0000 UTC m=+274.349071763" watchObservedRunningTime="2026-03-20 13:25:53.605611574 +0000 UTC m=+274.349281318" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.639483 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-plc2f" podStartSLOduration=2.26591361 podStartE2EDuration="1m0.639468259s" podCreationTimestamp="2026-03-20 13:24:53 +0000 UTC" firstStartedPulling="2026-03-20 13:24:54.514446928 +0000 UTC m=+215.258116672" lastFinishedPulling="2026-03-20 13:25:52.888001577 +0000 UTC m=+273.631671321" observedRunningTime="2026-03-20 13:25:53.625458307 +0000 UTC m=+274.369128051" watchObservedRunningTime="2026-03-20 13:25:53.639468259 +0000 UTC m=+274.383138003" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.674886 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-5jzd6" podStartSLOduration=3.120301136 podStartE2EDuration="1m1.674872405s" podCreationTimestamp="2026-03-20 13:24:52 +0000 UTC" firstStartedPulling="2026-03-20 13:24:54.514736127 +0000 UTC m=+215.258405871" lastFinishedPulling="2026-03-20 13:25:53.069307406 +0000 UTC m=+273.812977140" observedRunningTime="2026-03-20 13:25:53.673472368 +0000 UTC m=+274.417142112" watchObservedRunningTime="2026-03-20 13:25:53.674872405 +0000 UTC m=+274.418542149" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.810537 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cde7a8af-714b-4f14-816d-173bf104fa67-config\") pod \"controller-manager-6ddb884dbd-9wz58\" (UID: \"cde7a8af-714b-4f14-816d-173bf104fa67\") " pod="openshift-controller-manager/controller-manager-6ddb884dbd-9wz58" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.810905 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cde7a8af-714b-4f14-816d-173bf104fa67-proxy-ca-bundles\") pod \"controller-manager-6ddb884dbd-9wz58\" (UID: \"cde7a8af-714b-4f14-816d-173bf104fa67\") " pod="openshift-controller-manager/controller-manager-6ddb884dbd-9wz58" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.810952 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3c687f3-1259-421f-a8cd-b6cde5d00784-client-ca\") pod \"route-controller-manager-5799975c57-rf5wv\" (UID: \"c3c687f3-1259-421f-a8cd-b6cde5d00784\") " pod="openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.810981 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cde7a8af-714b-4f14-816d-173bf104fa67-serving-cert\") pod \"controller-manager-6ddb884dbd-9wz58\" (UID: \"cde7a8af-714b-4f14-816d-173bf104fa67\") " pod="openshift-controller-manager/controller-manager-6ddb884dbd-9wz58" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.811004 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3c687f3-1259-421f-a8cd-b6cde5d00784-serving-cert\") pod \"route-controller-manager-5799975c57-rf5wv\" (UID: \"c3c687f3-1259-421f-a8cd-b6cde5d00784\") " pod="openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.811036 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3c687f3-1259-421f-a8cd-b6cde5d00784-config\") pod \"route-controller-manager-5799975c57-rf5wv\" (UID: \"c3c687f3-1259-421f-a8cd-b6cde5d00784\") " pod="openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.811067 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-spvlc\" (UniqueName: \"kubernetes.io/projected/c3c687f3-1259-421f-a8cd-b6cde5d00784-kube-api-access-spvlc\") pod \"route-controller-manager-5799975c57-rf5wv\" (UID: \"c3c687f3-1259-421f-a8cd-b6cde5d00784\") " pod="openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.811086 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cde7a8af-714b-4f14-816d-173bf104fa67-client-ca\") pod \"controller-manager-6ddb884dbd-9wz58\" (UID: \"cde7a8af-714b-4f14-816d-173bf104fa67\") " pod="openshift-controller-manager/controller-manager-6ddb884dbd-9wz58" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.811105 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qb68\" (UniqueName: \"kubernetes.io/projected/cde7a8af-714b-4f14-816d-173bf104fa67-kube-api-access-2qb68\") pod \"controller-manager-6ddb884dbd-9wz58\" (UID: \"cde7a8af-714b-4f14-816d-173bf104fa67\") " pod="openshift-controller-manager/controller-manager-6ddb884dbd-9wz58" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.811854 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cde7a8af-714b-4f14-816d-173bf104fa67-config\") pod \"controller-manager-6ddb884dbd-9wz58\" (UID: \"cde7a8af-714b-4f14-816d-173bf104fa67\") " pod="openshift-controller-manager/controller-manager-6ddb884dbd-9wz58" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.812664 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3c687f3-1259-421f-a8cd-b6cde5d00784-client-ca\") pod \"route-controller-manager-5799975c57-rf5wv\" (UID: \"c3c687f3-1259-421f-a8cd-b6cde5d00784\") " pod="openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.812798 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3c687f3-1259-421f-a8cd-b6cde5d00784-config\") pod \"route-controller-manager-5799975c57-rf5wv\" (UID: \"c3c687f3-1259-421f-a8cd-b6cde5d00784\") " pod="openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.813905 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cde7a8af-714b-4f14-816d-173bf104fa67-proxy-ca-bundles\") pod \"controller-manager-6ddb884dbd-9wz58\" (UID: \"cde7a8af-714b-4f14-816d-173bf104fa67\") " pod="openshift-controller-manager/controller-manager-6ddb884dbd-9wz58" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.815420 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cde7a8af-714b-4f14-816d-173bf104fa67-client-ca\") pod \"controller-manager-6ddb884dbd-9wz58\" (UID: \"cde7a8af-714b-4f14-816d-173bf104fa67\") " pod="openshift-controller-manager/controller-manager-6ddb884dbd-9wz58" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.818710 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3c687f3-1259-421f-a8cd-b6cde5d00784-serving-cert\") pod \"route-controller-manager-5799975c57-rf5wv\" (UID: \"c3c687f3-1259-421f-a8cd-b6cde5d00784\") " pod="openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.819164 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cde7a8af-714b-4f14-816d-173bf104fa67-serving-cert\") pod \"controller-manager-6ddb884dbd-9wz58\" (UID: \"cde7a8af-714b-4f14-816d-173bf104fa67\") " pod="openshift-controller-manager/controller-manager-6ddb884dbd-9wz58" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.829169 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qb68\" (UniqueName: \"kubernetes.io/projected/cde7a8af-714b-4f14-816d-173bf104fa67-kube-api-access-2qb68\") pod \"controller-manager-6ddb884dbd-9wz58\" (UID: \"cde7a8af-714b-4f14-816d-173bf104fa67\") " pod="openshift-controller-manager/controller-manager-6ddb884dbd-9wz58" Mar 20 13:25:53 crc kubenswrapper[4973]: I0320 13:25:53.834406 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-spvlc\" (UniqueName: \"kubernetes.io/projected/c3c687f3-1259-421f-a8cd-b6cde5d00784-kube-api-access-spvlc\") pod \"route-controller-manager-5799975c57-rf5wv\" (UID: \"c3c687f3-1259-421f-a8cd-b6cde5d00784\") " pod="openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv" Mar 20 13:25:54 crc kubenswrapper[4973]: I0320 13:25:54.490546 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv" Mar 20 13:25:54 crc kubenswrapper[4973]: I0320 13:25:54.522204 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6ddb884dbd-9wz58" Mar 20 13:25:54 crc kubenswrapper[4973]: I0320 13:25:54.534103 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f14d082-80d9-4d31-8863-b2580ab8b571" path="/var/lib/kubelet/pods/4f14d082-80d9-4d31-8863-b2580ab8b571/volumes" Mar 20 13:25:54 crc kubenswrapper[4973]: I0320 13:25:54.534850 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6db768d985-jfnt9"] Mar 20 13:25:54 crc kubenswrapper[4973]: I0320 13:25:54.534884 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6db768d985-jfnt9"] Mar 20 13:25:54 crc kubenswrapper[4973]: I0320 13:25:54.863841 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6ddb884dbd-9wz58"] Mar 20 13:25:55 crc kubenswrapper[4973]: I0320 13:25:55.033598 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv"] Mar 20 13:25:55 crc kubenswrapper[4973]: W0320 13:25:55.038220 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3c687f3_1259_421f_a8cd_b6cde5d00784.slice/crio-23922da399645c983172da906b2cecc3348365aecd4423cd152c3225fe1630e1 WatchSource:0}: Error finding container 23922da399645c983172da906b2cecc3348365aecd4423cd152c3225fe1630e1: Status 404 returned error can't find the container with id 23922da399645c983172da906b2cecc3348365aecd4423cd152c3225fe1630e1 Mar 20 13:25:55 crc kubenswrapper[4973]: I0320 13:25:55.368694 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c7jw4" Mar 20 13:25:55 crc kubenswrapper[4973]: I0320 13:25:55.369056 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c7jw4" Mar 20 13:25:55 crc kubenswrapper[4973]: I0320 13:25:55.627024 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv" event={"ID":"c3c687f3-1259-421f-a8cd-b6cde5d00784","Type":"ContainerStarted","Data":"e8420f0cba3c466cfae89d55b3be5d04eea388997e9955dc93d0240748baa26a"} Mar 20 13:25:55 crc kubenswrapper[4973]: I0320 13:25:55.627071 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv" event={"ID":"c3c687f3-1259-421f-a8cd-b6cde5d00784","Type":"ContainerStarted","Data":"23922da399645c983172da906b2cecc3348365aecd4423cd152c3225fe1630e1"} Mar 20 13:25:55 crc kubenswrapper[4973]: I0320 13:25:55.627093 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv" Mar 20 13:25:55 crc kubenswrapper[4973]: I0320 13:25:55.629296 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6ddb884dbd-9wz58" event={"ID":"cde7a8af-714b-4f14-816d-173bf104fa67","Type":"ContainerStarted","Data":"e4e5dfde26a6b0977cbc4898fa2aa34bc869fc8eadc8ee97d436beeade2b6686"} Mar 20 13:25:55 crc kubenswrapper[4973]: I0320 13:25:55.629348 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6ddb884dbd-9wz58" event={"ID":"cde7a8af-714b-4f14-816d-173bf104fa67","Type":"ContainerStarted","Data":"74d281c1066276d47598997c1e8aa469922a54f10c0cac11f75a39a70dc140ec"} Mar 20 13:25:55 crc kubenswrapper[4973]: I0320 13:25:55.630042 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6ddb884dbd-9wz58" Mar 20 13:25:55 crc kubenswrapper[4973]: I0320 13:25:55.663449 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv" podStartSLOduration=4.663432676 podStartE2EDuration="4.663432676s" podCreationTimestamp="2026-03-20 13:25:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:25:55.659085874 +0000 UTC m=+276.402755618" watchObservedRunningTime="2026-03-20 13:25:55.663432676 +0000 UTC m=+276.407102420" Mar 20 13:25:55 crc kubenswrapper[4973]: I0320 13:25:55.671429 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6ddb884dbd-9wz58" Mar 20 13:25:55 crc kubenswrapper[4973]: I0320 13:25:55.694678 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6ddb884dbd-9wz58" podStartSLOduration=4.694653744 podStartE2EDuration="4.694653744s" podCreationTimestamp="2026-03-20 13:25:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:25:55.690351353 +0000 UTC m=+276.434021097" watchObservedRunningTime="2026-03-20 13:25:55.694653744 +0000 UTC m=+276.438323488" Mar 20 13:25:55 crc kubenswrapper[4973]: I0320 13:25:55.763854 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v6w64" Mar 20 13:25:55 crc kubenswrapper[4973]: I0320 13:25:55.764069 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v6w64" Mar 20 13:25:55 crc kubenswrapper[4973]: I0320 13:25:55.836130 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv" Mar 20 13:25:55 crc kubenswrapper[4973]: I0320 13:25:55.849041 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v6w64" Mar 20 13:25:55 crc kubenswrapper[4973]: I0320 13:25:55.965838 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e311f271-9488-44ec-98af-070ae0ef1651" path="/var/lib/kubelet/pods/e311f271-9488-44ec-98af-070ae0ef1651/volumes" Mar 20 13:25:56 crc kubenswrapper[4973]: I0320 13:25:56.420680 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6p25c" Mar 20 13:25:56 crc kubenswrapper[4973]: I0320 13:25:56.421058 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6p25c" Mar 20 13:25:56 crc kubenswrapper[4973]: I0320 13:25:56.635452 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw9hm" event={"ID":"af380bf0-7c0d-4790-8ae4-19697763a37a","Type":"ContainerStarted","Data":"04cced65bee69b50ffa6b1f7a1c8897a39ee1d8f7772e201a35a0e6738271ca2"} Mar 20 13:25:56 crc kubenswrapper[4973]: I0320 13:25:56.637597 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tsxk4" event={"ID":"6848e4e5-1d17-4bf7-8dee-bbbeddedd07d","Type":"ContainerStarted","Data":"ace8c1615938595e76e498ed343fd570e52f308a349724467d1abe0e677a6ebb"} Mar 20 13:25:56 crc kubenswrapper[4973]: I0320 13:25:56.639878 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ggbs9" event={"ID":"ce12ac02-5b58-44a3-a311-8cdd000ce41b","Type":"ContainerStarted","Data":"56301b50d1f1b544ae5620604a1077e1777ff8c1eface90be01b3eaa29194098"} Mar 20 13:25:56 crc kubenswrapper[4973]: I0320 13:25:56.657775 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qw9hm" podStartSLOduration=3.893673298 podStartE2EDuration="1m0.657755589s" podCreationTimestamp="2026-03-20 13:24:56 +0000 UTC" firstStartedPulling="2026-03-20 13:24:58.88751248 +0000 UTC m=+219.631182224" lastFinishedPulling="2026-03-20 13:25:55.651594771 +0000 UTC m=+276.395264515" observedRunningTime="2026-03-20 13:25:56.656075025 +0000 UTC m=+277.399744769" watchObservedRunningTime="2026-03-20 13:25:56.657755589 +0000 UTC m=+277.401425333" Mar 20 13:25:56 crc kubenswrapper[4973]: I0320 13:25:56.683091 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tsxk4" podStartSLOduration=3.425287164 podStartE2EDuration="1m3.683056193s" podCreationTimestamp="2026-03-20 13:24:53 +0000 UTC" firstStartedPulling="2026-03-20 13:24:55.532494298 +0000 UTC m=+216.276164042" lastFinishedPulling="2026-03-20 13:25:55.790263327 +0000 UTC m=+276.533933071" observedRunningTime="2026-03-20 13:25:56.678705961 +0000 UTC m=+277.422375705" watchObservedRunningTime="2026-03-20 13:25:56.683056193 +0000 UTC m=+277.426725937" Mar 20 13:25:56 crc kubenswrapper[4973]: I0320 13:25:56.695274 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v6w64" Mar 20 13:25:56 crc kubenswrapper[4973]: I0320 13:25:56.698761 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ggbs9" podStartSLOduration=3.509023194 podStartE2EDuration="1m3.698747879s" podCreationTimestamp="2026-03-20 13:24:53 +0000 UTC" firstStartedPulling="2026-03-20 13:24:55.517410232 +0000 UTC m=+216.261079976" lastFinishedPulling="2026-03-20 13:25:55.707134917 +0000 UTC m=+276.450804661" observedRunningTime="2026-03-20 13:25:56.697481426 +0000 UTC m=+277.441151170" watchObservedRunningTime="2026-03-20 13:25:56.698747879 +0000 UTC m=+277.442417623" Mar 20 13:25:56 crc kubenswrapper[4973]: I0320 13:25:56.742474 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-c7jw4" podUID="0993b0a3-f604-4447-bce2-01636b061230" containerName="registry-server" probeResult="failure" output=< Mar 20 13:25:56 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 13:25:56 crc kubenswrapper[4973]: > Mar 20 13:25:56 crc kubenswrapper[4973]: I0320 13:25:56.849395 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qw9hm" Mar 20 13:25:56 crc kubenswrapper[4973]: I0320 13:25:56.849448 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qw9hm" Mar 20 13:25:57 crc kubenswrapper[4973]: I0320 13:25:57.452784 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-6p25c" podUID="097e9042-52e2-4a7e-b567-5b97f34242d6" containerName="registry-server" probeResult="failure" output=< Mar 20 13:25:57 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 13:25:57 crc kubenswrapper[4973]: > Mar 20 13:25:57 crc kubenswrapper[4973]: I0320 13:25:57.884603 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qw9hm" podUID="af380bf0-7c0d-4790-8ae4-19697763a37a" containerName="registry-server" probeResult="failure" output=< Mar 20 13:25:57 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 13:25:57 crc kubenswrapper[4973]: > Mar 20 13:26:00 crc kubenswrapper[4973]: I0320 13:26:00.151509 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566886-pm77h"] Mar 20 13:26:00 crc kubenswrapper[4973]: I0320 13:26:00.152506 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566886-pm77h" Mar 20 13:26:00 crc kubenswrapper[4973]: I0320 13:26:00.155693 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:26:00 crc kubenswrapper[4973]: I0320 13:26:00.155728 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 13:26:00 crc kubenswrapper[4973]: I0320 13:26:00.155863 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:26:00 crc kubenswrapper[4973]: I0320 13:26:00.164842 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566886-pm77h"] Mar 20 13:26:00 crc kubenswrapper[4973]: I0320 13:26:00.215391 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g552w\" (UniqueName: \"kubernetes.io/projected/f10e2a0a-df92-4a54-98e0-382851137211-kube-api-access-g552w\") pod \"auto-csr-approver-29566886-pm77h\" (UID: \"f10e2a0a-df92-4a54-98e0-382851137211\") " pod="openshift-infra/auto-csr-approver-29566886-pm77h" Mar 20 13:26:00 crc kubenswrapper[4973]: I0320 13:26:00.316809 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g552w\" (UniqueName: \"kubernetes.io/projected/f10e2a0a-df92-4a54-98e0-382851137211-kube-api-access-g552w\") pod \"auto-csr-approver-29566886-pm77h\" (UID: \"f10e2a0a-df92-4a54-98e0-382851137211\") " pod="openshift-infra/auto-csr-approver-29566886-pm77h" Mar 20 13:26:00 crc kubenswrapper[4973]: I0320 13:26:00.337138 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g552w\" (UniqueName: \"kubernetes.io/projected/f10e2a0a-df92-4a54-98e0-382851137211-kube-api-access-g552w\") pod \"auto-csr-approver-29566886-pm77h\" (UID: \"f10e2a0a-df92-4a54-98e0-382851137211\") " pod="openshift-infra/auto-csr-approver-29566886-pm77h" Mar 20 13:26:00 crc kubenswrapper[4973]: I0320 13:26:00.384788 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6w64"] Mar 20 13:26:00 crc kubenswrapper[4973]: I0320 13:26:00.385083 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v6w64" podUID="cf96ec01-20ab-4529-ad42-d839540c3d8e" containerName="registry-server" containerID="cri-o://d14ab0675610c48df55ab8865ff590c4341eb609c474e6b170a211cdf71127d1" gracePeriod=2 Mar 20 13:26:00 crc kubenswrapper[4973]: I0320 13:26:00.466899 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566886-pm77h" Mar 20 13:26:01 crc kubenswrapper[4973]: I0320 13:26:01.056768 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566886-pm77h"] Mar 20 13:26:01 crc kubenswrapper[4973]: I0320 13:26:01.644886 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6w64" Mar 20 13:26:01 crc kubenswrapper[4973]: I0320 13:26:01.677044 4973 generic.go:334] "Generic (PLEG): container finished" podID="cf96ec01-20ab-4529-ad42-d839540c3d8e" containerID="d14ab0675610c48df55ab8865ff590c4341eb609c474e6b170a211cdf71127d1" exitCode=0 Mar 20 13:26:01 crc kubenswrapper[4973]: I0320 13:26:01.677108 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6w64" event={"ID":"cf96ec01-20ab-4529-ad42-d839540c3d8e","Type":"ContainerDied","Data":"d14ab0675610c48df55ab8865ff590c4341eb609c474e6b170a211cdf71127d1"} Mar 20 13:26:01 crc kubenswrapper[4973]: I0320 13:26:01.677133 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6w64" event={"ID":"cf96ec01-20ab-4529-ad42-d839540c3d8e","Type":"ContainerDied","Data":"cc7012e124f77a10942cbdaaa120915159b84bc9fb54d8844d8874e4f19347b9"} Mar 20 13:26:01 crc kubenswrapper[4973]: I0320 13:26:01.677154 4973 scope.go:117] "RemoveContainer" containerID="d14ab0675610c48df55ab8865ff590c4341eb609c474e6b170a211cdf71127d1" Mar 20 13:26:01 crc kubenswrapper[4973]: I0320 13:26:01.677180 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6w64" Mar 20 13:26:01 crc kubenswrapper[4973]: I0320 13:26:01.679018 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566886-pm77h" event={"ID":"f10e2a0a-df92-4a54-98e0-382851137211","Type":"ContainerStarted","Data":"be523138d545ba7ab917aa07af7d2f6f13c6937676b958f77e64c492e8c00497"} Mar 20 13:26:01 crc kubenswrapper[4973]: I0320 13:26:01.690575 4973 scope.go:117] "RemoveContainer" containerID="64fd272b1967a552dd5c1b2a0a1c219d65392d227a876ae1f044aab3ffba27ac" Mar 20 13:26:01 crc kubenswrapper[4973]: I0320 13:26:01.713269 4973 scope.go:117] "RemoveContainer" containerID="b9838194169be997935f92587ef678e9ede3335115a9337b6d2b5b4dfa205c98" Mar 20 13:26:01 crc kubenswrapper[4973]: I0320 13:26:01.732603 4973 scope.go:117] "RemoveContainer" containerID="d14ab0675610c48df55ab8865ff590c4341eb609c474e6b170a211cdf71127d1" Mar 20 13:26:01 crc kubenswrapper[4973]: E0320 13:26:01.733070 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d14ab0675610c48df55ab8865ff590c4341eb609c474e6b170a211cdf71127d1\": container with ID starting with d14ab0675610c48df55ab8865ff590c4341eb609c474e6b170a211cdf71127d1 not found: ID does not exist" containerID="d14ab0675610c48df55ab8865ff590c4341eb609c474e6b170a211cdf71127d1" Mar 20 13:26:01 crc kubenswrapper[4973]: I0320 13:26:01.733103 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d14ab0675610c48df55ab8865ff590c4341eb609c474e6b170a211cdf71127d1"} err="failed to get container status \"d14ab0675610c48df55ab8865ff590c4341eb609c474e6b170a211cdf71127d1\": rpc error: code = NotFound desc = could not find container \"d14ab0675610c48df55ab8865ff590c4341eb609c474e6b170a211cdf71127d1\": container with ID starting with d14ab0675610c48df55ab8865ff590c4341eb609c474e6b170a211cdf71127d1 not found: ID does not exist" Mar 20 13:26:01 crc kubenswrapper[4973]: I0320 13:26:01.733129 4973 scope.go:117] "RemoveContainer" containerID="64fd272b1967a552dd5c1b2a0a1c219d65392d227a876ae1f044aab3ffba27ac" Mar 20 13:26:01 crc kubenswrapper[4973]: E0320 13:26:01.734283 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64fd272b1967a552dd5c1b2a0a1c219d65392d227a876ae1f044aab3ffba27ac\": container with ID starting with 64fd272b1967a552dd5c1b2a0a1c219d65392d227a876ae1f044aab3ffba27ac not found: ID does not exist" containerID="64fd272b1967a552dd5c1b2a0a1c219d65392d227a876ae1f044aab3ffba27ac" Mar 20 13:26:01 crc kubenswrapper[4973]: I0320 13:26:01.734306 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64fd272b1967a552dd5c1b2a0a1c219d65392d227a876ae1f044aab3ffba27ac"} err="failed to get container status \"64fd272b1967a552dd5c1b2a0a1c219d65392d227a876ae1f044aab3ffba27ac\": rpc error: code = NotFound desc = could not find container \"64fd272b1967a552dd5c1b2a0a1c219d65392d227a876ae1f044aab3ffba27ac\": container with ID starting with 64fd272b1967a552dd5c1b2a0a1c219d65392d227a876ae1f044aab3ffba27ac not found: ID does not exist" Mar 20 13:26:01 crc kubenswrapper[4973]: I0320 13:26:01.734349 4973 scope.go:117] "RemoveContainer" containerID="b9838194169be997935f92587ef678e9ede3335115a9337b6d2b5b4dfa205c98" Mar 20 13:26:01 crc kubenswrapper[4973]: E0320 13:26:01.734570 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9838194169be997935f92587ef678e9ede3335115a9337b6d2b5b4dfa205c98\": container with ID starting with b9838194169be997935f92587ef678e9ede3335115a9337b6d2b5b4dfa205c98 not found: ID does not exist" containerID="b9838194169be997935f92587ef678e9ede3335115a9337b6d2b5b4dfa205c98" Mar 20 13:26:01 crc kubenswrapper[4973]: I0320 13:26:01.734594 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9838194169be997935f92587ef678e9ede3335115a9337b6d2b5b4dfa205c98"} err="failed to get container status \"b9838194169be997935f92587ef678e9ede3335115a9337b6d2b5b4dfa205c98\": rpc error: code = NotFound desc = could not find container \"b9838194169be997935f92587ef678e9ede3335115a9337b6d2b5b4dfa205c98\": container with ID starting with b9838194169be997935f92587ef678e9ede3335115a9337b6d2b5b4dfa205c98 not found: ID does not exist" Mar 20 13:26:01 crc kubenswrapper[4973]: I0320 13:26:01.835231 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf96ec01-20ab-4529-ad42-d839540c3d8e-utilities\") pod \"cf96ec01-20ab-4529-ad42-d839540c3d8e\" (UID: \"cf96ec01-20ab-4529-ad42-d839540c3d8e\") " Mar 20 13:26:01 crc kubenswrapper[4973]: I0320 13:26:01.835308 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf96ec01-20ab-4529-ad42-d839540c3d8e-catalog-content\") pod \"cf96ec01-20ab-4529-ad42-d839540c3d8e\" (UID: \"cf96ec01-20ab-4529-ad42-d839540c3d8e\") " Mar 20 13:26:01 crc kubenswrapper[4973]: I0320 13:26:01.835353 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kw6t4\" (UniqueName: \"kubernetes.io/projected/cf96ec01-20ab-4529-ad42-d839540c3d8e-kube-api-access-kw6t4\") pod \"cf96ec01-20ab-4529-ad42-d839540c3d8e\" (UID: \"cf96ec01-20ab-4529-ad42-d839540c3d8e\") " Mar 20 13:26:01 crc kubenswrapper[4973]: I0320 13:26:01.837049 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf96ec01-20ab-4529-ad42-d839540c3d8e-utilities" (OuterVolumeSpecName: "utilities") pod "cf96ec01-20ab-4529-ad42-d839540c3d8e" (UID: "cf96ec01-20ab-4529-ad42-d839540c3d8e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:26:01 crc kubenswrapper[4973]: I0320 13:26:01.840954 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf96ec01-20ab-4529-ad42-d839540c3d8e-kube-api-access-kw6t4" (OuterVolumeSpecName: "kube-api-access-kw6t4") pod "cf96ec01-20ab-4529-ad42-d839540c3d8e" (UID: "cf96ec01-20ab-4529-ad42-d839540c3d8e"). InnerVolumeSpecName "kube-api-access-kw6t4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:26:01 crc kubenswrapper[4973]: I0320 13:26:01.862683 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf96ec01-20ab-4529-ad42-d839540c3d8e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cf96ec01-20ab-4529-ad42-d839540c3d8e" (UID: "cf96ec01-20ab-4529-ad42-d839540c3d8e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:26:01 crc kubenswrapper[4973]: I0320 13:26:01.936915 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cf96ec01-20ab-4529-ad42-d839540c3d8e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:01 crc kubenswrapper[4973]: I0320 13:26:01.936961 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cf96ec01-20ab-4529-ad42-d839540c3d8e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:01 crc kubenswrapper[4973]: I0320 13:26:01.936982 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kw6t4\" (UniqueName: \"kubernetes.io/projected/cf96ec01-20ab-4529-ad42-d839540c3d8e-kube-api-access-kw6t4\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:01 crc kubenswrapper[4973]: I0320 13:26:01.997024 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6w64"] Mar 20 13:26:02 crc kubenswrapper[4973]: I0320 13:26:02.000585 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6w64"] Mar 20 13:26:03 crc kubenswrapper[4973]: I0320 13:26:03.250546 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-5jzd6" Mar 20 13:26:03 crc kubenswrapper[4973]: I0320 13:26:03.250905 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-5jzd6" Mar 20 13:26:03 crc kubenswrapper[4973]: I0320 13:26:03.301546 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-5jzd6" Mar 20 13:26:03 crc kubenswrapper[4973]: I0320 13:26:03.404127 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-plc2f" Mar 20 13:26:03 crc kubenswrapper[4973]: I0320 13:26:03.404175 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-plc2f" Mar 20 13:26:03 crc kubenswrapper[4973]: I0320 13:26:03.449320 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-plc2f" Mar 20 13:26:03 crc kubenswrapper[4973]: I0320 13:26:03.681668 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ggbs9" Mar 20 13:26:03 crc kubenswrapper[4973]: I0320 13:26:03.681725 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ggbs9" Mar 20 13:26:03 crc kubenswrapper[4973]: I0320 13:26:03.691937 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566886-pm77h" event={"ID":"f10e2a0a-df92-4a54-98e0-382851137211","Type":"ContainerStarted","Data":"96bf1ef5249aa5946a08794ce5306ea3474c4aeeb813e65c813085230c6afea9"} Mar 20 13:26:03 crc kubenswrapper[4973]: I0320 13:26:03.708211 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566886-pm77h" podStartSLOduration=1.7162428950000002 podStartE2EDuration="3.708191295s" podCreationTimestamp="2026-03-20 13:26:00 +0000 UTC" firstStartedPulling="2026-03-20 13:26:01.061865294 +0000 UTC m=+281.805535038" lastFinishedPulling="2026-03-20 13:26:03.053813694 +0000 UTC m=+283.797483438" observedRunningTime="2026-03-20 13:26:03.704029448 +0000 UTC m=+284.447699212" watchObservedRunningTime="2026-03-20 13:26:03.708191295 +0000 UTC m=+284.451861049" Mar 20 13:26:03 crc kubenswrapper[4973]: I0320 13:26:03.723035 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ggbs9" Mar 20 13:26:03 crc kubenswrapper[4973]: I0320 13:26:03.727558 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-plc2f" Mar 20 13:26:03 crc kubenswrapper[4973]: I0320 13:26:03.740140 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-5jzd6" Mar 20 13:26:03 crc kubenswrapper[4973]: I0320 13:26:03.777130 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ggbs9" Mar 20 13:26:03 crc kubenswrapper[4973]: I0320 13:26:03.869651 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tsxk4" Mar 20 13:26:03 crc kubenswrapper[4973]: I0320 13:26:03.869701 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tsxk4" Mar 20 13:26:03 crc kubenswrapper[4973]: I0320 13:26:03.903178 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tsxk4" Mar 20 13:26:03 crc kubenswrapper[4973]: I0320 13:26:03.958773 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf96ec01-20ab-4529-ad42-d839540c3d8e" path="/var/lib/kubelet/pods/cf96ec01-20ab-4529-ad42-d839540c3d8e/volumes" Mar 20 13:26:04 crc kubenswrapper[4973]: I0320 13:26:04.700425 4973 generic.go:334] "Generic (PLEG): container finished" podID="f10e2a0a-df92-4a54-98e0-382851137211" containerID="96bf1ef5249aa5946a08794ce5306ea3474c4aeeb813e65c813085230c6afea9" exitCode=0 Mar 20 13:26:04 crc kubenswrapper[4973]: I0320 13:26:04.700484 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566886-pm77h" event={"ID":"f10e2a0a-df92-4a54-98e0-382851137211","Type":"ContainerDied","Data":"96bf1ef5249aa5946a08794ce5306ea3474c4aeeb813e65c813085230c6afea9"} Mar 20 13:26:04 crc kubenswrapper[4973]: I0320 13:26:04.744292 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tsxk4" Mar 20 13:26:05 crc kubenswrapper[4973]: I0320 13:26:05.410771 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c7jw4" Mar 20 13:26:05 crc kubenswrapper[4973]: I0320 13:26:05.457630 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c7jw4" Mar 20 13:26:05 crc kubenswrapper[4973]: I0320 13:26:05.575014 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ggbs9"] Mar 20 13:26:05 crc kubenswrapper[4973]: I0320 13:26:05.705848 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ggbs9" podUID="ce12ac02-5b58-44a3-a311-8cdd000ce41b" containerName="registry-server" containerID="cri-o://56301b50d1f1b544ae5620604a1077e1777ff8c1eface90be01b3eaa29194098" gracePeriod=2 Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.081294 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566886-pm77h" Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.104051 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g552w\" (UniqueName: \"kubernetes.io/projected/f10e2a0a-df92-4a54-98e0-382851137211-kube-api-access-g552w\") pod \"f10e2a0a-df92-4a54-98e0-382851137211\" (UID: \"f10e2a0a-df92-4a54-98e0-382851137211\") " Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.109559 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f10e2a0a-df92-4a54-98e0-382851137211-kube-api-access-g552w" (OuterVolumeSpecName: "kube-api-access-g552w") pod "f10e2a0a-df92-4a54-98e0-382851137211" (UID: "f10e2a0a-df92-4a54-98e0-382851137211"). InnerVolumeSpecName "kube-api-access-g552w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.205720 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g552w\" (UniqueName: \"kubernetes.io/projected/f10e2a0a-df92-4a54-98e0-382851137211-kube-api-access-g552w\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.469213 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6p25c" Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.507923 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6p25c" Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.708151 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ggbs9" Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.716285 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566886-pm77h" event={"ID":"f10e2a0a-df92-4a54-98e0-382851137211","Type":"ContainerDied","Data":"be523138d545ba7ab917aa07af7d2f6f13c6937676b958f77e64c492e8c00497"} Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.716343 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be523138d545ba7ab917aa07af7d2f6f13c6937676b958f77e64c492e8c00497" Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.716576 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566886-pm77h" Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.720451 4973 generic.go:334] "Generic (PLEG): container finished" podID="ce12ac02-5b58-44a3-a311-8cdd000ce41b" containerID="56301b50d1f1b544ae5620604a1077e1777ff8c1eface90be01b3eaa29194098" exitCode=0 Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.720569 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ggbs9" Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.720646 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ggbs9" event={"ID":"ce12ac02-5b58-44a3-a311-8cdd000ce41b","Type":"ContainerDied","Data":"56301b50d1f1b544ae5620604a1077e1777ff8c1eface90be01b3eaa29194098"} Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.720688 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ggbs9" event={"ID":"ce12ac02-5b58-44a3-a311-8cdd000ce41b","Type":"ContainerDied","Data":"83e2a1c677ebcc029e14180a3cd929bc6ce758f2315c324712c208527a996146"} Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.720746 4973 scope.go:117] "RemoveContainer" containerID="56301b50d1f1b544ae5620604a1077e1777ff8c1eface90be01b3eaa29194098" Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.749441 4973 scope.go:117] "RemoveContainer" containerID="f18ac80061681553e87225e10ffc68884a4d14fe18f713ab93dd8675ad54347c" Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.764235 4973 scope.go:117] "RemoveContainer" containerID="74110cbff82c888eabcbc295f453d16c580c1ff1fa2fcc580bff9ff44af29e18" Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.777343 4973 scope.go:117] "RemoveContainer" containerID="56301b50d1f1b544ae5620604a1077e1777ff8c1eface90be01b3eaa29194098" Mar 20 13:26:06 crc kubenswrapper[4973]: E0320 13:26:06.777749 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56301b50d1f1b544ae5620604a1077e1777ff8c1eface90be01b3eaa29194098\": container with ID starting with 56301b50d1f1b544ae5620604a1077e1777ff8c1eface90be01b3eaa29194098 not found: ID does not exist" containerID="56301b50d1f1b544ae5620604a1077e1777ff8c1eface90be01b3eaa29194098" Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.777778 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56301b50d1f1b544ae5620604a1077e1777ff8c1eface90be01b3eaa29194098"} err="failed to get container status \"56301b50d1f1b544ae5620604a1077e1777ff8c1eface90be01b3eaa29194098\": rpc error: code = NotFound desc = could not find container \"56301b50d1f1b544ae5620604a1077e1777ff8c1eface90be01b3eaa29194098\": container with ID starting with 56301b50d1f1b544ae5620604a1077e1777ff8c1eface90be01b3eaa29194098 not found: ID does not exist" Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.777818 4973 scope.go:117] "RemoveContainer" containerID="f18ac80061681553e87225e10ffc68884a4d14fe18f713ab93dd8675ad54347c" Mar 20 13:26:06 crc kubenswrapper[4973]: E0320 13:26:06.778068 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f18ac80061681553e87225e10ffc68884a4d14fe18f713ab93dd8675ad54347c\": container with ID starting with f18ac80061681553e87225e10ffc68884a4d14fe18f713ab93dd8675ad54347c not found: ID does not exist" containerID="f18ac80061681553e87225e10ffc68884a4d14fe18f713ab93dd8675ad54347c" Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.778092 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f18ac80061681553e87225e10ffc68884a4d14fe18f713ab93dd8675ad54347c"} err="failed to get container status \"f18ac80061681553e87225e10ffc68884a4d14fe18f713ab93dd8675ad54347c\": rpc error: code = NotFound desc = could not find container \"f18ac80061681553e87225e10ffc68884a4d14fe18f713ab93dd8675ad54347c\": container with ID starting with f18ac80061681553e87225e10ffc68884a4d14fe18f713ab93dd8675ad54347c not found: ID does not exist" Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.778110 4973 scope.go:117] "RemoveContainer" containerID="74110cbff82c888eabcbc295f453d16c580c1ff1fa2fcc580bff9ff44af29e18" Mar 20 13:26:06 crc kubenswrapper[4973]: E0320 13:26:06.778380 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74110cbff82c888eabcbc295f453d16c580c1ff1fa2fcc580bff9ff44af29e18\": container with ID starting with 74110cbff82c888eabcbc295f453d16c580c1ff1fa2fcc580bff9ff44af29e18 not found: ID does not exist" containerID="74110cbff82c888eabcbc295f453d16c580c1ff1fa2fcc580bff9ff44af29e18" Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.778412 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74110cbff82c888eabcbc295f453d16c580c1ff1fa2fcc580bff9ff44af29e18"} err="failed to get container status \"74110cbff82c888eabcbc295f453d16c580c1ff1fa2fcc580bff9ff44af29e18\": rpc error: code = NotFound desc = could not find container \"74110cbff82c888eabcbc295f453d16c580c1ff1fa2fcc580bff9ff44af29e18\": container with ID starting with 74110cbff82c888eabcbc295f453d16c580c1ff1fa2fcc580bff9ff44af29e18 not found: ID does not exist" Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.815659 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce12ac02-5b58-44a3-a311-8cdd000ce41b-catalog-content\") pod \"ce12ac02-5b58-44a3-a311-8cdd000ce41b\" (UID: \"ce12ac02-5b58-44a3-a311-8cdd000ce41b\") " Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.815733 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce12ac02-5b58-44a3-a311-8cdd000ce41b-utilities\") pod \"ce12ac02-5b58-44a3-a311-8cdd000ce41b\" (UID: \"ce12ac02-5b58-44a3-a311-8cdd000ce41b\") " Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.816624 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce12ac02-5b58-44a3-a311-8cdd000ce41b-utilities" (OuterVolumeSpecName: "utilities") pod "ce12ac02-5b58-44a3-a311-8cdd000ce41b" (UID: "ce12ac02-5b58-44a3-a311-8cdd000ce41b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.816685 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q72rj\" (UniqueName: \"kubernetes.io/projected/ce12ac02-5b58-44a3-a311-8cdd000ce41b-kube-api-access-q72rj\") pod \"ce12ac02-5b58-44a3-a311-8cdd000ce41b\" (UID: \"ce12ac02-5b58-44a3-a311-8cdd000ce41b\") " Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.817002 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ce12ac02-5b58-44a3-a311-8cdd000ce41b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.820187 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce12ac02-5b58-44a3-a311-8cdd000ce41b-kube-api-access-q72rj" (OuterVolumeSpecName: "kube-api-access-q72rj") pod "ce12ac02-5b58-44a3-a311-8cdd000ce41b" (UID: "ce12ac02-5b58-44a3-a311-8cdd000ce41b"). InnerVolumeSpecName "kube-api-access-q72rj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.861618 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ce12ac02-5b58-44a3-a311-8cdd000ce41b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ce12ac02-5b58-44a3-a311-8cdd000ce41b" (UID: "ce12ac02-5b58-44a3-a311-8cdd000ce41b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.886704 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qw9hm" Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.918296 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ce12ac02-5b58-44a3-a311-8cdd000ce41b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.918325 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q72rj\" (UniqueName: \"kubernetes.io/projected/ce12ac02-5b58-44a3-a311-8cdd000ce41b-kube-api-access-q72rj\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:06 crc kubenswrapper[4973]: I0320 13:26:06.923175 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qw9hm" Mar 20 13:26:07 crc kubenswrapper[4973]: I0320 13:26:07.046777 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ggbs9"] Mar 20 13:26:07 crc kubenswrapper[4973]: I0320 13:26:07.050577 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ggbs9"] Mar 20 13:26:07 crc kubenswrapper[4973]: I0320 13:26:07.391694 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tsxk4"] Mar 20 13:26:07 crc kubenswrapper[4973]: I0320 13:26:07.392113 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tsxk4" podUID="6848e4e5-1d17-4bf7-8dee-bbbeddedd07d" containerName="registry-server" containerID="cri-o://ace8c1615938595e76e498ed343fd570e52f308a349724467d1abe0e677a6ebb" gracePeriod=2 Mar 20 13:26:07 crc kubenswrapper[4973]: I0320 13:26:07.729819 4973 generic.go:334] "Generic (PLEG): container finished" podID="6848e4e5-1d17-4bf7-8dee-bbbeddedd07d" containerID="ace8c1615938595e76e498ed343fd570e52f308a349724467d1abe0e677a6ebb" exitCode=0 Mar 20 13:26:07 crc kubenswrapper[4973]: I0320 13:26:07.729911 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tsxk4" event={"ID":"6848e4e5-1d17-4bf7-8dee-bbbeddedd07d","Type":"ContainerDied","Data":"ace8c1615938595e76e498ed343fd570e52f308a349724467d1abe0e677a6ebb"} Mar 20 13:26:07 crc kubenswrapper[4973]: I0320 13:26:07.862067 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tsxk4" Mar 20 13:26:07 crc kubenswrapper[4973]: I0320 13:26:07.928952 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6848e4e5-1d17-4bf7-8dee-bbbeddedd07d-catalog-content\") pod \"6848e4e5-1d17-4bf7-8dee-bbbeddedd07d\" (UID: \"6848e4e5-1d17-4bf7-8dee-bbbeddedd07d\") " Mar 20 13:26:07 crc kubenswrapper[4973]: I0320 13:26:07.929030 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6848e4e5-1d17-4bf7-8dee-bbbeddedd07d-utilities\") pod \"6848e4e5-1d17-4bf7-8dee-bbbeddedd07d\" (UID: \"6848e4e5-1d17-4bf7-8dee-bbbeddedd07d\") " Mar 20 13:26:07 crc kubenswrapper[4973]: I0320 13:26:07.929089 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5cqsf\" (UniqueName: \"kubernetes.io/projected/6848e4e5-1d17-4bf7-8dee-bbbeddedd07d-kube-api-access-5cqsf\") pod \"6848e4e5-1d17-4bf7-8dee-bbbeddedd07d\" (UID: \"6848e4e5-1d17-4bf7-8dee-bbbeddedd07d\") " Mar 20 13:26:07 crc kubenswrapper[4973]: I0320 13:26:07.929980 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6848e4e5-1d17-4bf7-8dee-bbbeddedd07d-utilities" (OuterVolumeSpecName: "utilities") pod "6848e4e5-1d17-4bf7-8dee-bbbeddedd07d" (UID: "6848e4e5-1d17-4bf7-8dee-bbbeddedd07d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:26:07 crc kubenswrapper[4973]: I0320 13:26:07.933121 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6848e4e5-1d17-4bf7-8dee-bbbeddedd07d-kube-api-access-5cqsf" (OuterVolumeSpecName: "kube-api-access-5cqsf") pod "6848e4e5-1d17-4bf7-8dee-bbbeddedd07d" (UID: "6848e4e5-1d17-4bf7-8dee-bbbeddedd07d"). InnerVolumeSpecName "kube-api-access-5cqsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:26:07 crc kubenswrapper[4973]: I0320 13:26:07.959790 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce12ac02-5b58-44a3-a311-8cdd000ce41b" path="/var/lib/kubelet/pods/ce12ac02-5b58-44a3-a311-8cdd000ce41b/volumes" Mar 20 13:26:07 crc kubenswrapper[4973]: I0320 13:26:07.981144 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6848e4e5-1d17-4bf7-8dee-bbbeddedd07d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6848e4e5-1d17-4bf7-8dee-bbbeddedd07d" (UID: "6848e4e5-1d17-4bf7-8dee-bbbeddedd07d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:26:08 crc kubenswrapper[4973]: I0320 13:26:08.030875 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5cqsf\" (UniqueName: \"kubernetes.io/projected/6848e4e5-1d17-4bf7-8dee-bbbeddedd07d-kube-api-access-5cqsf\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:08 crc kubenswrapper[4973]: I0320 13:26:08.030911 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6848e4e5-1d17-4bf7-8dee-bbbeddedd07d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:08 crc kubenswrapper[4973]: I0320 13:26:08.030923 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6848e4e5-1d17-4bf7-8dee-bbbeddedd07d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:08 crc kubenswrapper[4973]: I0320 13:26:08.739908 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tsxk4" event={"ID":"6848e4e5-1d17-4bf7-8dee-bbbeddedd07d","Type":"ContainerDied","Data":"61393645fbc553ded40796977f6a6b61c7061c60374e54e0a4a7cf8712191dad"} Mar 20 13:26:08 crc kubenswrapper[4973]: I0320 13:26:08.739970 4973 scope.go:117] "RemoveContainer" containerID="ace8c1615938595e76e498ed343fd570e52f308a349724467d1abe0e677a6ebb" Mar 20 13:26:08 crc kubenswrapper[4973]: I0320 13:26:08.740038 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tsxk4" Mar 20 13:26:08 crc kubenswrapper[4973]: I0320 13:26:08.758008 4973 scope.go:117] "RemoveContainer" containerID="57fcdbdefa8d57bdc4dbd86ba592eb6e1557052c2fa7d15192881a337546bb0f" Mar 20 13:26:08 crc kubenswrapper[4973]: I0320 13:26:08.774451 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tsxk4"] Mar 20 13:26:08 crc kubenswrapper[4973]: I0320 13:26:08.774762 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tsxk4"] Mar 20 13:26:08 crc kubenswrapper[4973]: I0320 13:26:08.793982 4973 scope.go:117] "RemoveContainer" containerID="76d8537cdfc81c36d6474778ed43d692326ce6099b1f4fbc1a0adc5c195746d3" Mar 20 13:26:09 crc kubenswrapper[4973]: I0320 13:26:09.959440 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6848e4e5-1d17-4bf7-8dee-bbbeddedd07d" path="/var/lib/kubelet/pods/6848e4e5-1d17-4bf7-8dee-bbbeddedd07d/volumes" Mar 20 13:26:09 crc kubenswrapper[4973]: I0320 13:26:09.978679 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qw9hm"] Mar 20 13:26:09 crc kubenswrapper[4973]: I0320 13:26:09.978962 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qw9hm" podUID="af380bf0-7c0d-4790-8ae4-19697763a37a" containerName="registry-server" containerID="cri-o://04cced65bee69b50ffa6b1f7a1c8897a39ee1d8f7772e201a35a0e6738271ca2" gracePeriod=2 Mar 20 13:26:10 crc kubenswrapper[4973]: I0320 13:26:10.476302 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qw9hm" Mar 20 13:26:10 crc kubenswrapper[4973]: I0320 13:26:10.562961 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwtgk\" (UniqueName: \"kubernetes.io/projected/af380bf0-7c0d-4790-8ae4-19697763a37a-kube-api-access-pwtgk\") pod \"af380bf0-7c0d-4790-8ae4-19697763a37a\" (UID: \"af380bf0-7c0d-4790-8ae4-19697763a37a\") " Mar 20 13:26:10 crc kubenswrapper[4973]: I0320 13:26:10.563024 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af380bf0-7c0d-4790-8ae4-19697763a37a-catalog-content\") pod \"af380bf0-7c0d-4790-8ae4-19697763a37a\" (UID: \"af380bf0-7c0d-4790-8ae4-19697763a37a\") " Mar 20 13:26:10 crc kubenswrapper[4973]: I0320 13:26:10.563116 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af380bf0-7c0d-4790-8ae4-19697763a37a-utilities\") pod \"af380bf0-7c0d-4790-8ae4-19697763a37a\" (UID: \"af380bf0-7c0d-4790-8ae4-19697763a37a\") " Mar 20 13:26:10 crc kubenswrapper[4973]: I0320 13:26:10.564144 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af380bf0-7c0d-4790-8ae4-19697763a37a-utilities" (OuterVolumeSpecName: "utilities") pod "af380bf0-7c0d-4790-8ae4-19697763a37a" (UID: "af380bf0-7c0d-4790-8ae4-19697763a37a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:26:10 crc kubenswrapper[4973]: I0320 13:26:10.570309 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af380bf0-7c0d-4790-8ae4-19697763a37a-kube-api-access-pwtgk" (OuterVolumeSpecName: "kube-api-access-pwtgk") pod "af380bf0-7c0d-4790-8ae4-19697763a37a" (UID: "af380bf0-7c0d-4790-8ae4-19697763a37a"). InnerVolumeSpecName "kube-api-access-pwtgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:26:10 crc kubenswrapper[4973]: I0320 13:26:10.664439 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af380bf0-7c0d-4790-8ae4-19697763a37a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:10 crc kubenswrapper[4973]: I0320 13:26:10.664479 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwtgk\" (UniqueName: \"kubernetes.io/projected/af380bf0-7c0d-4790-8ae4-19697763a37a-kube-api-access-pwtgk\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:10 crc kubenswrapper[4973]: I0320 13:26:10.684702 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af380bf0-7c0d-4790-8ae4-19697763a37a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af380bf0-7c0d-4790-8ae4-19697763a37a" (UID: "af380bf0-7c0d-4790-8ae4-19697763a37a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:26:10 crc kubenswrapper[4973]: I0320 13:26:10.758840 4973 generic.go:334] "Generic (PLEG): container finished" podID="af380bf0-7c0d-4790-8ae4-19697763a37a" containerID="04cced65bee69b50ffa6b1f7a1c8897a39ee1d8f7772e201a35a0e6738271ca2" exitCode=0 Mar 20 13:26:10 crc kubenswrapper[4973]: I0320 13:26:10.758888 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw9hm" event={"ID":"af380bf0-7c0d-4790-8ae4-19697763a37a","Type":"ContainerDied","Data":"04cced65bee69b50ffa6b1f7a1c8897a39ee1d8f7772e201a35a0e6738271ca2"} Mar 20 13:26:10 crc kubenswrapper[4973]: I0320 13:26:10.758907 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qw9hm" Mar 20 13:26:10 crc kubenswrapper[4973]: I0320 13:26:10.758925 4973 scope.go:117] "RemoveContainer" containerID="04cced65bee69b50ffa6b1f7a1c8897a39ee1d8f7772e201a35a0e6738271ca2" Mar 20 13:26:10 crc kubenswrapper[4973]: I0320 13:26:10.758915 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw9hm" event={"ID":"af380bf0-7c0d-4790-8ae4-19697763a37a","Type":"ContainerDied","Data":"3d684791e3e11579b746a59f772541ac3a689d302adef5db3402cb85cd0b432c"} Mar 20 13:26:10 crc kubenswrapper[4973]: I0320 13:26:10.764823 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af380bf0-7c0d-4790-8ae4-19697763a37a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:10 crc kubenswrapper[4973]: I0320 13:26:10.781263 4973 scope.go:117] "RemoveContainer" containerID="ab09514a390f3f2a225800d5e736873b2145b086b315c63dc7a6e6db9e60e3f7" Mar 20 13:26:10 crc kubenswrapper[4973]: I0320 13:26:10.783510 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qw9hm"] Mar 20 13:26:10 crc kubenswrapper[4973]: I0320 13:26:10.786441 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qw9hm"] Mar 20 13:26:10 crc kubenswrapper[4973]: I0320 13:26:10.810922 4973 scope.go:117] "RemoveContainer" containerID="5570c9afa0e5138df9257fd7b5615cfdab730c2e7fc1587ab05b1901e37f61b3" Mar 20 13:26:10 crc kubenswrapper[4973]: I0320 13:26:10.821455 4973 scope.go:117] "RemoveContainer" containerID="04cced65bee69b50ffa6b1f7a1c8897a39ee1d8f7772e201a35a0e6738271ca2" Mar 20 13:26:10 crc kubenswrapper[4973]: E0320 13:26:10.821764 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04cced65bee69b50ffa6b1f7a1c8897a39ee1d8f7772e201a35a0e6738271ca2\": container with ID starting with 04cced65bee69b50ffa6b1f7a1c8897a39ee1d8f7772e201a35a0e6738271ca2 not found: ID does not exist" containerID="04cced65bee69b50ffa6b1f7a1c8897a39ee1d8f7772e201a35a0e6738271ca2" Mar 20 13:26:10 crc kubenswrapper[4973]: I0320 13:26:10.821808 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04cced65bee69b50ffa6b1f7a1c8897a39ee1d8f7772e201a35a0e6738271ca2"} err="failed to get container status \"04cced65bee69b50ffa6b1f7a1c8897a39ee1d8f7772e201a35a0e6738271ca2\": rpc error: code = NotFound desc = could not find container \"04cced65bee69b50ffa6b1f7a1c8897a39ee1d8f7772e201a35a0e6738271ca2\": container with ID starting with 04cced65bee69b50ffa6b1f7a1c8897a39ee1d8f7772e201a35a0e6738271ca2 not found: ID does not exist" Mar 20 13:26:10 crc kubenswrapper[4973]: I0320 13:26:10.821839 4973 scope.go:117] "RemoveContainer" containerID="ab09514a390f3f2a225800d5e736873b2145b086b315c63dc7a6e6db9e60e3f7" Mar 20 13:26:10 crc kubenswrapper[4973]: E0320 13:26:10.822146 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab09514a390f3f2a225800d5e736873b2145b086b315c63dc7a6e6db9e60e3f7\": container with ID starting with ab09514a390f3f2a225800d5e736873b2145b086b315c63dc7a6e6db9e60e3f7 not found: ID does not exist" containerID="ab09514a390f3f2a225800d5e736873b2145b086b315c63dc7a6e6db9e60e3f7" Mar 20 13:26:10 crc kubenswrapper[4973]: I0320 13:26:10.822179 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab09514a390f3f2a225800d5e736873b2145b086b315c63dc7a6e6db9e60e3f7"} err="failed to get container status \"ab09514a390f3f2a225800d5e736873b2145b086b315c63dc7a6e6db9e60e3f7\": rpc error: code = NotFound desc = could not find container \"ab09514a390f3f2a225800d5e736873b2145b086b315c63dc7a6e6db9e60e3f7\": container with ID starting with ab09514a390f3f2a225800d5e736873b2145b086b315c63dc7a6e6db9e60e3f7 not found: ID does not exist" Mar 20 13:26:10 crc kubenswrapper[4973]: I0320 13:26:10.822196 4973 scope.go:117] "RemoveContainer" containerID="5570c9afa0e5138df9257fd7b5615cfdab730c2e7fc1587ab05b1901e37f61b3" Mar 20 13:26:10 crc kubenswrapper[4973]: E0320 13:26:10.822662 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5570c9afa0e5138df9257fd7b5615cfdab730c2e7fc1587ab05b1901e37f61b3\": container with ID starting with 5570c9afa0e5138df9257fd7b5615cfdab730c2e7fc1587ab05b1901e37f61b3 not found: ID does not exist" containerID="5570c9afa0e5138df9257fd7b5615cfdab730c2e7fc1587ab05b1901e37f61b3" Mar 20 13:26:10 crc kubenswrapper[4973]: I0320 13:26:10.822693 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5570c9afa0e5138df9257fd7b5615cfdab730c2e7fc1587ab05b1901e37f61b3"} err="failed to get container status \"5570c9afa0e5138df9257fd7b5615cfdab730c2e7fc1587ab05b1901e37f61b3\": rpc error: code = NotFound desc = could not find container \"5570c9afa0e5138df9257fd7b5615cfdab730c2e7fc1587ab05b1901e37f61b3\": container with ID starting with 5570c9afa0e5138df9257fd7b5615cfdab730c2e7fc1587ab05b1901e37f61b3 not found: ID does not exist" Mar 20 13:26:11 crc kubenswrapper[4973]: I0320 13:26:11.922901 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6ddb884dbd-9wz58"] Mar 20 13:26:11 crc kubenswrapper[4973]: I0320 13:26:11.923645 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6ddb884dbd-9wz58" podUID="cde7a8af-714b-4f14-816d-173bf104fa67" containerName="controller-manager" containerID="cri-o://e4e5dfde26a6b0977cbc4898fa2aa34bc869fc8eadc8ee97d436beeade2b6686" gracePeriod=30 Mar 20 13:26:11 crc kubenswrapper[4973]: I0320 13:26:11.964678 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af380bf0-7c0d-4790-8ae4-19697763a37a" path="/var/lib/kubelet/pods/af380bf0-7c0d-4790-8ae4-19697763a37a/volumes" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.004874 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv"] Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.005088 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv" podUID="c3c687f3-1259-421f-a8cd-b6cde5d00784" containerName="route-controller-manager" containerID="cri-o://e8420f0cba3c466cfae89d55b3be5d04eea388997e9955dc93d0240748baa26a" gracePeriod=30 Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.467578 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.473173 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6ddb884dbd-9wz58" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.482800 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3c687f3-1259-421f-a8cd-b6cde5d00784-config\") pod \"c3c687f3-1259-421f-a8cd-b6cde5d00784\" (UID: \"c3c687f3-1259-421f-a8cd-b6cde5d00784\") " Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.482861 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3c687f3-1259-421f-a8cd-b6cde5d00784-client-ca\") pod \"c3c687f3-1259-421f-a8cd-b6cde5d00784\" (UID: \"c3c687f3-1259-421f-a8cd-b6cde5d00784\") " Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.482904 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qb68\" (UniqueName: \"kubernetes.io/projected/cde7a8af-714b-4f14-816d-173bf104fa67-kube-api-access-2qb68\") pod \"cde7a8af-714b-4f14-816d-173bf104fa67\" (UID: \"cde7a8af-714b-4f14-816d-173bf104fa67\") " Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.482964 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-spvlc\" (UniqueName: \"kubernetes.io/projected/c3c687f3-1259-421f-a8cd-b6cde5d00784-kube-api-access-spvlc\") pod \"c3c687f3-1259-421f-a8cd-b6cde5d00784\" (UID: \"c3c687f3-1259-421f-a8cd-b6cde5d00784\") " Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.483035 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cde7a8af-714b-4f14-816d-173bf104fa67-proxy-ca-bundles\") pod \"cde7a8af-714b-4f14-816d-173bf104fa67\" (UID: \"cde7a8af-714b-4f14-816d-173bf104fa67\") " Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.483075 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cde7a8af-714b-4f14-816d-173bf104fa67-config\") pod \"cde7a8af-714b-4f14-816d-173bf104fa67\" (UID: \"cde7a8af-714b-4f14-816d-173bf104fa67\") " Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.483121 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cde7a8af-714b-4f14-816d-173bf104fa67-serving-cert\") pod \"cde7a8af-714b-4f14-816d-173bf104fa67\" (UID: \"cde7a8af-714b-4f14-816d-173bf104fa67\") " Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.483178 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cde7a8af-714b-4f14-816d-173bf104fa67-client-ca\") pod \"cde7a8af-714b-4f14-816d-173bf104fa67\" (UID: \"cde7a8af-714b-4f14-816d-173bf104fa67\") " Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.483231 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3c687f3-1259-421f-a8cd-b6cde5d00784-serving-cert\") pod \"c3c687f3-1259-421f-a8cd-b6cde5d00784\" (UID: \"c3c687f3-1259-421f-a8cd-b6cde5d00784\") " Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.484647 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cde7a8af-714b-4f14-816d-173bf104fa67-client-ca" (OuterVolumeSpecName: "client-ca") pod "cde7a8af-714b-4f14-816d-173bf104fa67" (UID: "cde7a8af-714b-4f14-816d-173bf104fa67"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.484762 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cde7a8af-714b-4f14-816d-173bf104fa67-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "cde7a8af-714b-4f14-816d-173bf104fa67" (UID: "cde7a8af-714b-4f14-816d-173bf104fa67"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.485408 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cde7a8af-714b-4f14-816d-173bf104fa67-config" (OuterVolumeSpecName: "config") pod "cde7a8af-714b-4f14-816d-173bf104fa67" (UID: "cde7a8af-714b-4f14-816d-173bf104fa67"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.485566 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3c687f3-1259-421f-a8cd-b6cde5d00784-client-ca" (OuterVolumeSpecName: "client-ca") pod "c3c687f3-1259-421f-a8cd-b6cde5d00784" (UID: "c3c687f3-1259-421f-a8cd-b6cde5d00784"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.485679 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3c687f3-1259-421f-a8cd-b6cde5d00784-config" (OuterVolumeSpecName: "config") pod "c3c687f3-1259-421f-a8cd-b6cde5d00784" (UID: "c3c687f3-1259-421f-a8cd-b6cde5d00784"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.489913 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cde7a8af-714b-4f14-816d-173bf104fa67-kube-api-access-2qb68" (OuterVolumeSpecName: "kube-api-access-2qb68") pod "cde7a8af-714b-4f14-816d-173bf104fa67" (UID: "cde7a8af-714b-4f14-816d-173bf104fa67"). InnerVolumeSpecName "kube-api-access-2qb68". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.494922 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3c687f3-1259-421f-a8cd-b6cde5d00784-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c3c687f3-1259-421f-a8cd-b6cde5d00784" (UID: "c3c687f3-1259-421f-a8cd-b6cde5d00784"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.494964 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cde7a8af-714b-4f14-816d-173bf104fa67-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "cde7a8af-714b-4f14-816d-173bf104fa67" (UID: "cde7a8af-714b-4f14-816d-173bf104fa67"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.496410 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3c687f3-1259-421f-a8cd-b6cde5d00784-kube-api-access-spvlc" (OuterVolumeSpecName: "kube-api-access-spvlc") pod "c3c687f3-1259-421f-a8cd-b6cde5d00784" (UID: "c3c687f3-1259-421f-a8cd-b6cde5d00784"). InnerVolumeSpecName "kube-api-access-spvlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.585927 4973 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cde7a8af-714b-4f14-816d-173bf104fa67-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.585979 4973 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/cde7a8af-714b-4f14-816d-173bf104fa67-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.585995 4973 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c3c687f3-1259-421f-a8cd-b6cde5d00784-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.586009 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c3c687f3-1259-421f-a8cd-b6cde5d00784-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.586022 4973 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c3c687f3-1259-421f-a8cd-b6cde5d00784-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.586038 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qb68\" (UniqueName: \"kubernetes.io/projected/cde7a8af-714b-4f14-816d-173bf104fa67-kube-api-access-2qb68\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.586058 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-spvlc\" (UniqueName: \"kubernetes.io/projected/c3c687f3-1259-421f-a8cd-b6cde5d00784-kube-api-access-spvlc\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.586073 4973 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/cde7a8af-714b-4f14-816d-173bf104fa67-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.586091 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cde7a8af-714b-4f14-816d-173bf104fa67-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.774553 4973 generic.go:334] "Generic (PLEG): container finished" podID="c3c687f3-1259-421f-a8cd-b6cde5d00784" containerID="e8420f0cba3c466cfae89d55b3be5d04eea388997e9955dc93d0240748baa26a" exitCode=0 Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.774632 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.774673 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv" event={"ID":"c3c687f3-1259-421f-a8cd-b6cde5d00784","Type":"ContainerDied","Data":"e8420f0cba3c466cfae89d55b3be5d04eea388997e9955dc93d0240748baa26a"} Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.774738 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv" event={"ID":"c3c687f3-1259-421f-a8cd-b6cde5d00784","Type":"ContainerDied","Data":"23922da399645c983172da906b2cecc3348365aecd4423cd152c3225fe1630e1"} Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.774765 4973 scope.go:117] "RemoveContainer" containerID="e8420f0cba3c466cfae89d55b3be5d04eea388997e9955dc93d0240748baa26a" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.778794 4973 generic.go:334] "Generic (PLEG): container finished" podID="cde7a8af-714b-4f14-816d-173bf104fa67" containerID="e4e5dfde26a6b0977cbc4898fa2aa34bc869fc8eadc8ee97d436beeade2b6686" exitCode=0 Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.778840 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6ddb884dbd-9wz58" event={"ID":"cde7a8af-714b-4f14-816d-173bf104fa67","Type":"ContainerDied","Data":"e4e5dfde26a6b0977cbc4898fa2aa34bc869fc8eadc8ee97d436beeade2b6686"} Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.778868 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6ddb884dbd-9wz58" event={"ID":"cde7a8af-714b-4f14-816d-173bf104fa67","Type":"ContainerDied","Data":"74d281c1066276d47598997c1e8aa469922a54f10c0cac11f75a39a70dc140ec"} Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.778877 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6ddb884dbd-9wz58" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.794141 4973 scope.go:117] "RemoveContainer" containerID="e8420f0cba3c466cfae89d55b3be5d04eea388997e9955dc93d0240748baa26a" Mar 20 13:26:12 crc kubenswrapper[4973]: E0320 13:26:12.794612 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8420f0cba3c466cfae89d55b3be5d04eea388997e9955dc93d0240748baa26a\": container with ID starting with e8420f0cba3c466cfae89d55b3be5d04eea388997e9955dc93d0240748baa26a not found: ID does not exist" containerID="e8420f0cba3c466cfae89d55b3be5d04eea388997e9955dc93d0240748baa26a" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.794658 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8420f0cba3c466cfae89d55b3be5d04eea388997e9955dc93d0240748baa26a"} err="failed to get container status \"e8420f0cba3c466cfae89d55b3be5d04eea388997e9955dc93d0240748baa26a\": rpc error: code = NotFound desc = could not find container \"e8420f0cba3c466cfae89d55b3be5d04eea388997e9955dc93d0240748baa26a\": container with ID starting with e8420f0cba3c466cfae89d55b3be5d04eea388997e9955dc93d0240748baa26a not found: ID does not exist" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.794690 4973 scope.go:117] "RemoveContainer" containerID="e4e5dfde26a6b0977cbc4898fa2aa34bc869fc8eadc8ee97d436beeade2b6686" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.810298 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv"] Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.812660 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5799975c57-rf5wv"] Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.817816 4973 scope.go:117] "RemoveContainer" containerID="e4e5dfde26a6b0977cbc4898fa2aa34bc869fc8eadc8ee97d436beeade2b6686" Mar 20 13:26:12 crc kubenswrapper[4973]: E0320 13:26:12.819036 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4e5dfde26a6b0977cbc4898fa2aa34bc869fc8eadc8ee97d436beeade2b6686\": container with ID starting with e4e5dfde26a6b0977cbc4898fa2aa34bc869fc8eadc8ee97d436beeade2b6686 not found: ID does not exist" containerID="e4e5dfde26a6b0977cbc4898fa2aa34bc869fc8eadc8ee97d436beeade2b6686" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.819068 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4e5dfde26a6b0977cbc4898fa2aa34bc869fc8eadc8ee97d436beeade2b6686"} err="failed to get container status \"e4e5dfde26a6b0977cbc4898fa2aa34bc869fc8eadc8ee97d436beeade2b6686\": rpc error: code = NotFound desc = could not find container \"e4e5dfde26a6b0977cbc4898fa2aa34bc869fc8eadc8ee97d436beeade2b6686\": container with ID starting with e4e5dfde26a6b0977cbc4898fa2aa34bc869fc8eadc8ee97d436beeade2b6686 not found: ID does not exist" Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.827597 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6ddb884dbd-9wz58"] Mar 20 13:26:12 crc kubenswrapper[4973]: I0320 13:26:12.830123 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6ddb884dbd-9wz58"] Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.321038 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.321116 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.321173 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.322078 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc"} pod="openshift-machine-config-operator/machine-config-daemon-qlztx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.322307 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" containerID="cri-o://abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc" gracePeriod=600 Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.519325 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6f85f9899-lkzwb"] Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.519959 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6848e4e5-1d17-4bf7-8dee-bbbeddedd07d" containerName="extract-utilities" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.519977 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="6848e4e5-1d17-4bf7-8dee-bbbeddedd07d" containerName="extract-utilities" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.519989 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6848e4e5-1d17-4bf7-8dee-bbbeddedd07d" containerName="extract-content" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.519997 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="6848e4e5-1d17-4bf7-8dee-bbbeddedd07d" containerName="extract-content" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.520014 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce12ac02-5b58-44a3-a311-8cdd000ce41b" containerName="registry-server" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.520024 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce12ac02-5b58-44a3-a311-8cdd000ce41b" containerName="registry-server" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.520042 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f10e2a0a-df92-4a54-98e0-382851137211" containerName="oc" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.520051 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="f10e2a0a-df92-4a54-98e0-382851137211" containerName="oc" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.520068 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af380bf0-7c0d-4790-8ae4-19697763a37a" containerName="extract-content" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.520077 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="af380bf0-7c0d-4790-8ae4-19697763a37a" containerName="extract-content" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.520094 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce12ac02-5b58-44a3-a311-8cdd000ce41b" containerName="extract-content" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.520104 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce12ac02-5b58-44a3-a311-8cdd000ce41b" containerName="extract-content" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.520126 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6848e4e5-1d17-4bf7-8dee-bbbeddedd07d" containerName="registry-server" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.520134 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="6848e4e5-1d17-4bf7-8dee-bbbeddedd07d" containerName="registry-server" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.520152 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf96ec01-20ab-4529-ad42-d839540c3d8e" containerName="extract-utilities" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.520161 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf96ec01-20ab-4529-ad42-d839540c3d8e" containerName="extract-utilities" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.520173 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af380bf0-7c0d-4790-8ae4-19697763a37a" containerName="extract-utilities" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.520185 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="af380bf0-7c0d-4790-8ae4-19697763a37a" containerName="extract-utilities" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.520199 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf96ec01-20ab-4529-ad42-d839540c3d8e" containerName="extract-content" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.520207 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf96ec01-20ab-4529-ad42-d839540c3d8e" containerName="extract-content" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.520216 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf96ec01-20ab-4529-ad42-d839540c3d8e" containerName="registry-server" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.520228 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf96ec01-20ab-4529-ad42-d839540c3d8e" containerName="registry-server" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.520239 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cde7a8af-714b-4f14-816d-173bf104fa67" containerName="controller-manager" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.520247 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="cde7a8af-714b-4f14-816d-173bf104fa67" containerName="controller-manager" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.520261 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af380bf0-7c0d-4790-8ae4-19697763a37a" containerName="registry-server" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.520267 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="af380bf0-7c0d-4790-8ae4-19697763a37a" containerName="registry-server" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.520280 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3c687f3-1259-421f-a8cd-b6cde5d00784" containerName="route-controller-manager" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.520286 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3c687f3-1259-421f-a8cd-b6cde5d00784" containerName="route-controller-manager" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.520298 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce12ac02-5b58-44a3-a311-8cdd000ce41b" containerName="extract-utilities" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.520304 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce12ac02-5b58-44a3-a311-8cdd000ce41b" containerName="extract-utilities" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.520512 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="6848e4e5-1d17-4bf7-8dee-bbbeddedd07d" containerName="registry-server" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.520524 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3c687f3-1259-421f-a8cd-b6cde5d00784" containerName="route-controller-manager" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.520540 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="f10e2a0a-df92-4a54-98e0-382851137211" containerName="oc" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.520552 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce12ac02-5b58-44a3-a311-8cdd000ce41b" containerName="registry-server" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.520567 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="af380bf0-7c0d-4790-8ae4-19697763a37a" containerName="registry-server" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.520579 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf96ec01-20ab-4529-ad42-d839540c3d8e" containerName="registry-server" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.520592 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="cde7a8af-714b-4f14-816d-173bf104fa67" containerName="controller-manager" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.521108 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.528706 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.528856 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.529495 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.529748 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.529892 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.529954 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.534301 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h"] Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.536593 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.537505 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.538292 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.545037 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.545044 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.546427 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.546643 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.546976 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.549647 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f85f9899-lkzwb"] Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.556657 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h"] Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.608807 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e04b3fba-1427-496a-b880-f61867a2c3ac-config\") pod \"controller-manager-6f85f9899-lkzwb\" (UID: \"e04b3fba-1427-496a-b880-f61867a2c3ac\") " pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.608880 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/451656d5-3bd4-402b-98b9-202b3ac829e1-config\") pod \"route-controller-manager-b56c97db-xrm2h\" (UID: \"451656d5-3bd4-402b-98b9-202b3ac829e1\") " pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.608935 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbktw\" (UniqueName: \"kubernetes.io/projected/e04b3fba-1427-496a-b880-f61867a2c3ac-kube-api-access-cbktw\") pod \"controller-manager-6f85f9899-lkzwb\" (UID: \"e04b3fba-1427-496a-b880-f61867a2c3ac\") " pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.608972 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/451656d5-3bd4-402b-98b9-202b3ac829e1-serving-cert\") pod \"route-controller-manager-b56c97db-xrm2h\" (UID: \"451656d5-3bd4-402b-98b9-202b3ac829e1\") " pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.609014 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5zd5\" (UniqueName: \"kubernetes.io/projected/451656d5-3bd4-402b-98b9-202b3ac829e1-kube-api-access-n5zd5\") pod \"route-controller-manager-b56c97db-xrm2h\" (UID: \"451656d5-3bd4-402b-98b9-202b3ac829e1\") " pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.609041 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/451656d5-3bd4-402b-98b9-202b3ac829e1-client-ca\") pod \"route-controller-manager-b56c97db-xrm2h\" (UID: \"451656d5-3bd4-402b-98b9-202b3ac829e1\") " pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.609066 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e04b3fba-1427-496a-b880-f61867a2c3ac-serving-cert\") pod \"controller-manager-6f85f9899-lkzwb\" (UID: \"e04b3fba-1427-496a-b880-f61867a2c3ac\") " pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.609092 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e04b3fba-1427-496a-b880-f61867a2c3ac-proxy-ca-bundles\") pod \"controller-manager-6f85f9899-lkzwb\" (UID: \"e04b3fba-1427-496a-b880-f61867a2c3ac\") " pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.609137 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e04b3fba-1427-496a-b880-f61867a2c3ac-client-ca\") pod \"controller-manager-6f85f9899-lkzwb\" (UID: \"e04b3fba-1427-496a-b880-f61867a2c3ac\") " pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.703034 4973 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.704151 4973 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.704321 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.704620 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d" gracePeriod=15 Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.704696 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168" gracePeriod=15 Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.704786 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://ae0f13493cd2874f6e19f0ced0646d2902d6cc172ca6b1560a28949984bebc70" gracePeriod=15 Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.704759 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b" gracePeriod=15 Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.704936 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a" gracePeriod=15 Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.705730 4973 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.706107 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.706142 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.706161 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.706173 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.706191 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.706205 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.706233 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.706246 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.706262 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.706275 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.706296 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.706307 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.706323 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.706341 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.706412 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.706426 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.706445 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.706457 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.706616 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.706635 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.706654 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.706667 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.706684 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.706697 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.706711 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.706729 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.706929 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.706946 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.707117 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.710022 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/451656d5-3bd4-402b-98b9-202b3ac829e1-client-ca\") pod \"route-controller-manager-b56c97db-xrm2h\" (UID: \"451656d5-3bd4-402b-98b9-202b3ac829e1\") " pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.710071 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e04b3fba-1427-496a-b880-f61867a2c3ac-serving-cert\") pod \"controller-manager-6f85f9899-lkzwb\" (UID: \"e04b3fba-1427-496a-b880-f61867a2c3ac\") " pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.710103 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e04b3fba-1427-496a-b880-f61867a2c3ac-proxy-ca-bundles\") pod \"controller-manager-6f85f9899-lkzwb\" (UID: \"e04b3fba-1427-496a-b880-f61867a2c3ac\") " pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.710149 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e04b3fba-1427-496a-b880-f61867a2c3ac-client-ca\") pod \"controller-manager-6f85f9899-lkzwb\" (UID: \"e04b3fba-1427-496a-b880-f61867a2c3ac\") " pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.710185 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e04b3fba-1427-496a-b880-f61867a2c3ac-config\") pod \"controller-manager-6f85f9899-lkzwb\" (UID: \"e04b3fba-1427-496a-b880-f61867a2c3ac\") " pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.710227 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/451656d5-3bd4-402b-98b9-202b3ac829e1-config\") pod \"route-controller-manager-b56c97db-xrm2h\" (UID: \"451656d5-3bd4-402b-98b9-202b3ac829e1\") " pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.710261 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbktw\" (UniqueName: \"kubernetes.io/projected/e04b3fba-1427-496a-b880-f61867a2c3ac-kube-api-access-cbktw\") pod \"controller-manager-6f85f9899-lkzwb\" (UID: \"e04b3fba-1427-496a-b880-f61867a2c3ac\") " pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.710290 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/451656d5-3bd4-402b-98b9-202b3ac829e1-serving-cert\") pod \"route-controller-manager-b56c97db-xrm2h\" (UID: \"451656d5-3bd4-402b-98b9-202b3ac829e1\") " pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.710317 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5zd5\" (UniqueName: \"kubernetes.io/projected/451656d5-3bd4-402b-98b9-202b3ac829e1-kube-api-access-n5zd5\") pod \"route-controller-manager-b56c97db-xrm2h\" (UID: \"451656d5-3bd4-402b-98b9-202b3ac829e1\") " pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.711307 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e04b3fba-1427-496a-b880-f61867a2c3ac-client-ca\") pod \"controller-manager-6f85f9899-lkzwb\" (UID: \"e04b3fba-1427-496a-b880-f61867a2c3ac\") " pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.711689 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e04b3fba-1427-496a-b880-f61867a2c3ac-proxy-ca-bundles\") pod \"controller-manager-6f85f9899-lkzwb\" (UID: \"e04b3fba-1427-496a-b880-f61867a2c3ac\") " pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.712545 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/451656d5-3bd4-402b-98b9-202b3ac829e1-client-ca\") pod \"route-controller-manager-b56c97db-xrm2h\" (UID: \"451656d5-3bd4-402b-98b9-202b3ac829e1\") " pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.718693 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e04b3fba-1427-496a-b880-f61867a2c3ac-config\") pod \"controller-manager-6f85f9899-lkzwb\" (UID: \"e04b3fba-1427-496a-b880-f61867a2c3ac\") " pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.724285 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e04b3fba-1427-496a-b880-f61867a2c3ac-serving-cert\") pod \"controller-manager-6f85f9899-lkzwb\" (UID: \"e04b3fba-1427-496a-b880-f61867a2c3ac\") " pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.725942 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/451656d5-3bd4-402b-98b9-202b3ac829e1-config\") pod \"route-controller-manager-b56c97db-xrm2h\" (UID: \"451656d5-3bd4-402b-98b9-202b3ac829e1\") " pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.726922 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/451656d5-3bd4-402b-98b9-202b3ac829e1-serving-cert\") pod \"route-controller-manager-b56c97db-xrm2h\" (UID: \"451656d5-3bd4-402b-98b9-202b3ac829e1\") " pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.730606 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbktw\" (UniqueName: \"kubernetes.io/projected/e04b3fba-1427-496a-b880-f61867a2c3ac-kube-api-access-cbktw\") pod \"controller-manager-6f85f9899-lkzwb\" (UID: \"e04b3fba-1427-496a-b880-f61867a2c3ac\") " pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.730967 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5zd5\" (UniqueName: \"kubernetes.io/projected/451656d5-3bd4-402b-98b9-202b3ac829e1-kube-api-access-n5zd5\") pod \"route-controller-manager-b56c97db-xrm2h\" (UID: \"451656d5-3bd4-402b-98b9-202b3ac829e1\") " pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.772930 4973 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.75:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.799424 4973 generic.go:334] "Generic (PLEG): container finished" podID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerID="abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc" exitCode=0 Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.799472 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerDied","Data":"abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc"} Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.814713 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.814774 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.814802 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.814861 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.814889 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.814914 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.814939 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.814980 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.856531 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.873094 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.915678 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.915740 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.915762 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.915779 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.915824 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.915862 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.915857 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.915893 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.915931 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.915973 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.915949 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.916015 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.916079 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.916129 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.916147 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.916184 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.921310 4973 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.922053 4973 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.922775 4973 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.923197 4973 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.923508 4973 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.923542 4973 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 13:26:13 crc kubenswrapper[4973]: E0320 13:26:13.923866 4973 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="200ms" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.962272 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3c687f3-1259-421f-a8cd-b6cde5d00784" path="/var/lib/kubelet/pods/c3c687f3-1259-421f-a8cd-b6cde5d00784/volumes" Mar 20 13:26:13 crc kubenswrapper[4973]: I0320 13:26:13.964686 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cde7a8af-714b-4f14-816d-173bf104fa67" path="/var/lib/kubelet/pods/cde7a8af-714b-4f14-816d-173bf104fa67/volumes" Mar 20 13:26:14 crc kubenswrapper[4973]: I0320 13:26:14.073782 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:26:14 crc kubenswrapper[4973]: E0320 13:26:14.085801 4973 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/events/machine-config-daemon-qlztx.189e8f641acc2b6a\": dial tcp 38.102.83.75:6443: connect: connection refused" event="&Event{ObjectMeta:{machine-config-daemon-qlztx.189e8f641acc2b6a openshift-machine-config-operator 26787 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:machine-config-daemon-qlztx,UID:70745a45-4eff-4e56-b9ab-efa4a7c83306,APIVersion:v1,ResourceVersion:26667,FieldPath:spec.containers{machine-config-daemon},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:22:43 +0000 UTC,LastTimestamp:2026-03-20 13:26:14.085137871 +0000 UTC m=+294.828807615,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:26:14 crc kubenswrapper[4973]: W0320 13:26:14.090278 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-d4192bd5dc1bd7b03b58f1a9b2ab2af7a5048eb0fb098c50500a58e6609c8257 WatchSource:0}: Error finding container d4192bd5dc1bd7b03b58f1a9b2ab2af7a5048eb0fb098c50500a58e6609c8257: Status 404 returned error can't find the container with id d4192bd5dc1bd7b03b58f1a9b2ab2af7a5048eb0fb098c50500a58e6609c8257 Mar 20 13:26:14 crc kubenswrapper[4973]: E0320 13:26:14.124783 4973 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="400ms" Mar 20 13:26:14 crc kubenswrapper[4973]: E0320 13:26:14.449579 4973 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 13:26:14 crc kubenswrapper[4973]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-6f85f9899-lkzwb_openshift-controller-manager_e04b3fba-1427-496a-b880-f61867a2c3ac_0(fd2a12b3ff233175ab126755cee2463822f3311170c11f5efc172f9bea6476b4): error adding pod openshift-controller-manager_controller-manager-6f85f9899-lkzwb to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"fd2a12b3ff233175ab126755cee2463822f3311170c11f5efc172f9bea6476b4" Netns:"/var/run/netns/66bf6572-da5d-4985-a6d2-5e9b80c52227" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-6f85f9899-lkzwb;K8S_POD_INFRA_CONTAINER_ID=fd2a12b3ff233175ab126755cee2463822f3311170c11f5efc172f9bea6476b4;K8S_POD_UID=e04b3fba-1427-496a-b880-f61867a2c3ac" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-6f85f9899-lkzwb] networking: Multus: [openshift-controller-manager/controller-manager-6f85f9899-lkzwb/e04b3fba-1427-496a-b880-f61867a2c3ac]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-6f85f9899-lkzwb in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-6f85f9899-lkzwb in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f85f9899-lkzwb?timeout=1m0s": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:26:14 crc kubenswrapper[4973]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:26:14 crc kubenswrapper[4973]: > Mar 20 13:26:14 crc kubenswrapper[4973]: E0320 13:26:14.450013 4973 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 13:26:14 crc kubenswrapper[4973]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-6f85f9899-lkzwb_openshift-controller-manager_e04b3fba-1427-496a-b880-f61867a2c3ac_0(fd2a12b3ff233175ab126755cee2463822f3311170c11f5efc172f9bea6476b4): error adding pod openshift-controller-manager_controller-manager-6f85f9899-lkzwb to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"fd2a12b3ff233175ab126755cee2463822f3311170c11f5efc172f9bea6476b4" Netns:"/var/run/netns/66bf6572-da5d-4985-a6d2-5e9b80c52227" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-6f85f9899-lkzwb;K8S_POD_INFRA_CONTAINER_ID=fd2a12b3ff233175ab126755cee2463822f3311170c11f5efc172f9bea6476b4;K8S_POD_UID=e04b3fba-1427-496a-b880-f61867a2c3ac" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-6f85f9899-lkzwb] networking: Multus: [openshift-controller-manager/controller-manager-6f85f9899-lkzwb/e04b3fba-1427-496a-b880-f61867a2c3ac]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-6f85f9899-lkzwb in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-6f85f9899-lkzwb in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f85f9899-lkzwb?timeout=1m0s": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:26:14 crc kubenswrapper[4973]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:26:14 crc kubenswrapper[4973]: > pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:14 crc kubenswrapper[4973]: E0320 13:26:14.450035 4973 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 20 13:26:14 crc kubenswrapper[4973]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-6f85f9899-lkzwb_openshift-controller-manager_e04b3fba-1427-496a-b880-f61867a2c3ac_0(fd2a12b3ff233175ab126755cee2463822f3311170c11f5efc172f9bea6476b4): error adding pod openshift-controller-manager_controller-manager-6f85f9899-lkzwb to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"fd2a12b3ff233175ab126755cee2463822f3311170c11f5efc172f9bea6476b4" Netns:"/var/run/netns/66bf6572-da5d-4985-a6d2-5e9b80c52227" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-6f85f9899-lkzwb;K8S_POD_INFRA_CONTAINER_ID=fd2a12b3ff233175ab126755cee2463822f3311170c11f5efc172f9bea6476b4;K8S_POD_UID=e04b3fba-1427-496a-b880-f61867a2c3ac" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-6f85f9899-lkzwb] networking: Multus: [openshift-controller-manager/controller-manager-6f85f9899-lkzwb/e04b3fba-1427-496a-b880-f61867a2c3ac]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-6f85f9899-lkzwb in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-6f85f9899-lkzwb in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f85f9899-lkzwb?timeout=1m0s": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:26:14 crc kubenswrapper[4973]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:26:14 crc kubenswrapper[4973]: > pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:14 crc kubenswrapper[4973]: E0320 13:26:14.450092 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-6f85f9899-lkzwb_openshift-controller-manager(e04b3fba-1427-496a-b880-f61867a2c3ac)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-6f85f9899-lkzwb_openshift-controller-manager(e04b3fba-1427-496a-b880-f61867a2c3ac)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-6f85f9899-lkzwb_openshift-controller-manager_e04b3fba-1427-496a-b880-f61867a2c3ac_0(fd2a12b3ff233175ab126755cee2463822f3311170c11f5efc172f9bea6476b4): error adding pod openshift-controller-manager_controller-manager-6f85f9899-lkzwb to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"fd2a12b3ff233175ab126755cee2463822f3311170c11f5efc172f9bea6476b4\\\" Netns:\\\"/var/run/netns/66bf6572-da5d-4985-a6d2-5e9b80c52227\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-6f85f9899-lkzwb;K8S_POD_INFRA_CONTAINER_ID=fd2a12b3ff233175ab126755cee2463822f3311170c11f5efc172f9bea6476b4;K8S_POD_UID=e04b3fba-1427-496a-b880-f61867a2c3ac\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-6f85f9899-lkzwb] networking: Multus: [openshift-controller-manager/controller-manager-6f85f9899-lkzwb/e04b3fba-1427-496a-b880-f61867a2c3ac]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-6f85f9899-lkzwb in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-6f85f9899-lkzwb in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f85f9899-lkzwb?timeout=1m0s\\\": dial tcp 38.102.83.75:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" podUID="e04b3fba-1427-496a-b880-f61867a2c3ac" Mar 20 13:26:14 crc kubenswrapper[4973]: E0320 13:26:14.527042 4973 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="800ms" Mar 20 13:26:14 crc kubenswrapper[4973]: E0320 13:26:14.550048 4973 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 13:26:14 crc kubenswrapper[4973]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-b56c97db-xrm2h_openshift-route-controller-manager_451656d5-3bd4-402b-98b9-202b3ac829e1_0(18431feda4a83b786c98ade22f4a48c2250ebebaeeafe5491980a0c29447c70d): error adding pod openshift-route-controller-manager_route-controller-manager-b56c97db-xrm2h to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"18431feda4a83b786c98ade22f4a48c2250ebebaeeafe5491980a0c29447c70d" Netns:"/var/run/netns/d616591e-c791-42b2-ae40-55ac17b63b31" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-b56c97db-xrm2h;K8S_POD_INFRA_CONTAINER_ID=18431feda4a83b786c98ade22f4a48c2250ebebaeeafe5491980a0c29447c70d;K8S_POD_UID=451656d5-3bd4-402b-98b9-202b3ac829e1" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h] networking: Multus: [openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h/451656d5-3bd4-402b-98b9-202b3ac829e1]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-b56c97db-xrm2h in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-b56c97db-xrm2h in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b56c97db-xrm2h?timeout=1m0s": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:26:14 crc kubenswrapper[4973]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:26:14 crc kubenswrapper[4973]: > Mar 20 13:26:14 crc kubenswrapper[4973]: E0320 13:26:14.550119 4973 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 13:26:14 crc kubenswrapper[4973]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-b56c97db-xrm2h_openshift-route-controller-manager_451656d5-3bd4-402b-98b9-202b3ac829e1_0(18431feda4a83b786c98ade22f4a48c2250ebebaeeafe5491980a0c29447c70d): error adding pod openshift-route-controller-manager_route-controller-manager-b56c97db-xrm2h to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"18431feda4a83b786c98ade22f4a48c2250ebebaeeafe5491980a0c29447c70d" Netns:"/var/run/netns/d616591e-c791-42b2-ae40-55ac17b63b31" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-b56c97db-xrm2h;K8S_POD_INFRA_CONTAINER_ID=18431feda4a83b786c98ade22f4a48c2250ebebaeeafe5491980a0c29447c70d;K8S_POD_UID=451656d5-3bd4-402b-98b9-202b3ac829e1" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h] networking: Multus: [openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h/451656d5-3bd4-402b-98b9-202b3ac829e1]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-b56c97db-xrm2h in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-b56c97db-xrm2h in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b56c97db-xrm2h?timeout=1m0s": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:26:14 crc kubenswrapper[4973]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:26:14 crc kubenswrapper[4973]: > pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" Mar 20 13:26:14 crc kubenswrapper[4973]: E0320 13:26:14.550138 4973 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 20 13:26:14 crc kubenswrapper[4973]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-b56c97db-xrm2h_openshift-route-controller-manager_451656d5-3bd4-402b-98b9-202b3ac829e1_0(18431feda4a83b786c98ade22f4a48c2250ebebaeeafe5491980a0c29447c70d): error adding pod openshift-route-controller-manager_route-controller-manager-b56c97db-xrm2h to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"18431feda4a83b786c98ade22f4a48c2250ebebaeeafe5491980a0c29447c70d" Netns:"/var/run/netns/d616591e-c791-42b2-ae40-55ac17b63b31" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-b56c97db-xrm2h;K8S_POD_INFRA_CONTAINER_ID=18431feda4a83b786c98ade22f4a48c2250ebebaeeafe5491980a0c29447c70d;K8S_POD_UID=451656d5-3bd4-402b-98b9-202b3ac829e1" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h] networking: Multus: [openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h/451656d5-3bd4-402b-98b9-202b3ac829e1]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-b56c97db-xrm2h in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-b56c97db-xrm2h in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b56c97db-xrm2h?timeout=1m0s": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:26:14 crc kubenswrapper[4973]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:26:14 crc kubenswrapper[4973]: > pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" Mar 20 13:26:14 crc kubenswrapper[4973]: E0320 13:26:14.550195 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"route-controller-manager-b56c97db-xrm2h_openshift-route-controller-manager(451656d5-3bd4-402b-98b9-202b3ac829e1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"route-controller-manager-b56c97db-xrm2h_openshift-route-controller-manager(451656d5-3bd4-402b-98b9-202b3ac829e1)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-b56c97db-xrm2h_openshift-route-controller-manager_451656d5-3bd4-402b-98b9-202b3ac829e1_0(18431feda4a83b786c98ade22f4a48c2250ebebaeeafe5491980a0c29447c70d): error adding pod openshift-route-controller-manager_route-controller-manager-b56c97db-xrm2h to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"18431feda4a83b786c98ade22f4a48c2250ebebaeeafe5491980a0c29447c70d\\\" Netns:\\\"/var/run/netns/d616591e-c791-42b2-ae40-55ac17b63b31\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-b56c97db-xrm2h;K8S_POD_INFRA_CONTAINER_ID=18431feda4a83b786c98ade22f4a48c2250ebebaeeafe5491980a0c29447c70d;K8S_POD_UID=451656d5-3bd4-402b-98b9-202b3ac829e1\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h] networking: Multus: [openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h/451656d5-3bd4-402b-98b9-202b3ac829e1]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-b56c97db-xrm2h in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-b56c97db-xrm2h in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b56c97db-xrm2h?timeout=1m0s\\\": dial tcp 38.102.83.75:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" podUID="451656d5-3bd4-402b-98b9-202b3ac829e1" Mar 20 13:26:14 crc kubenswrapper[4973]: I0320 13:26:14.807465 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"4879829748d04247919d4584e9d504332cd27c4463067eb66efb07235043065c"} Mar 20 13:26:14 crc kubenswrapper[4973]: I0320 13:26:14.807524 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"d4192bd5dc1bd7b03b58f1a9b2ab2af7a5048eb0fb098c50500a58e6609c8257"} Mar 20 13:26:14 crc kubenswrapper[4973]: E0320 13:26:14.808224 4973 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.75:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:26:14 crc kubenswrapper[4973]: I0320 13:26:14.810927 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 13:26:14 crc kubenswrapper[4973]: I0320 13:26:14.812314 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 13:26:14 crc kubenswrapper[4973]: I0320 13:26:14.813001 4973 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ae0f13493cd2874f6e19f0ced0646d2902d6cc172ca6b1560a28949984bebc70" exitCode=0 Mar 20 13:26:14 crc kubenswrapper[4973]: I0320 13:26:14.813035 4973 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a" exitCode=0 Mar 20 13:26:14 crc kubenswrapper[4973]: I0320 13:26:14.813044 4973 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168" exitCode=0 Mar 20 13:26:14 crc kubenswrapper[4973]: I0320 13:26:14.813054 4973 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b" exitCode=2 Mar 20 13:26:14 crc kubenswrapper[4973]: I0320 13:26:14.813098 4973 scope.go:117] "RemoveContainer" containerID="923e6573aaa51d68d755becd8e952c0e0a42b28c31552b3035c421a17c2c0365" Mar 20 13:26:14 crc kubenswrapper[4973]: I0320 13:26:14.815600 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerStarted","Data":"52d5fd2368a231dd70e0a2d1cd4e97da6e1bd0ca50431edd3ab2ddcc1bd88dec"} Mar 20 13:26:14 crc kubenswrapper[4973]: I0320 13:26:14.816229 4973 status_manager.go:851] "Failed to get status for pod" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-qlztx\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:14 crc kubenswrapper[4973]: I0320 13:26:14.817515 4973 generic.go:334] "Generic (PLEG): container finished" podID="48634fee-db9f-4111-abac-74b05132eaa9" containerID="4b272b7dc2a5fe12dbeca86072cdef8eb91b88b9fa5d8ea961223e81c1498768" exitCode=0 Mar 20 13:26:14 crc kubenswrapper[4973]: I0320 13:26:14.817572 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" Mar 20 13:26:14 crc kubenswrapper[4973]: I0320 13:26:14.817591 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"48634fee-db9f-4111-abac-74b05132eaa9","Type":"ContainerDied","Data":"4b272b7dc2a5fe12dbeca86072cdef8eb91b88b9fa5d8ea961223e81c1498768"} Mar 20 13:26:14 crc kubenswrapper[4973]: I0320 13:26:14.817634 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:14 crc kubenswrapper[4973]: I0320 13:26:14.817887 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:14 crc kubenswrapper[4973]: I0320 13:26:14.817950 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" Mar 20 13:26:14 crc kubenswrapper[4973]: I0320 13:26:14.818145 4973 status_manager.go:851] "Failed to get status for pod" podUID="48634fee-db9f-4111-abac-74b05132eaa9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:14 crc kubenswrapper[4973]: I0320 13:26:14.818394 4973 status_manager.go:851] "Failed to get status for pod" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-qlztx\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:14 crc kubenswrapper[4973]: E0320 13:26:14.939069 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:14Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:14Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:14Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:14Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4869c69128f74d9c3b178ea6c8c8d38df169b6bce05eb821a65f0aaf514c563a\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:f3c2ad90e251062165f8d6623ca4994c0b3e28324e4b5b17fd588b162ec97766\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746912226},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1a4192be260d654b8925c9da948e9e46b5d16700f8c0110fbac99cde2728f126\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a56fc50b2a9b02dc273af5c247943f865ae57ce5e3fb338e1f48ea5a3732cec5\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252700376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:29f12af0e8b18530a7709224088624d4544d83dc113d7305c831632505312453\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8b88035291b6bc6b899463fae9a0cd90aee8057575fb645dc719184dece70a0b\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1223676638},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:14 crc kubenswrapper[4973]: E0320 13:26:14.939929 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:14 crc kubenswrapper[4973]: E0320 13:26:14.940217 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:14 crc kubenswrapper[4973]: E0320 13:26:14.940434 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:14 crc kubenswrapper[4973]: E0320 13:26:14.940609 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:14 crc kubenswrapper[4973]: E0320 13:26:14.940626 4973 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:26:15 crc kubenswrapper[4973]: E0320 13:26:15.328165 4973 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="1.6s" Mar 20 13:26:15 crc kubenswrapper[4973]: E0320 13:26:15.630840 4973 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 13:26:15 crc kubenswrapper[4973]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-6f85f9899-lkzwb_openshift-controller-manager_e04b3fba-1427-496a-b880-f61867a2c3ac_0(7788b797ca32b45ff4303038b049f6f56d2be03f317f01695b8f695b7779cfaf): error adding pod openshift-controller-manager_controller-manager-6f85f9899-lkzwb to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7788b797ca32b45ff4303038b049f6f56d2be03f317f01695b8f695b7779cfaf" Netns:"/var/run/netns/348cfc46-7f71-47c7-bb64-3c2ea6e75138" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-6f85f9899-lkzwb;K8S_POD_INFRA_CONTAINER_ID=7788b797ca32b45ff4303038b049f6f56d2be03f317f01695b8f695b7779cfaf;K8S_POD_UID=e04b3fba-1427-496a-b880-f61867a2c3ac" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-6f85f9899-lkzwb] networking: Multus: [openshift-controller-manager/controller-manager-6f85f9899-lkzwb/e04b3fba-1427-496a-b880-f61867a2c3ac]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-6f85f9899-lkzwb in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-6f85f9899-lkzwb in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f85f9899-lkzwb?timeout=1m0s": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:26:15 crc kubenswrapper[4973]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:26:15 crc kubenswrapper[4973]: > Mar 20 13:26:15 crc kubenswrapper[4973]: E0320 13:26:15.630909 4973 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 13:26:15 crc kubenswrapper[4973]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-6f85f9899-lkzwb_openshift-controller-manager_e04b3fba-1427-496a-b880-f61867a2c3ac_0(7788b797ca32b45ff4303038b049f6f56d2be03f317f01695b8f695b7779cfaf): error adding pod openshift-controller-manager_controller-manager-6f85f9899-lkzwb to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7788b797ca32b45ff4303038b049f6f56d2be03f317f01695b8f695b7779cfaf" Netns:"/var/run/netns/348cfc46-7f71-47c7-bb64-3c2ea6e75138" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-6f85f9899-lkzwb;K8S_POD_INFRA_CONTAINER_ID=7788b797ca32b45ff4303038b049f6f56d2be03f317f01695b8f695b7779cfaf;K8S_POD_UID=e04b3fba-1427-496a-b880-f61867a2c3ac" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-6f85f9899-lkzwb] networking: Multus: [openshift-controller-manager/controller-manager-6f85f9899-lkzwb/e04b3fba-1427-496a-b880-f61867a2c3ac]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-6f85f9899-lkzwb in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-6f85f9899-lkzwb in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f85f9899-lkzwb?timeout=1m0s": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:26:15 crc kubenswrapper[4973]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:26:15 crc kubenswrapper[4973]: > pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:15 crc kubenswrapper[4973]: E0320 13:26:15.630934 4973 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 20 13:26:15 crc kubenswrapper[4973]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-6f85f9899-lkzwb_openshift-controller-manager_e04b3fba-1427-496a-b880-f61867a2c3ac_0(7788b797ca32b45ff4303038b049f6f56d2be03f317f01695b8f695b7779cfaf): error adding pod openshift-controller-manager_controller-manager-6f85f9899-lkzwb to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"7788b797ca32b45ff4303038b049f6f56d2be03f317f01695b8f695b7779cfaf" Netns:"/var/run/netns/348cfc46-7f71-47c7-bb64-3c2ea6e75138" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-6f85f9899-lkzwb;K8S_POD_INFRA_CONTAINER_ID=7788b797ca32b45ff4303038b049f6f56d2be03f317f01695b8f695b7779cfaf;K8S_POD_UID=e04b3fba-1427-496a-b880-f61867a2c3ac" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-6f85f9899-lkzwb] networking: Multus: [openshift-controller-manager/controller-manager-6f85f9899-lkzwb/e04b3fba-1427-496a-b880-f61867a2c3ac]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-6f85f9899-lkzwb in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-6f85f9899-lkzwb in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f85f9899-lkzwb?timeout=1m0s": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:26:15 crc kubenswrapper[4973]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:26:15 crc kubenswrapper[4973]: > pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:15 crc kubenswrapper[4973]: E0320 13:26:15.631011 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-6f85f9899-lkzwb_openshift-controller-manager(e04b3fba-1427-496a-b880-f61867a2c3ac)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-6f85f9899-lkzwb_openshift-controller-manager(e04b3fba-1427-496a-b880-f61867a2c3ac)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-6f85f9899-lkzwb_openshift-controller-manager_e04b3fba-1427-496a-b880-f61867a2c3ac_0(7788b797ca32b45ff4303038b049f6f56d2be03f317f01695b8f695b7779cfaf): error adding pod openshift-controller-manager_controller-manager-6f85f9899-lkzwb to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"7788b797ca32b45ff4303038b049f6f56d2be03f317f01695b8f695b7779cfaf\\\" Netns:\\\"/var/run/netns/348cfc46-7f71-47c7-bb64-3c2ea6e75138\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-6f85f9899-lkzwb;K8S_POD_INFRA_CONTAINER_ID=7788b797ca32b45ff4303038b049f6f56d2be03f317f01695b8f695b7779cfaf;K8S_POD_UID=e04b3fba-1427-496a-b880-f61867a2c3ac\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-6f85f9899-lkzwb] networking: Multus: [openshift-controller-manager/controller-manager-6f85f9899-lkzwb/e04b3fba-1427-496a-b880-f61867a2c3ac]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-6f85f9899-lkzwb in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-6f85f9899-lkzwb in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f85f9899-lkzwb?timeout=1m0s\\\": dial tcp 38.102.83.75:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" podUID="e04b3fba-1427-496a-b880-f61867a2c3ac" Mar 20 13:26:15 crc kubenswrapper[4973]: E0320 13:26:15.694883 4973 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 13:26:15 crc kubenswrapper[4973]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-b56c97db-xrm2h_openshift-route-controller-manager_451656d5-3bd4-402b-98b9-202b3ac829e1_0(eadd1a5a42823f6470d1cb9a10a59f1d3ac21b6162eb12da7667dd78be796d56): error adding pod openshift-route-controller-manager_route-controller-manager-b56c97db-xrm2h to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"eadd1a5a42823f6470d1cb9a10a59f1d3ac21b6162eb12da7667dd78be796d56" Netns:"/var/run/netns/48fd0a70-ee38-477a-a8b8-69f20e6b9f98" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-b56c97db-xrm2h;K8S_POD_INFRA_CONTAINER_ID=eadd1a5a42823f6470d1cb9a10a59f1d3ac21b6162eb12da7667dd78be796d56;K8S_POD_UID=451656d5-3bd4-402b-98b9-202b3ac829e1" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h] networking: Multus: [openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h/451656d5-3bd4-402b-98b9-202b3ac829e1]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-b56c97db-xrm2h in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-b56c97db-xrm2h in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b56c97db-xrm2h?timeout=1m0s": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:26:15 crc kubenswrapper[4973]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:26:15 crc kubenswrapper[4973]: > Mar 20 13:26:15 crc kubenswrapper[4973]: E0320 13:26:15.694946 4973 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 13:26:15 crc kubenswrapper[4973]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-b56c97db-xrm2h_openshift-route-controller-manager_451656d5-3bd4-402b-98b9-202b3ac829e1_0(eadd1a5a42823f6470d1cb9a10a59f1d3ac21b6162eb12da7667dd78be796d56): error adding pod openshift-route-controller-manager_route-controller-manager-b56c97db-xrm2h to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"eadd1a5a42823f6470d1cb9a10a59f1d3ac21b6162eb12da7667dd78be796d56" Netns:"/var/run/netns/48fd0a70-ee38-477a-a8b8-69f20e6b9f98" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-b56c97db-xrm2h;K8S_POD_INFRA_CONTAINER_ID=eadd1a5a42823f6470d1cb9a10a59f1d3ac21b6162eb12da7667dd78be796d56;K8S_POD_UID=451656d5-3bd4-402b-98b9-202b3ac829e1" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h] networking: Multus: [openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h/451656d5-3bd4-402b-98b9-202b3ac829e1]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-b56c97db-xrm2h in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-b56c97db-xrm2h in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b56c97db-xrm2h?timeout=1m0s": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:26:15 crc kubenswrapper[4973]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:26:15 crc kubenswrapper[4973]: > pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" Mar 20 13:26:15 crc kubenswrapper[4973]: E0320 13:26:15.694979 4973 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 20 13:26:15 crc kubenswrapper[4973]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-b56c97db-xrm2h_openshift-route-controller-manager_451656d5-3bd4-402b-98b9-202b3ac829e1_0(eadd1a5a42823f6470d1cb9a10a59f1d3ac21b6162eb12da7667dd78be796d56): error adding pod openshift-route-controller-manager_route-controller-manager-b56c97db-xrm2h to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"eadd1a5a42823f6470d1cb9a10a59f1d3ac21b6162eb12da7667dd78be796d56" Netns:"/var/run/netns/48fd0a70-ee38-477a-a8b8-69f20e6b9f98" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-b56c97db-xrm2h;K8S_POD_INFRA_CONTAINER_ID=eadd1a5a42823f6470d1cb9a10a59f1d3ac21b6162eb12da7667dd78be796d56;K8S_POD_UID=451656d5-3bd4-402b-98b9-202b3ac829e1" Path:"" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h] networking: Multus: [openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h/451656d5-3bd4-402b-98b9-202b3ac829e1]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-b56c97db-xrm2h in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-b56c97db-xrm2h in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b56c97db-xrm2h?timeout=1m0s": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:26:15 crc kubenswrapper[4973]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:26:15 crc kubenswrapper[4973]: > pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" Mar 20 13:26:15 crc kubenswrapper[4973]: E0320 13:26:15.695032 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"route-controller-manager-b56c97db-xrm2h_openshift-route-controller-manager(451656d5-3bd4-402b-98b9-202b3ac829e1)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"route-controller-manager-b56c97db-xrm2h_openshift-route-controller-manager(451656d5-3bd4-402b-98b9-202b3ac829e1)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_route-controller-manager-b56c97db-xrm2h_openshift-route-controller-manager_451656d5-3bd4-402b-98b9-202b3ac829e1_0(eadd1a5a42823f6470d1cb9a10a59f1d3ac21b6162eb12da7667dd78be796d56): error adding pod openshift-route-controller-manager_route-controller-manager-b56c97db-xrm2h to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"eadd1a5a42823f6470d1cb9a10a59f1d3ac21b6162eb12da7667dd78be796d56\\\" Netns:\\\"/var/run/netns/48fd0a70-ee38-477a-a8b8-69f20e6b9f98\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-route-controller-manager;K8S_POD_NAME=route-controller-manager-b56c97db-xrm2h;K8S_POD_INFRA_CONTAINER_ID=eadd1a5a42823f6470d1cb9a10a59f1d3ac21b6162eb12da7667dd78be796d56;K8S_POD_UID=451656d5-3bd4-402b-98b9-202b3ac829e1\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h] networking: Multus: [openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h/451656d5-3bd4-402b-98b9-202b3ac829e1]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod route-controller-manager-b56c97db-xrm2h in out of cluster comm: SetNetworkStatus: failed to update the pod route-controller-manager-b56c97db-xrm2h in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-route-controller-manager/pods/route-controller-manager-b56c97db-xrm2h?timeout=1m0s\\\": dial tcp 38.102.83.75:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" podUID="451656d5-3bd4-402b-98b9-202b3ac829e1" Mar 20 13:26:15 crc kubenswrapper[4973]: I0320 13:26:15.835375 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.027802 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.028608 4973 status_manager.go:851] "Failed to get status for pod" podUID="48634fee-db9f-4111-abac-74b05132eaa9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.029005 4973 status_manager.go:851] "Failed to get status for pod" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-qlztx\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.044728 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.045361 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.045857 4973 status_manager.go:851] "Failed to get status for pod" podUID="48634fee-db9f-4111-abac-74b05132eaa9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.046158 4973 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.046755 4973 status_manager.go:851] "Failed to get status for pod" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-qlztx\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.143096 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.143181 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.143214 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.143223 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48634fee-db9f-4111-abac-74b05132eaa9-kube-api-access\") pod \"48634fee-db9f-4111-abac-74b05132eaa9\" (UID: \"48634fee-db9f-4111-abac-74b05132eaa9\") " Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.143284 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/48634fee-db9f-4111-abac-74b05132eaa9-var-lock\") pod \"48634fee-db9f-4111-abac-74b05132eaa9\" (UID: \"48634fee-db9f-4111-abac-74b05132eaa9\") " Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.143309 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.143331 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48634fee-db9f-4111-abac-74b05132eaa9-kubelet-dir\") pod \"48634fee-db9f-4111-abac-74b05132eaa9\" (UID: \"48634fee-db9f-4111-abac-74b05132eaa9\") " Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.143321 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.143399 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48634fee-db9f-4111-abac-74b05132eaa9-var-lock" (OuterVolumeSpecName: "var-lock") pod "48634fee-db9f-4111-abac-74b05132eaa9" (UID: "48634fee-db9f-4111-abac-74b05132eaa9"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.143416 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.143501 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48634fee-db9f-4111-abac-74b05132eaa9-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "48634fee-db9f-4111-abac-74b05132eaa9" (UID: "48634fee-db9f-4111-abac-74b05132eaa9"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.143682 4973 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/48634fee-db9f-4111-abac-74b05132eaa9-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.143696 4973 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.143705 4973 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/48634fee-db9f-4111-abac-74b05132eaa9-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.143713 4973 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.143721 4973 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.148060 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48634fee-db9f-4111-abac-74b05132eaa9-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "48634fee-db9f-4111-abac-74b05132eaa9" (UID: "48634fee-db9f-4111-abac-74b05132eaa9"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.245761 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/48634fee-db9f-4111-abac-74b05132eaa9-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.851896 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.852807 4973 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d" exitCode=0 Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.852882 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.852914 4973 scope.go:117] "RemoveContainer" containerID="ae0f13493cd2874f6e19f0ced0646d2902d6cc172ca6b1560a28949984bebc70" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.854974 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"48634fee-db9f-4111-abac-74b05132eaa9","Type":"ContainerDied","Data":"f31f0d371f7831c07cef575f03946fb60b6adb68740c2029873ec3d640369f6a"} Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.855007 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f31f0d371f7831c07cef575f03946fb60b6adb68740c2029873ec3d640369f6a" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.855043 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.867844 4973 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.868212 4973 status_manager.go:851] "Failed to get status for pod" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-qlztx\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.868538 4973 status_manager.go:851] "Failed to get status for pod" podUID="48634fee-db9f-4111-abac-74b05132eaa9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.870383 4973 scope.go:117] "RemoveContainer" containerID="543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.873674 4973 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.874008 4973 status_manager.go:851] "Failed to get status for pod" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-qlztx\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.874307 4973 status_manager.go:851] "Failed to get status for pod" podUID="48634fee-db9f-4111-abac-74b05132eaa9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.888771 4973 scope.go:117] "RemoveContainer" containerID="381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.904813 4973 scope.go:117] "RemoveContainer" containerID="7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.919054 4973 scope.go:117] "RemoveContainer" containerID="539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d" Mar 20 13:26:16 crc kubenswrapper[4973]: E0320 13:26:16.928700 4973 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="3.2s" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.934648 4973 scope.go:117] "RemoveContainer" containerID="fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.952578 4973 scope.go:117] "RemoveContainer" containerID="ae0f13493cd2874f6e19f0ced0646d2902d6cc172ca6b1560a28949984bebc70" Mar 20 13:26:16 crc kubenswrapper[4973]: E0320 13:26:16.955031 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae0f13493cd2874f6e19f0ced0646d2902d6cc172ca6b1560a28949984bebc70\": container with ID starting with ae0f13493cd2874f6e19f0ced0646d2902d6cc172ca6b1560a28949984bebc70 not found: ID does not exist" containerID="ae0f13493cd2874f6e19f0ced0646d2902d6cc172ca6b1560a28949984bebc70" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.955081 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae0f13493cd2874f6e19f0ced0646d2902d6cc172ca6b1560a28949984bebc70"} err="failed to get container status \"ae0f13493cd2874f6e19f0ced0646d2902d6cc172ca6b1560a28949984bebc70\": rpc error: code = NotFound desc = could not find container \"ae0f13493cd2874f6e19f0ced0646d2902d6cc172ca6b1560a28949984bebc70\": container with ID starting with ae0f13493cd2874f6e19f0ced0646d2902d6cc172ca6b1560a28949984bebc70 not found: ID does not exist" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.955104 4973 scope.go:117] "RemoveContainer" containerID="543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a" Mar 20 13:26:16 crc kubenswrapper[4973]: E0320 13:26:16.955425 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\": container with ID starting with 543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a not found: ID does not exist" containerID="543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.955484 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a"} err="failed to get container status \"543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\": rpc error: code = NotFound desc = could not find container \"543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a\": container with ID starting with 543cca8f03e35eb213a0c969a0df00e1ff45d6f28d9e7aa7cd22e1b3b540231a not found: ID does not exist" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.955520 4973 scope.go:117] "RemoveContainer" containerID="381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168" Mar 20 13:26:16 crc kubenswrapper[4973]: E0320 13:26:16.955820 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\": container with ID starting with 381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168 not found: ID does not exist" containerID="381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.955840 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168"} err="failed to get container status \"381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\": rpc error: code = NotFound desc = could not find container \"381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168\": container with ID starting with 381219d44482fe4484b4ddd405493ef644b2390053f4e257b04b7b2db684a168 not found: ID does not exist" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.955852 4973 scope.go:117] "RemoveContainer" containerID="7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b" Mar 20 13:26:16 crc kubenswrapper[4973]: E0320 13:26:16.956085 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\": container with ID starting with 7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b not found: ID does not exist" containerID="7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.956125 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b"} err="failed to get container status \"7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\": rpc error: code = NotFound desc = could not find container \"7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b\": container with ID starting with 7637d8b582008060944ed2a0693cd508c23a7ff2ee5863e44b77271027f0098b not found: ID does not exist" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.956140 4973 scope.go:117] "RemoveContainer" containerID="539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d" Mar 20 13:26:16 crc kubenswrapper[4973]: E0320 13:26:16.956412 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\": container with ID starting with 539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d not found: ID does not exist" containerID="539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.956455 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d"} err="failed to get container status \"539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\": rpc error: code = NotFound desc = could not find container \"539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d\": container with ID starting with 539aa65e288989b46ad2b0fdf0d77b6de8a97e0d97c4b6c7d87a04ea442b6b4d not found: ID does not exist" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.956477 4973 scope.go:117] "RemoveContainer" containerID="fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989" Mar 20 13:26:16 crc kubenswrapper[4973]: E0320 13:26:16.956725 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\": container with ID starting with fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989 not found: ID does not exist" containerID="fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989" Mar 20 13:26:16 crc kubenswrapper[4973]: I0320 13:26:16.956751 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989"} err="failed to get container status \"fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\": rpc error: code = NotFound desc = could not find container \"fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989\": container with ID starting with fe303d2f53978ed1f6cff9a3e12ae1ae4bcfba0c2a682cdf887b441d4842e989 not found: ID does not exist" Mar 20 13:26:17 crc kubenswrapper[4973]: E0320 13:26:17.033176 4973 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.75:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" volumeName="registry-storage" Mar 20 13:26:17 crc kubenswrapper[4973]: I0320 13:26:17.963625 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 20 13:26:19 crc kubenswrapper[4973]: I0320 13:26:19.956830 4973 status_manager.go:851] "Failed to get status for pod" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-qlztx\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:19 crc kubenswrapper[4973]: I0320 13:26:19.957674 4973 status_manager.go:851] "Failed to get status for pod" podUID="48634fee-db9f-4111-abac-74b05132eaa9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:20 crc kubenswrapper[4973]: E0320 13:26:20.129410 4973 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="6.4s" Mar 20 13:26:20 crc kubenswrapper[4973]: E0320 13:26:20.790208 4973 event.go:368] "Unable to write event (may retry after sleeping)" err="Patch \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/events/machine-config-daemon-qlztx.189e8f641acc2b6a\": dial tcp 38.102.83.75:6443: connect: connection refused" event="&Event{ObjectMeta:{machine-config-daemon-qlztx.189e8f641acc2b6a openshift-machine-config-operator 26787 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:machine-config-daemon-qlztx,UID:70745a45-4eff-4e56-b9ab-efa4a7c83306,APIVersion:v1,ResourceVersion:26667,FieldPath:spec.containers{machine-config-daemon},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 13:22:43 +0000 UTC,LastTimestamp:2026-03-20 13:26:14.085137871 +0000 UTC m=+294.828807615,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 13:26:24 crc kubenswrapper[4973]: E0320 13:26:24.994140 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:24Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:24Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:24Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T13:26:24Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4869c69128f74d9c3b178ea6c8c8d38df169b6bce05eb821a65f0aaf514c563a\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:f3c2ad90e251062165f8d6623ca4994c0b3e28324e4b5b17fd588b162ec97766\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1746912226},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1a4192be260d654b8925c9da948e9e46b5d16700f8c0110fbac99cde2728f126\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:a56fc50b2a9b02dc273af5c247943f865ae57ce5e3fb338e1f48ea5a3732cec5\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1252700376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:29f12af0e8b18530a7709224088624d4544d83dc113d7305c831632505312453\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:8b88035291b6bc6b899463fae9a0cd90aee8057575fb645dc719184dece70a0b\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1223676638},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-cli@sha256:69762925e16053d77685ff3a08b3b45dd2bfa5d68277851bc6969b368bbd0cb9\\\",\\\"registry.redhat.io/openshift4/ose-cli@sha256:ef83967297f619f45075e7fd1428a1eb981622a6c174c46fb53b158ed24bed85\\\",\\\"registry.redhat.io/openshift4/ose-cli:latest\\\"],\\\"sizeBytes\\\":584351326},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Patch \"https://api-int.crc.testing:6443/api/v1/nodes/crc/status?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:24 crc kubenswrapper[4973]: E0320 13:26:24.995900 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:24 crc kubenswrapper[4973]: E0320 13:26:24.996300 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:24 crc kubenswrapper[4973]: E0320 13:26:24.996808 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:24 crc kubenswrapper[4973]: E0320 13:26:24.997231 4973 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": Get \"https://api-int.crc.testing:6443/api/v1/nodes/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:24 crc kubenswrapper[4973]: E0320 13:26:24.997303 4973 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 13:26:26 crc kubenswrapper[4973]: E0320 13:26:26.531105 4973 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.75:6443: connect: connection refused" interval="7s" Mar 20 13:26:26 crc kubenswrapper[4973]: I0320 13:26:26.930278 4973 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Readiness probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 13:26:26 crc kubenswrapper[4973]: I0320 13:26:26.930363 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 13:26:26 crc kubenswrapper[4973]: I0320 13:26:26.930748 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 13:26:26 crc kubenswrapper[4973]: I0320 13:26:26.931413 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 13:26:26 crc kubenswrapper[4973]: I0320 13:26:26.931464 4973 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="787de60574f16b2026ae38a99d60b46b6329107ce49531d094fb400a8b010e67" exitCode=1 Mar 20 13:26:26 crc kubenswrapper[4973]: I0320 13:26:26.931499 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"787de60574f16b2026ae38a99d60b46b6329107ce49531d094fb400a8b010e67"} Mar 20 13:26:26 crc kubenswrapper[4973]: I0320 13:26:26.931986 4973 scope.go:117] "RemoveContainer" containerID="787de60574f16b2026ae38a99d60b46b6329107ce49531d094fb400a8b010e67" Mar 20 13:26:26 crc kubenswrapper[4973]: I0320 13:26:26.932302 4973 status_manager.go:851] "Failed to get status for pod" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-qlztx\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:26 crc kubenswrapper[4973]: I0320 13:26:26.932760 4973 status_manager.go:851] "Failed to get status for pod" podUID="48634fee-db9f-4111-abac-74b05132eaa9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:26 crc kubenswrapper[4973]: I0320 13:26:26.933878 4973 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:27 crc kubenswrapper[4973]: I0320 13:26:27.939736 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 13:26:27 crc kubenswrapper[4973]: I0320 13:26:27.941246 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 13:26:27 crc kubenswrapper[4973]: I0320 13:26:27.941307 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"826d4b6392bd62656909815947ff1009f394e2d1882a602f9da8658e1b2650a6"} Mar 20 13:26:27 crc kubenswrapper[4973]: I0320 13:26:27.942157 4973 status_manager.go:851] "Failed to get status for pod" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-qlztx\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:27 crc kubenswrapper[4973]: I0320 13:26:27.942684 4973 status_manager.go:851] "Failed to get status for pod" podUID="48634fee-db9f-4111-abac-74b05132eaa9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:27 crc kubenswrapper[4973]: I0320 13:26:27.943092 4973 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:27 crc kubenswrapper[4973]: I0320 13:26:27.950543 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:27 crc kubenswrapper[4973]: I0320 13:26:27.950566 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:26:27 crc kubenswrapper[4973]: I0320 13:26:27.950968 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:27 crc kubenswrapper[4973]: I0320 13:26:27.951184 4973 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:27 crc kubenswrapper[4973]: I0320 13:26:27.951558 4973 status_manager.go:851] "Failed to get status for pod" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-qlztx\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:27 crc kubenswrapper[4973]: I0320 13:26:27.951846 4973 status_manager.go:851] "Failed to get status for pod" podUID="48634fee-db9f-4111-abac-74b05132eaa9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:27 crc kubenswrapper[4973]: I0320 13:26:27.971110 4973 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d18fe6e1-563f-476a-8193-275b6f92839b" Mar 20 13:26:27 crc kubenswrapper[4973]: I0320 13:26:27.971164 4973 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d18fe6e1-563f-476a-8193-275b6f92839b" Mar 20 13:26:27 crc kubenswrapper[4973]: E0320 13:26:27.971890 4973 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:26:27 crc kubenswrapper[4973]: I0320 13:26:27.972674 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:26:27 crc kubenswrapper[4973]: W0320 13:26:27.989622 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-b6612b1a5ea1bea660ab638e0be88c0ef9972fa7d68ee33d35ac9316d983625e WatchSource:0}: Error finding container b6612b1a5ea1bea660ab638e0be88c0ef9972fa7d68ee33d35ac9316d983625e: Status 404 returned error can't find the container with id b6612b1a5ea1bea660ab638e0be88c0ef9972fa7d68ee33d35ac9316d983625e Mar 20 13:26:28 crc kubenswrapper[4973]: E0320 13:26:28.567732 4973 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 13:26:28 crc kubenswrapper[4973]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-6f85f9899-lkzwb_openshift-controller-manager_e04b3fba-1427-496a-b880-f61867a2c3ac_0(b9ab234dfa31ab0cf3572573384d459815cc7c9dde88b34716602807e37a8fda): error adding pod openshift-controller-manager_controller-manager-6f85f9899-lkzwb to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"b9ab234dfa31ab0cf3572573384d459815cc7c9dde88b34716602807e37a8fda" Netns:"/var/run/netns/f11e11a5-62e7-4ba0-9e8e-645c193961ac" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-6f85f9899-lkzwb;K8S_POD_INFRA_CONTAINER_ID=b9ab234dfa31ab0cf3572573384d459815cc7c9dde88b34716602807e37a8fda;K8S_POD_UID=e04b3fba-1427-496a-b880-f61867a2c3ac" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-6f85f9899-lkzwb] networking: Multus: [openshift-controller-manager/controller-manager-6f85f9899-lkzwb/e04b3fba-1427-496a-b880-f61867a2c3ac]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-6f85f9899-lkzwb in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-6f85f9899-lkzwb in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f85f9899-lkzwb?timeout=1m0s": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:26:28 crc kubenswrapper[4973]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:26:28 crc kubenswrapper[4973]: > Mar 20 13:26:28 crc kubenswrapper[4973]: E0320 13:26:28.567980 4973 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 13:26:28 crc kubenswrapper[4973]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-6f85f9899-lkzwb_openshift-controller-manager_e04b3fba-1427-496a-b880-f61867a2c3ac_0(b9ab234dfa31ab0cf3572573384d459815cc7c9dde88b34716602807e37a8fda): error adding pod openshift-controller-manager_controller-manager-6f85f9899-lkzwb to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"b9ab234dfa31ab0cf3572573384d459815cc7c9dde88b34716602807e37a8fda" Netns:"/var/run/netns/f11e11a5-62e7-4ba0-9e8e-645c193961ac" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-6f85f9899-lkzwb;K8S_POD_INFRA_CONTAINER_ID=b9ab234dfa31ab0cf3572573384d459815cc7c9dde88b34716602807e37a8fda;K8S_POD_UID=e04b3fba-1427-496a-b880-f61867a2c3ac" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-6f85f9899-lkzwb] networking: Multus: [openshift-controller-manager/controller-manager-6f85f9899-lkzwb/e04b3fba-1427-496a-b880-f61867a2c3ac]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-6f85f9899-lkzwb in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-6f85f9899-lkzwb in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f85f9899-lkzwb?timeout=1m0s": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:26:28 crc kubenswrapper[4973]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:26:28 crc kubenswrapper[4973]: > pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:28 crc kubenswrapper[4973]: E0320 13:26:28.567998 4973 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 20 13:26:28 crc kubenswrapper[4973]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-6f85f9899-lkzwb_openshift-controller-manager_e04b3fba-1427-496a-b880-f61867a2c3ac_0(b9ab234dfa31ab0cf3572573384d459815cc7c9dde88b34716602807e37a8fda): error adding pod openshift-controller-manager_controller-manager-6f85f9899-lkzwb to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"b9ab234dfa31ab0cf3572573384d459815cc7c9dde88b34716602807e37a8fda" Netns:"/var/run/netns/f11e11a5-62e7-4ba0-9e8e-645c193961ac" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-6f85f9899-lkzwb;K8S_POD_INFRA_CONTAINER_ID=b9ab234dfa31ab0cf3572573384d459815cc7c9dde88b34716602807e37a8fda;K8S_POD_UID=e04b3fba-1427-496a-b880-f61867a2c3ac" Path:"" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-6f85f9899-lkzwb] networking: Multus: [openshift-controller-manager/controller-manager-6f85f9899-lkzwb/e04b3fba-1427-496a-b880-f61867a2c3ac]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-6f85f9899-lkzwb in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-6f85f9899-lkzwb in out of cluster comm: status update failed for pod /: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f85f9899-lkzwb?timeout=1m0s": dial tcp 38.102.83.75:6443: connect: connection refused Mar 20 13:26:28 crc kubenswrapper[4973]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:26:28 crc kubenswrapper[4973]: > pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:28 crc kubenswrapper[4973]: E0320 13:26:28.568058 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"controller-manager-6f85f9899-lkzwb_openshift-controller-manager(e04b3fba-1427-496a-b880-f61867a2c3ac)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"controller-manager-6f85f9899-lkzwb_openshift-controller-manager(e04b3fba-1427-496a-b880-f61867a2c3ac)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_controller-manager-6f85f9899-lkzwb_openshift-controller-manager_e04b3fba-1427-496a-b880-f61867a2c3ac_0(b9ab234dfa31ab0cf3572573384d459815cc7c9dde88b34716602807e37a8fda): error adding pod openshift-controller-manager_controller-manager-6f85f9899-lkzwb to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"b9ab234dfa31ab0cf3572573384d459815cc7c9dde88b34716602807e37a8fda\\\" Netns:\\\"/var/run/netns/f11e11a5-62e7-4ba0-9e8e-645c193961ac\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-controller-manager;K8S_POD_NAME=controller-manager-6f85f9899-lkzwb;K8S_POD_INFRA_CONTAINER_ID=b9ab234dfa31ab0cf3572573384d459815cc7c9dde88b34716602807e37a8fda;K8S_POD_UID=e04b3fba-1427-496a-b880-f61867a2c3ac\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-controller-manager/controller-manager-6f85f9899-lkzwb] networking: Multus: [openshift-controller-manager/controller-manager-6f85f9899-lkzwb/e04b3fba-1427-496a-b880-f61867a2c3ac]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod controller-manager-6f85f9899-lkzwb in out of cluster comm: SetNetworkStatus: failed to update the pod controller-manager-6f85f9899-lkzwb in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager/pods/controller-manager-6f85f9899-lkzwb?timeout=1m0s\\\": dial tcp 38.102.83.75:6443: connect: connection refused\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" podUID="e04b3fba-1427-496a-b880-f61867a2c3ac" Mar 20 13:26:28 crc kubenswrapper[4973]: I0320 13:26:28.949604 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" Mar 20 13:26:28 crc kubenswrapper[4973]: I0320 13:26:28.950491 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" Mar 20 13:26:28 crc kubenswrapper[4973]: I0320 13:26:28.951831 4973 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="acf802b76de7c17748e1b47c3a6f17967419d6b777aec7b3e32dc98fe0d447b7" exitCode=0 Mar 20 13:26:28 crc kubenswrapper[4973]: I0320 13:26:28.951904 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"acf802b76de7c17748e1b47c3a6f17967419d6b777aec7b3e32dc98fe0d447b7"} Mar 20 13:26:28 crc kubenswrapper[4973]: I0320 13:26:28.951947 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b6612b1a5ea1bea660ab638e0be88c0ef9972fa7d68ee33d35ac9316d983625e"} Mar 20 13:26:28 crc kubenswrapper[4973]: I0320 13:26:28.952410 4973 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d18fe6e1-563f-476a-8193-275b6f92839b" Mar 20 13:26:28 crc kubenswrapper[4973]: I0320 13:26:28.952443 4973 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d18fe6e1-563f-476a-8193-275b6f92839b" Mar 20 13:26:28 crc kubenswrapper[4973]: I0320 13:26:28.953143 4973 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:28 crc kubenswrapper[4973]: E0320 13:26:28.953159 4973 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:26:28 crc kubenswrapper[4973]: I0320 13:26:28.953643 4973 status_manager.go:851] "Failed to get status for pod" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-qlztx\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:28 crc kubenswrapper[4973]: I0320 13:26:28.954043 4973 status_manager.go:851] "Failed to get status for pod" podUID="48634fee-db9f-4111-abac-74b05132eaa9" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.75:6443: connect: connection refused" Mar 20 13:26:29 crc kubenswrapper[4973]: I0320 13:26:29.918770 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:26:29 crc kubenswrapper[4973]: I0320 13:26:29.929064 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:26:29 crc kubenswrapper[4973]: I0320 13:26:29.965069 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2d7afae06f5b649c55eb9cae9e69ad7f509603ef3f5eefe002387a7cc31c93a0"} Mar 20 13:26:29 crc kubenswrapper[4973]: I0320 13:26:29.965118 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a8470da8205bf37c182994c30e0136e0f7fdef7664e74105408fbe54a446ce83"} Mar 20 13:26:29 crc kubenswrapper[4973]: I0320 13:26:29.965144 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"bd5df7570f488898067465e278ec8b517002c4138048f70cbc4254749c344514"} Mar 20 13:26:29 crc kubenswrapper[4973]: I0320 13:26:29.965156 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d01a7341d02d0c365f12f5750213961244c72c49bd121b7b86354d551b1be754"} Mar 20 13:26:29 crc kubenswrapper[4973]: I0320 13:26:29.965256 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:26:30 crc kubenswrapper[4973]: I0320 13:26:30.973877 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7df0b979c968f549ef88fb713b3aa8037115f370b39a8521bfec817317b5f2cf"} Mar 20 13:26:30 crc kubenswrapper[4973]: I0320 13:26:30.974312 4973 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d18fe6e1-563f-476a-8193-275b6f92839b" Mar 20 13:26:30 crc kubenswrapper[4973]: I0320 13:26:30.974327 4973 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d18fe6e1-563f-476a-8193-275b6f92839b" Mar 20 13:26:32 crc kubenswrapper[4973]: I0320 13:26:32.973472 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:26:32 crc kubenswrapper[4973]: I0320 13:26:32.974218 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:26:32 crc kubenswrapper[4973]: I0320 13:26:32.982233 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:26:35 crc kubenswrapper[4973]: W0320 13:26:35.528636 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod451656d5_3bd4_402b_98b9_202b3ac829e1.slice/crio-f6c1cac402e01fac9c170a7a6852db89f769e95a056dcc06adefb6c68269b74a WatchSource:0}: Error finding container f6c1cac402e01fac9c170a7a6852db89f769e95a056dcc06adefb6c68269b74a: Status 404 returned error can't find the container with id f6c1cac402e01fac9c170a7a6852db89f769e95a056dcc06adefb6c68269b74a Mar 20 13:26:35 crc kubenswrapper[4973]: I0320 13:26:35.983157 4973 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:26:36 crc kubenswrapper[4973]: I0320 13:26:36.013491 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" event={"ID":"451656d5-3bd4-402b-98b9-202b3ac829e1","Type":"ContainerStarted","Data":"7b4f93c40e93978ae627975c8200800131a30150e565070b84ea9dcdc3d8c7ba"} Mar 20 13:26:36 crc kubenswrapper[4973]: I0320 13:26:36.013564 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" event={"ID":"451656d5-3bd4-402b-98b9-202b3ac829e1","Type":"ContainerStarted","Data":"f6c1cac402e01fac9c170a7a6852db89f769e95a056dcc06adefb6c68269b74a"} Mar 20 13:26:36 crc kubenswrapper[4973]: I0320 13:26:36.013719 4973 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d18fe6e1-563f-476a-8193-275b6f92839b" Mar 20 13:26:36 crc kubenswrapper[4973]: I0320 13:26:36.013735 4973 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d18fe6e1-563f-476a-8193-275b6f92839b" Mar 20 13:26:36 crc kubenswrapper[4973]: I0320 13:26:36.013908 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:26:36 crc kubenswrapper[4973]: I0320 13:26:36.019557 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:26:36 crc kubenswrapper[4973]: I0320 13:26:36.032195 4973 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ac703675-5718-414e-b565-3e2553804084" Mar 20 13:26:37 crc kubenswrapper[4973]: I0320 13:26:37.019643 4973 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d18fe6e1-563f-476a-8193-275b6f92839b" Mar 20 13:26:37 crc kubenswrapper[4973]: I0320 13:26:37.019697 4973 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d18fe6e1-563f-476a-8193-275b6f92839b" Mar 20 13:26:38 crc kubenswrapper[4973]: I0320 13:26:38.024946 4973 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d18fe6e1-563f-476a-8193-275b6f92839b" Mar 20 13:26:38 crc kubenswrapper[4973]: I0320 13:26:38.025285 4973 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="d18fe6e1-563f-476a-8193-275b6f92839b" Mar 20 13:26:39 crc kubenswrapper[4973]: I0320 13:26:39.950687 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:39 crc kubenswrapper[4973]: I0320 13:26:39.959418 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:39 crc kubenswrapper[4973]: I0320 13:26:39.966898 4973 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="ac703675-5718-414e-b565-3e2553804084" Mar 20 13:26:40 crc kubenswrapper[4973]: W0320 13:26:40.353434 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode04b3fba_1427_496a_b880_f61867a2c3ac.slice/crio-fc36c236a00037c840df3115ca70bcdb9af9a748e140f3465a6508823107c950 WatchSource:0}: Error finding container fc36c236a00037c840df3115ca70bcdb9af9a748e140f3465a6508823107c950: Status 404 returned error can't find the container with id fc36c236a00037c840df3115ca70bcdb9af9a748e140f3465a6508823107c950 Mar 20 13:26:41 crc kubenswrapper[4973]: I0320 13:26:41.041547 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" event={"ID":"e04b3fba-1427-496a-b880-f61867a2c3ac","Type":"ContainerStarted","Data":"a2af3cfb413dc551a04ce48e0d255e82580989074871c84af53032536a4b6eaf"} Mar 20 13:26:41 crc kubenswrapper[4973]: I0320 13:26:41.041592 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" event={"ID":"e04b3fba-1427-496a-b880-f61867a2c3ac","Type":"ContainerStarted","Data":"fc36c236a00037c840df3115ca70bcdb9af9a748e140f3465a6508823107c950"} Mar 20 13:26:41 crc kubenswrapper[4973]: I0320 13:26:41.042630 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:41 crc kubenswrapper[4973]: I0320 13:26:41.048832 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" Mar 20 13:26:43 crc kubenswrapper[4973]: I0320 13:26:43.873750 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" Mar 20 13:26:44 crc kubenswrapper[4973]: I0320 13:26:44.874111 4973 patch_prober.go:28] interesting pod/route-controller-manager-b56c97db-xrm2h container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:26:44 crc kubenswrapper[4973]: I0320 13:26:44.874181 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" podUID="451656d5-3bd4-402b-98b9-202b3ac829e1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:26:45 crc kubenswrapper[4973]: I0320 13:26:45.294593 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 13:26:45 crc kubenswrapper[4973]: I0320 13:26:45.476864 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 13:26:45 crc kubenswrapper[4973]: I0320 13:26:45.487082 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 13:26:45 crc kubenswrapper[4973]: I0320 13:26:45.718334 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 13:26:45 crc kubenswrapper[4973]: I0320 13:26:45.875274 4973 patch_prober.go:28] interesting pod/route-controller-manager-b56c97db-xrm2h container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:26:45 crc kubenswrapper[4973]: I0320 13:26:45.875332 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" podUID="451656d5-3bd4-402b-98b9-202b3ac829e1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:26:46 crc kubenswrapper[4973]: I0320 13:26:46.050829 4973 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 13:26:46 crc kubenswrapper[4973]: I0320 13:26:46.272919 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 13:26:46 crc kubenswrapper[4973]: I0320 13:26:46.290029 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 13:26:46 crc kubenswrapper[4973]: I0320 13:26:46.382435 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 13:26:46 crc kubenswrapper[4973]: I0320 13:26:46.544072 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 13:26:46 crc kubenswrapper[4973]: I0320 13:26:46.579497 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 13:26:46 crc kubenswrapper[4973]: I0320 13:26:46.936592 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 13:26:47 crc kubenswrapper[4973]: I0320 13:26:47.289715 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 13:26:47 crc kubenswrapper[4973]: I0320 13:26:47.388940 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 13:26:47 crc kubenswrapper[4973]: I0320 13:26:47.442638 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 13:26:47 crc kubenswrapper[4973]: I0320 13:26:47.456770 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 13:26:47 crc kubenswrapper[4973]: I0320 13:26:47.581433 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 13:26:47 crc kubenswrapper[4973]: I0320 13:26:47.648033 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 13:26:47 crc kubenswrapper[4973]: I0320 13:26:47.881639 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 13:26:48 crc kubenswrapper[4973]: I0320 13:26:48.237153 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 13:26:48 crc kubenswrapper[4973]: I0320 13:26:48.294386 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 13:26:48 crc kubenswrapper[4973]: I0320 13:26:48.504777 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 13:26:48 crc kubenswrapper[4973]: I0320 13:26:48.607076 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 13:26:48 crc kubenswrapper[4973]: I0320 13:26:48.690174 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 13:26:48 crc kubenswrapper[4973]: I0320 13:26:48.737726 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 13:26:48 crc kubenswrapper[4973]: I0320 13:26:48.758192 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 13:26:48 crc kubenswrapper[4973]: I0320 13:26:48.789947 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 13:26:48 crc kubenswrapper[4973]: I0320 13:26:48.794293 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 13:26:48 crc kubenswrapper[4973]: I0320 13:26:48.828118 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 13:26:48 crc kubenswrapper[4973]: I0320 13:26:48.982443 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 13:26:49 crc kubenswrapper[4973]: I0320 13:26:49.012929 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 13:26:49 crc kubenswrapper[4973]: I0320 13:26:49.182132 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 13:26:49 crc kubenswrapper[4973]: I0320 13:26:49.198470 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 13:26:49 crc kubenswrapper[4973]: I0320 13:26:49.341949 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 13:26:49 crc kubenswrapper[4973]: I0320 13:26:49.369322 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 13:26:49 crc kubenswrapper[4973]: I0320 13:26:49.438078 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 13:26:49 crc kubenswrapper[4973]: I0320 13:26:49.535556 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 13:26:49 crc kubenswrapper[4973]: I0320 13:26:49.575868 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 13:26:49 crc kubenswrapper[4973]: I0320 13:26:49.594000 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 13:26:49 crc kubenswrapper[4973]: I0320 13:26:49.630449 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 13:26:49 crc kubenswrapper[4973]: I0320 13:26:49.656375 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 13:26:49 crc kubenswrapper[4973]: I0320 13:26:49.668189 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 13:26:49 crc kubenswrapper[4973]: I0320 13:26:49.689240 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 13:26:49 crc kubenswrapper[4973]: I0320 13:26:49.825643 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 13:26:49 crc kubenswrapper[4973]: I0320 13:26:49.862936 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 13:26:49 crc kubenswrapper[4973]: I0320 13:26:49.980710 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 13:26:50 crc kubenswrapper[4973]: I0320 13:26:50.009695 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 13:26:50 crc kubenswrapper[4973]: I0320 13:26:50.023079 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 13:26:50 crc kubenswrapper[4973]: I0320 13:26:50.088166 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 13:26:50 crc kubenswrapper[4973]: I0320 13:26:50.126600 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 13:26:50 crc kubenswrapper[4973]: I0320 13:26:50.153993 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 13:26:50 crc kubenswrapper[4973]: I0320 13:26:50.155163 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 13:26:50 crc kubenswrapper[4973]: I0320 13:26:50.181015 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 13:26:50 crc kubenswrapper[4973]: I0320 13:26:50.181706 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 13:26:50 crc kubenswrapper[4973]: I0320 13:26:50.371687 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 13:26:50 crc kubenswrapper[4973]: I0320 13:26:50.463925 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 13:26:50 crc kubenswrapper[4973]: I0320 13:26:50.592759 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 13:26:50 crc kubenswrapper[4973]: I0320 13:26:50.641651 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 13:26:50 crc kubenswrapper[4973]: I0320 13:26:50.721751 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 13:26:50 crc kubenswrapper[4973]: I0320 13:26:50.872963 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 13:26:50 crc kubenswrapper[4973]: I0320 13:26:50.898192 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 13:26:50 crc kubenswrapper[4973]: I0320 13:26:50.975385 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 13:26:51 crc kubenswrapper[4973]: I0320 13:26:51.131118 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 13:26:51 crc kubenswrapper[4973]: I0320 13:26:51.175470 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 13:26:51 crc kubenswrapper[4973]: I0320 13:26:51.176306 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 13:26:51 crc kubenswrapper[4973]: I0320 13:26:51.207023 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 13:26:51 crc kubenswrapper[4973]: I0320 13:26:51.298919 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 13:26:51 crc kubenswrapper[4973]: I0320 13:26:51.304592 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 13:26:51 crc kubenswrapper[4973]: I0320 13:26:51.377108 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 13:26:51 crc kubenswrapper[4973]: I0320 13:26:51.414441 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 13:26:51 crc kubenswrapper[4973]: I0320 13:26:51.516749 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 13:26:51 crc kubenswrapper[4973]: I0320 13:26:51.583035 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 13:26:51 crc kubenswrapper[4973]: I0320 13:26:51.747329 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 13:26:51 crc kubenswrapper[4973]: I0320 13:26:51.893478 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 13:26:51 crc kubenswrapper[4973]: I0320 13:26:51.949894 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 13:26:51 crc kubenswrapper[4973]: I0320 13:26:51.978273 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 13:26:52 crc kubenswrapper[4973]: I0320 13:26:52.015307 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 13:26:52 crc kubenswrapper[4973]: I0320 13:26:52.046370 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 13:26:52 crc kubenswrapper[4973]: I0320 13:26:52.204626 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 13:26:52 crc kubenswrapper[4973]: I0320 13:26:52.284596 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 13:26:52 crc kubenswrapper[4973]: I0320 13:26:52.369578 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 13:26:52 crc kubenswrapper[4973]: I0320 13:26:52.382681 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 13:26:52 crc kubenswrapper[4973]: I0320 13:26:52.502408 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 13:26:52 crc kubenswrapper[4973]: I0320 13:26:52.506677 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 13:26:52 crc kubenswrapper[4973]: I0320 13:26:52.533816 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 13:26:52 crc kubenswrapper[4973]: I0320 13:26:52.555747 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 13:26:52 crc kubenswrapper[4973]: I0320 13:26:52.690785 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 13:26:52 crc kubenswrapper[4973]: I0320 13:26:52.715576 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 13:26:52 crc kubenswrapper[4973]: I0320 13:26:52.739790 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 13:26:52 crc kubenswrapper[4973]: I0320 13:26:52.763467 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 13:26:52 crc kubenswrapper[4973]: I0320 13:26:52.822953 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 13:26:52 crc kubenswrapper[4973]: I0320 13:26:52.912596 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 13:26:52 crc kubenswrapper[4973]: I0320 13:26:52.937230 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 13:26:52 crc kubenswrapper[4973]: I0320 13:26:52.947536 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 13:26:53 crc kubenswrapper[4973]: I0320 13:26:53.032533 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 13:26:53 crc kubenswrapper[4973]: I0320 13:26:53.059464 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 13:26:53 crc kubenswrapper[4973]: I0320 13:26:53.100311 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 13:26:53 crc kubenswrapper[4973]: I0320 13:26:53.104653 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 13:26:53 crc kubenswrapper[4973]: I0320 13:26:53.115270 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 13:26:53 crc kubenswrapper[4973]: I0320 13:26:53.129383 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 13:26:53 crc kubenswrapper[4973]: I0320 13:26:53.190433 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 13:26:53 crc kubenswrapper[4973]: I0320 13:26:53.210601 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 13:26:53 crc kubenswrapper[4973]: I0320 13:26:53.293446 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 13:26:53 crc kubenswrapper[4973]: I0320 13:26:53.348171 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 13:26:53 crc kubenswrapper[4973]: I0320 13:26:53.392891 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 13:26:53 crc kubenswrapper[4973]: I0320 13:26:53.442090 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 13:26:53 crc kubenswrapper[4973]: I0320 13:26:53.442090 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 13:26:53 crc kubenswrapper[4973]: I0320 13:26:53.442235 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 13:26:53 crc kubenswrapper[4973]: I0320 13:26:53.488525 4973 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 13:26:53 crc kubenswrapper[4973]: I0320 13:26:53.495523 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 13:26:53 crc kubenswrapper[4973]: I0320 13:26:53.522550 4973 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 13:26:53 crc kubenswrapper[4973]: I0320 13:26:53.637241 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 13:26:53 crc kubenswrapper[4973]: I0320 13:26:53.652145 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 13:26:53 crc kubenswrapper[4973]: I0320 13:26:53.692416 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 13:26:53 crc kubenswrapper[4973]: I0320 13:26:53.782758 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 13:26:53 crc kubenswrapper[4973]: I0320 13:26:53.795981 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 13:26:53 crc kubenswrapper[4973]: I0320 13:26:53.934410 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 13:26:53 crc kubenswrapper[4973]: I0320 13:26:53.991719 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 13:26:54 crc kubenswrapper[4973]: I0320 13:26:54.043502 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 13:26:54 crc kubenswrapper[4973]: I0320 13:26:54.144944 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 13:26:54 crc kubenswrapper[4973]: I0320 13:26:54.150763 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 13:26:54 crc kubenswrapper[4973]: I0320 13:26:54.154412 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 13:26:54 crc kubenswrapper[4973]: I0320 13:26:54.201820 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 13:26:54 crc kubenswrapper[4973]: I0320 13:26:54.243304 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 13:26:54 crc kubenswrapper[4973]: I0320 13:26:54.295409 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 13:26:54 crc kubenswrapper[4973]: I0320 13:26:54.354605 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 13:26:54 crc kubenswrapper[4973]: I0320 13:26:54.536325 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 13:26:54 crc kubenswrapper[4973]: I0320 13:26:54.560483 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 13:26:54 crc kubenswrapper[4973]: I0320 13:26:54.585504 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 13:26:54 crc kubenswrapper[4973]: I0320 13:26:54.609442 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 13:26:54 crc kubenswrapper[4973]: I0320 13:26:54.620809 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 13:26:54 crc kubenswrapper[4973]: I0320 13:26:54.631267 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 13:26:54 crc kubenswrapper[4973]: I0320 13:26:54.682579 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 13:26:54 crc kubenswrapper[4973]: I0320 13:26:54.794047 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 13:26:54 crc kubenswrapper[4973]: I0320 13:26:54.815842 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 13:26:54 crc kubenswrapper[4973]: I0320 13:26:54.825797 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 13:26:54 crc kubenswrapper[4973]: I0320 13:26:54.854743 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 13:26:54 crc kubenswrapper[4973]: I0320 13:26:54.874100 4973 patch_prober.go:28] interesting pod/route-controller-manager-b56c97db-xrm2h container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 13:26:54 crc kubenswrapper[4973]: I0320 13:26:54.874180 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" podUID="451656d5-3bd4-402b-98b9-202b3ac829e1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 13:26:54 crc kubenswrapper[4973]: I0320 13:26:54.880002 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 13:26:54 crc kubenswrapper[4973]: I0320 13:26:54.883714 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 13:26:55 crc kubenswrapper[4973]: I0320 13:26:55.006263 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 13:26:55 crc kubenswrapper[4973]: I0320 13:26:55.061464 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 13:26:55 crc kubenswrapper[4973]: I0320 13:26:55.204585 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 13:26:55 crc kubenswrapper[4973]: I0320 13:26:55.235544 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 13:26:55 crc kubenswrapper[4973]: I0320 13:26:55.268520 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 13:26:55 crc kubenswrapper[4973]: I0320 13:26:55.279196 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 13:26:55 crc kubenswrapper[4973]: I0320 13:26:55.326589 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 13:26:55 crc kubenswrapper[4973]: I0320 13:26:55.332920 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 13:26:55 crc kubenswrapper[4973]: I0320 13:26:55.436668 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 13:26:55 crc kubenswrapper[4973]: I0320 13:26:55.493067 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 13:26:55 crc kubenswrapper[4973]: I0320 13:26:55.495302 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 13:26:55 crc kubenswrapper[4973]: I0320 13:26:55.527874 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 13:26:55 crc kubenswrapper[4973]: I0320 13:26:55.602738 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 13:26:55 crc kubenswrapper[4973]: I0320 13:26:55.602960 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 13:26:55 crc kubenswrapper[4973]: I0320 13:26:55.614030 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 13:26:55 crc kubenswrapper[4973]: I0320 13:26:55.707235 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 13:26:55 crc kubenswrapper[4973]: I0320 13:26:55.735076 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 13:26:55 crc kubenswrapper[4973]: I0320 13:26:55.745070 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 13:26:55 crc kubenswrapper[4973]: I0320 13:26:55.790201 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 13:26:55 crc kubenswrapper[4973]: I0320 13:26:55.812047 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 13:26:55 crc kubenswrapper[4973]: I0320 13:26:55.813983 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 13:26:55 crc kubenswrapper[4973]: I0320 13:26:55.832048 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 13:26:55 crc kubenswrapper[4973]: I0320 13:26:55.842790 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 13:26:55 crc kubenswrapper[4973]: I0320 13:26:55.868753 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 13:26:55 crc kubenswrapper[4973]: I0320 13:26:55.915161 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 13:26:55 crc kubenswrapper[4973]: I0320 13:26:55.966447 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 13:26:55 crc kubenswrapper[4973]: I0320 13:26:55.990208 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 13:26:56 crc kubenswrapper[4973]: I0320 13:26:56.044417 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 13:26:56 crc kubenswrapper[4973]: I0320 13:26:56.187147 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 13:26:56 crc kubenswrapper[4973]: I0320 13:26:56.280112 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 13:26:56 crc kubenswrapper[4973]: I0320 13:26:56.333265 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 13:26:56 crc kubenswrapper[4973]: I0320 13:26:56.336467 4973 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 13:26:56 crc kubenswrapper[4973]: I0320 13:26:56.388842 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 13:26:56 crc kubenswrapper[4973]: I0320 13:26:56.395219 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 13:26:56 crc kubenswrapper[4973]: I0320 13:26:56.407329 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 13:26:56 crc kubenswrapper[4973]: I0320 13:26:56.507504 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 13:26:56 crc kubenswrapper[4973]: I0320 13:26:56.622737 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 13:26:56 crc kubenswrapper[4973]: I0320 13:26:56.677810 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 13:26:56 crc kubenswrapper[4973]: I0320 13:26:56.792705 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 13:26:56 crc kubenswrapper[4973]: I0320 13:26:56.805718 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 13:26:56 crc kubenswrapper[4973]: I0320 13:26:56.867186 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 13:26:56 crc kubenswrapper[4973]: I0320 13:26:56.882213 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 13:26:56 crc kubenswrapper[4973]: I0320 13:26:56.892771 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 13:26:56 crc kubenswrapper[4973]: I0320 13:26:56.903256 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 13:26:56 crc kubenswrapper[4973]: I0320 13:26:56.905761 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 13:26:56 crc kubenswrapper[4973]: I0320 13:26:56.996549 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 13:26:57 crc kubenswrapper[4973]: I0320 13:26:57.012608 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 13:26:57 crc kubenswrapper[4973]: I0320 13:26:57.054656 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.234281 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.243203 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.243664 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.244056 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.244251 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.244286 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.247519 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.247763 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.247969 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.248147 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.248830 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.249367 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.249693 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.249717 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.249721 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.249745 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.249991 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.261621 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.261962 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.266125 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.266214 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.266125 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.266410 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.266538 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.280900 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.293764 4973 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.296063 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" podStartSLOduration=46.296042474 podStartE2EDuration="46.296042474s" podCreationTimestamp="2026-03-20 13:26:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:36.030186996 +0000 UTC m=+316.773856740" watchObservedRunningTime="2026-03-20 13:26:58.296042474 +0000 UTC m=+339.039712218" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.298285 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" podStartSLOduration=47.298267145 podStartE2EDuration="47.298267145s" podCreationTimestamp="2026-03-20 13:26:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:41.056318166 +0000 UTC m=+321.799987910" watchObservedRunningTime="2026-03-20 13:26:58.298267145 +0000 UTC m=+339.041936889" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.300435 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.300484 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.300509 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h","openshift-controller-manager/controller-manager-6f85f9899-lkzwb"] Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.312581 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.339750 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=23.339698421 podStartE2EDuration="23.339698421s" podCreationTimestamp="2026-03-20 13:26:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:26:58.332252526 +0000 UTC m=+339.075922270" watchObservedRunningTime="2026-03-20 13:26:58.339698421 +0000 UTC m=+339.083368165" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.505028 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.582966 4973 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.634787 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.636212 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.770112 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.871521 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.911690 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.958225 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.964133 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 13:26:58 crc kubenswrapper[4973]: I0320 13:26:58.997450 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 13:26:59 crc kubenswrapper[4973]: I0320 13:26:59.009546 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 13:26:59 crc kubenswrapper[4973]: I0320 13:26:59.034629 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 13:26:59 crc kubenswrapper[4973]: I0320 13:26:59.081191 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" Mar 20 13:26:59 crc kubenswrapper[4973]: I0320 13:26:59.221605 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 13:26:59 crc kubenswrapper[4973]: I0320 13:26:59.410252 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 13:26:59 crc kubenswrapper[4973]: I0320 13:26:59.452089 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 13:26:59 crc kubenswrapper[4973]: I0320 13:26:59.561461 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 13:26:59 crc kubenswrapper[4973]: I0320 13:26:59.604247 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 13:26:59 crc kubenswrapper[4973]: I0320 13:26:59.786203 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 13:26:59 crc kubenswrapper[4973]: I0320 13:26:59.803251 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 13:26:59 crc kubenswrapper[4973]: I0320 13:26:59.836035 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 13:26:59 crc kubenswrapper[4973]: I0320 13:26:59.838327 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 13:26:59 crc kubenswrapper[4973]: I0320 13:26:59.966739 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 13:27:00 crc kubenswrapper[4973]: I0320 13:27:00.005922 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 13:27:00 crc kubenswrapper[4973]: I0320 13:27:00.033916 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 13:27:00 crc kubenswrapper[4973]: I0320 13:27:00.056796 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 13:27:00 crc kubenswrapper[4973]: I0320 13:27:00.103615 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 13:27:00 crc kubenswrapper[4973]: I0320 13:27:00.168627 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 13:27:00 crc kubenswrapper[4973]: I0320 13:27:00.267032 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 13:27:00 crc kubenswrapper[4973]: I0320 13:27:00.308795 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 13:27:00 crc kubenswrapper[4973]: I0320 13:27:00.314208 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 13:27:00 crc kubenswrapper[4973]: I0320 13:27:00.598540 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 13:27:00 crc kubenswrapper[4973]: I0320 13:27:00.950242 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 13:27:00 crc kubenswrapper[4973]: I0320 13:27:00.975408 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 13:27:01 crc kubenswrapper[4973]: I0320 13:27:01.234128 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 13:27:01 crc kubenswrapper[4973]: I0320 13:27:01.277321 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 13:27:01 crc kubenswrapper[4973]: I0320 13:27:01.546524 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 13:27:01 crc kubenswrapper[4973]: I0320 13:27:01.796404 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 13:27:01 crc kubenswrapper[4973]: I0320 13:27:01.827862 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 13:27:02 crc kubenswrapper[4973]: I0320 13:27:02.370110 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 13:27:03 crc kubenswrapper[4973]: I0320 13:27:03.944982 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 13:27:09 crc kubenswrapper[4973]: I0320 13:27:09.619499 4973 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 13:27:09 crc kubenswrapper[4973]: I0320 13:27:09.620085 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://4879829748d04247919d4584e9d504332cd27c4463067eb66efb07235043065c" gracePeriod=5 Mar 20 13:27:15 crc kubenswrapper[4973]: I0320 13:27:15.194621 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 13:27:15 crc kubenswrapper[4973]: I0320 13:27:15.195569 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:27:15 crc kubenswrapper[4973]: I0320 13:27:15.331194 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:27:15 crc kubenswrapper[4973]: I0320 13:27:15.331266 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:27:15 crc kubenswrapper[4973]: I0320 13:27:15.331282 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:27:15 crc kubenswrapper[4973]: I0320 13:27:15.331303 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:27:15 crc kubenswrapper[4973]: I0320 13:27:15.331393 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 13:27:15 crc kubenswrapper[4973]: I0320 13:27:15.331413 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:27:15 crc kubenswrapper[4973]: I0320 13:27:15.331488 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:27:15 crc kubenswrapper[4973]: I0320 13:27:15.331558 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:27:15 crc kubenswrapper[4973]: I0320 13:27:15.331819 4973 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:15 crc kubenswrapper[4973]: I0320 13:27:15.331840 4973 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:15 crc kubenswrapper[4973]: I0320 13:27:15.331853 4973 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:15 crc kubenswrapper[4973]: I0320 13:27:15.331904 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:27:15 crc kubenswrapper[4973]: I0320 13:27:15.340124 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:27:15 crc kubenswrapper[4973]: I0320 13:27:15.361731 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 13:27:15 crc kubenswrapper[4973]: I0320 13:27:15.362080 4973 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="4879829748d04247919d4584e9d504332cd27c4463067eb66efb07235043065c" exitCode=137 Mar 20 13:27:15 crc kubenswrapper[4973]: I0320 13:27:15.362182 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 13:27:15 crc kubenswrapper[4973]: I0320 13:27:15.362156 4973 scope.go:117] "RemoveContainer" containerID="4879829748d04247919d4584e9d504332cd27c4463067eb66efb07235043065c" Mar 20 13:27:15 crc kubenswrapper[4973]: I0320 13:27:15.383135 4973 scope.go:117] "RemoveContainer" containerID="4879829748d04247919d4584e9d504332cd27c4463067eb66efb07235043065c" Mar 20 13:27:15 crc kubenswrapper[4973]: E0320 13:27:15.383660 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4879829748d04247919d4584e9d504332cd27c4463067eb66efb07235043065c\": container with ID starting with 4879829748d04247919d4584e9d504332cd27c4463067eb66efb07235043065c not found: ID does not exist" containerID="4879829748d04247919d4584e9d504332cd27c4463067eb66efb07235043065c" Mar 20 13:27:15 crc kubenswrapper[4973]: I0320 13:27:15.383708 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4879829748d04247919d4584e9d504332cd27c4463067eb66efb07235043065c"} err="failed to get container status \"4879829748d04247919d4584e9d504332cd27c4463067eb66efb07235043065c\": rpc error: code = NotFound desc = could not find container \"4879829748d04247919d4584e9d504332cd27c4463067eb66efb07235043065c\": container with ID starting with 4879829748d04247919d4584e9d504332cd27c4463067eb66efb07235043065c not found: ID does not exist" Mar 20 13:27:15 crc kubenswrapper[4973]: I0320 13:27:15.432515 4973 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:15 crc kubenswrapper[4973]: I0320 13:27:15.432546 4973 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 13:27:15 crc kubenswrapper[4973]: I0320 13:27:15.957696 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 20 13:27:19 crc kubenswrapper[4973]: I0320 13:27:19.446533 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 13:27:23 crc kubenswrapper[4973]: I0320 13:27:23.404198 4973 generic.go:334] "Generic (PLEG): container finished" podID="b0cbdcce-514f-4b72-8c8c-17029b7217a8" containerID="30bd3c695e315318c275f785e37f16f20888a754072f832bf4604eabf6fbe2cf" exitCode=0 Mar 20 13:27:23 crc kubenswrapper[4973]: I0320 13:27:23.404294 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" event={"ID":"b0cbdcce-514f-4b72-8c8c-17029b7217a8","Type":"ContainerDied","Data":"30bd3c695e315318c275f785e37f16f20888a754072f832bf4604eabf6fbe2cf"} Mar 20 13:27:23 crc kubenswrapper[4973]: I0320 13:27:23.405311 4973 scope.go:117] "RemoveContainer" containerID="30bd3c695e315318c275f785e37f16f20888a754072f832bf4604eabf6fbe2cf" Mar 20 13:27:24 crc kubenswrapper[4973]: I0320 13:27:24.410655 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" event={"ID":"b0cbdcce-514f-4b72-8c8c-17029b7217a8","Type":"ContainerStarted","Data":"be1af85491e1da9d562848f760f2f64ec735570af9790acaa9d5ea8b6763d507"} Mar 20 13:27:24 crc kubenswrapper[4973]: I0320 13:27:24.411864 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" Mar 20 13:27:24 crc kubenswrapper[4973]: I0320 13:27:24.414024 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" Mar 20 13:27:53 crc kubenswrapper[4973]: I0320 13:27:53.046237 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tl6hw"] Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.049027 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dlqh5"] Mar 20 13:27:58 crc kubenswrapper[4973]: E0320 13:27:58.049642 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48634fee-db9f-4111-abac-74b05132eaa9" containerName="installer" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.049662 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="48634fee-db9f-4111-abac-74b05132eaa9" containerName="installer" Mar 20 13:27:58 crc kubenswrapper[4973]: E0320 13:27:58.049684 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.049697 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.049843 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="48634fee-db9f-4111-abac-74b05132eaa9" containerName="installer" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.049894 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.050486 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.069219 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dlqh5"] Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.221062 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/71b8a459-9616-4919-a93a-dbc343c5e3b7-registry-certificates\") pod \"image-registry-66df7c8f76-dlqh5\" (UID: \"71b8a459-9616-4919-a93a-dbc343c5e3b7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.221372 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-dlqh5\" (UID: \"71b8a459-9616-4919-a93a-dbc343c5e3b7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.221405 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/71b8a459-9616-4919-a93a-dbc343c5e3b7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dlqh5\" (UID: \"71b8a459-9616-4919-a93a-dbc343c5e3b7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.221425 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/71b8a459-9616-4919-a93a-dbc343c5e3b7-trusted-ca\") pod \"image-registry-66df7c8f76-dlqh5\" (UID: \"71b8a459-9616-4919-a93a-dbc343c5e3b7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.221441 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/71b8a459-9616-4919-a93a-dbc343c5e3b7-registry-tls\") pod \"image-registry-66df7c8f76-dlqh5\" (UID: \"71b8a459-9616-4919-a93a-dbc343c5e3b7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.221471 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42q4b\" (UniqueName: \"kubernetes.io/projected/71b8a459-9616-4919-a93a-dbc343c5e3b7-kube-api-access-42q4b\") pod \"image-registry-66df7c8f76-dlqh5\" (UID: \"71b8a459-9616-4919-a93a-dbc343c5e3b7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.221684 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/71b8a459-9616-4919-a93a-dbc343c5e3b7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dlqh5\" (UID: \"71b8a459-9616-4919-a93a-dbc343c5e3b7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.221744 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/71b8a459-9616-4919-a93a-dbc343c5e3b7-bound-sa-token\") pod \"image-registry-66df7c8f76-dlqh5\" (UID: \"71b8a459-9616-4919-a93a-dbc343c5e3b7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.249876 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-dlqh5\" (UID: \"71b8a459-9616-4919-a93a-dbc343c5e3b7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.323482 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/71b8a459-9616-4919-a93a-dbc343c5e3b7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dlqh5\" (UID: \"71b8a459-9616-4919-a93a-dbc343c5e3b7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.323536 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/71b8a459-9616-4919-a93a-dbc343c5e3b7-trusted-ca\") pod \"image-registry-66df7c8f76-dlqh5\" (UID: \"71b8a459-9616-4919-a93a-dbc343c5e3b7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.323560 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/71b8a459-9616-4919-a93a-dbc343c5e3b7-registry-tls\") pod \"image-registry-66df7c8f76-dlqh5\" (UID: \"71b8a459-9616-4919-a93a-dbc343c5e3b7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.323600 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42q4b\" (UniqueName: \"kubernetes.io/projected/71b8a459-9616-4919-a93a-dbc343c5e3b7-kube-api-access-42q4b\") pod \"image-registry-66df7c8f76-dlqh5\" (UID: \"71b8a459-9616-4919-a93a-dbc343c5e3b7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.323642 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/71b8a459-9616-4919-a93a-dbc343c5e3b7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dlqh5\" (UID: \"71b8a459-9616-4919-a93a-dbc343c5e3b7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.323668 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/71b8a459-9616-4919-a93a-dbc343c5e3b7-bound-sa-token\") pod \"image-registry-66df7c8f76-dlqh5\" (UID: \"71b8a459-9616-4919-a93a-dbc343c5e3b7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.323703 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/71b8a459-9616-4919-a93a-dbc343c5e3b7-registry-certificates\") pod \"image-registry-66df7c8f76-dlqh5\" (UID: \"71b8a459-9616-4919-a93a-dbc343c5e3b7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.324056 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/71b8a459-9616-4919-a93a-dbc343c5e3b7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dlqh5\" (UID: \"71b8a459-9616-4919-a93a-dbc343c5e3b7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.324794 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/71b8a459-9616-4919-a93a-dbc343c5e3b7-registry-certificates\") pod \"image-registry-66df7c8f76-dlqh5\" (UID: \"71b8a459-9616-4919-a93a-dbc343c5e3b7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.324986 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/71b8a459-9616-4919-a93a-dbc343c5e3b7-trusted-ca\") pod \"image-registry-66df7c8f76-dlqh5\" (UID: \"71b8a459-9616-4919-a93a-dbc343c5e3b7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.330078 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/71b8a459-9616-4919-a93a-dbc343c5e3b7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dlqh5\" (UID: \"71b8a459-9616-4919-a93a-dbc343c5e3b7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.335962 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/71b8a459-9616-4919-a93a-dbc343c5e3b7-registry-tls\") pod \"image-registry-66df7c8f76-dlqh5\" (UID: \"71b8a459-9616-4919-a93a-dbc343c5e3b7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.339151 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42q4b\" (UniqueName: \"kubernetes.io/projected/71b8a459-9616-4919-a93a-dbc343c5e3b7-kube-api-access-42q4b\") pod \"image-registry-66df7c8f76-dlqh5\" (UID: \"71b8a459-9616-4919-a93a-dbc343c5e3b7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.350442 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/71b8a459-9616-4919-a93a-dbc343c5e3b7-bound-sa-token\") pod \"image-registry-66df7c8f76-dlqh5\" (UID: \"71b8a459-9616-4919-a93a-dbc343c5e3b7\") " pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.366760 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" Mar 20 13:27:58 crc kubenswrapper[4973]: I0320 13:27:58.757614 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dlqh5"] Mar 20 13:27:59 crc kubenswrapper[4973]: I0320 13:27:59.629525 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" event={"ID":"71b8a459-9616-4919-a93a-dbc343c5e3b7","Type":"ContainerStarted","Data":"730e03fe44c886e176cf0c7e9690f62ddf379fdd40460127cc4be2399d3cc06f"} Mar 20 13:27:59 crc kubenswrapper[4973]: I0320 13:27:59.629606 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" event={"ID":"71b8a459-9616-4919-a93a-dbc343c5e3b7","Type":"ContainerStarted","Data":"01cd52e3ec175861afb9c767c9428dd22806c7e26bf50c21f554b4669231027b"} Mar 20 13:27:59 crc kubenswrapper[4973]: I0320 13:27:59.629628 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" Mar 20 13:27:59 crc kubenswrapper[4973]: I0320 13:27:59.648703 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" podStartSLOduration=1.6486806 podStartE2EDuration="1.6486806s" podCreationTimestamp="2026-03-20 13:27:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:27:59.646234893 +0000 UTC m=+400.389904637" watchObservedRunningTime="2026-03-20 13:27:59.6486806 +0000 UTC m=+400.392350344" Mar 20 13:28:00 crc kubenswrapper[4973]: I0320 13:28:00.175910 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566888-z9k9w"] Mar 20 13:28:00 crc kubenswrapper[4973]: I0320 13:28:00.176955 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566888-z9k9w" Mar 20 13:28:00 crc kubenswrapper[4973]: I0320 13:28:00.179771 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 13:28:00 crc kubenswrapper[4973]: I0320 13:28:00.179935 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:28:00 crc kubenswrapper[4973]: I0320 13:28:00.181068 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:28:00 crc kubenswrapper[4973]: I0320 13:28:00.184147 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566888-z9k9w"] Mar 20 13:28:00 crc kubenswrapper[4973]: I0320 13:28:00.355777 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lcxq\" (UniqueName: \"kubernetes.io/projected/a10661fb-7db3-4aa7-b7f7-ffaaacb3999d-kube-api-access-2lcxq\") pod \"auto-csr-approver-29566888-z9k9w\" (UID: \"a10661fb-7db3-4aa7-b7f7-ffaaacb3999d\") " pod="openshift-infra/auto-csr-approver-29566888-z9k9w" Mar 20 13:28:00 crc kubenswrapper[4973]: I0320 13:28:00.457486 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lcxq\" (UniqueName: \"kubernetes.io/projected/a10661fb-7db3-4aa7-b7f7-ffaaacb3999d-kube-api-access-2lcxq\") pod \"auto-csr-approver-29566888-z9k9w\" (UID: \"a10661fb-7db3-4aa7-b7f7-ffaaacb3999d\") " pod="openshift-infra/auto-csr-approver-29566888-z9k9w" Mar 20 13:28:00 crc kubenswrapper[4973]: I0320 13:28:00.478644 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lcxq\" (UniqueName: \"kubernetes.io/projected/a10661fb-7db3-4aa7-b7f7-ffaaacb3999d-kube-api-access-2lcxq\") pod \"auto-csr-approver-29566888-z9k9w\" (UID: \"a10661fb-7db3-4aa7-b7f7-ffaaacb3999d\") " pod="openshift-infra/auto-csr-approver-29566888-z9k9w" Mar 20 13:28:00 crc kubenswrapper[4973]: I0320 13:28:00.494983 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566888-z9k9w" Mar 20 13:28:00 crc kubenswrapper[4973]: I0320 13:28:00.938485 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566888-z9k9w"] Mar 20 13:28:01 crc kubenswrapper[4973]: I0320 13:28:01.649129 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566888-z9k9w" event={"ID":"a10661fb-7db3-4aa7-b7f7-ffaaacb3999d","Type":"ContainerStarted","Data":"c59e84e21ed33dd38eab4cd54f64e51fa1b32d99603c9e47c52784488081131f"} Mar 20 13:28:02 crc kubenswrapper[4973]: I0320 13:28:02.662824 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566888-z9k9w" event={"ID":"a10661fb-7db3-4aa7-b7f7-ffaaacb3999d","Type":"ContainerStarted","Data":"0c6475e0b0e8b115d461fbcaa22f96cac44bb08baf3a011339b3ff6758eb9f8b"} Mar 20 13:28:02 crc kubenswrapper[4973]: I0320 13:28:02.694447 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566888-z9k9w" podStartSLOduration=1.264659553 podStartE2EDuration="2.694424083s" podCreationTimestamp="2026-03-20 13:28:00 +0000 UTC" firstStartedPulling="2026-03-20 13:28:00.951817061 +0000 UTC m=+401.695486835" lastFinishedPulling="2026-03-20 13:28:02.381581631 +0000 UTC m=+403.125251365" observedRunningTime="2026-03-20 13:28:02.679654299 +0000 UTC m=+403.423324053" watchObservedRunningTime="2026-03-20 13:28:02.694424083 +0000 UTC m=+403.438093827" Mar 20 13:28:03 crc kubenswrapper[4973]: I0320 13:28:03.670975 4973 generic.go:334] "Generic (PLEG): container finished" podID="a10661fb-7db3-4aa7-b7f7-ffaaacb3999d" containerID="0c6475e0b0e8b115d461fbcaa22f96cac44bb08baf3a011339b3ff6758eb9f8b" exitCode=0 Mar 20 13:28:03 crc kubenswrapper[4973]: I0320 13:28:03.671027 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566888-z9k9w" event={"ID":"a10661fb-7db3-4aa7-b7f7-ffaaacb3999d","Type":"ContainerDied","Data":"0c6475e0b0e8b115d461fbcaa22f96cac44bb08baf3a011339b3ff6758eb9f8b"} Mar 20 13:28:04 crc kubenswrapper[4973]: I0320 13:28:04.971234 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566888-z9k9w" Mar 20 13:28:05 crc kubenswrapper[4973]: I0320 13:28:05.124307 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lcxq\" (UniqueName: \"kubernetes.io/projected/a10661fb-7db3-4aa7-b7f7-ffaaacb3999d-kube-api-access-2lcxq\") pod \"a10661fb-7db3-4aa7-b7f7-ffaaacb3999d\" (UID: \"a10661fb-7db3-4aa7-b7f7-ffaaacb3999d\") " Mar 20 13:28:05 crc kubenswrapper[4973]: I0320 13:28:05.129644 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a10661fb-7db3-4aa7-b7f7-ffaaacb3999d-kube-api-access-2lcxq" (OuterVolumeSpecName: "kube-api-access-2lcxq") pod "a10661fb-7db3-4aa7-b7f7-ffaaacb3999d" (UID: "a10661fb-7db3-4aa7-b7f7-ffaaacb3999d"). InnerVolumeSpecName "kube-api-access-2lcxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:05 crc kubenswrapper[4973]: I0320 13:28:05.226267 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lcxq\" (UniqueName: \"kubernetes.io/projected/a10661fb-7db3-4aa7-b7f7-ffaaacb3999d-kube-api-access-2lcxq\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:05 crc kubenswrapper[4973]: I0320 13:28:05.683064 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566888-z9k9w" event={"ID":"a10661fb-7db3-4aa7-b7f7-ffaaacb3999d","Type":"ContainerDied","Data":"c59e84e21ed33dd38eab4cd54f64e51fa1b32d99603c9e47c52784488081131f"} Mar 20 13:28:05 crc kubenswrapper[4973]: I0320 13:28:05.683118 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c59e84e21ed33dd38eab4cd54f64e51fa1b32d99603c9e47c52784488081131f" Mar 20 13:28:05 crc kubenswrapper[4973]: I0320 13:28:05.683154 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566888-z9k9w" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.077872 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" podUID="b1fe291e-3490-49c0-9443-e5b0f03db19c" containerName="oauth-openshift" containerID="cri-o://8fc3ae1a34dca9e6b8bce0b1ad1a59852acffafd2e607e0f20d3875dfcaa1051" gracePeriod=15 Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.372687 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-dlqh5" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.427027 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r57q7"] Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.501513 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.535517 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-567cd76c58-zvtsd"] Mar 20 13:28:18 crc kubenswrapper[4973]: E0320 13:28:18.535743 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a10661fb-7db3-4aa7-b7f7-ffaaacb3999d" containerName="oc" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.535755 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="a10661fb-7db3-4aa7-b7f7-ffaaacb3999d" containerName="oc" Mar 20 13:28:18 crc kubenswrapper[4973]: E0320 13:28:18.535770 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1fe291e-3490-49c0-9443-e5b0f03db19c" containerName="oauth-openshift" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.535775 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1fe291e-3490-49c0-9443-e5b0f03db19c" containerName="oauth-openshift" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.535857 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="a10661fb-7db3-4aa7-b7f7-ffaaacb3999d" containerName="oc" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.535873 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1fe291e-3490-49c0-9443-e5b0f03db19c" containerName="oauth-openshift" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.536245 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.549752 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-567cd76c58-zvtsd"] Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.626478 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-router-certs\") pod \"b1fe291e-3490-49c0-9443-e5b0f03db19c\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.626560 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b1fe291e-3490-49c0-9443-e5b0f03db19c-audit-dir\") pod \"b1fe291e-3490-49c0-9443-e5b0f03db19c\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.626582 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b1fe291e-3490-49c0-9443-e5b0f03db19c-audit-policies\") pod \"b1fe291e-3490-49c0-9443-e5b0f03db19c\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.626601 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-session\") pod \"b1fe291e-3490-49c0-9443-e5b0f03db19c\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.626627 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jr5bs\" (UniqueName: \"kubernetes.io/projected/b1fe291e-3490-49c0-9443-e5b0f03db19c-kube-api-access-jr5bs\") pod \"b1fe291e-3490-49c0-9443-e5b0f03db19c\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.626645 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-user-template-provider-selection\") pod \"b1fe291e-3490-49c0-9443-e5b0f03db19c\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.626646 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b1fe291e-3490-49c0-9443-e5b0f03db19c-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "b1fe291e-3490-49c0-9443-e5b0f03db19c" (UID: "b1fe291e-3490-49c0-9443-e5b0f03db19c"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.626663 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-user-template-error\") pod \"b1fe291e-3490-49c0-9443-e5b0f03db19c\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.626733 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-trusted-ca-bundle\") pod \"b1fe291e-3490-49c0-9443-e5b0f03db19c\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.626799 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-serving-cert\") pod \"b1fe291e-3490-49c0-9443-e5b0f03db19c\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.626827 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-user-template-login\") pod \"b1fe291e-3490-49c0-9443-e5b0f03db19c\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.626856 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-cliconfig\") pod \"b1fe291e-3490-49c0-9443-e5b0f03db19c\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.626899 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-user-idp-0-file-data\") pod \"b1fe291e-3490-49c0-9443-e5b0f03db19c\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.626935 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-service-ca\") pod \"b1fe291e-3490-49c0-9443-e5b0f03db19c\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.626969 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-ocp-branding-template\") pod \"b1fe291e-3490-49c0-9443-e5b0f03db19c\" (UID: \"b1fe291e-3490-49c0-9443-e5b0f03db19c\") " Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.627211 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fcfe2a5b-dd17-425e-8024-655606a1c470-audit-dir\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.627254 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.627292 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-user-template-login\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.627359 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-system-serving-cert\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.627401 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-system-service-ca\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.627441 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.627484 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.627526 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-system-cliconfig\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.627569 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-system-session\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.627599 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-system-router-certs\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.627623 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-user-template-error\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.627676 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pqrn\" (UniqueName: \"kubernetes.io/projected/fcfe2a5b-dd17-425e-8024-655606a1c470-kube-api-access-8pqrn\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.627701 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fcfe2a5b-dd17-425e-8024-655606a1c470-audit-policies\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.627715 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "b1fe291e-3490-49c0-9443-e5b0f03db19c" (UID: "b1fe291e-3490-49c0-9443-e5b0f03db19c"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.627737 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "b1fe291e-3490-49c0-9443-e5b0f03db19c" (UID: "b1fe291e-3490-49c0-9443-e5b0f03db19c"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.627741 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.627915 4973 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.627936 4973 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.627951 4973 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b1fe291e-3490-49c0-9443-e5b0f03db19c-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.628206 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1fe291e-3490-49c0-9443-e5b0f03db19c-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "b1fe291e-3490-49c0-9443-e5b0f03db19c" (UID: "b1fe291e-3490-49c0-9443-e5b0f03db19c"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.628251 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "b1fe291e-3490-49c0-9443-e5b0f03db19c" (UID: "b1fe291e-3490-49c0-9443-e5b0f03db19c"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.631951 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "b1fe291e-3490-49c0-9443-e5b0f03db19c" (UID: "b1fe291e-3490-49c0-9443-e5b0f03db19c"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.632232 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "b1fe291e-3490-49c0-9443-e5b0f03db19c" (UID: "b1fe291e-3490-49c0-9443-e5b0f03db19c"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.632658 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "b1fe291e-3490-49c0-9443-e5b0f03db19c" (UID: "b1fe291e-3490-49c0-9443-e5b0f03db19c"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.632700 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "b1fe291e-3490-49c0-9443-e5b0f03db19c" (UID: "b1fe291e-3490-49c0-9443-e5b0f03db19c"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.638506 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "b1fe291e-3490-49c0-9443-e5b0f03db19c" (UID: "b1fe291e-3490-49c0-9443-e5b0f03db19c"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.638586 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1fe291e-3490-49c0-9443-e5b0f03db19c-kube-api-access-jr5bs" (OuterVolumeSpecName: "kube-api-access-jr5bs") pod "b1fe291e-3490-49c0-9443-e5b0f03db19c" (UID: "b1fe291e-3490-49c0-9443-e5b0f03db19c"). InnerVolumeSpecName "kube-api-access-jr5bs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.638849 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "b1fe291e-3490-49c0-9443-e5b0f03db19c" (UID: "b1fe291e-3490-49c0-9443-e5b0f03db19c"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.639076 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "b1fe291e-3490-49c0-9443-e5b0f03db19c" (UID: "b1fe291e-3490-49c0-9443-e5b0f03db19c"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.639323 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "b1fe291e-3490-49c0-9443-e5b0f03db19c" (UID: "b1fe291e-3490-49c0-9443-e5b0f03db19c"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.729159 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-system-service-ca\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.729237 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.729275 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.729305 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-system-cliconfig\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.729349 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-system-session\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.729374 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-system-router-certs\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.729396 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-user-template-error\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.729441 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8pqrn\" (UniqueName: \"kubernetes.io/projected/fcfe2a5b-dd17-425e-8024-655606a1c470-kube-api-access-8pqrn\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.729614 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fcfe2a5b-dd17-425e-8024-655606a1c470-audit-policies\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.729723 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.729921 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fcfe2a5b-dd17-425e-8024-655606a1c470-audit-dir\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.730190 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-system-service-ca\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.729750 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/fcfe2a5b-dd17-425e-8024-655606a1c470-audit-dir\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.730276 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.730304 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-user-template-login\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.730365 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-system-serving-cert\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.730454 4973 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b1fe291e-3490-49c0-9443-e5b0f03db19c-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.730467 4973 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.730482 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jr5bs\" (UniqueName: \"kubernetes.io/projected/b1fe291e-3490-49c0-9443-e5b0f03db19c-kube-api-access-jr5bs\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.730494 4973 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.730504 4973 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.730516 4973 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.730525 4973 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.730537 4973 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.730551 4973 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.730562 4973 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.730574 4973 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b1fe291e-3490-49c0-9443-e5b0f03db19c-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.730718 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/fcfe2a5b-dd17-425e-8024-655606a1c470-audit-policies\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.731320 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-system-cliconfig\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.731874 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.733529 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-system-router-certs\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.733533 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.734453 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.735049 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-user-template-login\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.735064 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-system-serving-cert\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.735129 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.735751 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-user-template-error\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.736313 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/fcfe2a5b-dd17-425e-8024-655606a1c470-v4-0-config-system-session\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.747912 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pqrn\" (UniqueName: \"kubernetes.io/projected/fcfe2a5b-dd17-425e-8024-655606a1c470-kube-api-access-8pqrn\") pod \"oauth-openshift-567cd76c58-zvtsd\" (UID: \"fcfe2a5b-dd17-425e-8024-655606a1c470\") " pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.764826 4973 generic.go:334] "Generic (PLEG): container finished" podID="b1fe291e-3490-49c0-9443-e5b0f03db19c" containerID="8fc3ae1a34dca9e6b8bce0b1ad1a59852acffafd2e607e0f20d3875dfcaa1051" exitCode=0 Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.764877 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" event={"ID":"b1fe291e-3490-49c0-9443-e5b0f03db19c","Type":"ContainerDied","Data":"8fc3ae1a34dca9e6b8bce0b1ad1a59852acffafd2e607e0f20d3875dfcaa1051"} Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.764898 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.764919 4973 scope.go:117] "RemoveContainer" containerID="8fc3ae1a34dca9e6b8bce0b1ad1a59852acffafd2e607e0f20d3875dfcaa1051" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.764907 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-tl6hw" event={"ID":"b1fe291e-3490-49c0-9443-e5b0f03db19c","Type":"ContainerDied","Data":"74e0a04a31c0c8b76fe37fed06f1b575b045e1f31557417141c74f81a9d51547"} Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.781393 4973 scope.go:117] "RemoveContainer" containerID="8fc3ae1a34dca9e6b8bce0b1ad1a59852acffafd2e607e0f20d3875dfcaa1051" Mar 20 13:28:18 crc kubenswrapper[4973]: E0320 13:28:18.781934 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fc3ae1a34dca9e6b8bce0b1ad1a59852acffafd2e607e0f20d3875dfcaa1051\": container with ID starting with 8fc3ae1a34dca9e6b8bce0b1ad1a59852acffafd2e607e0f20d3875dfcaa1051 not found: ID does not exist" containerID="8fc3ae1a34dca9e6b8bce0b1ad1a59852acffafd2e607e0f20d3875dfcaa1051" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.781967 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fc3ae1a34dca9e6b8bce0b1ad1a59852acffafd2e607e0f20d3875dfcaa1051"} err="failed to get container status \"8fc3ae1a34dca9e6b8bce0b1ad1a59852acffafd2e607e0f20d3875dfcaa1051\": rpc error: code = NotFound desc = could not find container \"8fc3ae1a34dca9e6b8bce0b1ad1a59852acffafd2e607e0f20d3875dfcaa1051\": container with ID starting with 8fc3ae1a34dca9e6b8bce0b1ad1a59852acffafd2e607e0f20d3875dfcaa1051 not found: ID does not exist" Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.804751 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tl6hw"] Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.813815 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-tl6hw"] Mar 20 13:28:18 crc kubenswrapper[4973]: I0320 13:28:18.852004 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:19 crc kubenswrapper[4973]: I0320 13:28:19.279429 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-567cd76c58-zvtsd"] Mar 20 13:28:19 crc kubenswrapper[4973]: I0320 13:28:19.772969 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" event={"ID":"fcfe2a5b-dd17-425e-8024-655606a1c470","Type":"ContainerStarted","Data":"50b08c96da284276ec3922bbcaf36d8ae5d0756a56d050f5fad7267d3bd5db7d"} Mar 20 13:28:19 crc kubenswrapper[4973]: I0320 13:28:19.773411 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" event={"ID":"fcfe2a5b-dd17-425e-8024-655606a1c470","Type":"ContainerStarted","Data":"e1582b9a643729d032b3aedb309c4bc9b3ad0b774b81a524e3dd2b421f29e67c"} Mar 20 13:28:19 crc kubenswrapper[4973]: I0320 13:28:19.776450 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:19 crc kubenswrapper[4973]: I0320 13:28:19.794808 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" podStartSLOduration=26.794786967 podStartE2EDuration="26.794786967s" podCreationTimestamp="2026-03-20 13:27:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:28:19.790226613 +0000 UTC m=+420.533896357" watchObservedRunningTime="2026-03-20 13:28:19.794786967 +0000 UTC m=+420.538456711" Mar 20 13:28:19 crc kubenswrapper[4973]: I0320 13:28:19.936565 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" Mar 20 13:28:19 crc kubenswrapper[4973]: I0320 13:28:19.959455 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1fe291e-3490-49c0-9443-e5b0f03db19c" path="/var/lib/kubelet/pods/b1fe291e-3490-49c0-9443-e5b0f03db19c/volumes" Mar 20 13:28:43 crc kubenswrapper[4973]: I0320 13:28:43.321062 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:28:43 crc kubenswrapper[4973]: I0320 13:28:43.323582 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:28:43 crc kubenswrapper[4973]: I0320 13:28:43.472297 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" podUID="a4e363de-fd5c-4f76-8943-ae3c56f3765b" containerName="registry" containerID="cri-o://8c0b5b254693bc98c893506550a9f2a364c6013e1f0d8c980544958121375ac9" gracePeriod=30 Mar 20 13:28:43 crc kubenswrapper[4973]: I0320 13:28:43.892214 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:28:43 crc kubenswrapper[4973]: I0320 13:28:43.921319 4973 generic.go:334] "Generic (PLEG): container finished" podID="a4e363de-fd5c-4f76-8943-ae3c56f3765b" containerID="8c0b5b254693bc98c893506550a9f2a364c6013e1f0d8c980544958121375ac9" exitCode=0 Mar 20 13:28:43 crc kubenswrapper[4973]: I0320 13:28:43.921378 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" event={"ID":"a4e363de-fd5c-4f76-8943-ae3c56f3765b","Type":"ContainerDied","Data":"8c0b5b254693bc98c893506550a9f2a364c6013e1f0d8c980544958121375ac9"} Mar 20 13:28:43 crc kubenswrapper[4973]: I0320 13:28:43.921401 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" event={"ID":"a4e363de-fd5c-4f76-8943-ae3c56f3765b","Type":"ContainerDied","Data":"65ec8861ba4d159f50d1cd266216b5e762a531b244f8930afaa5447c5954c07a"} Mar 20 13:28:43 crc kubenswrapper[4973]: I0320 13:28:43.921421 4973 scope.go:117] "RemoveContainer" containerID="8c0b5b254693bc98c893506550a9f2a364c6013e1f0d8c980544958121375ac9" Mar 20 13:28:43 crc kubenswrapper[4973]: I0320 13:28:43.921454 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-r57q7" Mar 20 13:28:43 crc kubenswrapper[4973]: I0320 13:28:43.944714 4973 scope.go:117] "RemoveContainer" containerID="8c0b5b254693bc98c893506550a9f2a364c6013e1f0d8c980544958121375ac9" Mar 20 13:28:43 crc kubenswrapper[4973]: E0320 13:28:43.945202 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c0b5b254693bc98c893506550a9f2a364c6013e1f0d8c980544958121375ac9\": container with ID starting with 8c0b5b254693bc98c893506550a9f2a364c6013e1f0d8c980544958121375ac9 not found: ID does not exist" containerID="8c0b5b254693bc98c893506550a9f2a364c6013e1f0d8c980544958121375ac9" Mar 20 13:28:43 crc kubenswrapper[4973]: I0320 13:28:43.945331 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c0b5b254693bc98c893506550a9f2a364c6013e1f0d8c980544958121375ac9"} err="failed to get container status \"8c0b5b254693bc98c893506550a9f2a364c6013e1f0d8c980544958121375ac9\": rpc error: code = NotFound desc = could not find container \"8c0b5b254693bc98c893506550a9f2a364c6013e1f0d8c980544958121375ac9\": container with ID starting with 8c0b5b254693bc98c893506550a9f2a364c6013e1f0d8c980544958121375ac9 not found: ID does not exist" Mar 20 13:28:43 crc kubenswrapper[4973]: I0320 13:28:43.990827 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a4e363de-fd5c-4f76-8943-ae3c56f3765b-ca-trust-extracted\") pod \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " Mar 20 13:28:43 crc kubenswrapper[4973]: I0320 13:28:43.990907 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a4e363de-fd5c-4f76-8943-ae3c56f3765b-installation-pull-secrets\") pod \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " Mar 20 13:28:43 crc kubenswrapper[4973]: I0320 13:28:43.990931 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a4e363de-fd5c-4f76-8943-ae3c56f3765b-registry-certificates\") pod \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " Mar 20 13:28:43 crc kubenswrapper[4973]: I0320 13:28:43.990980 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz7fv\" (UniqueName: \"kubernetes.io/projected/a4e363de-fd5c-4f76-8943-ae3c56f3765b-kube-api-access-xz7fv\") pod \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " Mar 20 13:28:43 crc kubenswrapper[4973]: I0320 13:28:43.991004 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a4e363de-fd5c-4f76-8943-ae3c56f3765b-registry-tls\") pod \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " Mar 20 13:28:43 crc kubenswrapper[4973]: I0320 13:28:43.991188 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " Mar 20 13:28:43 crc kubenswrapper[4973]: I0320 13:28:43.991231 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4e363de-fd5c-4f76-8943-ae3c56f3765b-trusted-ca\") pod \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " Mar 20 13:28:43 crc kubenswrapper[4973]: I0320 13:28:43.991249 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a4e363de-fd5c-4f76-8943-ae3c56f3765b-bound-sa-token\") pod \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\" (UID: \"a4e363de-fd5c-4f76-8943-ae3c56f3765b\") " Mar 20 13:28:43 crc kubenswrapper[4973]: I0320 13:28:43.991932 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4e363de-fd5c-4f76-8943-ae3c56f3765b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a4e363de-fd5c-4f76-8943-ae3c56f3765b" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:43 crc kubenswrapper[4973]: I0320 13:28:43.993275 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4e363de-fd5c-4f76-8943-ae3c56f3765b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a4e363de-fd5c-4f76-8943-ae3c56f3765b" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:43 crc kubenswrapper[4973]: I0320 13:28:43.998054 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4e363de-fd5c-4f76-8943-ae3c56f3765b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a4e363de-fd5c-4f76-8943-ae3c56f3765b" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:43 crc kubenswrapper[4973]: I0320 13:28:43.998710 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a4e363de-fd5c-4f76-8943-ae3c56f3765b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a4e363de-fd5c-4f76-8943-ae3c56f3765b" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:44 crc kubenswrapper[4973]: I0320 13:28:44.001237 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4e363de-fd5c-4f76-8943-ae3c56f3765b-kube-api-access-xz7fv" (OuterVolumeSpecName: "kube-api-access-xz7fv") pod "a4e363de-fd5c-4f76-8943-ae3c56f3765b" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b"). InnerVolumeSpecName "kube-api-access-xz7fv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:44 crc kubenswrapper[4973]: I0320 13:28:44.002453 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4e363de-fd5c-4f76-8943-ae3c56f3765b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a4e363de-fd5c-4f76-8943-ae3c56f3765b" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:44 crc kubenswrapper[4973]: I0320 13:28:44.006201 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "a4e363de-fd5c-4f76-8943-ae3c56f3765b" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:28:44 crc kubenswrapper[4973]: I0320 13:28:44.006829 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4e363de-fd5c-4f76-8943-ae3c56f3765b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a4e363de-fd5c-4f76-8943-ae3c56f3765b" (UID: "a4e363de-fd5c-4f76-8943-ae3c56f3765b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:28:44 crc kubenswrapper[4973]: I0320 13:28:44.092576 4973 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a4e363de-fd5c-4f76-8943-ae3c56f3765b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:44 crc kubenswrapper[4973]: I0320 13:28:44.092992 4973 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a4e363de-fd5c-4f76-8943-ae3c56f3765b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:44 crc kubenswrapper[4973]: I0320 13:28:44.093015 4973 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a4e363de-fd5c-4f76-8943-ae3c56f3765b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:44 crc kubenswrapper[4973]: I0320 13:28:44.093030 4973 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a4e363de-fd5c-4f76-8943-ae3c56f3765b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:44 crc kubenswrapper[4973]: I0320 13:28:44.093044 4973 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a4e363de-fd5c-4f76-8943-ae3c56f3765b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:44 crc kubenswrapper[4973]: I0320 13:28:44.093078 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz7fv\" (UniqueName: \"kubernetes.io/projected/a4e363de-fd5c-4f76-8943-ae3c56f3765b-kube-api-access-xz7fv\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:44 crc kubenswrapper[4973]: I0320 13:28:44.093103 4973 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a4e363de-fd5c-4f76-8943-ae3c56f3765b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:44 crc kubenswrapper[4973]: I0320 13:28:44.267986 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r57q7"] Mar 20 13:28:44 crc kubenswrapper[4973]: I0320 13:28:44.272071 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-r57q7"] Mar 20 13:28:45 crc kubenswrapper[4973]: I0320 13:28:45.959302 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4e363de-fd5c-4f76-8943-ae3c56f3765b" path="/var/lib/kubelet/pods/a4e363de-fd5c-4f76-8943-ae3c56f3765b/volumes" Mar 20 13:28:54 crc kubenswrapper[4973]: I0320 13:28:54.592437 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5jzd6"] Mar 20 13:28:54 crc kubenswrapper[4973]: I0320 13:28:54.593321 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-5jzd6" podUID="2115631d-0f02-4cb4-bfee-e18dd87a0462" containerName="registry-server" containerID="cri-o://0395673bbf4bceba4bfbfacd5ed9da9de3375db709c57b139dce03e8d570c3e0" gracePeriod=30 Mar 20 13:28:54 crc kubenswrapper[4973]: I0320 13:28:54.610507 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-plc2f"] Mar 20 13:28:54 crc kubenswrapper[4973]: I0320 13:28:54.610817 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-plc2f" podUID="8f429634-2787-4daa-a443-e4ab84f2e6b7" containerName="registry-server" containerID="cri-o://18d92bd3a9ca7915dd4ae5d8f43927e1ae8256cc06c76bd242c200dfe3b90b44" gracePeriod=30 Mar 20 13:28:54 crc kubenswrapper[4973]: I0320 13:28:54.621098 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-l7crv"] Mar 20 13:28:54 crc kubenswrapper[4973]: I0320 13:28:54.621328 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" podUID="b0cbdcce-514f-4b72-8c8c-17029b7217a8" containerName="marketplace-operator" containerID="cri-o://be1af85491e1da9d562848f760f2f64ec735570af9790acaa9d5ea8b6763d507" gracePeriod=30 Mar 20 13:28:54 crc kubenswrapper[4973]: I0320 13:28:54.627353 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c7jw4"] Mar 20 13:28:54 crc kubenswrapper[4973]: I0320 13:28:54.627649 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c7jw4" podUID="0993b0a3-f604-4447-bce2-01636b061230" containerName="registry-server" containerID="cri-o://e25d1c90ec9a04104b2de18694da6c5b694a1131ea348f7eebf9bc63d2b6795f" gracePeriod=30 Mar 20 13:28:54 crc kubenswrapper[4973]: I0320 13:28:54.641661 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6p25c"] Mar 20 13:28:54 crc kubenswrapper[4973]: I0320 13:28:54.642076 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6p25c" podUID="097e9042-52e2-4a7e-b567-5b97f34242d6" containerName="registry-server" containerID="cri-o://96aa2c4308d9871bde168cb5796161dfe73221250faa8c2c6a47bfb8cd322984" gracePeriod=30 Mar 20 13:28:54 crc kubenswrapper[4973]: I0320 13:28:54.648285 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kfft6"] Mar 20 13:28:54 crc kubenswrapper[4973]: E0320 13:28:54.650048 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4e363de-fd5c-4f76-8943-ae3c56f3765b" containerName="registry" Mar 20 13:28:54 crc kubenswrapper[4973]: I0320 13:28:54.650075 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4e363de-fd5c-4f76-8943-ae3c56f3765b" containerName="registry" Mar 20 13:28:54 crc kubenswrapper[4973]: I0320 13:28:54.650250 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4e363de-fd5c-4f76-8943-ae3c56f3765b" containerName="registry" Mar 20 13:28:54 crc kubenswrapper[4973]: I0320 13:28:54.650729 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kfft6" Mar 20 13:28:54 crc kubenswrapper[4973]: I0320 13:28:54.668297 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kfft6"] Mar 20 13:28:54 crc kubenswrapper[4973]: I0320 13:28:54.808701 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2843ad35-cfc0-4922-8b96-cebb15694c99-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kfft6\" (UID: \"2843ad35-cfc0-4922-8b96-cebb15694c99\") " pod="openshift-marketplace/marketplace-operator-79b997595-kfft6" Mar 20 13:28:54 crc kubenswrapper[4973]: I0320 13:28:54.808773 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2843ad35-cfc0-4922-8b96-cebb15694c99-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kfft6\" (UID: \"2843ad35-cfc0-4922-8b96-cebb15694c99\") " pod="openshift-marketplace/marketplace-operator-79b997595-kfft6" Mar 20 13:28:54 crc kubenswrapper[4973]: I0320 13:28:54.808792 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcc74\" (UniqueName: \"kubernetes.io/projected/2843ad35-cfc0-4922-8b96-cebb15694c99-kube-api-access-pcc74\") pod \"marketplace-operator-79b997595-kfft6\" (UID: \"2843ad35-cfc0-4922-8b96-cebb15694c99\") " pod="openshift-marketplace/marketplace-operator-79b997595-kfft6" Mar 20 13:28:54 crc kubenswrapper[4973]: I0320 13:28:54.909514 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2843ad35-cfc0-4922-8b96-cebb15694c99-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kfft6\" (UID: \"2843ad35-cfc0-4922-8b96-cebb15694c99\") " pod="openshift-marketplace/marketplace-operator-79b997595-kfft6" Mar 20 13:28:54 crc kubenswrapper[4973]: I0320 13:28:54.909560 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcc74\" (UniqueName: \"kubernetes.io/projected/2843ad35-cfc0-4922-8b96-cebb15694c99-kube-api-access-pcc74\") pod \"marketplace-operator-79b997595-kfft6\" (UID: \"2843ad35-cfc0-4922-8b96-cebb15694c99\") " pod="openshift-marketplace/marketplace-operator-79b997595-kfft6" Mar 20 13:28:54 crc kubenswrapper[4973]: I0320 13:28:54.909611 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2843ad35-cfc0-4922-8b96-cebb15694c99-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kfft6\" (UID: \"2843ad35-cfc0-4922-8b96-cebb15694c99\") " pod="openshift-marketplace/marketplace-operator-79b997595-kfft6" Mar 20 13:28:54 crc kubenswrapper[4973]: I0320 13:28:54.911094 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/2843ad35-cfc0-4922-8b96-cebb15694c99-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kfft6\" (UID: \"2843ad35-cfc0-4922-8b96-cebb15694c99\") " pod="openshift-marketplace/marketplace-operator-79b997595-kfft6" Mar 20 13:28:54 crc kubenswrapper[4973]: I0320 13:28:54.915394 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/2843ad35-cfc0-4922-8b96-cebb15694c99-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kfft6\" (UID: \"2843ad35-cfc0-4922-8b96-cebb15694c99\") " pod="openshift-marketplace/marketplace-operator-79b997595-kfft6" Mar 20 13:28:54 crc kubenswrapper[4973]: I0320 13:28:54.927238 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcc74\" (UniqueName: \"kubernetes.io/projected/2843ad35-cfc0-4922-8b96-cebb15694c99-kube-api-access-pcc74\") pod \"marketplace-operator-79b997595-kfft6\" (UID: \"2843ad35-cfc0-4922-8b96-cebb15694c99\") " pod="openshift-marketplace/marketplace-operator-79b997595-kfft6" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.008618 4973 generic.go:334] "Generic (PLEG): container finished" podID="2115631d-0f02-4cb4-bfee-e18dd87a0462" containerID="0395673bbf4bceba4bfbfacd5ed9da9de3375db709c57b139dce03e8d570c3e0" exitCode=0 Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.008686 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jzd6" event={"ID":"2115631d-0f02-4cb4-bfee-e18dd87a0462","Type":"ContainerDied","Data":"0395673bbf4bceba4bfbfacd5ed9da9de3375db709c57b139dce03e8d570c3e0"} Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.010955 4973 generic.go:334] "Generic (PLEG): container finished" podID="0993b0a3-f604-4447-bce2-01636b061230" containerID="e25d1c90ec9a04104b2de18694da6c5b694a1131ea348f7eebf9bc63d2b6795f" exitCode=0 Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.011018 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c7jw4" event={"ID":"0993b0a3-f604-4447-bce2-01636b061230","Type":"ContainerDied","Data":"e25d1c90ec9a04104b2de18694da6c5b694a1131ea348f7eebf9bc63d2b6795f"} Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.015828 4973 generic.go:334] "Generic (PLEG): container finished" podID="b0cbdcce-514f-4b72-8c8c-17029b7217a8" containerID="be1af85491e1da9d562848f760f2f64ec735570af9790acaa9d5ea8b6763d507" exitCode=0 Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.015924 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" event={"ID":"b0cbdcce-514f-4b72-8c8c-17029b7217a8","Type":"ContainerDied","Data":"be1af85491e1da9d562848f760f2f64ec735570af9790acaa9d5ea8b6763d507"} Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.015999 4973 scope.go:117] "RemoveContainer" containerID="30bd3c695e315318c275f785e37f16f20888a754072f832bf4604eabf6fbe2cf" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.018857 4973 generic.go:334] "Generic (PLEG): container finished" podID="097e9042-52e2-4a7e-b567-5b97f34242d6" containerID="96aa2c4308d9871bde168cb5796161dfe73221250faa8c2c6a47bfb8cd322984" exitCode=0 Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.018926 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6p25c" event={"ID":"097e9042-52e2-4a7e-b567-5b97f34242d6","Type":"ContainerDied","Data":"96aa2c4308d9871bde168cb5796161dfe73221250faa8c2c6a47bfb8cd322984"} Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.021089 4973 generic.go:334] "Generic (PLEG): container finished" podID="8f429634-2787-4daa-a443-e4ab84f2e6b7" containerID="18d92bd3a9ca7915dd4ae5d8f43927e1ae8256cc06c76bd242c200dfe3b90b44" exitCode=0 Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.021116 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-plc2f" event={"ID":"8f429634-2787-4daa-a443-e4ab84f2e6b7","Type":"ContainerDied","Data":"18d92bd3a9ca7915dd4ae5d8f43927e1ae8256cc06c76bd242c200dfe3b90b44"} Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.021138 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-plc2f" event={"ID":"8f429634-2787-4daa-a443-e4ab84f2e6b7","Type":"ContainerDied","Data":"2bfbadf6b0db245fff8160e3566710c30f4a4f50fcd5dcdf5a01db3dda0a749b"} Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.021162 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bfbadf6b0db245fff8160e3566710c30f4a4f50fcd5dcdf5a01db3dda0a749b" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.064161 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kfft6" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.066613 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-plc2f" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.074469 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jzd6" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.118148 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6p25c" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.145153 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.155122 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c7jw4" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.212639 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2115631d-0f02-4cb4-bfee-e18dd87a0462-catalog-content\") pod \"2115631d-0f02-4cb4-bfee-e18dd87a0462\" (UID: \"2115631d-0f02-4cb4-bfee-e18dd87a0462\") " Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.212741 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f429634-2787-4daa-a443-e4ab84f2e6b7-utilities\") pod \"8f429634-2787-4daa-a443-e4ab84f2e6b7\" (UID: \"8f429634-2787-4daa-a443-e4ab84f2e6b7\") " Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.212784 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-glj8x\" (UniqueName: \"kubernetes.io/projected/2115631d-0f02-4cb4-bfee-e18dd87a0462-kube-api-access-glj8x\") pod \"2115631d-0f02-4cb4-bfee-e18dd87a0462\" (UID: \"2115631d-0f02-4cb4-bfee-e18dd87a0462\") " Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.212849 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8sms\" (UniqueName: \"kubernetes.io/projected/8f429634-2787-4daa-a443-e4ab84f2e6b7-kube-api-access-f8sms\") pod \"8f429634-2787-4daa-a443-e4ab84f2e6b7\" (UID: \"8f429634-2787-4daa-a443-e4ab84f2e6b7\") " Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.212880 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2115631d-0f02-4cb4-bfee-e18dd87a0462-utilities\") pod \"2115631d-0f02-4cb4-bfee-e18dd87a0462\" (UID: \"2115631d-0f02-4cb4-bfee-e18dd87a0462\") " Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.212921 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f429634-2787-4daa-a443-e4ab84f2e6b7-catalog-content\") pod \"8f429634-2787-4daa-a443-e4ab84f2e6b7\" (UID: \"8f429634-2787-4daa-a443-e4ab84f2e6b7\") " Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.217403 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2115631d-0f02-4cb4-bfee-e18dd87a0462-utilities" (OuterVolumeSpecName: "utilities") pod "2115631d-0f02-4cb4-bfee-e18dd87a0462" (UID: "2115631d-0f02-4cb4-bfee-e18dd87a0462"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.217990 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f429634-2787-4daa-a443-e4ab84f2e6b7-utilities" (OuterVolumeSpecName: "utilities") pod "8f429634-2787-4daa-a443-e4ab84f2e6b7" (UID: "8f429634-2787-4daa-a443-e4ab84f2e6b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.227410 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2115631d-0f02-4cb4-bfee-e18dd87a0462-kube-api-access-glj8x" (OuterVolumeSpecName: "kube-api-access-glj8x") pod "2115631d-0f02-4cb4-bfee-e18dd87a0462" (UID: "2115631d-0f02-4cb4-bfee-e18dd87a0462"). InnerVolumeSpecName "kube-api-access-glj8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.229015 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f429634-2787-4daa-a443-e4ab84f2e6b7-kube-api-access-f8sms" (OuterVolumeSpecName: "kube-api-access-f8sms") pod "8f429634-2787-4daa-a443-e4ab84f2e6b7" (UID: "8f429634-2787-4daa-a443-e4ab84f2e6b7"). InnerVolumeSpecName "kube-api-access-f8sms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.284047 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2115631d-0f02-4cb4-bfee-e18dd87a0462-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2115631d-0f02-4cb4-bfee-e18dd87a0462" (UID: "2115631d-0f02-4cb4-bfee-e18dd87a0462"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.289971 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f429634-2787-4daa-a443-e4ab84f2e6b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f429634-2787-4daa-a443-e4ab84f2e6b7" (UID: "8f429634-2787-4daa-a443-e4ab84f2e6b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.314554 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b0cbdcce-514f-4b72-8c8c-17029b7217a8-marketplace-trusted-ca\") pod \"b0cbdcce-514f-4b72-8c8c-17029b7217a8\" (UID: \"b0cbdcce-514f-4b72-8c8c-17029b7217a8\") " Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.314609 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b0cbdcce-514f-4b72-8c8c-17029b7217a8-marketplace-operator-metrics\") pod \"b0cbdcce-514f-4b72-8c8c-17029b7217a8\" (UID: \"b0cbdcce-514f-4b72-8c8c-17029b7217a8\") " Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.314666 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-64hqq\" (UniqueName: \"kubernetes.io/projected/097e9042-52e2-4a7e-b567-5b97f34242d6-kube-api-access-64hqq\") pod \"097e9042-52e2-4a7e-b567-5b97f34242d6\" (UID: \"097e9042-52e2-4a7e-b567-5b97f34242d6\") " Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.314720 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0993b0a3-f604-4447-bce2-01636b061230-utilities\") pod \"0993b0a3-f604-4447-bce2-01636b061230\" (UID: \"0993b0a3-f604-4447-bce2-01636b061230\") " Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.314743 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/097e9042-52e2-4a7e-b567-5b97f34242d6-utilities\") pod \"097e9042-52e2-4a7e-b567-5b97f34242d6\" (UID: \"097e9042-52e2-4a7e-b567-5b97f34242d6\") " Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.314760 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0993b0a3-f604-4447-bce2-01636b061230-catalog-content\") pod \"0993b0a3-f604-4447-bce2-01636b061230\" (UID: \"0993b0a3-f604-4447-bce2-01636b061230\") " Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.314799 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mzghc\" (UniqueName: \"kubernetes.io/projected/0993b0a3-f604-4447-bce2-01636b061230-kube-api-access-mzghc\") pod \"0993b0a3-f604-4447-bce2-01636b061230\" (UID: \"0993b0a3-f604-4447-bce2-01636b061230\") " Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.314826 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh8zf\" (UniqueName: \"kubernetes.io/projected/b0cbdcce-514f-4b72-8c8c-17029b7217a8-kube-api-access-vh8zf\") pod \"b0cbdcce-514f-4b72-8c8c-17029b7217a8\" (UID: \"b0cbdcce-514f-4b72-8c8c-17029b7217a8\") " Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.314848 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/097e9042-52e2-4a7e-b567-5b97f34242d6-catalog-content\") pod \"097e9042-52e2-4a7e-b567-5b97f34242d6\" (UID: \"097e9042-52e2-4a7e-b567-5b97f34242d6\") " Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.315091 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f429634-2787-4daa-a443-e4ab84f2e6b7-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.315118 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-glj8x\" (UniqueName: \"kubernetes.io/projected/2115631d-0f02-4cb4-bfee-e18dd87a0462-kube-api-access-glj8x\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.315132 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8sms\" (UniqueName: \"kubernetes.io/projected/8f429634-2787-4daa-a443-e4ab84f2e6b7-kube-api-access-f8sms\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.315146 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2115631d-0f02-4cb4-bfee-e18dd87a0462-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.315157 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f429634-2787-4daa-a443-e4ab84f2e6b7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.315169 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2115631d-0f02-4cb4-bfee-e18dd87a0462-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.315447 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b0cbdcce-514f-4b72-8c8c-17029b7217a8-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b0cbdcce-514f-4b72-8c8c-17029b7217a8" (UID: "b0cbdcce-514f-4b72-8c8c-17029b7217a8"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.315567 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0993b0a3-f604-4447-bce2-01636b061230-utilities" (OuterVolumeSpecName: "utilities") pod "0993b0a3-f604-4447-bce2-01636b061230" (UID: "0993b0a3-f604-4447-bce2-01636b061230"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.315634 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/097e9042-52e2-4a7e-b567-5b97f34242d6-utilities" (OuterVolumeSpecName: "utilities") pod "097e9042-52e2-4a7e-b567-5b97f34242d6" (UID: "097e9042-52e2-4a7e-b567-5b97f34242d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.317805 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0cbdcce-514f-4b72-8c8c-17029b7217a8-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b0cbdcce-514f-4b72-8c8c-17029b7217a8" (UID: "b0cbdcce-514f-4b72-8c8c-17029b7217a8"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.317886 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/097e9042-52e2-4a7e-b567-5b97f34242d6-kube-api-access-64hqq" (OuterVolumeSpecName: "kube-api-access-64hqq") pod "097e9042-52e2-4a7e-b567-5b97f34242d6" (UID: "097e9042-52e2-4a7e-b567-5b97f34242d6"). InnerVolumeSpecName "kube-api-access-64hqq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.318397 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0cbdcce-514f-4b72-8c8c-17029b7217a8-kube-api-access-vh8zf" (OuterVolumeSpecName: "kube-api-access-vh8zf") pod "b0cbdcce-514f-4b72-8c8c-17029b7217a8" (UID: "b0cbdcce-514f-4b72-8c8c-17029b7217a8"). InnerVolumeSpecName "kube-api-access-vh8zf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.318843 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0993b0a3-f604-4447-bce2-01636b061230-kube-api-access-mzghc" (OuterVolumeSpecName: "kube-api-access-mzghc") pod "0993b0a3-f604-4447-bce2-01636b061230" (UID: "0993b0a3-f604-4447-bce2-01636b061230"). InnerVolumeSpecName "kube-api-access-mzghc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.356495 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0993b0a3-f604-4447-bce2-01636b061230-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0993b0a3-f604-4447-bce2-01636b061230" (UID: "0993b0a3-f604-4447-bce2-01636b061230"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.416554 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-64hqq\" (UniqueName: \"kubernetes.io/projected/097e9042-52e2-4a7e-b567-5b97f34242d6-kube-api-access-64hqq\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.416590 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0993b0a3-f604-4447-bce2-01636b061230-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.416600 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/097e9042-52e2-4a7e-b567-5b97f34242d6-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.416609 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0993b0a3-f604-4447-bce2-01636b061230-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.416618 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mzghc\" (UniqueName: \"kubernetes.io/projected/0993b0a3-f604-4447-bce2-01636b061230-kube-api-access-mzghc\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.416626 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh8zf\" (UniqueName: \"kubernetes.io/projected/b0cbdcce-514f-4b72-8c8c-17029b7217a8-kube-api-access-vh8zf\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.416635 4973 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b0cbdcce-514f-4b72-8c8c-17029b7217a8-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.416644 4973 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b0cbdcce-514f-4b72-8c8c-17029b7217a8-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.458878 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/097e9042-52e2-4a7e-b567-5b97f34242d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "097e9042-52e2-4a7e-b567-5b97f34242d6" (UID: "097e9042-52e2-4a7e-b567-5b97f34242d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.515180 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kfft6"] Mar 20 13:28:55 crc kubenswrapper[4973]: I0320 13:28:55.517962 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/097e9042-52e2-4a7e-b567-5b97f34242d6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.028081 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6p25c" event={"ID":"097e9042-52e2-4a7e-b567-5b97f34242d6","Type":"ContainerDied","Data":"f75e40b7ab9e44acb7ae2421d86c232f4943ea4b43a0a500023ca0abec0dcf24"} Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.028383 4973 scope.go:117] "RemoveContainer" containerID="96aa2c4308d9871bde168cb5796161dfe73221250faa8c2c6a47bfb8cd322984" Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.028312 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6p25c" Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.029770 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kfft6" event={"ID":"2843ad35-cfc0-4922-8b96-cebb15694c99","Type":"ContainerStarted","Data":"bac1e650f3acfe5c677991a57c3b3a377e86a275ad38496e98672aadab3304af"} Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.029797 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kfft6" event={"ID":"2843ad35-cfc0-4922-8b96-cebb15694c99","Type":"ContainerStarted","Data":"41f6f4003fc7bc26e8a028fcce4942b9b5bd20c16882dfec3ca2d21f3660aeee"} Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.030920 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-kfft6" Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.032188 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-5jzd6" Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.032677 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-5jzd6" event={"ID":"2115631d-0f02-4cb4-bfee-e18dd87a0462","Type":"ContainerDied","Data":"703def46dbd3bf420b463ea2358eb7effd3cea44dbd17f56cb4a1cbcf1e74e6a"} Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.046690 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c7jw4" event={"ID":"0993b0a3-f604-4447-bce2-01636b061230","Type":"ContainerDied","Data":"b9cf01bd82dffd5ae33b8f9ca5ce597ada7b90d54b3d4a62d3126a51db83792a"} Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.046769 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c7jw4" Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.048594 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-plc2f" Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.048795 4973 scope.go:117] "RemoveContainer" containerID="3eb631ef9f8977b0cae6df746fb756b7f261e3f3c1b0ae8eb9666a2d6ba08e95" Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.048905 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.049319 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-l7crv" event={"ID":"b0cbdcce-514f-4b72-8c8c-17029b7217a8","Type":"ContainerDied","Data":"1c9e41c5ed9170e422bc0a5dcef516c793a78b237e475c0c2f24ca2222b43c6b"} Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.074421 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-kfft6" podStartSLOduration=2.074404286 podStartE2EDuration="2.074404286s" podCreationTimestamp="2026-03-20 13:28:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:28:56.067689892 +0000 UTC m=+456.811359636" watchObservedRunningTime="2026-03-20 13:28:56.074404286 +0000 UTC m=+456.818074030" Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.075577 4973 scope.go:117] "RemoveContainer" containerID="4b456b688e7f33f27332351c95d84b4a52455670e07e3318d97c7b157d2e50b2" Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.084145 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-5jzd6"] Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.101944 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-5jzd6"] Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.102183 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-kfft6" Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.121056 4973 scope.go:117] "RemoveContainer" containerID="0395673bbf4bceba4bfbfacd5ed9da9de3375db709c57b139dce03e8d570c3e0" Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.122665 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6p25c"] Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.126317 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6p25c"] Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.136801 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c7jw4"] Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.142617 4973 scope.go:117] "RemoveContainer" containerID="dc6526e06ba5337105e9f0338a9e0a154a735b4c08f5cd80eebb85de1efe3952" Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.144641 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c7jw4"] Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.153509 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-plc2f"] Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.160821 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-plc2f"] Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.161329 4973 scope.go:117] "RemoveContainer" containerID="0acdc2d07d6b35d8a2476747b85cd85c583270203eb80a7f9d7a3c7f78fa44e5" Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.164971 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-l7crv"] Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.171822 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-l7crv"] Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.176648 4973 scope.go:117] "RemoveContainer" containerID="e25d1c90ec9a04104b2de18694da6c5b694a1131ea348f7eebf9bc63d2b6795f" Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.194049 4973 scope.go:117] "RemoveContainer" containerID="d5edb5515ca20a4ddc54247ad2599cece16abc6a5966dde3b7d31f262aa665cc" Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.234838 4973 scope.go:117] "RemoveContainer" containerID="40ff8a0068a1f16d6b0aadcc93ee6ca4ff61cbe773fe9cca174fa5284a1f3c41" Mar 20 13:28:56 crc kubenswrapper[4973]: I0320 13:28:56.249977 4973 scope.go:117] "RemoveContainer" containerID="be1af85491e1da9d562848f760f2f64ec735570af9790acaa9d5ea8b6763d507" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.410775 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gmpxj"] Mar 20 13:28:57 crc kubenswrapper[4973]: E0320 13:28:57.411014 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cbdcce-514f-4b72-8c8c-17029b7217a8" containerName="marketplace-operator" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.411029 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cbdcce-514f-4b72-8c8c-17029b7217a8" containerName="marketplace-operator" Mar 20 13:28:57 crc kubenswrapper[4973]: E0320 13:28:57.411043 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2115631d-0f02-4cb4-bfee-e18dd87a0462" containerName="extract-content" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.411050 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="2115631d-0f02-4cb4-bfee-e18dd87a0462" containerName="extract-content" Mar 20 13:28:57 crc kubenswrapper[4973]: E0320 13:28:57.411062 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="097e9042-52e2-4a7e-b567-5b97f34242d6" containerName="registry-server" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.411071 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="097e9042-52e2-4a7e-b567-5b97f34242d6" containerName="registry-server" Mar 20 13:28:57 crc kubenswrapper[4973]: E0320 13:28:57.411082 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0993b0a3-f604-4447-bce2-01636b061230" containerName="registry-server" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.411089 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="0993b0a3-f604-4447-bce2-01636b061230" containerName="registry-server" Mar 20 13:28:57 crc kubenswrapper[4973]: E0320 13:28:57.411099 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0993b0a3-f604-4447-bce2-01636b061230" containerName="extract-content" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.411107 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="0993b0a3-f604-4447-bce2-01636b061230" containerName="extract-content" Mar 20 13:28:57 crc kubenswrapper[4973]: E0320 13:28:57.411121 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0993b0a3-f604-4447-bce2-01636b061230" containerName="extract-utilities" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.411129 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="0993b0a3-f604-4447-bce2-01636b061230" containerName="extract-utilities" Mar 20 13:28:57 crc kubenswrapper[4973]: E0320 13:28:57.411138 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="097e9042-52e2-4a7e-b567-5b97f34242d6" containerName="extract-content" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.411144 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="097e9042-52e2-4a7e-b567-5b97f34242d6" containerName="extract-content" Mar 20 13:28:57 crc kubenswrapper[4973]: E0320 13:28:57.411159 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2115631d-0f02-4cb4-bfee-e18dd87a0462" containerName="registry-server" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.411165 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="2115631d-0f02-4cb4-bfee-e18dd87a0462" containerName="registry-server" Mar 20 13:28:57 crc kubenswrapper[4973]: E0320 13:28:57.411178 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="097e9042-52e2-4a7e-b567-5b97f34242d6" containerName="extract-utilities" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.411185 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="097e9042-52e2-4a7e-b567-5b97f34242d6" containerName="extract-utilities" Mar 20 13:28:57 crc kubenswrapper[4973]: E0320 13:28:57.411195 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0cbdcce-514f-4b72-8c8c-17029b7217a8" containerName="marketplace-operator" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.411202 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0cbdcce-514f-4b72-8c8c-17029b7217a8" containerName="marketplace-operator" Mar 20 13:28:57 crc kubenswrapper[4973]: E0320 13:28:57.411212 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f429634-2787-4daa-a443-e4ab84f2e6b7" containerName="extract-content" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.411220 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f429634-2787-4daa-a443-e4ab84f2e6b7" containerName="extract-content" Mar 20 13:28:57 crc kubenswrapper[4973]: E0320 13:28:57.411232 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f429634-2787-4daa-a443-e4ab84f2e6b7" containerName="extract-utilities" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.411239 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f429634-2787-4daa-a443-e4ab84f2e6b7" containerName="extract-utilities" Mar 20 13:28:57 crc kubenswrapper[4973]: E0320 13:28:57.411250 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2115631d-0f02-4cb4-bfee-e18dd87a0462" containerName="extract-utilities" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.411257 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="2115631d-0f02-4cb4-bfee-e18dd87a0462" containerName="extract-utilities" Mar 20 13:28:57 crc kubenswrapper[4973]: E0320 13:28:57.411270 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f429634-2787-4daa-a443-e4ab84f2e6b7" containerName="registry-server" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.411277 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f429634-2787-4daa-a443-e4ab84f2e6b7" containerName="registry-server" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.411398 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cbdcce-514f-4b72-8c8c-17029b7217a8" containerName="marketplace-operator" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.411415 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="2115631d-0f02-4cb4-bfee-e18dd87a0462" containerName="registry-server" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.411426 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0cbdcce-514f-4b72-8c8c-17029b7217a8" containerName="marketplace-operator" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.411438 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="097e9042-52e2-4a7e-b567-5b97f34242d6" containerName="registry-server" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.411450 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="0993b0a3-f604-4447-bce2-01636b061230" containerName="registry-server" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.411460 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f429634-2787-4daa-a443-e4ab84f2e6b7" containerName="registry-server" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.412488 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gmpxj" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.416840 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.421616 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gmpxj"] Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.542402 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23fa52a0-b5df-4056-bd9e-4f42e8e9893c-utilities\") pod \"certified-operators-gmpxj\" (UID: \"23fa52a0-b5df-4056-bd9e-4f42e8e9893c\") " pod="openshift-marketplace/certified-operators-gmpxj" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.542444 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23fa52a0-b5df-4056-bd9e-4f42e8e9893c-catalog-content\") pod \"certified-operators-gmpxj\" (UID: \"23fa52a0-b5df-4056-bd9e-4f42e8e9893c\") " pod="openshift-marketplace/certified-operators-gmpxj" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.542489 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2prcc\" (UniqueName: \"kubernetes.io/projected/23fa52a0-b5df-4056-bd9e-4f42e8e9893c-kube-api-access-2prcc\") pod \"certified-operators-gmpxj\" (UID: \"23fa52a0-b5df-4056-bd9e-4f42e8e9893c\") " pod="openshift-marketplace/certified-operators-gmpxj" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.644202 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23fa52a0-b5df-4056-bd9e-4f42e8e9893c-catalog-content\") pod \"certified-operators-gmpxj\" (UID: \"23fa52a0-b5df-4056-bd9e-4f42e8e9893c\") " pod="openshift-marketplace/certified-operators-gmpxj" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.644249 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23fa52a0-b5df-4056-bd9e-4f42e8e9893c-utilities\") pod \"certified-operators-gmpxj\" (UID: \"23fa52a0-b5df-4056-bd9e-4f42e8e9893c\") " pod="openshift-marketplace/certified-operators-gmpxj" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.644282 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2prcc\" (UniqueName: \"kubernetes.io/projected/23fa52a0-b5df-4056-bd9e-4f42e8e9893c-kube-api-access-2prcc\") pod \"certified-operators-gmpxj\" (UID: \"23fa52a0-b5df-4056-bd9e-4f42e8e9893c\") " pod="openshift-marketplace/certified-operators-gmpxj" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.645009 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23fa52a0-b5df-4056-bd9e-4f42e8e9893c-catalog-content\") pod \"certified-operators-gmpxj\" (UID: \"23fa52a0-b5df-4056-bd9e-4f42e8e9893c\") " pod="openshift-marketplace/certified-operators-gmpxj" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.645125 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23fa52a0-b5df-4056-bd9e-4f42e8e9893c-utilities\") pod \"certified-operators-gmpxj\" (UID: \"23fa52a0-b5df-4056-bd9e-4f42e8e9893c\") " pod="openshift-marketplace/certified-operators-gmpxj" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.668514 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2prcc\" (UniqueName: \"kubernetes.io/projected/23fa52a0-b5df-4056-bd9e-4f42e8e9893c-kube-api-access-2prcc\") pod \"certified-operators-gmpxj\" (UID: \"23fa52a0-b5df-4056-bd9e-4f42e8e9893c\") " pod="openshift-marketplace/certified-operators-gmpxj" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.786110 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gmpxj" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.960000 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="097e9042-52e2-4a7e-b567-5b97f34242d6" path="/var/lib/kubelet/pods/097e9042-52e2-4a7e-b567-5b97f34242d6/volumes" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.961132 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0993b0a3-f604-4447-bce2-01636b061230" path="/var/lib/kubelet/pods/0993b0a3-f604-4447-bce2-01636b061230/volumes" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.961761 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2115631d-0f02-4cb4-bfee-e18dd87a0462" path="/var/lib/kubelet/pods/2115631d-0f02-4cb4-bfee-e18dd87a0462/volumes" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.962862 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f429634-2787-4daa-a443-e4ab84f2e6b7" path="/var/lib/kubelet/pods/8f429634-2787-4daa-a443-e4ab84f2e6b7/volumes" Mar 20 13:28:57 crc kubenswrapper[4973]: I0320 13:28:57.963473 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0cbdcce-514f-4b72-8c8c-17029b7217a8" path="/var/lib/kubelet/pods/b0cbdcce-514f-4b72-8c8c-17029b7217a8/volumes" Mar 20 13:28:58 crc kubenswrapper[4973]: I0320 13:28:58.222972 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gmpxj"] Mar 20 13:28:58 crc kubenswrapper[4973]: W0320 13:28:58.229008 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23fa52a0_b5df_4056_bd9e_4f42e8e9893c.slice/crio-1268d233be1df167395850251a3e75516770c833b565925cd1f518def416dcae WatchSource:0}: Error finding container 1268d233be1df167395850251a3e75516770c833b565925cd1f518def416dcae: Status 404 returned error can't find the container with id 1268d233be1df167395850251a3e75516770c833b565925cd1f518def416dcae Mar 20 13:28:58 crc kubenswrapper[4973]: I0320 13:28:58.417884 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h5hz4"] Mar 20 13:28:58 crc kubenswrapper[4973]: I0320 13:28:58.419506 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h5hz4" Mar 20 13:28:58 crc kubenswrapper[4973]: I0320 13:28:58.421672 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 13:28:58 crc kubenswrapper[4973]: I0320 13:28:58.428391 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h5hz4"] Mar 20 13:28:58 crc kubenswrapper[4973]: I0320 13:28:58.559275 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b450ebc6-3181-4e0d-b546-b10ac89e0481-utilities\") pod \"redhat-operators-h5hz4\" (UID: \"b450ebc6-3181-4e0d-b546-b10ac89e0481\") " pod="openshift-marketplace/redhat-operators-h5hz4" Mar 20 13:28:58 crc kubenswrapper[4973]: I0320 13:28:58.559375 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b450ebc6-3181-4e0d-b546-b10ac89e0481-catalog-content\") pod \"redhat-operators-h5hz4\" (UID: \"b450ebc6-3181-4e0d-b546-b10ac89e0481\") " pod="openshift-marketplace/redhat-operators-h5hz4" Mar 20 13:28:58 crc kubenswrapper[4973]: I0320 13:28:58.559665 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bvwk\" (UniqueName: \"kubernetes.io/projected/b450ebc6-3181-4e0d-b546-b10ac89e0481-kube-api-access-9bvwk\") pod \"redhat-operators-h5hz4\" (UID: \"b450ebc6-3181-4e0d-b546-b10ac89e0481\") " pod="openshift-marketplace/redhat-operators-h5hz4" Mar 20 13:28:58 crc kubenswrapper[4973]: I0320 13:28:58.661467 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b450ebc6-3181-4e0d-b546-b10ac89e0481-utilities\") pod \"redhat-operators-h5hz4\" (UID: \"b450ebc6-3181-4e0d-b546-b10ac89e0481\") " pod="openshift-marketplace/redhat-operators-h5hz4" Mar 20 13:28:58 crc kubenswrapper[4973]: I0320 13:28:58.661594 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b450ebc6-3181-4e0d-b546-b10ac89e0481-catalog-content\") pod \"redhat-operators-h5hz4\" (UID: \"b450ebc6-3181-4e0d-b546-b10ac89e0481\") " pod="openshift-marketplace/redhat-operators-h5hz4" Mar 20 13:28:58 crc kubenswrapper[4973]: I0320 13:28:58.661654 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bvwk\" (UniqueName: \"kubernetes.io/projected/b450ebc6-3181-4e0d-b546-b10ac89e0481-kube-api-access-9bvwk\") pod \"redhat-operators-h5hz4\" (UID: \"b450ebc6-3181-4e0d-b546-b10ac89e0481\") " pod="openshift-marketplace/redhat-operators-h5hz4" Mar 20 13:28:58 crc kubenswrapper[4973]: I0320 13:28:58.662495 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b450ebc6-3181-4e0d-b546-b10ac89e0481-catalog-content\") pod \"redhat-operators-h5hz4\" (UID: \"b450ebc6-3181-4e0d-b546-b10ac89e0481\") " pod="openshift-marketplace/redhat-operators-h5hz4" Mar 20 13:28:58 crc kubenswrapper[4973]: I0320 13:28:58.662549 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b450ebc6-3181-4e0d-b546-b10ac89e0481-utilities\") pod \"redhat-operators-h5hz4\" (UID: \"b450ebc6-3181-4e0d-b546-b10ac89e0481\") " pod="openshift-marketplace/redhat-operators-h5hz4" Mar 20 13:28:58 crc kubenswrapper[4973]: I0320 13:28:58.693648 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bvwk\" (UniqueName: \"kubernetes.io/projected/b450ebc6-3181-4e0d-b546-b10ac89e0481-kube-api-access-9bvwk\") pod \"redhat-operators-h5hz4\" (UID: \"b450ebc6-3181-4e0d-b546-b10ac89e0481\") " pod="openshift-marketplace/redhat-operators-h5hz4" Mar 20 13:28:58 crc kubenswrapper[4973]: I0320 13:28:58.760227 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h5hz4" Mar 20 13:28:59 crc kubenswrapper[4973]: I0320 13:28:59.068155 4973 generic.go:334] "Generic (PLEG): container finished" podID="23fa52a0-b5df-4056-bd9e-4f42e8e9893c" containerID="6a9655491cb133886fe94e4ee47ee5de6400fcd5d3e9c098c3effbc46501c942" exitCode=0 Mar 20 13:28:59 crc kubenswrapper[4973]: I0320 13:28:59.068254 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gmpxj" event={"ID":"23fa52a0-b5df-4056-bd9e-4f42e8e9893c","Type":"ContainerDied","Data":"6a9655491cb133886fe94e4ee47ee5de6400fcd5d3e9c098c3effbc46501c942"} Mar 20 13:28:59 crc kubenswrapper[4973]: I0320 13:28:59.068587 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gmpxj" event={"ID":"23fa52a0-b5df-4056-bd9e-4f42e8e9893c","Type":"ContainerStarted","Data":"1268d233be1df167395850251a3e75516770c833b565925cd1f518def416dcae"} Mar 20 13:28:59 crc kubenswrapper[4973]: I0320 13:28:59.814472 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fpwkr"] Mar 20 13:28:59 crc kubenswrapper[4973]: I0320 13:28:59.815721 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fpwkr" Mar 20 13:28:59 crc kubenswrapper[4973]: I0320 13:28:59.817619 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 13:28:59 crc kubenswrapper[4973]: I0320 13:28:59.836799 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fpwkr"] Mar 20 13:28:59 crc kubenswrapper[4973]: I0320 13:28:59.859530 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h5hz4"] Mar 20 13:28:59 crc kubenswrapper[4973]: W0320 13:28:59.862330 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb450ebc6_3181_4e0d_b546_b10ac89e0481.slice/crio-56eb13129ae083c746acb047e39c69477065459e81cc9d39306ee6072ea0290c WatchSource:0}: Error finding container 56eb13129ae083c746acb047e39c69477065459e81cc9d39306ee6072ea0290c: Status 404 returned error can't find the container with id 56eb13129ae083c746acb047e39c69477065459e81cc9d39306ee6072ea0290c Mar 20 13:28:59 crc kubenswrapper[4973]: I0320 13:28:59.978564 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57731a76-c496-43ff-afea-a5685864a2f3-catalog-content\") pod \"community-operators-fpwkr\" (UID: \"57731a76-c496-43ff-afea-a5685864a2f3\") " pod="openshift-marketplace/community-operators-fpwkr" Mar 20 13:28:59 crc kubenswrapper[4973]: I0320 13:28:59.979022 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcnj5\" (UniqueName: \"kubernetes.io/projected/57731a76-c496-43ff-afea-a5685864a2f3-kube-api-access-jcnj5\") pod \"community-operators-fpwkr\" (UID: \"57731a76-c496-43ff-afea-a5685864a2f3\") " pod="openshift-marketplace/community-operators-fpwkr" Mar 20 13:28:59 crc kubenswrapper[4973]: I0320 13:28:59.979123 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57731a76-c496-43ff-afea-a5685864a2f3-utilities\") pod \"community-operators-fpwkr\" (UID: \"57731a76-c496-43ff-afea-a5685864a2f3\") " pod="openshift-marketplace/community-operators-fpwkr" Mar 20 13:29:00 crc kubenswrapper[4973]: I0320 13:29:00.076724 4973 generic.go:334] "Generic (PLEG): container finished" podID="23fa52a0-b5df-4056-bd9e-4f42e8e9893c" containerID="9c159be98e56a648697890f346ca405a98b5515e7f47bdbee25c1eb0fdc5d592" exitCode=0 Mar 20 13:29:00 crc kubenswrapper[4973]: I0320 13:29:00.076805 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gmpxj" event={"ID":"23fa52a0-b5df-4056-bd9e-4f42e8e9893c","Type":"ContainerDied","Data":"9c159be98e56a648697890f346ca405a98b5515e7f47bdbee25c1eb0fdc5d592"} Mar 20 13:29:00 crc kubenswrapper[4973]: I0320 13:29:00.079940 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57731a76-c496-43ff-afea-a5685864a2f3-catalog-content\") pod \"community-operators-fpwkr\" (UID: \"57731a76-c496-43ff-afea-a5685864a2f3\") " pod="openshift-marketplace/community-operators-fpwkr" Mar 20 13:29:00 crc kubenswrapper[4973]: I0320 13:29:00.080009 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcnj5\" (UniqueName: \"kubernetes.io/projected/57731a76-c496-43ff-afea-a5685864a2f3-kube-api-access-jcnj5\") pod \"community-operators-fpwkr\" (UID: \"57731a76-c496-43ff-afea-a5685864a2f3\") " pod="openshift-marketplace/community-operators-fpwkr" Mar 20 13:29:00 crc kubenswrapper[4973]: I0320 13:29:00.080239 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57731a76-c496-43ff-afea-a5685864a2f3-utilities\") pod \"community-operators-fpwkr\" (UID: \"57731a76-c496-43ff-afea-a5685864a2f3\") " pod="openshift-marketplace/community-operators-fpwkr" Mar 20 13:29:00 crc kubenswrapper[4973]: I0320 13:29:00.080549 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57731a76-c496-43ff-afea-a5685864a2f3-catalog-content\") pod \"community-operators-fpwkr\" (UID: \"57731a76-c496-43ff-afea-a5685864a2f3\") " pod="openshift-marketplace/community-operators-fpwkr" Mar 20 13:29:00 crc kubenswrapper[4973]: I0320 13:29:00.080779 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57731a76-c496-43ff-afea-a5685864a2f3-utilities\") pod \"community-operators-fpwkr\" (UID: \"57731a76-c496-43ff-afea-a5685864a2f3\") " pod="openshift-marketplace/community-operators-fpwkr" Mar 20 13:29:00 crc kubenswrapper[4973]: I0320 13:29:00.084130 4973 generic.go:334] "Generic (PLEG): container finished" podID="b450ebc6-3181-4e0d-b546-b10ac89e0481" containerID="2454db895cb39622b27f1324d8e9b0616417dfc310dbd5ebd942e9c7971c2b77" exitCode=0 Mar 20 13:29:00 crc kubenswrapper[4973]: I0320 13:29:00.084205 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5hz4" event={"ID":"b450ebc6-3181-4e0d-b546-b10ac89e0481","Type":"ContainerDied","Data":"2454db895cb39622b27f1324d8e9b0616417dfc310dbd5ebd942e9c7971c2b77"} Mar 20 13:29:00 crc kubenswrapper[4973]: I0320 13:29:00.084243 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5hz4" event={"ID":"b450ebc6-3181-4e0d-b546-b10ac89e0481","Type":"ContainerStarted","Data":"56eb13129ae083c746acb047e39c69477065459e81cc9d39306ee6072ea0290c"} Mar 20 13:29:00 crc kubenswrapper[4973]: I0320 13:29:00.104290 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcnj5\" (UniqueName: \"kubernetes.io/projected/57731a76-c496-43ff-afea-a5685864a2f3-kube-api-access-jcnj5\") pod \"community-operators-fpwkr\" (UID: \"57731a76-c496-43ff-afea-a5685864a2f3\") " pod="openshift-marketplace/community-operators-fpwkr" Mar 20 13:29:00 crc kubenswrapper[4973]: I0320 13:29:00.133960 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fpwkr" Mar 20 13:29:00 crc kubenswrapper[4973]: I0320 13:29:00.534213 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fpwkr"] Mar 20 13:29:00 crc kubenswrapper[4973]: W0320 13:29:00.538063 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57731a76_c496_43ff_afea_a5685864a2f3.slice/crio-2c648a8b67b3640d4578bada620cb73829ee2c52ebebc0c33c778bb67857ab5a WatchSource:0}: Error finding container 2c648a8b67b3640d4578bada620cb73829ee2c52ebebc0c33c778bb67857ab5a: Status 404 returned error can't find the container with id 2c648a8b67b3640d4578bada620cb73829ee2c52ebebc0c33c778bb67857ab5a Mar 20 13:29:00 crc kubenswrapper[4973]: I0320 13:29:00.808272 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-6hwhw"] Mar 20 13:29:00 crc kubenswrapper[4973]: I0320 13:29:00.809457 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6hwhw" Mar 20 13:29:00 crc kubenswrapper[4973]: I0320 13:29:00.811021 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 13:29:00 crc kubenswrapper[4973]: I0320 13:29:00.827135 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6hwhw"] Mar 20 13:29:00 crc kubenswrapper[4973]: I0320 13:29:00.992594 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdmg2\" (UniqueName: \"kubernetes.io/projected/31ec1f01-54bd-417e-867e-91484e966352-kube-api-access-jdmg2\") pod \"redhat-marketplace-6hwhw\" (UID: \"31ec1f01-54bd-417e-867e-91484e966352\") " pod="openshift-marketplace/redhat-marketplace-6hwhw" Mar 20 13:29:00 crc kubenswrapper[4973]: I0320 13:29:00.992671 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31ec1f01-54bd-417e-867e-91484e966352-utilities\") pod \"redhat-marketplace-6hwhw\" (UID: \"31ec1f01-54bd-417e-867e-91484e966352\") " pod="openshift-marketplace/redhat-marketplace-6hwhw" Mar 20 13:29:00 crc kubenswrapper[4973]: I0320 13:29:00.992882 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31ec1f01-54bd-417e-867e-91484e966352-catalog-content\") pod \"redhat-marketplace-6hwhw\" (UID: \"31ec1f01-54bd-417e-867e-91484e966352\") " pod="openshift-marketplace/redhat-marketplace-6hwhw" Mar 20 13:29:01 crc kubenswrapper[4973]: I0320 13:29:01.090654 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gmpxj" event={"ID":"23fa52a0-b5df-4056-bd9e-4f42e8e9893c","Type":"ContainerStarted","Data":"1d42cde7d7c77acf9457af58faffda33fadd57f8e7ffbc97bc752cffe7eec0cb"} Mar 20 13:29:01 crc kubenswrapper[4973]: I0320 13:29:01.092443 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5hz4" event={"ID":"b450ebc6-3181-4e0d-b546-b10ac89e0481","Type":"ContainerStarted","Data":"4e1f14137a44930ccdd063996282e8566301cd5d235979f41f714860ad133bcc"} Mar 20 13:29:01 crc kubenswrapper[4973]: I0320 13:29:01.093581 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31ec1f01-54bd-417e-867e-91484e966352-catalog-content\") pod \"redhat-marketplace-6hwhw\" (UID: \"31ec1f01-54bd-417e-867e-91484e966352\") " pod="openshift-marketplace/redhat-marketplace-6hwhw" Mar 20 13:29:01 crc kubenswrapper[4973]: I0320 13:29:01.093646 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdmg2\" (UniqueName: \"kubernetes.io/projected/31ec1f01-54bd-417e-867e-91484e966352-kube-api-access-jdmg2\") pod \"redhat-marketplace-6hwhw\" (UID: \"31ec1f01-54bd-417e-867e-91484e966352\") " pod="openshift-marketplace/redhat-marketplace-6hwhw" Mar 20 13:29:01 crc kubenswrapper[4973]: I0320 13:29:01.093677 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31ec1f01-54bd-417e-867e-91484e966352-utilities\") pod \"redhat-marketplace-6hwhw\" (UID: \"31ec1f01-54bd-417e-867e-91484e966352\") " pod="openshift-marketplace/redhat-marketplace-6hwhw" Mar 20 13:29:01 crc kubenswrapper[4973]: I0320 13:29:01.093987 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/31ec1f01-54bd-417e-867e-91484e966352-catalog-content\") pod \"redhat-marketplace-6hwhw\" (UID: \"31ec1f01-54bd-417e-867e-91484e966352\") " pod="openshift-marketplace/redhat-marketplace-6hwhw" Mar 20 13:29:01 crc kubenswrapper[4973]: I0320 13:29:01.094006 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/31ec1f01-54bd-417e-867e-91484e966352-utilities\") pod \"redhat-marketplace-6hwhw\" (UID: \"31ec1f01-54bd-417e-867e-91484e966352\") " pod="openshift-marketplace/redhat-marketplace-6hwhw" Mar 20 13:29:01 crc kubenswrapper[4973]: I0320 13:29:01.094467 4973 generic.go:334] "Generic (PLEG): container finished" podID="57731a76-c496-43ff-afea-a5685864a2f3" containerID="1b06a26b301cb74db26ae35c47ef7b1ced0d24d482912e5880e65e4376747bac" exitCode=0 Mar 20 13:29:01 crc kubenswrapper[4973]: I0320 13:29:01.094521 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fpwkr" event={"ID":"57731a76-c496-43ff-afea-a5685864a2f3","Type":"ContainerDied","Data":"1b06a26b301cb74db26ae35c47ef7b1ced0d24d482912e5880e65e4376747bac"} Mar 20 13:29:01 crc kubenswrapper[4973]: I0320 13:29:01.094555 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fpwkr" event={"ID":"57731a76-c496-43ff-afea-a5685864a2f3","Type":"ContainerStarted","Data":"2c648a8b67b3640d4578bada620cb73829ee2c52ebebc0c33c778bb67857ab5a"} Mar 20 13:29:01 crc kubenswrapper[4973]: I0320 13:29:01.113508 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gmpxj" podStartSLOduration=2.591279003 podStartE2EDuration="4.113490354s" podCreationTimestamp="2026-03-20 13:28:57 +0000 UTC" firstStartedPulling="2026-03-20 13:28:59.070761063 +0000 UTC m=+459.814430807" lastFinishedPulling="2026-03-20 13:29:00.592972404 +0000 UTC m=+461.336642158" observedRunningTime="2026-03-20 13:29:01.111183491 +0000 UTC m=+461.854853235" watchObservedRunningTime="2026-03-20 13:29:01.113490354 +0000 UTC m=+461.857160098" Mar 20 13:29:01 crc kubenswrapper[4973]: I0320 13:29:01.125732 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdmg2\" (UniqueName: \"kubernetes.io/projected/31ec1f01-54bd-417e-867e-91484e966352-kube-api-access-jdmg2\") pod \"redhat-marketplace-6hwhw\" (UID: \"31ec1f01-54bd-417e-867e-91484e966352\") " pod="openshift-marketplace/redhat-marketplace-6hwhw" Mar 20 13:29:01 crc kubenswrapper[4973]: I0320 13:29:01.171071 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-6hwhw" Mar 20 13:29:01 crc kubenswrapper[4973]: I0320 13:29:01.599032 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-6hwhw"] Mar 20 13:29:01 crc kubenswrapper[4973]: W0320 13:29:01.610444 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31ec1f01_54bd_417e_867e_91484e966352.slice/crio-a94f0cbad2c55363aa8a76d064145ed5a120294079963bb876dd1fd691206245 WatchSource:0}: Error finding container a94f0cbad2c55363aa8a76d064145ed5a120294079963bb876dd1fd691206245: Status 404 returned error can't find the container with id a94f0cbad2c55363aa8a76d064145ed5a120294079963bb876dd1fd691206245 Mar 20 13:29:02 crc kubenswrapper[4973]: I0320 13:29:02.105003 4973 generic.go:334] "Generic (PLEG): container finished" podID="b450ebc6-3181-4e0d-b546-b10ac89e0481" containerID="4e1f14137a44930ccdd063996282e8566301cd5d235979f41f714860ad133bcc" exitCode=0 Mar 20 13:29:02 crc kubenswrapper[4973]: I0320 13:29:02.105077 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5hz4" event={"ID":"b450ebc6-3181-4e0d-b546-b10ac89e0481","Type":"ContainerDied","Data":"4e1f14137a44930ccdd063996282e8566301cd5d235979f41f714860ad133bcc"} Mar 20 13:29:02 crc kubenswrapper[4973]: I0320 13:29:02.108694 4973 generic.go:334] "Generic (PLEG): container finished" podID="31ec1f01-54bd-417e-867e-91484e966352" containerID="bc3b359c868dcdbd8c88bca4b762fe87995b88a3f9e6b5033892710b8bb5400d" exitCode=0 Mar 20 13:29:02 crc kubenswrapper[4973]: I0320 13:29:02.108866 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6hwhw" event={"ID":"31ec1f01-54bd-417e-867e-91484e966352","Type":"ContainerDied","Data":"bc3b359c868dcdbd8c88bca4b762fe87995b88a3f9e6b5033892710b8bb5400d"} Mar 20 13:29:02 crc kubenswrapper[4973]: I0320 13:29:02.109178 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6hwhw" event={"ID":"31ec1f01-54bd-417e-867e-91484e966352","Type":"ContainerStarted","Data":"a94f0cbad2c55363aa8a76d064145ed5a120294079963bb876dd1fd691206245"} Mar 20 13:29:03 crc kubenswrapper[4973]: I0320 13:29:03.115843 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5hz4" event={"ID":"b450ebc6-3181-4e0d-b546-b10ac89e0481","Type":"ContainerStarted","Data":"f57e0f4515afd97e275b7d92199045d344693ab84816b0462c87b831b7e9bda1"} Mar 20 13:29:03 crc kubenswrapper[4973]: I0320 13:29:03.120166 4973 generic.go:334] "Generic (PLEG): container finished" podID="31ec1f01-54bd-417e-867e-91484e966352" containerID="8ba1dd4ba02f2bd56fd7edc74327f577b43348ecf5749f584818f16f249d5f09" exitCode=0 Mar 20 13:29:03 crc kubenswrapper[4973]: I0320 13:29:03.120385 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6hwhw" event={"ID":"31ec1f01-54bd-417e-867e-91484e966352","Type":"ContainerDied","Data":"8ba1dd4ba02f2bd56fd7edc74327f577b43348ecf5749f584818f16f249d5f09"} Mar 20 13:29:03 crc kubenswrapper[4973]: I0320 13:29:03.133149 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h5hz4" podStartSLOduration=2.57668913 podStartE2EDuration="5.133133994s" podCreationTimestamp="2026-03-20 13:28:58 +0000 UTC" firstStartedPulling="2026-03-20 13:29:00.085711568 +0000 UTC m=+460.829381342" lastFinishedPulling="2026-03-20 13:29:02.642156462 +0000 UTC m=+463.385826206" observedRunningTime="2026-03-20 13:29:03.133008779 +0000 UTC m=+463.876678533" watchObservedRunningTime="2026-03-20 13:29:03.133133994 +0000 UTC m=+463.876803738" Mar 20 13:29:06 crc kubenswrapper[4973]: I0320 13:29:06.140987 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-6hwhw" event={"ID":"31ec1f01-54bd-417e-867e-91484e966352","Type":"ContainerStarted","Data":"483d9c155b347cf885637d56cbba7d4bd349a5c6c18f0b5ce02f890c0f4b4512"} Mar 20 13:29:06 crc kubenswrapper[4973]: I0320 13:29:06.142868 4973 generic.go:334] "Generic (PLEG): container finished" podID="57731a76-c496-43ff-afea-a5685864a2f3" containerID="7b6ee33570e62646d6971da7adbc04568a53c423438abdcc9b888b52a0103ec8" exitCode=0 Mar 20 13:29:06 crc kubenswrapper[4973]: I0320 13:29:06.142913 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fpwkr" event={"ID":"57731a76-c496-43ff-afea-a5685864a2f3","Type":"ContainerDied","Data":"7b6ee33570e62646d6971da7adbc04568a53c423438abdcc9b888b52a0103ec8"} Mar 20 13:29:06 crc kubenswrapper[4973]: I0320 13:29:06.169738 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-6hwhw" podStartSLOduration=2.861190718 podStartE2EDuration="6.169719221s" podCreationTimestamp="2026-03-20 13:29:00 +0000 UTC" firstStartedPulling="2026-03-20 13:29:02.110617201 +0000 UTC m=+462.854286945" lastFinishedPulling="2026-03-20 13:29:05.419145704 +0000 UTC m=+466.162815448" observedRunningTime="2026-03-20 13:29:06.167322046 +0000 UTC m=+466.910991800" watchObservedRunningTime="2026-03-20 13:29:06.169719221 +0000 UTC m=+466.913388975" Mar 20 13:29:07 crc kubenswrapper[4973]: I0320 13:29:07.149702 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fpwkr" event={"ID":"57731a76-c496-43ff-afea-a5685864a2f3","Type":"ContainerStarted","Data":"07df23c3b96eff2dda412f9198cc400c4978dd51d11200b75b8c76345a68a2c0"} Mar 20 13:29:07 crc kubenswrapper[4973]: I0320 13:29:07.172557 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fpwkr" podStartSLOduration=2.587886211 podStartE2EDuration="8.172535235s" podCreationTimestamp="2026-03-20 13:28:59 +0000 UTC" firstStartedPulling="2026-03-20 13:29:01.095758528 +0000 UTC m=+461.839428272" lastFinishedPulling="2026-03-20 13:29:06.680407552 +0000 UTC m=+467.424077296" observedRunningTime="2026-03-20 13:29:07.166621492 +0000 UTC m=+467.910291236" watchObservedRunningTime="2026-03-20 13:29:07.172535235 +0000 UTC m=+467.916204979" Mar 20 13:29:07 crc kubenswrapper[4973]: I0320 13:29:07.787198 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gmpxj" Mar 20 13:29:07 crc kubenswrapper[4973]: I0320 13:29:07.787793 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gmpxj" Mar 20 13:29:07 crc kubenswrapper[4973]: I0320 13:29:07.839245 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gmpxj" Mar 20 13:29:08 crc kubenswrapper[4973]: I0320 13:29:08.198811 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gmpxj" Mar 20 13:29:08 crc kubenswrapper[4973]: I0320 13:29:08.760904 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h5hz4" Mar 20 13:29:08 crc kubenswrapper[4973]: I0320 13:29:08.760971 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h5hz4" Mar 20 13:29:09 crc kubenswrapper[4973]: I0320 13:29:09.795908 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h5hz4" podUID="b450ebc6-3181-4e0d-b546-b10ac89e0481" containerName="registry-server" probeResult="failure" output=< Mar 20 13:29:09 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 13:29:09 crc kubenswrapper[4973]: > Mar 20 13:29:10 crc kubenswrapper[4973]: I0320 13:29:10.134822 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fpwkr" Mar 20 13:29:10 crc kubenswrapper[4973]: I0320 13:29:10.134892 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fpwkr" Mar 20 13:29:10 crc kubenswrapper[4973]: I0320 13:29:10.193653 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fpwkr" Mar 20 13:29:11 crc kubenswrapper[4973]: I0320 13:29:11.172257 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-6hwhw" Mar 20 13:29:11 crc kubenswrapper[4973]: I0320 13:29:11.175427 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-6hwhw" Mar 20 13:29:11 crc kubenswrapper[4973]: I0320 13:29:11.215289 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-6hwhw" Mar 20 13:29:12 crc kubenswrapper[4973]: I0320 13:29:12.213291 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-6hwhw" Mar 20 13:29:13 crc kubenswrapper[4973]: I0320 13:29:13.320873 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:29:13 crc kubenswrapper[4973]: I0320 13:29:13.320942 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:29:18 crc kubenswrapper[4973]: I0320 13:29:18.803814 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h5hz4" Mar 20 13:29:18 crc kubenswrapper[4973]: I0320 13:29:18.848586 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h5hz4" Mar 20 13:29:20 crc kubenswrapper[4973]: I0320 13:29:20.174691 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fpwkr" Mar 20 13:29:43 crc kubenswrapper[4973]: I0320 13:29:43.320677 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:29:43 crc kubenswrapper[4973]: I0320 13:29:43.321556 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:29:43 crc kubenswrapper[4973]: I0320 13:29:43.321685 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 13:29:43 crc kubenswrapper[4973]: I0320 13:29:43.323575 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"52d5fd2368a231dd70e0a2d1cd4e97da6e1bd0ca50431edd3ab2ddcc1bd88dec"} pod="openshift-machine-config-operator/machine-config-daemon-qlztx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:29:43 crc kubenswrapper[4973]: I0320 13:29:43.323686 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" containerID="cri-o://52d5fd2368a231dd70e0a2d1cd4e97da6e1bd0ca50431edd3ab2ddcc1bd88dec" gracePeriod=600 Mar 20 13:29:44 crc kubenswrapper[4973]: I0320 13:29:44.363712 4973 generic.go:334] "Generic (PLEG): container finished" podID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerID="52d5fd2368a231dd70e0a2d1cd4e97da6e1bd0ca50431edd3ab2ddcc1bd88dec" exitCode=0 Mar 20 13:29:44 crc kubenswrapper[4973]: I0320 13:29:44.363868 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerDied","Data":"52d5fd2368a231dd70e0a2d1cd4e97da6e1bd0ca50431edd3ab2ddcc1bd88dec"} Mar 20 13:29:44 crc kubenswrapper[4973]: I0320 13:29:44.364628 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerStarted","Data":"ae703893e0e60a1cb59d74cc5e33372631d6542f4ae1b15e831e0084a728c10a"} Mar 20 13:29:44 crc kubenswrapper[4973]: I0320 13:29:44.364663 4973 scope.go:117] "RemoveContainer" containerID="abcd771297c72ebf201123700b92083f01a58f420bf04337fc21af73f83a8fbc" Mar 20 13:29:49 crc kubenswrapper[4973]: I0320 13:29:49.493116 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-jslr6"] Mar 20 13:29:49 crc kubenswrapper[4973]: I0320 13:29:49.494718 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jslr6" Mar 20 13:29:49 crc kubenswrapper[4973]: I0320 13:29:49.496973 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 20 13:29:49 crc kubenswrapper[4973]: I0320 13:29:49.498308 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 20 13:29:49 crc kubenswrapper[4973]: I0320 13:29:49.498502 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 20 13:29:49 crc kubenswrapper[4973]: I0320 13:29:49.498566 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-dockercfg-wwt9l" Mar 20 13:29:49 crc kubenswrapper[4973]: I0320 13:29:49.498579 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 20 13:29:49 crc kubenswrapper[4973]: I0320 13:29:49.510853 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-jslr6"] Mar 20 13:29:49 crc kubenswrapper[4973]: I0320 13:29:49.542104 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/3d941169-ab7b-4150-a7c8-5bea6b763907-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-jslr6\" (UID: \"3d941169-ab7b-4150-a7c8-5bea6b763907\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jslr6" Mar 20 13:29:49 crc kubenswrapper[4973]: I0320 13:29:49.542313 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d941169-ab7b-4150-a7c8-5bea6b763907-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-jslr6\" (UID: \"3d941169-ab7b-4150-a7c8-5bea6b763907\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jslr6" Mar 20 13:29:49 crc kubenswrapper[4973]: I0320 13:29:49.542393 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpjdb\" (UniqueName: \"kubernetes.io/projected/3d941169-ab7b-4150-a7c8-5bea6b763907-kube-api-access-kpjdb\") pod \"cluster-monitoring-operator-6d5b84845-jslr6\" (UID: \"3d941169-ab7b-4150-a7c8-5bea6b763907\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jslr6" Mar 20 13:29:49 crc kubenswrapper[4973]: I0320 13:29:49.643404 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/3d941169-ab7b-4150-a7c8-5bea6b763907-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-jslr6\" (UID: \"3d941169-ab7b-4150-a7c8-5bea6b763907\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jslr6" Mar 20 13:29:49 crc kubenswrapper[4973]: I0320 13:29:49.643494 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d941169-ab7b-4150-a7c8-5bea6b763907-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-jslr6\" (UID: \"3d941169-ab7b-4150-a7c8-5bea6b763907\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jslr6" Mar 20 13:29:49 crc kubenswrapper[4973]: I0320 13:29:49.643526 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpjdb\" (UniqueName: \"kubernetes.io/projected/3d941169-ab7b-4150-a7c8-5bea6b763907-kube-api-access-kpjdb\") pod \"cluster-monitoring-operator-6d5b84845-jslr6\" (UID: \"3d941169-ab7b-4150-a7c8-5bea6b763907\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jslr6" Mar 20 13:29:49 crc kubenswrapper[4973]: I0320 13:29:49.644295 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/3d941169-ab7b-4150-a7c8-5bea6b763907-telemetry-config\") pod \"cluster-monitoring-operator-6d5b84845-jslr6\" (UID: \"3d941169-ab7b-4150-a7c8-5bea6b763907\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jslr6" Mar 20 13:29:49 crc kubenswrapper[4973]: I0320 13:29:49.651112 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/3d941169-ab7b-4150-a7c8-5bea6b763907-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6d5b84845-jslr6\" (UID: \"3d941169-ab7b-4150-a7c8-5bea6b763907\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jslr6" Mar 20 13:29:49 crc kubenswrapper[4973]: I0320 13:29:49.662024 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpjdb\" (UniqueName: \"kubernetes.io/projected/3d941169-ab7b-4150-a7c8-5bea6b763907-kube-api-access-kpjdb\") pod \"cluster-monitoring-operator-6d5b84845-jslr6\" (UID: \"3d941169-ab7b-4150-a7c8-5bea6b763907\") " pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jslr6" Mar 20 13:29:49 crc kubenswrapper[4973]: I0320 13:29:49.817641 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jslr6" Mar 20 13:29:50 crc kubenswrapper[4973]: I0320 13:29:50.084141 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6d5b84845-jslr6"] Mar 20 13:29:50 crc kubenswrapper[4973]: I0320 13:29:50.102904 4973 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:29:50 crc kubenswrapper[4973]: I0320 13:29:50.418760 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jslr6" event={"ID":"3d941169-ab7b-4150-a7c8-5bea6b763907","Type":"ContainerStarted","Data":"4d9c181327590b04f675c3d3a5a1d2d5fb2351665124cee03f90c1657d91fe50"} Mar 20 13:29:52 crc kubenswrapper[4973]: I0320 13:29:52.430889 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jslr6" event={"ID":"3d941169-ab7b-4150-a7c8-5bea6b763907","Type":"ContainerStarted","Data":"b7002d2aaa914bd9e6e0f12f5b010b1dc17609f60d620127a2b8c9dc85c23214"} Mar 20 13:29:52 crc kubenswrapper[4973]: I0320 13:29:52.451225 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6d5b84845-jslr6" podStartSLOduration=1.471400358 podStartE2EDuration="3.451206506s" podCreationTimestamp="2026-03-20 13:29:49 +0000 UTC" firstStartedPulling="2026-03-20 13:29:50.102557011 +0000 UTC m=+510.846226755" lastFinishedPulling="2026-03-20 13:29:52.082363159 +0000 UTC m=+512.826032903" observedRunningTime="2026-03-20 13:29:52.448078321 +0000 UTC m=+513.191748065" watchObservedRunningTime="2026-03-20 13:29:52.451206506 +0000 UTC m=+513.194876250" Mar 20 13:29:52 crc kubenswrapper[4973]: I0320 13:29:52.704267 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4v8kv"] Mar 20 13:29:52 crc kubenswrapper[4973]: I0320 13:29:52.705015 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4v8kv" Mar 20 13:29:52 crc kubenswrapper[4973]: I0320 13:29:52.707661 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-dockercfg-qtznw" Mar 20 13:29:52 crc kubenswrapper[4973]: I0320 13:29:52.709448 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 20 13:29:52 crc kubenswrapper[4973]: I0320 13:29:52.713453 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4v8kv"] Mar 20 13:29:52 crc kubenswrapper[4973]: I0320 13:29:52.888903 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b3ad60e7-f0fc-4ae0-bf47-a92997c32a08-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-4v8kv\" (UID: \"b3ad60e7-f0fc-4ae0-bf47-a92997c32a08\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4v8kv" Mar 20 13:29:52 crc kubenswrapper[4973]: I0320 13:29:52.990989 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b3ad60e7-f0fc-4ae0-bf47-a92997c32a08-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-4v8kv\" (UID: \"b3ad60e7-f0fc-4ae0-bf47-a92997c32a08\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4v8kv" Mar 20 13:29:53 crc kubenswrapper[4973]: I0320 13:29:53.001928 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b3ad60e7-f0fc-4ae0-bf47-a92997c32a08-tls-certificates\") pod \"prometheus-operator-admission-webhook-f54c54754-4v8kv\" (UID: \"b3ad60e7-f0fc-4ae0-bf47-a92997c32a08\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4v8kv" Mar 20 13:29:53 crc kubenswrapper[4973]: I0320 13:29:53.029888 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4v8kv" Mar 20 13:29:53 crc kubenswrapper[4973]: I0320 13:29:53.219915 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4v8kv"] Mar 20 13:29:53 crc kubenswrapper[4973]: W0320 13:29:53.228190 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3ad60e7_f0fc_4ae0_bf47_a92997c32a08.slice/crio-e4c51f62e358a60d994f8ec107fb51ce9dd50cce714241e580f708ff6f8cf667 WatchSource:0}: Error finding container e4c51f62e358a60d994f8ec107fb51ce9dd50cce714241e580f708ff6f8cf667: Status 404 returned error can't find the container with id e4c51f62e358a60d994f8ec107fb51ce9dd50cce714241e580f708ff6f8cf667 Mar 20 13:29:53 crc kubenswrapper[4973]: I0320 13:29:53.457699 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4v8kv" event={"ID":"b3ad60e7-f0fc-4ae0-bf47-a92997c32a08","Type":"ContainerStarted","Data":"e4c51f62e358a60d994f8ec107fb51ce9dd50cce714241e580f708ff6f8cf667"} Mar 20 13:29:55 crc kubenswrapper[4973]: I0320 13:29:55.474912 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4v8kv" event={"ID":"b3ad60e7-f0fc-4ae0-bf47-a92997c32a08","Type":"ContainerStarted","Data":"38eea04540b407368fbfa05a7134a7a7bc3e5c78a9b6a98cb5b99fdb30ba055e"} Mar 20 13:29:55 crc kubenswrapper[4973]: I0320 13:29:55.475585 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4v8kv" Mar 20 13:29:55 crc kubenswrapper[4973]: I0320 13:29:55.487867 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4v8kv" Mar 20 13:29:55 crc kubenswrapper[4973]: I0320 13:29:55.502413 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4v8kv" podStartSLOduration=1.818861446 podStartE2EDuration="3.502393745s" podCreationTimestamp="2026-03-20 13:29:52 +0000 UTC" firstStartedPulling="2026-03-20 13:29:53.231368343 +0000 UTC m=+513.975038087" lastFinishedPulling="2026-03-20 13:29:54.914900642 +0000 UTC m=+515.658570386" observedRunningTime="2026-03-20 13:29:55.500719439 +0000 UTC m=+516.244389213" watchObservedRunningTime="2026-03-20 13:29:55.502393745 +0000 UTC m=+516.246063499" Mar 20 13:29:55 crc kubenswrapper[4973]: I0320 13:29:55.777685 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-nbv2s"] Mar 20 13:29:55 crc kubenswrapper[4973]: I0320 13:29:55.778485 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-nbv2s" Mar 20 13:29:55 crc kubenswrapper[4973]: I0320 13:29:55.780712 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 20 13:29:55 crc kubenswrapper[4973]: I0320 13:29:55.780840 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 20 13:29:55 crc kubenswrapper[4973]: I0320 13:29:55.781023 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 20 13:29:55 crc kubenswrapper[4973]: I0320 13:29:55.784267 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-crf65" Mar 20 13:29:55 crc kubenswrapper[4973]: I0320 13:29:55.794678 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-nbv2s"] Mar 20 13:29:55 crc kubenswrapper[4973]: I0320 13:29:55.834556 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvnvg\" (UniqueName: \"kubernetes.io/projected/89e7f0a2-af1f-44ea-878f-3c85b206c5e7-kube-api-access-dvnvg\") pod \"prometheus-operator-db54df47d-nbv2s\" (UID: \"89e7f0a2-af1f-44ea-878f-3c85b206c5e7\") " pod="openshift-monitoring/prometheus-operator-db54df47d-nbv2s" Mar 20 13:29:55 crc kubenswrapper[4973]: I0320 13:29:55.834651 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/89e7f0a2-af1f-44ea-878f-3c85b206c5e7-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-nbv2s\" (UID: \"89e7f0a2-af1f-44ea-878f-3c85b206c5e7\") " pod="openshift-monitoring/prometheus-operator-db54df47d-nbv2s" Mar 20 13:29:55 crc kubenswrapper[4973]: I0320 13:29:55.834710 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/89e7f0a2-af1f-44ea-878f-3c85b206c5e7-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-nbv2s\" (UID: \"89e7f0a2-af1f-44ea-878f-3c85b206c5e7\") " pod="openshift-monitoring/prometheus-operator-db54df47d-nbv2s" Mar 20 13:29:55 crc kubenswrapper[4973]: I0320 13:29:55.834747 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89e7f0a2-af1f-44ea-878f-3c85b206c5e7-metrics-client-ca\") pod \"prometheus-operator-db54df47d-nbv2s\" (UID: \"89e7f0a2-af1f-44ea-878f-3c85b206c5e7\") " pod="openshift-monitoring/prometheus-operator-db54df47d-nbv2s" Mar 20 13:29:55 crc kubenswrapper[4973]: I0320 13:29:55.936106 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/89e7f0a2-af1f-44ea-878f-3c85b206c5e7-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-nbv2s\" (UID: \"89e7f0a2-af1f-44ea-878f-3c85b206c5e7\") " pod="openshift-monitoring/prometheus-operator-db54df47d-nbv2s" Mar 20 13:29:55 crc kubenswrapper[4973]: I0320 13:29:55.936158 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/89e7f0a2-af1f-44ea-878f-3c85b206c5e7-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-nbv2s\" (UID: \"89e7f0a2-af1f-44ea-878f-3c85b206c5e7\") " pod="openshift-monitoring/prometheus-operator-db54df47d-nbv2s" Mar 20 13:29:55 crc kubenswrapper[4973]: I0320 13:29:55.936190 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89e7f0a2-af1f-44ea-878f-3c85b206c5e7-metrics-client-ca\") pod \"prometheus-operator-db54df47d-nbv2s\" (UID: \"89e7f0a2-af1f-44ea-878f-3c85b206c5e7\") " pod="openshift-monitoring/prometheus-operator-db54df47d-nbv2s" Mar 20 13:29:55 crc kubenswrapper[4973]: I0320 13:29:55.936225 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvnvg\" (UniqueName: \"kubernetes.io/projected/89e7f0a2-af1f-44ea-878f-3c85b206c5e7-kube-api-access-dvnvg\") pod \"prometheus-operator-db54df47d-nbv2s\" (UID: \"89e7f0a2-af1f-44ea-878f-3c85b206c5e7\") " pod="openshift-monitoring/prometheus-operator-db54df47d-nbv2s" Mar 20 13:29:55 crc kubenswrapper[4973]: I0320 13:29:55.937367 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/89e7f0a2-af1f-44ea-878f-3c85b206c5e7-metrics-client-ca\") pod \"prometheus-operator-db54df47d-nbv2s\" (UID: \"89e7f0a2-af1f-44ea-878f-3c85b206c5e7\") " pod="openshift-monitoring/prometheus-operator-db54df47d-nbv2s" Mar 20 13:29:55 crc kubenswrapper[4973]: I0320 13:29:55.945601 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/89e7f0a2-af1f-44ea-878f-3c85b206c5e7-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-db54df47d-nbv2s\" (UID: \"89e7f0a2-af1f-44ea-878f-3c85b206c5e7\") " pod="openshift-monitoring/prometheus-operator-db54df47d-nbv2s" Mar 20 13:29:55 crc kubenswrapper[4973]: I0320 13:29:55.945860 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/89e7f0a2-af1f-44ea-878f-3c85b206c5e7-prometheus-operator-tls\") pod \"prometheus-operator-db54df47d-nbv2s\" (UID: \"89e7f0a2-af1f-44ea-878f-3c85b206c5e7\") " pod="openshift-monitoring/prometheus-operator-db54df47d-nbv2s" Mar 20 13:29:55 crc kubenswrapper[4973]: I0320 13:29:55.953434 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvnvg\" (UniqueName: \"kubernetes.io/projected/89e7f0a2-af1f-44ea-878f-3c85b206c5e7-kube-api-access-dvnvg\") pod \"prometheus-operator-db54df47d-nbv2s\" (UID: \"89e7f0a2-af1f-44ea-878f-3c85b206c5e7\") " pod="openshift-monitoring/prometheus-operator-db54df47d-nbv2s" Mar 20 13:29:56 crc kubenswrapper[4973]: I0320 13:29:56.132867 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-db54df47d-nbv2s" Mar 20 13:29:56 crc kubenswrapper[4973]: I0320 13:29:56.369841 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-db54df47d-nbv2s"] Mar 20 13:29:56 crc kubenswrapper[4973]: W0320 13:29:56.376673 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89e7f0a2_af1f_44ea_878f_3c85b206c5e7.slice/crio-9301e2af600ae08e1ca133ead59d9b14ee666c3113b9162be2a6659da785f72c WatchSource:0}: Error finding container 9301e2af600ae08e1ca133ead59d9b14ee666c3113b9162be2a6659da785f72c: Status 404 returned error can't find the container with id 9301e2af600ae08e1ca133ead59d9b14ee666c3113b9162be2a6659da785f72c Mar 20 13:29:56 crc kubenswrapper[4973]: I0320 13:29:56.482002 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-nbv2s" event={"ID":"89e7f0a2-af1f-44ea-878f-3c85b206c5e7","Type":"ContainerStarted","Data":"9301e2af600ae08e1ca133ead59d9b14ee666c3113b9162be2a6659da785f72c"} Mar 20 13:29:58 crc kubenswrapper[4973]: I0320 13:29:58.494773 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-nbv2s" event={"ID":"89e7f0a2-af1f-44ea-878f-3c85b206c5e7","Type":"ContainerStarted","Data":"937b4c8390e0c44cd519faf3d722f96fd49c0f862a02a22aa6f1145ae9e78be8"} Mar 20 13:29:58 crc kubenswrapper[4973]: I0320 13:29:58.495162 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-db54df47d-nbv2s" event={"ID":"89e7f0a2-af1f-44ea-878f-3c85b206c5e7","Type":"ContainerStarted","Data":"27cfb0f0febb472263de1f69d46d068921529775efe92aa3f33945c959f911d8"} Mar 20 13:29:58 crc kubenswrapper[4973]: I0320 13:29:58.510013 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-db54df47d-nbv2s" podStartSLOduration=1.751855449 podStartE2EDuration="3.509997s" podCreationTimestamp="2026-03-20 13:29:55 +0000 UTC" firstStartedPulling="2026-03-20 13:29:56.38039822 +0000 UTC m=+517.124067954" lastFinishedPulling="2026-03-20 13:29:58.138539741 +0000 UTC m=+518.882209505" observedRunningTime="2026-03-20 13:29:58.506924256 +0000 UTC m=+519.250594020" watchObservedRunningTime="2026-03-20 13:29:58.509997 +0000 UTC m=+519.253666744" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.137890 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566890-dchfr"] Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.139960 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566890-dchfr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.142798 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.143007 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.144648 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566890-8gj94"] Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.145562 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-8gj94" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.147045 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-vdkln"] Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.152302 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.152585 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.152713 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.154102 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-vdkln" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.175837 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-t5nwh" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.175989 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.178119 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.179311 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566890-dchfr"] Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.182277 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566890-8gj94"] Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.190754 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12c40f14-cc6c-48dd-b2c0-c026cf5cd708-config-volume\") pod \"collect-profiles-29566890-8gj94\" (UID: \"12c40f14-cc6c-48dd-b2c0-c026cf5cd708\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-8gj94" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.190796 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d546186-82a6-43a6-835d-98d221360f81-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-vdkln\" (UID: \"6d546186-82a6-43a6-835d-98d221360f81\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-vdkln" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.190861 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/12c40f14-cc6c-48dd-b2c0-c026cf5cd708-secret-volume\") pod \"collect-profiles-29566890-8gj94\" (UID: \"12c40f14-cc6c-48dd-b2c0-c026cf5cd708\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-8gj94" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.190884 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvn2q\" (UniqueName: \"kubernetes.io/projected/a33dac4d-9a04-4e13-8c3e-fac7025b14ca-kube-api-access-wvn2q\") pod \"auto-csr-approver-29566890-dchfr\" (UID: \"a33dac4d-9a04-4e13-8c3e-fac7025b14ca\") " pod="openshift-infra/auto-csr-approver-29566890-dchfr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.190957 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txkht\" (UniqueName: \"kubernetes.io/projected/12c40f14-cc6c-48dd-b2c0-c026cf5cd708-kube-api-access-txkht\") pod \"collect-profiles-29566890-8gj94\" (UID: \"12c40f14-cc6c-48dd-b2c0-c026cf5cd708\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-8gj94" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.190992 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bmbj\" (UniqueName: \"kubernetes.io/projected/6d546186-82a6-43a6-835d-98d221360f81-kube-api-access-2bmbj\") pod \"openshift-state-metrics-566fddb674-vdkln\" (UID: \"6d546186-82a6-43a6-835d-98d221360f81\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-vdkln" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.191020 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6d546186-82a6-43a6-835d-98d221360f81-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-vdkln\" (UID: \"6d546186-82a6-43a6-835d-98d221360f81\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-vdkln" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.191041 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6d546186-82a6-43a6-835d-98d221360f81-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-vdkln\" (UID: \"6d546186-82a6-43a6-835d-98d221360f81\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-vdkln" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.199586 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-vdkln"] Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.209446 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj"] Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.210399 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.214901 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-tqpwr"] Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.216107 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tqpwr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.216220 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-ng5qk" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.216523 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.216571 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.216937 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.219467 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-ttwxb" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.219636 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.219768 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.236399 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj"] Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.294600 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6b08581e-299b-43b3-aaa0-6e4263e8281e-sys\") pod \"node-exporter-tqpwr\" (UID: \"6b08581e-299b-43b3-aaa0-6e4263e8281e\") " pod="openshift-monitoring/node-exporter-tqpwr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.294788 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12c40f14-cc6c-48dd-b2c0-c026cf5cd708-config-volume\") pod \"collect-profiles-29566890-8gj94\" (UID: \"12c40f14-cc6c-48dd-b2c0-c026cf5cd708\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-8gj94" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.294828 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d546186-82a6-43a6-835d-98d221360f81-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-vdkln\" (UID: \"6d546186-82a6-43a6-835d-98d221360f81\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-vdkln" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.294857 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/36f4567c-3d00-4ecc-b737-30b11776f4b0-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-l5nmj\" (UID: \"36f4567c-3d00-4ecc-b737-30b11776f4b0\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj" Mar 20 13:30:00 crc kubenswrapper[4973]: E0320 13:30:00.294939 4973 secret.go:188] Couldn't get secret openshift-monitoring/openshift-state-metrics-tls: secret "openshift-state-metrics-tls" not found Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.294936 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/36f4567c-3d00-4ecc-b737-30b11776f4b0-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-l5nmj\" (UID: \"36f4567c-3d00-4ecc-b737-30b11776f4b0\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj" Mar 20 13:30:00 crc kubenswrapper[4973]: E0320 13:30:00.294994 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d546186-82a6-43a6-835d-98d221360f81-openshift-state-metrics-tls podName:6d546186-82a6-43a6-835d-98d221360f81 nodeName:}" failed. No retries permitted until 2026-03-20 13:30:00.79497818 +0000 UTC m=+521.538647924 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openshift-state-metrics-tls" (UniqueName: "kubernetes.io/secret/6d546186-82a6-43a6-835d-98d221360f81-openshift-state-metrics-tls") pod "openshift-state-metrics-566fddb674-vdkln" (UID: "6d546186-82a6-43a6-835d-98d221360f81") : secret "openshift-state-metrics-tls" not found Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.295213 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6b08581e-299b-43b3-aaa0-6e4263e8281e-node-exporter-wtmp\") pod \"node-exporter-tqpwr\" (UID: \"6b08581e-299b-43b3-aaa0-6e4263e8281e\") " pod="openshift-monitoring/node-exporter-tqpwr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.295602 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2j8d\" (UniqueName: \"kubernetes.io/projected/6b08581e-299b-43b3-aaa0-6e4263e8281e-kube-api-access-s2j8d\") pod \"node-exporter-tqpwr\" (UID: \"6b08581e-299b-43b3-aaa0-6e4263e8281e\") " pod="openshift-monitoring/node-exporter-tqpwr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.295666 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6b08581e-299b-43b3-aaa0-6e4263e8281e-node-exporter-tls\") pod \"node-exporter-tqpwr\" (UID: \"6b08581e-299b-43b3-aaa0-6e4263e8281e\") " pod="openshift-monitoring/node-exporter-tqpwr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.295707 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6b08581e-299b-43b3-aaa0-6e4263e8281e-metrics-client-ca\") pod \"node-exporter-tqpwr\" (UID: \"6b08581e-299b-43b3-aaa0-6e4263e8281e\") " pod="openshift-monitoring/node-exporter-tqpwr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.295744 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/12c40f14-cc6c-48dd-b2c0-c026cf5cd708-secret-volume\") pod \"collect-profiles-29566890-8gj94\" (UID: \"12c40f14-cc6c-48dd-b2c0-c026cf5cd708\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-8gj94" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.295779 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36f4567c-3d00-4ecc-b737-30b11776f4b0-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-l5nmj\" (UID: \"36f4567c-3d00-4ecc-b737-30b11776f4b0\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.295817 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6b08581e-299b-43b3-aaa0-6e4263e8281e-node-exporter-textfile\") pod \"node-exporter-tqpwr\" (UID: \"6b08581e-299b-43b3-aaa0-6e4263e8281e\") " pod="openshift-monitoring/node-exporter-tqpwr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.296493 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvn2q\" (UniqueName: \"kubernetes.io/projected/a33dac4d-9a04-4e13-8c3e-fac7025b14ca-kube-api-access-wvn2q\") pod \"auto-csr-approver-29566890-dchfr\" (UID: \"a33dac4d-9a04-4e13-8c3e-fac7025b14ca\") " pod="openshift-infra/auto-csr-approver-29566890-dchfr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.296702 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6b08581e-299b-43b3-aaa0-6e4263e8281e-root\") pod \"node-exporter-tqpwr\" (UID: \"6b08581e-299b-43b3-aaa0-6e4263e8281e\") " pod="openshift-monitoring/node-exporter-tqpwr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.296744 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/36f4567c-3d00-4ecc-b737-30b11776f4b0-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-l5nmj\" (UID: \"36f4567c-3d00-4ecc-b737-30b11776f4b0\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.296768 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txkht\" (UniqueName: \"kubernetes.io/projected/12c40f14-cc6c-48dd-b2c0-c026cf5cd708-kube-api-access-txkht\") pod \"collect-profiles-29566890-8gj94\" (UID: \"12c40f14-cc6c-48dd-b2c0-c026cf5cd708\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-8gj94" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.296794 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/36f4567c-3d00-4ecc-b737-30b11776f4b0-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-l5nmj\" (UID: \"36f4567c-3d00-4ecc-b737-30b11776f4b0\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.303270 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2d42\" (UniqueName: \"kubernetes.io/projected/36f4567c-3d00-4ecc-b737-30b11776f4b0-kube-api-access-h2d42\") pod \"kube-state-metrics-777cb5bd5d-l5nmj\" (UID: \"36f4567c-3d00-4ecc-b737-30b11776f4b0\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.303301 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bmbj\" (UniqueName: \"kubernetes.io/projected/6d546186-82a6-43a6-835d-98d221360f81-kube-api-access-2bmbj\") pod \"openshift-state-metrics-566fddb674-vdkln\" (UID: \"6d546186-82a6-43a6-835d-98d221360f81\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-vdkln" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.303347 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6b08581e-299b-43b3-aaa0-6e4263e8281e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tqpwr\" (UID: \"6b08581e-299b-43b3-aaa0-6e4263e8281e\") " pod="openshift-monitoring/node-exporter-tqpwr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.303367 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6d546186-82a6-43a6-835d-98d221360f81-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-vdkln\" (UID: \"6d546186-82a6-43a6-835d-98d221360f81\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-vdkln" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.303390 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6d546186-82a6-43a6-835d-98d221360f81-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-vdkln\" (UID: \"6d546186-82a6-43a6-835d-98d221360f81\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-vdkln" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.308524 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6d546186-82a6-43a6-835d-98d221360f81-metrics-client-ca\") pod \"openshift-state-metrics-566fddb674-vdkln\" (UID: \"6d546186-82a6-43a6-835d-98d221360f81\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-vdkln" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.308627 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/12c40f14-cc6c-48dd-b2c0-c026cf5cd708-secret-volume\") pod \"collect-profiles-29566890-8gj94\" (UID: \"12c40f14-cc6c-48dd-b2c0-c026cf5cd708\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-8gj94" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.309063 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12c40f14-cc6c-48dd-b2c0-c026cf5cd708-config-volume\") pod \"collect-profiles-29566890-8gj94\" (UID: \"12c40f14-cc6c-48dd-b2c0-c026cf5cd708\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-8gj94" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.309326 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6d546186-82a6-43a6-835d-98d221360f81-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-566fddb674-vdkln\" (UID: \"6d546186-82a6-43a6-835d-98d221360f81\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-vdkln" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.332640 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bmbj\" (UniqueName: \"kubernetes.io/projected/6d546186-82a6-43a6-835d-98d221360f81-kube-api-access-2bmbj\") pod \"openshift-state-metrics-566fddb674-vdkln\" (UID: \"6d546186-82a6-43a6-835d-98d221360f81\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-vdkln" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.341130 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txkht\" (UniqueName: \"kubernetes.io/projected/12c40f14-cc6c-48dd-b2c0-c026cf5cd708-kube-api-access-txkht\") pod \"collect-profiles-29566890-8gj94\" (UID: \"12c40f14-cc6c-48dd-b2c0-c026cf5cd708\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-8gj94" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.346009 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvn2q\" (UniqueName: \"kubernetes.io/projected/a33dac4d-9a04-4e13-8c3e-fac7025b14ca-kube-api-access-wvn2q\") pod \"auto-csr-approver-29566890-dchfr\" (UID: \"a33dac4d-9a04-4e13-8c3e-fac7025b14ca\") " pod="openshift-infra/auto-csr-approver-29566890-dchfr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.404104 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6b08581e-299b-43b3-aaa0-6e4263e8281e-root\") pod \"node-exporter-tqpwr\" (UID: \"6b08581e-299b-43b3-aaa0-6e4263e8281e\") " pod="openshift-monitoring/node-exporter-tqpwr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.404150 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/36f4567c-3d00-4ecc-b737-30b11776f4b0-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-l5nmj\" (UID: \"36f4567c-3d00-4ecc-b737-30b11776f4b0\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.404180 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/36f4567c-3d00-4ecc-b737-30b11776f4b0-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-l5nmj\" (UID: \"36f4567c-3d00-4ecc-b737-30b11776f4b0\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.404199 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2d42\" (UniqueName: \"kubernetes.io/projected/36f4567c-3d00-4ecc-b737-30b11776f4b0-kube-api-access-h2d42\") pod \"kube-state-metrics-777cb5bd5d-l5nmj\" (UID: \"36f4567c-3d00-4ecc-b737-30b11776f4b0\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.404220 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6b08581e-299b-43b3-aaa0-6e4263e8281e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tqpwr\" (UID: \"6b08581e-299b-43b3-aaa0-6e4263e8281e\") " pod="openshift-monitoring/node-exporter-tqpwr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.404223 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6b08581e-299b-43b3-aaa0-6e4263e8281e-root\") pod \"node-exporter-tqpwr\" (UID: \"6b08581e-299b-43b3-aaa0-6e4263e8281e\") " pod="openshift-monitoring/node-exporter-tqpwr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.404242 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6b08581e-299b-43b3-aaa0-6e4263e8281e-sys\") pod \"node-exporter-tqpwr\" (UID: \"6b08581e-299b-43b3-aaa0-6e4263e8281e\") " pod="openshift-monitoring/node-exporter-tqpwr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.404275 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6b08581e-299b-43b3-aaa0-6e4263e8281e-sys\") pod \"node-exporter-tqpwr\" (UID: \"6b08581e-299b-43b3-aaa0-6e4263e8281e\") " pod="openshift-monitoring/node-exporter-tqpwr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.404348 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/36f4567c-3d00-4ecc-b737-30b11776f4b0-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-l5nmj\" (UID: \"36f4567c-3d00-4ecc-b737-30b11776f4b0\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.404372 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/36f4567c-3d00-4ecc-b737-30b11776f4b0-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-l5nmj\" (UID: \"36f4567c-3d00-4ecc-b737-30b11776f4b0\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.404401 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6b08581e-299b-43b3-aaa0-6e4263e8281e-node-exporter-wtmp\") pod \"node-exporter-tqpwr\" (UID: \"6b08581e-299b-43b3-aaa0-6e4263e8281e\") " pod="openshift-monitoring/node-exporter-tqpwr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.404441 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2j8d\" (UniqueName: \"kubernetes.io/projected/6b08581e-299b-43b3-aaa0-6e4263e8281e-kube-api-access-s2j8d\") pod \"node-exporter-tqpwr\" (UID: \"6b08581e-299b-43b3-aaa0-6e4263e8281e\") " pod="openshift-monitoring/node-exporter-tqpwr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.404480 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6b08581e-299b-43b3-aaa0-6e4263e8281e-node-exporter-tls\") pod \"node-exporter-tqpwr\" (UID: \"6b08581e-299b-43b3-aaa0-6e4263e8281e\") " pod="openshift-monitoring/node-exporter-tqpwr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.404499 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6b08581e-299b-43b3-aaa0-6e4263e8281e-metrics-client-ca\") pod \"node-exporter-tqpwr\" (UID: \"6b08581e-299b-43b3-aaa0-6e4263e8281e\") " pod="openshift-monitoring/node-exporter-tqpwr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.404548 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36f4567c-3d00-4ecc-b737-30b11776f4b0-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-l5nmj\" (UID: \"36f4567c-3d00-4ecc-b737-30b11776f4b0\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.404561 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6b08581e-299b-43b3-aaa0-6e4263e8281e-node-exporter-textfile\") pod \"node-exporter-tqpwr\" (UID: \"6b08581e-299b-43b3-aaa0-6e4263e8281e\") " pod="openshift-monitoring/node-exporter-tqpwr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.404905 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6b08581e-299b-43b3-aaa0-6e4263e8281e-node-exporter-textfile\") pod \"node-exporter-tqpwr\" (UID: \"6b08581e-299b-43b3-aaa0-6e4263e8281e\") " pod="openshift-monitoring/node-exporter-tqpwr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.406023 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36f4567c-3d00-4ecc-b737-30b11776f4b0-metrics-client-ca\") pod \"kube-state-metrics-777cb5bd5d-l5nmj\" (UID: \"36f4567c-3d00-4ecc-b737-30b11776f4b0\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.406189 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6b08581e-299b-43b3-aaa0-6e4263e8281e-node-exporter-wtmp\") pod \"node-exporter-tqpwr\" (UID: \"6b08581e-299b-43b3-aaa0-6e4263e8281e\") " pod="openshift-monitoring/node-exporter-tqpwr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.406328 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/36f4567c-3d00-4ecc-b737-30b11776f4b0-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-777cb5bd5d-l5nmj\" (UID: \"36f4567c-3d00-4ecc-b737-30b11776f4b0\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.406784 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/36f4567c-3d00-4ecc-b737-30b11776f4b0-volume-directive-shadow\") pod \"kube-state-metrics-777cb5bd5d-l5nmj\" (UID: \"36f4567c-3d00-4ecc-b737-30b11776f4b0\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.406799 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6b08581e-299b-43b3-aaa0-6e4263e8281e-metrics-client-ca\") pod \"node-exporter-tqpwr\" (UID: \"6b08581e-299b-43b3-aaa0-6e4263e8281e\") " pod="openshift-monitoring/node-exporter-tqpwr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.407453 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/36f4567c-3d00-4ecc-b737-30b11776f4b0-kube-state-metrics-tls\") pod \"kube-state-metrics-777cb5bd5d-l5nmj\" (UID: \"36f4567c-3d00-4ecc-b737-30b11776f4b0\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.409405 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/36f4567c-3d00-4ecc-b737-30b11776f4b0-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-777cb5bd5d-l5nmj\" (UID: \"36f4567c-3d00-4ecc-b737-30b11776f4b0\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.410235 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6b08581e-299b-43b3-aaa0-6e4263e8281e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-tqpwr\" (UID: \"6b08581e-299b-43b3-aaa0-6e4263e8281e\") " pod="openshift-monitoring/node-exporter-tqpwr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.410254 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6b08581e-299b-43b3-aaa0-6e4263e8281e-node-exporter-tls\") pod \"node-exporter-tqpwr\" (UID: \"6b08581e-299b-43b3-aaa0-6e4263e8281e\") " pod="openshift-monitoring/node-exporter-tqpwr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.427993 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2j8d\" (UniqueName: \"kubernetes.io/projected/6b08581e-299b-43b3-aaa0-6e4263e8281e-kube-api-access-s2j8d\") pod \"node-exporter-tqpwr\" (UID: \"6b08581e-299b-43b3-aaa0-6e4263e8281e\") " pod="openshift-monitoring/node-exporter-tqpwr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.429025 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2d42\" (UniqueName: \"kubernetes.io/projected/36f4567c-3d00-4ecc-b737-30b11776f4b0-kube-api-access-h2d42\") pod \"kube-state-metrics-777cb5bd5d-l5nmj\" (UID: \"36f4567c-3d00-4ecc-b737-30b11776f4b0\") " pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.468716 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566890-dchfr" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.477106 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-8gj94" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.535363 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.548583 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-tqpwr" Mar 20 13:30:00 crc kubenswrapper[4973]: W0320 13:30:00.583480 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b08581e_299b_43b3_aaa0_6e4263e8281e.slice/crio-6b8894a998ee2c4433dd81ab6d8ea21346852923adae7f2cb26aa010e5b7d4a3 WatchSource:0}: Error finding container 6b8894a998ee2c4433dd81ab6d8ea21346852923adae7f2cb26aa010e5b7d4a3: Status 404 returned error can't find the container with id 6b8894a998ee2c4433dd81ab6d8ea21346852923adae7f2cb26aa010e5b7d4a3 Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.661495 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566890-dchfr"] Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.775656 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj"] Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.808692 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d546186-82a6-43a6-835d-98d221360f81-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-vdkln\" (UID: \"6d546186-82a6-43a6-835d-98d221360f81\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-vdkln" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.814322 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/6d546186-82a6-43a6-835d-98d221360f81-openshift-state-metrics-tls\") pod \"openshift-state-metrics-566fddb674-vdkln\" (UID: \"6d546186-82a6-43a6-835d-98d221360f81\") " pod="openshift-monitoring/openshift-state-metrics-566fddb674-vdkln" Mar 20 13:30:00 crc kubenswrapper[4973]: I0320 13:30:00.924025 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566890-8gj94"] Mar 20 13:30:00 crc kubenswrapper[4973]: W0320 13:30:00.930973 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12c40f14_cc6c_48dd_b2c0_c026cf5cd708.slice/crio-a0acf931d95c0163a1ae656023d8141bd95fca2f00c2c3c7999d0d35f98414ae WatchSource:0}: Error finding container a0acf931d95c0163a1ae656023d8141bd95fca2f00c2c3c7999d0d35f98414ae: Status 404 returned error can't find the container with id a0acf931d95c0163a1ae656023d8141bd95fca2f00c2c3c7999d0d35f98414ae Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.091255 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-566fddb674-vdkln" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.257635 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.278843 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.282595 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.293842 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.304721 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.305002 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.305077 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.305165 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-j84z7" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.305357 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.304956 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.305653 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.308507 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.319434 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-config-volume\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.319612 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.319692 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.319783 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.319856 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8s5p\" (UniqueName: \"kubernetes.io/projected/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-kube-api-access-c8s5p\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.319925 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.320007 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-config-out\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.320066 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.320128 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.320194 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.320261 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.320366 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-web-config\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.361741 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-566fddb674-vdkln"] Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.421839 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8s5p\" (UniqueName: \"kubernetes.io/projected/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-kube-api-access-c8s5p\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.421889 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.421941 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-config-out\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.421963 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.421982 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.422001 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.422023 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.422049 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-web-config\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.422091 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-config-volume\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.422112 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.422135 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.422159 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.424432 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.424605 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.424891 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.428111 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-tls-assets\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.428265 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.429667 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-config-out\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.429746 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.429894 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.429979 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.430366 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-web-config\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.439904 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-config-volume\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.447711 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8s5p\" (UniqueName: \"kubernetes.io/projected/d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3-kube-api-access-c8s5p\") pod \"alertmanager-main-0\" (UID: \"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3\") " pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.509206 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.515214 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566890-dchfr" event={"ID":"a33dac4d-9a04-4e13-8c3e-fac7025b14ca","Type":"ContainerStarted","Data":"fff14e9b78039e954dca0f2deaca9b63ccae215ee12549a806191cba99dee7dc"} Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.516953 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tqpwr" event={"ID":"6b08581e-299b-43b3-aaa0-6e4263e8281e","Type":"ContainerStarted","Data":"6b8894a998ee2c4433dd81ab6d8ea21346852923adae7f2cb26aa010e5b7d4a3"} Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.517715 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj" event={"ID":"36f4567c-3d00-4ecc-b737-30b11776f4b0","Type":"ContainerStarted","Data":"86bd6a9bde0bc1e2d52d6ea5d0766860f84125c58f705504a30ac48de0874d8e"} Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.518889 4973 generic.go:334] "Generic (PLEG): container finished" podID="12c40f14-cc6c-48dd-b2c0-c026cf5cd708" containerID="62f0d2355eeefb7c032b003bb6fef5d2bca8c88776749d6bc24a1068ecd65d2e" exitCode=0 Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.519051 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-8gj94" event={"ID":"12c40f14-cc6c-48dd-b2c0-c026cf5cd708","Type":"ContainerDied","Data":"62f0d2355eeefb7c032b003bb6fef5d2bca8c88776749d6bc24a1068ecd65d2e"} Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.519469 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-8gj94" event={"ID":"12c40f14-cc6c-48dd-b2c0-c026cf5cd708","Type":"ContainerStarted","Data":"a0acf931d95c0163a1ae656023d8141bd95fca2f00c2c3c7999d0d35f98414ae"} Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.520469 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-vdkln" event={"ID":"6d546186-82a6-43a6-835d-98d221360f81","Type":"ContainerStarted","Data":"a24b0ab4a0ad0539aa2adf01176ce3356bf3566fcc9b8f70a62f0f8a92e8c61f"} Mar 20 13:30:01 crc kubenswrapper[4973]: I0320 13:30:01.711197 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 20 13:30:01 crc kubenswrapper[4973]: W0320 13:30:01.716254 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4cf4cf2_30cd_4690_b0b1_b1bd9064bfb3.slice/crio-2d1cb4ca0480f5f560dbcef672f0045b1e943fb1224c4f143892ff80f8fb93bb WatchSource:0}: Error finding container 2d1cb4ca0480f5f560dbcef672f0045b1e943fb1224c4f143892ff80f8fb93bb: Status 404 returned error can't find the container with id 2d1cb4ca0480f5f560dbcef672f0045b1e943fb1224c4f143892ff80f8fb93bb Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.236601 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-55bcd89946-hkjh8"] Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.239287 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.240906 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.241112 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.241241 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.241253 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.243682 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-4ww5d" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.244286 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.245063 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-4r76lqflvfvof" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.248367 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-55bcd89946-hkjh8"] Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.334160 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg7sl\" (UniqueName: \"kubernetes.io/projected/db81edf4-da9e-422b-b1cc-7842cf2c1183-kube-api-access-jg7sl\") pod \"thanos-querier-55bcd89946-hkjh8\" (UID: \"db81edf4-da9e-422b-b1cc-7842cf2c1183\") " pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.334205 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/db81edf4-da9e-422b-b1cc-7842cf2c1183-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-55bcd89946-hkjh8\" (UID: \"db81edf4-da9e-422b-b1cc-7842cf2c1183\") " pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.334233 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/db81edf4-da9e-422b-b1cc-7842cf2c1183-secret-thanos-querier-tls\") pod \"thanos-querier-55bcd89946-hkjh8\" (UID: \"db81edf4-da9e-422b-b1cc-7842cf2c1183\") " pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.334259 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/db81edf4-da9e-422b-b1cc-7842cf2c1183-metrics-client-ca\") pod \"thanos-querier-55bcd89946-hkjh8\" (UID: \"db81edf4-da9e-422b-b1cc-7842cf2c1183\") " pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.334277 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/db81edf4-da9e-422b-b1cc-7842cf2c1183-secret-grpc-tls\") pod \"thanos-querier-55bcd89946-hkjh8\" (UID: \"db81edf4-da9e-422b-b1cc-7842cf2c1183\") " pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.334432 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/db81edf4-da9e-422b-b1cc-7842cf2c1183-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-55bcd89946-hkjh8\" (UID: \"db81edf4-da9e-422b-b1cc-7842cf2c1183\") " pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.334557 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/db81edf4-da9e-422b-b1cc-7842cf2c1183-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-55bcd89946-hkjh8\" (UID: \"db81edf4-da9e-422b-b1cc-7842cf2c1183\") " pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.334594 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/db81edf4-da9e-422b-b1cc-7842cf2c1183-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-55bcd89946-hkjh8\" (UID: \"db81edf4-da9e-422b-b1cc-7842cf2c1183\") " pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.435950 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/db81edf4-da9e-422b-b1cc-7842cf2c1183-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-55bcd89946-hkjh8\" (UID: \"db81edf4-da9e-422b-b1cc-7842cf2c1183\") " pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.436021 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/db81edf4-da9e-422b-b1cc-7842cf2c1183-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-55bcd89946-hkjh8\" (UID: \"db81edf4-da9e-422b-b1cc-7842cf2c1183\") " pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.436041 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/db81edf4-da9e-422b-b1cc-7842cf2c1183-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-55bcd89946-hkjh8\" (UID: \"db81edf4-da9e-422b-b1cc-7842cf2c1183\") " pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.436081 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg7sl\" (UniqueName: \"kubernetes.io/projected/db81edf4-da9e-422b-b1cc-7842cf2c1183-kube-api-access-jg7sl\") pod \"thanos-querier-55bcd89946-hkjh8\" (UID: \"db81edf4-da9e-422b-b1cc-7842cf2c1183\") " pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.436102 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/db81edf4-da9e-422b-b1cc-7842cf2c1183-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-55bcd89946-hkjh8\" (UID: \"db81edf4-da9e-422b-b1cc-7842cf2c1183\") " pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.436132 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/db81edf4-da9e-422b-b1cc-7842cf2c1183-secret-thanos-querier-tls\") pod \"thanos-querier-55bcd89946-hkjh8\" (UID: \"db81edf4-da9e-422b-b1cc-7842cf2c1183\") " pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.436152 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/db81edf4-da9e-422b-b1cc-7842cf2c1183-metrics-client-ca\") pod \"thanos-querier-55bcd89946-hkjh8\" (UID: \"db81edf4-da9e-422b-b1cc-7842cf2c1183\") " pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.436168 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/db81edf4-da9e-422b-b1cc-7842cf2c1183-secret-grpc-tls\") pod \"thanos-querier-55bcd89946-hkjh8\" (UID: \"db81edf4-da9e-422b-b1cc-7842cf2c1183\") " pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.438455 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/db81edf4-da9e-422b-b1cc-7842cf2c1183-metrics-client-ca\") pod \"thanos-querier-55bcd89946-hkjh8\" (UID: \"db81edf4-da9e-422b-b1cc-7842cf2c1183\") " pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.441736 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/db81edf4-da9e-422b-b1cc-7842cf2c1183-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-55bcd89946-hkjh8\" (UID: \"db81edf4-da9e-422b-b1cc-7842cf2c1183\") " pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.441820 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/db81edf4-da9e-422b-b1cc-7842cf2c1183-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-55bcd89946-hkjh8\" (UID: \"db81edf4-da9e-422b-b1cc-7842cf2c1183\") " pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.442040 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/db81edf4-da9e-422b-b1cc-7842cf2c1183-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-55bcd89946-hkjh8\" (UID: \"db81edf4-da9e-422b-b1cc-7842cf2c1183\") " pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.453091 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/db81edf4-da9e-422b-b1cc-7842cf2c1183-secret-thanos-querier-tls\") pod \"thanos-querier-55bcd89946-hkjh8\" (UID: \"db81edf4-da9e-422b-b1cc-7842cf2c1183\") " pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.463917 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/db81edf4-da9e-422b-b1cc-7842cf2c1183-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-55bcd89946-hkjh8\" (UID: \"db81edf4-da9e-422b-b1cc-7842cf2c1183\") " pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.464009 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg7sl\" (UniqueName: \"kubernetes.io/projected/db81edf4-da9e-422b-b1cc-7842cf2c1183-kube-api-access-jg7sl\") pod \"thanos-querier-55bcd89946-hkjh8\" (UID: \"db81edf4-da9e-422b-b1cc-7842cf2c1183\") " pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.464134 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/db81edf4-da9e-422b-b1cc-7842cf2c1183-secret-grpc-tls\") pod \"thanos-querier-55bcd89946-hkjh8\" (UID: \"db81edf4-da9e-422b-b1cc-7842cf2c1183\") " pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.526599 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3","Type":"ContainerStarted","Data":"2d1cb4ca0480f5f560dbcef672f0045b1e943fb1224c4f143892ff80f8fb93bb"} Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.528770 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-vdkln" event={"ID":"6d546186-82a6-43a6-835d-98d221360f81","Type":"ContainerStarted","Data":"79bac71c499b971c23cabb995dace0cdc3af2d3de5e406f3612fe3191a08caab"} Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.528818 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-vdkln" event={"ID":"6d546186-82a6-43a6-835d-98d221360f81","Type":"ContainerStarted","Data":"16f597ba115e65cd129fe4bfc804e3176732ce30b6d54fdd027f9ab81119ac14"} Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.602154 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.741049 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-8gj94" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.841278 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12c40f14-cc6c-48dd-b2c0-c026cf5cd708-config-volume\") pod \"12c40f14-cc6c-48dd-b2c0-c026cf5cd708\" (UID: \"12c40f14-cc6c-48dd-b2c0-c026cf5cd708\") " Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.841372 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/12c40f14-cc6c-48dd-b2c0-c026cf5cd708-secret-volume\") pod \"12c40f14-cc6c-48dd-b2c0-c026cf5cd708\" (UID: \"12c40f14-cc6c-48dd-b2c0-c026cf5cd708\") " Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.841414 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txkht\" (UniqueName: \"kubernetes.io/projected/12c40f14-cc6c-48dd-b2c0-c026cf5cd708-kube-api-access-txkht\") pod \"12c40f14-cc6c-48dd-b2c0-c026cf5cd708\" (UID: \"12c40f14-cc6c-48dd-b2c0-c026cf5cd708\") " Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.842304 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12c40f14-cc6c-48dd-b2c0-c026cf5cd708-config-volume" (OuterVolumeSpecName: "config-volume") pod "12c40f14-cc6c-48dd-b2c0-c026cf5cd708" (UID: "12c40f14-cc6c-48dd-b2c0-c026cf5cd708"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.849270 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12c40f14-cc6c-48dd-b2c0-c026cf5cd708-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "12c40f14-cc6c-48dd-b2c0-c026cf5cd708" (UID: "12c40f14-cc6c-48dd-b2c0-c026cf5cd708"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.849300 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12c40f14-cc6c-48dd-b2c0-c026cf5cd708-kube-api-access-txkht" (OuterVolumeSpecName: "kube-api-access-txkht") pod "12c40f14-cc6c-48dd-b2c0-c026cf5cd708" (UID: "12c40f14-cc6c-48dd-b2c0-c026cf5cd708"). InnerVolumeSpecName "kube-api-access-txkht". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.942432 4973 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/12c40f14-cc6c-48dd-b2c0-c026cf5cd708-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.942467 4973 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/12c40f14-cc6c-48dd-b2c0-c026cf5cd708-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:02 crc kubenswrapper[4973]: I0320 13:30:02.942479 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txkht\" (UniqueName: \"kubernetes.io/projected/12c40f14-cc6c-48dd-b2c0-c026cf5cd708-kube-api-access-txkht\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:03 crc kubenswrapper[4973]: I0320 13:30:03.106276 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-55bcd89946-hkjh8"] Mar 20 13:30:03 crc kubenswrapper[4973]: I0320 13:30:03.535114 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" event={"ID":"db81edf4-da9e-422b-b1cc-7842cf2c1183","Type":"ContainerStarted","Data":"2238c8ff8f3d319e9754d53c490cc9a8e2fcb54a8c017bf515ca04ee802bd71b"} Mar 20 13:30:03 crc kubenswrapper[4973]: I0320 13:30:03.537844 4973 generic.go:334] "Generic (PLEG): container finished" podID="a33dac4d-9a04-4e13-8c3e-fac7025b14ca" containerID="bb9c5df0c614ab19b805a339ea8af161e279182509f169ff90df7a81cf3ba345" exitCode=0 Mar 20 13:30:03 crc kubenswrapper[4973]: I0320 13:30:03.538059 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566890-dchfr" event={"ID":"a33dac4d-9a04-4e13-8c3e-fac7025b14ca","Type":"ContainerDied","Data":"bb9c5df0c614ab19b805a339ea8af161e279182509f169ff90df7a81cf3ba345"} Mar 20 13:30:03 crc kubenswrapper[4973]: I0320 13:30:03.539678 4973 generic.go:334] "Generic (PLEG): container finished" podID="6b08581e-299b-43b3-aaa0-6e4263e8281e" containerID="50ad4d45c8cfb9d89bd44613e57d8c46fa17d3f9ed59c8e9de27db5c8c729f58" exitCode=0 Mar 20 13:30:03 crc kubenswrapper[4973]: I0320 13:30:03.539791 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tqpwr" event={"ID":"6b08581e-299b-43b3-aaa0-6e4263e8281e","Type":"ContainerDied","Data":"50ad4d45c8cfb9d89bd44613e57d8c46fa17d3f9ed59c8e9de27db5c8c729f58"} Mar 20 13:30:03 crc kubenswrapper[4973]: I0320 13:30:03.541926 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj" event={"ID":"36f4567c-3d00-4ecc-b737-30b11776f4b0","Type":"ContainerStarted","Data":"c8fc65736a0899415a4e0e1051badaacdff9afbede8e46db006900bd1fd18d05"} Mar 20 13:30:03 crc kubenswrapper[4973]: I0320 13:30:03.541954 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj" event={"ID":"36f4567c-3d00-4ecc-b737-30b11776f4b0","Type":"ContainerStarted","Data":"1601322f7e4105de9ea748293b25294c423c83bd6016b7ee133ce2f3080fc594"} Mar 20 13:30:03 crc kubenswrapper[4973]: I0320 13:30:03.541965 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj" event={"ID":"36f4567c-3d00-4ecc-b737-30b11776f4b0","Type":"ContainerStarted","Data":"86695b48281dae97ac82c8321d87f1943138e33eafbc7ae0837e6484c55f309c"} Mar 20 13:30:03 crc kubenswrapper[4973]: I0320 13:30:03.543527 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-8gj94" event={"ID":"12c40f14-cc6c-48dd-b2c0-c026cf5cd708","Type":"ContainerDied","Data":"a0acf931d95c0163a1ae656023d8141bd95fca2f00c2c3c7999d0d35f98414ae"} Mar 20 13:30:03 crc kubenswrapper[4973]: I0320 13:30:03.543562 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0acf931d95c0163a1ae656023d8141bd95fca2f00c2c3c7999d0d35f98414ae" Mar 20 13:30:03 crc kubenswrapper[4973]: I0320 13:30:03.543610 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566890-8gj94" Mar 20 13:30:03 crc kubenswrapper[4973]: I0320 13:30:03.771795 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-777cb5bd5d-l5nmj" podStartSLOduration=1.834012325 podStartE2EDuration="3.77177469s" podCreationTimestamp="2026-03-20 13:30:00 +0000 UTC" firstStartedPulling="2026-03-20 13:30:00.786769428 +0000 UTC m=+521.530439172" lastFinishedPulling="2026-03-20 13:30:02.724531793 +0000 UTC m=+523.468201537" observedRunningTime="2026-03-20 13:30:03.592117354 +0000 UTC m=+524.335787098" watchObservedRunningTime="2026-03-20 13:30:03.77177469 +0000 UTC m=+524.515444434" Mar 20 13:30:04 crc kubenswrapper[4973]: I0320 13:30:04.551396 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-566fddb674-vdkln" event={"ID":"6d546186-82a6-43a6-835d-98d221360f81","Type":"ContainerStarted","Data":"f2278e5ef6a6c45826924b914f13831026b397b0c120bc0e6541d1dc39ccef6c"} Mar 20 13:30:04 crc kubenswrapper[4973]: I0320 13:30:04.556472 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tqpwr" event={"ID":"6b08581e-299b-43b3-aaa0-6e4263e8281e","Type":"ContainerStarted","Data":"d40146c88b4d2270ec2a0cbf43f32f44c6d8d9bcae55926c1872101ec44842c0"} Mar 20 13:30:04 crc kubenswrapper[4973]: I0320 13:30:04.556515 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-tqpwr" event={"ID":"6b08581e-299b-43b3-aaa0-6e4263e8281e","Type":"ContainerStarted","Data":"2278b9fa8e995e3402bbf58aa3751f59dc86a61bc2e0c180c3361ff9a134cd2b"} Mar 20 13:30:04 crc kubenswrapper[4973]: I0320 13:30:04.558672 4973 generic.go:334] "Generic (PLEG): container finished" podID="d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3" containerID="52b46bee2621d30685564bf3ce066d31c6bf7eee844b9ad71877f07bcb1d86b0" exitCode=0 Mar 20 13:30:04 crc kubenswrapper[4973]: I0320 13:30:04.558730 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3","Type":"ContainerDied","Data":"52b46bee2621d30685564bf3ce066d31c6bf7eee844b9ad71877f07bcb1d86b0"} Mar 20 13:30:04 crc kubenswrapper[4973]: I0320 13:30:04.575690 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-566fddb674-vdkln" podStartSLOduration=2.170523844 podStartE2EDuration="4.575669629s" podCreationTimestamp="2026-03-20 13:30:00 +0000 UTC" firstStartedPulling="2026-03-20 13:30:01.676589308 +0000 UTC m=+522.420259052" lastFinishedPulling="2026-03-20 13:30:04.081735093 +0000 UTC m=+524.825404837" observedRunningTime="2026-03-20 13:30:04.569036838 +0000 UTC m=+525.312706602" watchObservedRunningTime="2026-03-20 13:30:04.575669629 +0000 UTC m=+525.319339373" Mar 20 13:30:04 crc kubenswrapper[4973]: I0320 13:30:04.594852 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-tqpwr" podStartSLOduration=2.515507873 podStartE2EDuration="4.594832223s" podCreationTimestamp="2026-03-20 13:30:00 +0000 UTC" firstStartedPulling="2026-03-20 13:30:00.590477636 +0000 UTC m=+521.334147380" lastFinishedPulling="2026-03-20 13:30:02.669801986 +0000 UTC m=+523.413471730" observedRunningTime="2026-03-20 13:30:04.593785784 +0000 UTC m=+525.337455548" watchObservedRunningTime="2026-03-20 13:30:04.594832223 +0000 UTC m=+525.338501977" Mar 20 13:30:04 crc kubenswrapper[4973]: I0320 13:30:04.799399 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566890-dchfr" Mar 20 13:30:04 crc kubenswrapper[4973]: I0320 13:30:04.867562 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wvn2q\" (UniqueName: \"kubernetes.io/projected/a33dac4d-9a04-4e13-8c3e-fac7025b14ca-kube-api-access-wvn2q\") pod \"a33dac4d-9a04-4e13-8c3e-fac7025b14ca\" (UID: \"a33dac4d-9a04-4e13-8c3e-fac7025b14ca\") " Mar 20 13:30:04 crc kubenswrapper[4973]: I0320 13:30:04.876382 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a33dac4d-9a04-4e13-8c3e-fac7025b14ca-kube-api-access-wvn2q" (OuterVolumeSpecName: "kube-api-access-wvn2q") pod "a33dac4d-9a04-4e13-8c3e-fac7025b14ca" (UID: "a33dac4d-9a04-4e13-8c3e-fac7025b14ca"). InnerVolumeSpecName "kube-api-access-wvn2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:30:04 crc kubenswrapper[4973]: I0320 13:30:04.937571 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7bcc6b4fc8-spfpg"] Mar 20 13:30:04 crc kubenswrapper[4973]: E0320 13:30:04.937846 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12c40f14-cc6c-48dd-b2c0-c026cf5cd708" containerName="collect-profiles" Mar 20 13:30:04 crc kubenswrapper[4973]: I0320 13:30:04.937869 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="12c40f14-cc6c-48dd-b2c0-c026cf5cd708" containerName="collect-profiles" Mar 20 13:30:04 crc kubenswrapper[4973]: E0320 13:30:04.937896 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a33dac4d-9a04-4e13-8c3e-fac7025b14ca" containerName="oc" Mar 20 13:30:04 crc kubenswrapper[4973]: I0320 13:30:04.937906 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="a33dac4d-9a04-4e13-8c3e-fac7025b14ca" containerName="oc" Mar 20 13:30:04 crc kubenswrapper[4973]: I0320 13:30:04.938045 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="12c40f14-cc6c-48dd-b2c0-c026cf5cd708" containerName="collect-profiles" Mar 20 13:30:04 crc kubenswrapper[4973]: I0320 13:30:04.938064 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="a33dac4d-9a04-4e13-8c3e-fac7025b14ca" containerName="oc" Mar 20 13:30:04 crc kubenswrapper[4973]: I0320 13:30:04.938545 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:30:04 crc kubenswrapper[4973]: I0320 13:30:04.952682 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bcc6b4fc8-spfpg"] Mar 20 13:30:04 crc kubenswrapper[4973]: I0320 13:30:04.970080 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65cfdf70-b0b0-497e-9889-a04aaea42ac0-console-serving-cert\") pod \"console-7bcc6b4fc8-spfpg\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:30:04 crc kubenswrapper[4973]: I0320 13:30:04.970822 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65cfdf70-b0b0-497e-9889-a04aaea42ac0-console-oauth-config\") pod \"console-7bcc6b4fc8-spfpg\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:30:04 crc kubenswrapper[4973]: I0320 13:30:04.970882 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65cfdf70-b0b0-497e-9889-a04aaea42ac0-trusted-ca-bundle\") pod \"console-7bcc6b4fc8-spfpg\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:30:04 crc kubenswrapper[4973]: I0320 13:30:04.970908 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwgdh\" (UniqueName: \"kubernetes.io/projected/65cfdf70-b0b0-497e-9889-a04aaea42ac0-kube-api-access-jwgdh\") pod \"console-7bcc6b4fc8-spfpg\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:30:04 crc kubenswrapper[4973]: I0320 13:30:04.970932 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65cfdf70-b0b0-497e-9889-a04aaea42ac0-console-config\") pod \"console-7bcc6b4fc8-spfpg\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:30:04 crc kubenswrapper[4973]: I0320 13:30:04.970983 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65cfdf70-b0b0-497e-9889-a04aaea42ac0-service-ca\") pod \"console-7bcc6b4fc8-spfpg\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:30:04 crc kubenswrapper[4973]: I0320 13:30:04.971004 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65cfdf70-b0b0-497e-9889-a04aaea42ac0-oauth-serving-cert\") pod \"console-7bcc6b4fc8-spfpg\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:30:04 crc kubenswrapper[4973]: I0320 13:30:04.971071 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wvn2q\" (UniqueName: \"kubernetes.io/projected/a33dac4d-9a04-4e13-8c3e-fac7025b14ca-kube-api-access-wvn2q\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.071645 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65cfdf70-b0b0-497e-9889-a04aaea42ac0-console-config\") pod \"console-7bcc6b4fc8-spfpg\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.071715 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65cfdf70-b0b0-497e-9889-a04aaea42ac0-service-ca\") pod \"console-7bcc6b4fc8-spfpg\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.071733 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65cfdf70-b0b0-497e-9889-a04aaea42ac0-oauth-serving-cert\") pod \"console-7bcc6b4fc8-spfpg\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.071777 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65cfdf70-b0b0-497e-9889-a04aaea42ac0-console-serving-cert\") pod \"console-7bcc6b4fc8-spfpg\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.071798 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65cfdf70-b0b0-497e-9889-a04aaea42ac0-console-oauth-config\") pod \"console-7bcc6b4fc8-spfpg\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.071835 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65cfdf70-b0b0-497e-9889-a04aaea42ac0-trusted-ca-bundle\") pod \"console-7bcc6b4fc8-spfpg\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.071859 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jwgdh\" (UniqueName: \"kubernetes.io/projected/65cfdf70-b0b0-497e-9889-a04aaea42ac0-kube-api-access-jwgdh\") pod \"console-7bcc6b4fc8-spfpg\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.073950 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65cfdf70-b0b0-497e-9889-a04aaea42ac0-trusted-ca-bundle\") pod \"console-7bcc6b4fc8-spfpg\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.073950 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65cfdf70-b0b0-497e-9889-a04aaea42ac0-console-config\") pod \"console-7bcc6b4fc8-spfpg\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.074586 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65cfdf70-b0b0-497e-9889-a04aaea42ac0-service-ca\") pod \"console-7bcc6b4fc8-spfpg\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.077096 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65cfdf70-b0b0-497e-9889-a04aaea42ac0-oauth-serving-cert\") pod \"console-7bcc6b4fc8-spfpg\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.083706 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65cfdf70-b0b0-497e-9889-a04aaea42ac0-console-oauth-config\") pod \"console-7bcc6b4fc8-spfpg\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.084165 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65cfdf70-b0b0-497e-9889-a04aaea42ac0-console-serving-cert\") pod \"console-7bcc6b4fc8-spfpg\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.089067 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwgdh\" (UniqueName: \"kubernetes.io/projected/65cfdf70-b0b0-497e-9889-a04aaea42ac0-kube-api-access-jwgdh\") pod \"console-7bcc6b4fc8-spfpg\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.256499 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.518581 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-55f4d8dbbb-bmckj"] Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.519788 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.523117 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.523392 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.523427 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.524256 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.524565 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-7msfq" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.524715 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-c5le39di7mqp5" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.549311 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-55f4d8dbbb-bmckj"] Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.565848 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566890-dchfr" event={"ID":"a33dac4d-9a04-4e13-8c3e-fac7025b14ca","Type":"ContainerDied","Data":"fff14e9b78039e954dca0f2deaca9b63ccae215ee12549a806191cba99dee7dc"} Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.566830 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fff14e9b78039e954dca0f2deaca9b63ccae215ee12549a806191cba99dee7dc" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.565940 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566890-dchfr" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.581079 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/57c75085-cf04-49d6-8b97-902e00c0efd5-metrics-server-audit-profiles\") pod \"metrics-server-55f4d8dbbb-bmckj\" (UID: \"57c75085-cf04-49d6-8b97-902e00c0efd5\") " pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.581111 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/57c75085-cf04-49d6-8b97-902e00c0efd5-audit-log\") pod \"metrics-server-55f4d8dbbb-bmckj\" (UID: \"57c75085-cf04-49d6-8b97-902e00c0efd5\") " pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.581141 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/57c75085-cf04-49d6-8b97-902e00c0efd5-secret-metrics-server-tls\") pod \"metrics-server-55f4d8dbbb-bmckj\" (UID: \"57c75085-cf04-49d6-8b97-902e00c0efd5\") " pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.581168 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57c75085-cf04-49d6-8b97-902e00c0efd5-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-55f4d8dbbb-bmckj\" (UID: \"57c75085-cf04-49d6-8b97-902e00c0efd5\") " pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.581252 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/57c75085-cf04-49d6-8b97-902e00c0efd5-secret-metrics-client-certs\") pod \"metrics-server-55f4d8dbbb-bmckj\" (UID: \"57c75085-cf04-49d6-8b97-902e00c0efd5\") " pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.581271 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvg69\" (UniqueName: \"kubernetes.io/projected/57c75085-cf04-49d6-8b97-902e00c0efd5-kube-api-access-dvg69\") pod \"metrics-server-55f4d8dbbb-bmckj\" (UID: \"57c75085-cf04-49d6-8b97-902e00c0efd5\") " pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.581290 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57c75085-cf04-49d6-8b97-902e00c0efd5-client-ca-bundle\") pod \"metrics-server-55f4d8dbbb-bmckj\" (UID: \"57c75085-cf04-49d6-8b97-902e00c0efd5\") " pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.628069 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7bcc6b4fc8-spfpg"] Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.682881 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/57c75085-cf04-49d6-8b97-902e00c0efd5-secret-metrics-client-certs\") pod \"metrics-server-55f4d8dbbb-bmckj\" (UID: \"57c75085-cf04-49d6-8b97-902e00c0efd5\") " pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.683975 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvg69\" (UniqueName: \"kubernetes.io/projected/57c75085-cf04-49d6-8b97-902e00c0efd5-kube-api-access-dvg69\") pod \"metrics-server-55f4d8dbbb-bmckj\" (UID: \"57c75085-cf04-49d6-8b97-902e00c0efd5\") " pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.684081 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57c75085-cf04-49d6-8b97-902e00c0efd5-client-ca-bundle\") pod \"metrics-server-55f4d8dbbb-bmckj\" (UID: \"57c75085-cf04-49d6-8b97-902e00c0efd5\") " pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.684208 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/57c75085-cf04-49d6-8b97-902e00c0efd5-metrics-server-audit-profiles\") pod \"metrics-server-55f4d8dbbb-bmckj\" (UID: \"57c75085-cf04-49d6-8b97-902e00c0efd5\") " pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.684521 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/57c75085-cf04-49d6-8b97-902e00c0efd5-audit-log\") pod \"metrics-server-55f4d8dbbb-bmckj\" (UID: \"57c75085-cf04-49d6-8b97-902e00c0efd5\") " pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.684666 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/57c75085-cf04-49d6-8b97-902e00c0efd5-secret-metrics-server-tls\") pod \"metrics-server-55f4d8dbbb-bmckj\" (UID: \"57c75085-cf04-49d6-8b97-902e00c0efd5\") " pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.684776 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57c75085-cf04-49d6-8b97-902e00c0efd5-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-55f4d8dbbb-bmckj\" (UID: \"57c75085-cf04-49d6-8b97-902e00c0efd5\") " pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.684918 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/57c75085-cf04-49d6-8b97-902e00c0efd5-audit-log\") pod \"metrics-server-55f4d8dbbb-bmckj\" (UID: \"57c75085-cf04-49d6-8b97-902e00c0efd5\") " pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.685457 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/57c75085-cf04-49d6-8b97-902e00c0efd5-metrics-server-audit-profiles\") pod \"metrics-server-55f4d8dbbb-bmckj\" (UID: \"57c75085-cf04-49d6-8b97-902e00c0efd5\") " pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.686791 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/57c75085-cf04-49d6-8b97-902e00c0efd5-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-55f4d8dbbb-bmckj\" (UID: \"57c75085-cf04-49d6-8b97-902e00c0efd5\") " pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.700974 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/57c75085-cf04-49d6-8b97-902e00c0efd5-secret-metrics-client-certs\") pod \"metrics-server-55f4d8dbbb-bmckj\" (UID: \"57c75085-cf04-49d6-8b97-902e00c0efd5\") " pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.701098 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/57c75085-cf04-49d6-8b97-902e00c0efd5-secret-metrics-server-tls\") pod \"metrics-server-55f4d8dbbb-bmckj\" (UID: \"57c75085-cf04-49d6-8b97-902e00c0efd5\") " pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.703301 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57c75085-cf04-49d6-8b97-902e00c0efd5-client-ca-bundle\") pod \"metrics-server-55f4d8dbbb-bmckj\" (UID: \"57c75085-cf04-49d6-8b97-902e00c0efd5\") " pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.704883 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvg69\" (UniqueName: \"kubernetes.io/projected/57c75085-cf04-49d6-8b97-902e00c0efd5-kube-api-access-dvg69\") pod \"metrics-server-55f4d8dbbb-bmckj\" (UID: \"57c75085-cf04-49d6-8b97-902e00c0efd5\") " pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.844321 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.850977 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566884-j8dqd"] Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.853707 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566884-j8dqd"] Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.920161 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5f8fb459cf-n7bzv"] Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.920943 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5f8fb459cf-n7bzv" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.923521 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-6tstp" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.924051 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.930449 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5f8fb459cf-n7bzv"] Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.961308 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66c28cfe-8a0b-459a-bbab-59053fe226b8" path="/var/lib/kubelet/pods/66c28cfe-8a0b-459a-bbab-59053fe226b8/volumes" Mar 20 13:30:05 crc kubenswrapper[4973]: I0320 13:30:05.988205 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a8d216d7-808c-4312-a314-729e369ee963-monitoring-plugin-cert\") pod \"monitoring-plugin-5f8fb459cf-n7bzv\" (UID: \"a8d216d7-808c-4312-a314-729e369ee963\") " pod="openshift-monitoring/monitoring-plugin-5f8fb459cf-n7bzv" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.089216 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a8d216d7-808c-4312-a314-729e369ee963-monitoring-plugin-cert\") pod \"monitoring-plugin-5f8fb459cf-n7bzv\" (UID: \"a8d216d7-808c-4312-a314-729e369ee963\") " pod="openshift-monitoring/monitoring-plugin-5f8fb459cf-n7bzv" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.093296 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/a8d216d7-808c-4312-a314-729e369ee963-monitoring-plugin-cert\") pod \"monitoring-plugin-5f8fb459cf-n7bzv\" (UID: \"a8d216d7-808c-4312-a314-729e369ee963\") " pod="openshift-monitoring/monitoring-plugin-5f8fb459cf-n7bzv" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.236208 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5f8fb459cf-n7bzv" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.413115 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.418627 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.421219 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.421259 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.421490 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.421522 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.421870 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.422932 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-td6q7" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.422958 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.423095 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-drc0lg3k9gk9k" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.423109 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.423284 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.423322 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.426076 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.433371 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.444861 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.502419 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1623a587-c5f5-49eb-bc1d-960e4a0faf81-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.502460 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1623a587-c5f5-49eb-bc1d-960e4a0faf81-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.502481 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1623a587-c5f5-49eb-bc1d-960e4a0faf81-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.502499 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1623a587-c5f5-49eb-bc1d-960e4a0faf81-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.502514 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1623a587-c5f5-49eb-bc1d-960e4a0faf81-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.502542 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1623a587-c5f5-49eb-bc1d-960e4a0faf81-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.502590 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1623a587-c5f5-49eb-bc1d-960e4a0faf81-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.502606 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1623a587-c5f5-49eb-bc1d-960e4a0faf81-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.502625 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s2vt\" (UniqueName: \"kubernetes.io/projected/1623a587-c5f5-49eb-bc1d-960e4a0faf81-kube-api-access-5s2vt\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.502643 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1623a587-c5f5-49eb-bc1d-960e4a0faf81-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.502665 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1623a587-c5f5-49eb-bc1d-960e4a0faf81-web-config\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.502679 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1623a587-c5f5-49eb-bc1d-960e4a0faf81-config\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.502696 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1623a587-c5f5-49eb-bc1d-960e4a0faf81-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.502716 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1623a587-c5f5-49eb-bc1d-960e4a0faf81-config-out\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.502736 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1623a587-c5f5-49eb-bc1d-960e4a0faf81-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.502769 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1623a587-c5f5-49eb-bc1d-960e4a0faf81-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.502808 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1623a587-c5f5-49eb-bc1d-960e4a0faf81-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.502855 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1623a587-c5f5-49eb-bc1d-960e4a0faf81-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.603699 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1623a587-c5f5-49eb-bc1d-960e4a0faf81-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.603765 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1623a587-c5f5-49eb-bc1d-960e4a0faf81-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.603787 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1623a587-c5f5-49eb-bc1d-960e4a0faf81-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.603811 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1623a587-c5f5-49eb-bc1d-960e4a0faf81-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.603840 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1623a587-c5f5-49eb-bc1d-960e4a0faf81-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.603859 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1623a587-c5f5-49eb-bc1d-960e4a0faf81-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.603876 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1623a587-c5f5-49eb-bc1d-960e4a0faf81-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.603896 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1623a587-c5f5-49eb-bc1d-960e4a0faf81-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.603911 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1623a587-c5f5-49eb-bc1d-960e4a0faf81-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.603933 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1623a587-c5f5-49eb-bc1d-960e4a0faf81-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.603951 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1623a587-c5f5-49eb-bc1d-960e4a0faf81-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.603965 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1623a587-c5f5-49eb-bc1d-960e4a0faf81-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.603981 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s2vt\" (UniqueName: \"kubernetes.io/projected/1623a587-c5f5-49eb-bc1d-960e4a0faf81-kube-api-access-5s2vt\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.603997 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1623a587-c5f5-49eb-bc1d-960e4a0faf81-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.604018 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1623a587-c5f5-49eb-bc1d-960e4a0faf81-web-config\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.604036 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1623a587-c5f5-49eb-bc1d-960e4a0faf81-config\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.604051 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1623a587-c5f5-49eb-bc1d-960e4a0faf81-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.604068 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1623a587-c5f5-49eb-bc1d-960e4a0faf81-config-out\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.605669 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1623a587-c5f5-49eb-bc1d-960e4a0faf81-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.608245 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1623a587-c5f5-49eb-bc1d-960e4a0faf81-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.611611 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1623a587-c5f5-49eb-bc1d-960e4a0faf81-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.614152 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1623a587-c5f5-49eb-bc1d-960e4a0faf81-config-out\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.617164 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1623a587-c5f5-49eb-bc1d-960e4a0faf81-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.617585 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1623a587-c5f5-49eb-bc1d-960e4a0faf81-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.617632 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1623a587-c5f5-49eb-bc1d-960e4a0faf81-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.618105 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1623a587-c5f5-49eb-bc1d-960e4a0faf81-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.618936 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1623a587-c5f5-49eb-bc1d-960e4a0faf81-web-config\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.619978 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1623a587-c5f5-49eb-bc1d-960e4a0faf81-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.620043 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1623a587-c5f5-49eb-bc1d-960e4a0faf81-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.620375 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1623a587-c5f5-49eb-bc1d-960e4a0faf81-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.622221 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1623a587-c5f5-49eb-bc1d-960e4a0faf81-config\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.622575 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1623a587-c5f5-49eb-bc1d-960e4a0faf81-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.623227 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1623a587-c5f5-49eb-bc1d-960e4a0faf81-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.623271 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1623a587-c5f5-49eb-bc1d-960e4a0faf81-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.623759 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s2vt\" (UniqueName: \"kubernetes.io/projected/1623a587-c5f5-49eb-bc1d-960e4a0faf81-kube-api-access-5s2vt\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.625751 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1623a587-c5f5-49eb-bc1d-960e4a0faf81-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1623a587-c5f5-49eb-bc1d-960e4a0faf81\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:06 crc kubenswrapper[4973]: I0320 13:30:06.737590 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:07 crc kubenswrapper[4973]: W0320 13:30:07.032141 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65cfdf70_b0b0_497e_9889_a04aaea42ac0.slice/crio-fa35ca3598dbfad6b2a86cb3347cb55e43447399d7674bb39f055f3a53135bf0 WatchSource:0}: Error finding container fa35ca3598dbfad6b2a86cb3347cb55e43447399d7674bb39f055f3a53135bf0: Status 404 returned error can't find the container with id fa35ca3598dbfad6b2a86cb3347cb55e43447399d7674bb39f055f3a53135bf0 Mar 20 13:30:07 crc kubenswrapper[4973]: I0320 13:30:07.501519 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 20 13:30:07 crc kubenswrapper[4973]: W0320 13:30:07.513655 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1623a587_c5f5_49eb_bc1d_960e4a0faf81.slice/crio-2b5df2e08bd2fcf8c13decc9054d6d287c652cee4963e6c8121ec76b6fd61426 WatchSource:0}: Error finding container 2b5df2e08bd2fcf8c13decc9054d6d287c652cee4963e6c8121ec76b6fd61426: Status 404 returned error can't find the container with id 2b5df2e08bd2fcf8c13decc9054d6d287c652cee4963e6c8121ec76b6fd61426 Mar 20 13:30:07 crc kubenswrapper[4973]: I0320 13:30:07.554726 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5f8fb459cf-n7bzv"] Mar 20 13:30:07 crc kubenswrapper[4973]: W0320 13:30:07.555255 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8d216d7_808c_4312_a314_729e369ee963.slice/crio-3a0655632fcc447e534bdd8d5c38a2934d2dd265d13f6c34f88b4f7ff5b76775 WatchSource:0}: Error finding container 3a0655632fcc447e534bdd8d5c38a2934d2dd265d13f6c34f88b4f7ff5b76775: Status 404 returned error can't find the container with id 3a0655632fcc447e534bdd8d5c38a2934d2dd265d13f6c34f88b4f7ff5b76775 Mar 20 13:30:07 crc kubenswrapper[4973]: I0320 13:30:07.560823 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-55f4d8dbbb-bmckj"] Mar 20 13:30:07 crc kubenswrapper[4973]: I0320 13:30:07.582955 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" event={"ID":"57c75085-cf04-49d6-8b97-902e00c0efd5","Type":"ContainerStarted","Data":"afea8c0e170dd725222b7ae8fbfd68e13fcaecd0d25be2a6f51b696437ec998b"} Mar 20 13:30:07 crc kubenswrapper[4973]: I0320 13:30:07.584660 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bcc6b4fc8-spfpg" event={"ID":"65cfdf70-b0b0-497e-9889-a04aaea42ac0","Type":"ContainerStarted","Data":"efc02883c7f60ad1f4cc8e67cc2c8ca69a7b02bff7eccc98773f8784deaf8957"} Mar 20 13:30:07 crc kubenswrapper[4973]: I0320 13:30:07.584684 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bcc6b4fc8-spfpg" event={"ID":"65cfdf70-b0b0-497e-9889-a04aaea42ac0","Type":"ContainerStarted","Data":"fa35ca3598dbfad6b2a86cb3347cb55e43447399d7674bb39f055f3a53135bf0"} Mar 20 13:30:07 crc kubenswrapper[4973]: I0320 13:30:07.586138 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1623a587-c5f5-49eb-bc1d-960e4a0faf81","Type":"ContainerStarted","Data":"2b5df2e08bd2fcf8c13decc9054d6d287c652cee4963e6c8121ec76b6fd61426"} Mar 20 13:30:07 crc kubenswrapper[4973]: I0320 13:30:07.589505 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3","Type":"ContainerStarted","Data":"1d40ee86fcbab0ffedaec8dfc3d25afd540034cd3965720f1ca364ff085c3c32"} Mar 20 13:30:07 crc kubenswrapper[4973]: I0320 13:30:07.589533 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3","Type":"ContainerStarted","Data":"f8d8b7ff75db117ed158f2389bf2cc6698cfe5874173bd4413aa4dfea7e574c8"} Mar 20 13:30:07 crc kubenswrapper[4973]: I0320 13:30:07.589545 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3","Type":"ContainerStarted","Data":"2885c0428126f07fa51eb4cedf8274eb05451de38416fb508b08c49b2ca807ae"} Mar 20 13:30:07 crc kubenswrapper[4973]: I0320 13:30:07.590611 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5f8fb459cf-n7bzv" event={"ID":"a8d216d7-808c-4312-a314-729e369ee963","Type":"ContainerStarted","Data":"3a0655632fcc447e534bdd8d5c38a2934d2dd265d13f6c34f88b4f7ff5b76775"} Mar 20 13:30:07 crc kubenswrapper[4973]: I0320 13:30:07.592142 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" event={"ID":"db81edf4-da9e-422b-b1cc-7842cf2c1183","Type":"ContainerStarted","Data":"c8b441eb3265d204f3a16b424cc58c8af61aba6dcddfc98871e9003096efaed7"} Mar 20 13:30:07 crc kubenswrapper[4973]: I0320 13:30:07.592165 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" event={"ID":"db81edf4-da9e-422b-b1cc-7842cf2c1183","Type":"ContainerStarted","Data":"7d4f9609eab6740477fc89a2fc9f59c87f52ed790f8136be3173b5e0830d5e76"} Mar 20 13:30:07 crc kubenswrapper[4973]: I0320 13:30:07.592174 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" event={"ID":"db81edf4-da9e-422b-b1cc-7842cf2c1183","Type":"ContainerStarted","Data":"d2f080f313bb7d85b444fd3dfd4f734424da0c42cf2a325410c94198a20497bc"} Mar 20 13:30:07 crc kubenswrapper[4973]: I0320 13:30:07.605667 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7bcc6b4fc8-spfpg" podStartSLOduration=3.605644992 podStartE2EDuration="3.605644992s" podCreationTimestamp="2026-03-20 13:30:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:30:07.599769701 +0000 UTC m=+528.343439445" watchObservedRunningTime="2026-03-20 13:30:07.605644992 +0000 UTC m=+528.349314736" Mar 20 13:30:08 crc kubenswrapper[4973]: I0320 13:30:08.600299 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3","Type":"ContainerStarted","Data":"45817f90cea31a85ca9dfafe12e5aa09e30394a5e8b553b08eba4e66f9128d66"} Mar 20 13:30:08 crc kubenswrapper[4973]: I0320 13:30:08.600646 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3","Type":"ContainerStarted","Data":"a654e7f0c9d736030011729abb4a62030a891d51d92f782060c952ce02be9914"} Mar 20 13:30:08 crc kubenswrapper[4973]: I0320 13:30:08.602358 4973 generic.go:334] "Generic (PLEG): container finished" podID="1623a587-c5f5-49eb-bc1d-960e4a0faf81" containerID="43e8710dbd3608d99e8ea80713534a3577c2f98f4b7700f1b78c98616cc84be4" exitCode=0 Mar 20 13:30:08 crc kubenswrapper[4973]: I0320 13:30:08.603208 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1623a587-c5f5-49eb-bc1d-960e4a0faf81","Type":"ContainerDied","Data":"43e8710dbd3608d99e8ea80713534a3577c2f98f4b7700f1b78c98616cc84be4"} Mar 20 13:30:09 crc kubenswrapper[4973]: I0320 13:30:09.618495 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"d4cf4cf2-30cd-4690-b0b1-b1bd9064bfb3","Type":"ContainerStarted","Data":"ecca122900eab1db14f76e4c41a61fb3ea3332257c43c286b2c5938c7612e1da"} Mar 20 13:30:09 crc kubenswrapper[4973]: I0320 13:30:09.623492 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" event={"ID":"db81edf4-da9e-422b-b1cc-7842cf2c1183","Type":"ContainerStarted","Data":"1dc951c688dc24ade788fe1e4715215c4326db12331a83cabf4755e711999236"} Mar 20 13:30:09 crc kubenswrapper[4973]: I0320 13:30:09.623541 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" event={"ID":"db81edf4-da9e-422b-b1cc-7842cf2c1183","Type":"ContainerStarted","Data":"a768d48e429930f7ae2d6e54bba7012b07774b5d74107fc29c9faac26f3b4454"} Mar 20 13:30:09 crc kubenswrapper[4973]: I0320 13:30:09.623556 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" event={"ID":"db81edf4-da9e-422b-b1cc-7842cf2c1183","Type":"ContainerStarted","Data":"859fea19652c0acca45bfdf89c635cf2c47fd89dc55c026f99f624cc02d59fe9"} Mar 20 13:30:09 crc kubenswrapper[4973]: I0320 13:30:09.623627 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:09 crc kubenswrapper[4973]: I0320 13:30:09.645175 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.774714237 podStartE2EDuration="8.645153821s" podCreationTimestamp="2026-03-20 13:30:01 +0000 UTC" firstStartedPulling="2026-03-20 13:30:01.719818271 +0000 UTC m=+522.463488025" lastFinishedPulling="2026-03-20 13:30:08.590257875 +0000 UTC m=+529.333927609" observedRunningTime="2026-03-20 13:30:09.64217694 +0000 UTC m=+530.385846684" watchObservedRunningTime="2026-03-20 13:30:09.645153821 +0000 UTC m=+530.388823575" Mar 20 13:30:09 crc kubenswrapper[4973]: I0320 13:30:09.674732 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" podStartSLOduration=2.217495767 podStartE2EDuration="7.67470745s" podCreationTimestamp="2026-03-20 13:30:02 +0000 UTC" firstStartedPulling="2026-03-20 13:30:03.115289446 +0000 UTC m=+523.858959190" lastFinishedPulling="2026-03-20 13:30:08.572501129 +0000 UTC m=+529.316170873" observedRunningTime="2026-03-20 13:30:09.669676903 +0000 UTC m=+530.413346647" watchObservedRunningTime="2026-03-20 13:30:09.67470745 +0000 UTC m=+530.418377194" Mar 20 13:30:10 crc kubenswrapper[4973]: I0320 13:30:10.630190 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5f8fb459cf-n7bzv" event={"ID":"a8d216d7-808c-4312-a314-729e369ee963","Type":"ContainerStarted","Data":"f42bb6b77784ebaeacaa84c45ea17cb403b956ca0af73947758de5a81a12666c"} Mar 20 13:30:10 crc kubenswrapper[4973]: I0320 13:30:10.630991 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-5f8fb459cf-n7bzv" Mar 20 13:30:10 crc kubenswrapper[4973]: I0320 13:30:10.633304 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" event={"ID":"57c75085-cf04-49d6-8b97-902e00c0efd5","Type":"ContainerStarted","Data":"da6003725dac38353c12d436c1cdfb7e598b608c8da0d7de51706cb392e36cf6"} Mar 20 13:30:10 crc kubenswrapper[4973]: I0320 13:30:10.636011 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5f8fb459cf-n7bzv" Mar 20 13:30:10 crc kubenswrapper[4973]: I0320 13:30:10.647957 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5f8fb459cf-n7bzv" podStartSLOduration=3.360364584 podStartE2EDuration="5.647940302s" podCreationTimestamp="2026-03-20 13:30:05 +0000 UTC" firstStartedPulling="2026-03-20 13:30:07.557599247 +0000 UTC m=+528.301268991" lastFinishedPulling="2026-03-20 13:30:09.845174965 +0000 UTC m=+530.588844709" observedRunningTime="2026-03-20 13:30:10.645550187 +0000 UTC m=+531.389219951" watchObservedRunningTime="2026-03-20 13:30:10.647940302 +0000 UTC m=+531.391610046" Mar 20 13:30:10 crc kubenswrapper[4973]: I0320 13:30:10.686897 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" podStartSLOduration=3.404820531 podStartE2EDuration="5.686883178s" podCreationTimestamp="2026-03-20 13:30:05 +0000 UTC" firstStartedPulling="2026-03-20 13:30:07.558360478 +0000 UTC m=+528.302030212" lastFinishedPulling="2026-03-20 13:30:09.840423115 +0000 UTC m=+530.584092859" observedRunningTime="2026-03-20 13:30:10.682612841 +0000 UTC m=+531.426282605" watchObservedRunningTime="2026-03-20 13:30:10.686883178 +0000 UTC m=+531.430552922" Mar 20 13:30:12 crc kubenswrapper[4973]: I0320 13:30:12.613285 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" Mar 20 13:30:12 crc kubenswrapper[4973]: I0320 13:30:12.674395 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1623a587-c5f5-49eb-bc1d-960e4a0faf81","Type":"ContainerStarted","Data":"786deca399054864c1d15f0af1aeef90509e1344ad113f4a4c08a5f264604adf"} Mar 20 13:30:12 crc kubenswrapper[4973]: I0320 13:30:12.674453 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1623a587-c5f5-49eb-bc1d-960e4a0faf81","Type":"ContainerStarted","Data":"47bae9c2f05331daa1c9947374b909fce7bf327ffd26cb6e24110f8a70727f97"} Mar 20 13:30:12 crc kubenswrapper[4973]: I0320 13:30:12.674463 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1623a587-c5f5-49eb-bc1d-960e4a0faf81","Type":"ContainerStarted","Data":"4a396b3f1ef15fa08de390a515dbeceba1b8c0b6380c12372ec24bd461712f2e"} Mar 20 13:30:13 crc kubenswrapper[4973]: I0320 13:30:13.696453 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1623a587-c5f5-49eb-bc1d-960e4a0faf81","Type":"ContainerStarted","Data":"bb88aaa550058dfbd9fa71f9dbbf7ed6d39ccf94933e223202365f41c11322a7"} Mar 20 13:30:13 crc kubenswrapper[4973]: I0320 13:30:13.696803 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1623a587-c5f5-49eb-bc1d-960e4a0faf81","Type":"ContainerStarted","Data":"044fb19194f1b73320d4f7a17899399fc57ff76e642f77d3998e98226c39c775"} Mar 20 13:30:13 crc kubenswrapper[4973]: I0320 13:30:13.696819 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1623a587-c5f5-49eb-bc1d-960e4a0faf81","Type":"ContainerStarted","Data":"f9053b269024ae327f11c530693e767ec91cc360aa662b5490f7755b04d92745"} Mar 20 13:30:15 crc kubenswrapper[4973]: I0320 13:30:15.257364 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:30:15 crc kubenswrapper[4973]: I0320 13:30:15.258564 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:30:15 crc kubenswrapper[4973]: I0320 13:30:15.265214 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:30:15 crc kubenswrapper[4973]: I0320 13:30:15.287240 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=5.762860841 podStartE2EDuration="9.287220243s" podCreationTimestamp="2026-03-20 13:30:06 +0000 UTC" firstStartedPulling="2026-03-20 13:30:08.6036164 +0000 UTC m=+529.347286144" lastFinishedPulling="2026-03-20 13:30:12.127975802 +0000 UTC m=+532.871645546" observedRunningTime="2026-03-20 13:30:13.734030332 +0000 UTC m=+534.477700096" watchObservedRunningTime="2026-03-20 13:30:15.287220243 +0000 UTC m=+536.030889987" Mar 20 13:30:15 crc kubenswrapper[4973]: I0320 13:30:15.713484 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:30:15 crc kubenswrapper[4973]: I0320 13:30:15.821493 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-k7krj"] Mar 20 13:30:16 crc kubenswrapper[4973]: I0320 13:30:16.738538 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:30:25 crc kubenswrapper[4973]: I0320 13:30:25.845070 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 13:30:25 crc kubenswrapper[4973]: I0320 13:30:25.845535 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 13:30:40 crc kubenswrapper[4973]: I0320 13:30:40.861547 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-k7krj" podUID="de8d912e-7616-42ee-a688-b43d5b85dc44" containerName="console" containerID="cri-o://9177ef8fd6fdd2756e298bcec7d63f4104099f4e0bb820eda29fc40835fc0d91" gracePeriod=15 Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.238100 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-k7krj_de8d912e-7616-42ee-a688-b43d5b85dc44/console/0.log" Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.238426 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.357280 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de8d912e-7616-42ee-a688-b43d5b85dc44-trusted-ca-bundle\") pod \"de8d912e-7616-42ee-a688-b43d5b85dc44\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.357403 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wvh9\" (UniqueName: \"kubernetes.io/projected/de8d912e-7616-42ee-a688-b43d5b85dc44-kube-api-access-4wvh9\") pod \"de8d912e-7616-42ee-a688-b43d5b85dc44\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.357446 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de8d912e-7616-42ee-a688-b43d5b85dc44-service-ca\") pod \"de8d912e-7616-42ee-a688-b43d5b85dc44\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.357478 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de8d912e-7616-42ee-a688-b43d5b85dc44-console-serving-cert\") pod \"de8d912e-7616-42ee-a688-b43d5b85dc44\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.357496 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de8d912e-7616-42ee-a688-b43d5b85dc44-console-oauth-config\") pod \"de8d912e-7616-42ee-a688-b43d5b85dc44\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.357518 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de8d912e-7616-42ee-a688-b43d5b85dc44-console-config\") pod \"de8d912e-7616-42ee-a688-b43d5b85dc44\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.357536 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de8d912e-7616-42ee-a688-b43d5b85dc44-oauth-serving-cert\") pod \"de8d912e-7616-42ee-a688-b43d5b85dc44\" (UID: \"de8d912e-7616-42ee-a688-b43d5b85dc44\") " Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.358403 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de8d912e-7616-42ee-a688-b43d5b85dc44-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "de8d912e-7616-42ee-a688-b43d5b85dc44" (UID: "de8d912e-7616-42ee-a688-b43d5b85dc44"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.358411 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de8d912e-7616-42ee-a688-b43d5b85dc44-console-config" (OuterVolumeSpecName: "console-config") pod "de8d912e-7616-42ee-a688-b43d5b85dc44" (UID: "de8d912e-7616-42ee-a688-b43d5b85dc44"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.358474 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de8d912e-7616-42ee-a688-b43d5b85dc44-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "de8d912e-7616-42ee-a688-b43d5b85dc44" (UID: "de8d912e-7616-42ee-a688-b43d5b85dc44"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.358495 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de8d912e-7616-42ee-a688-b43d5b85dc44-service-ca" (OuterVolumeSpecName: "service-ca") pod "de8d912e-7616-42ee-a688-b43d5b85dc44" (UID: "de8d912e-7616-42ee-a688-b43d5b85dc44"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.363255 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de8d912e-7616-42ee-a688-b43d5b85dc44-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "de8d912e-7616-42ee-a688-b43d5b85dc44" (UID: "de8d912e-7616-42ee-a688-b43d5b85dc44"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.363915 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de8d912e-7616-42ee-a688-b43d5b85dc44-kube-api-access-4wvh9" (OuterVolumeSpecName: "kube-api-access-4wvh9") pod "de8d912e-7616-42ee-a688-b43d5b85dc44" (UID: "de8d912e-7616-42ee-a688-b43d5b85dc44"). InnerVolumeSpecName "kube-api-access-4wvh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.364595 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de8d912e-7616-42ee-a688-b43d5b85dc44-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "de8d912e-7616-42ee-a688-b43d5b85dc44" (UID: "de8d912e-7616-42ee-a688-b43d5b85dc44"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.459056 4973 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de8d912e-7616-42ee-a688-b43d5b85dc44-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.459100 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wvh9\" (UniqueName: \"kubernetes.io/projected/de8d912e-7616-42ee-a688-b43d5b85dc44-kube-api-access-4wvh9\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.459123 4973 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/de8d912e-7616-42ee-a688-b43d5b85dc44-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.459132 4973 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/de8d912e-7616-42ee-a688-b43d5b85dc44-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.459155 4973 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/de8d912e-7616-42ee-a688-b43d5b85dc44-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.459163 4973 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/de8d912e-7616-42ee-a688-b43d5b85dc44-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.459182 4973 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/de8d912e-7616-42ee-a688-b43d5b85dc44-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.912460 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-k7krj_de8d912e-7616-42ee-a688-b43d5b85dc44/console/0.log" Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.913549 4973 generic.go:334] "Generic (PLEG): container finished" podID="de8d912e-7616-42ee-a688-b43d5b85dc44" containerID="9177ef8fd6fdd2756e298bcec7d63f4104099f4e0bb820eda29fc40835fc0d91" exitCode=2 Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.913607 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-k7krj" event={"ID":"de8d912e-7616-42ee-a688-b43d5b85dc44","Type":"ContainerDied","Data":"9177ef8fd6fdd2756e298bcec7d63f4104099f4e0bb820eda29fc40835fc0d91"} Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.913646 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-k7krj" event={"ID":"de8d912e-7616-42ee-a688-b43d5b85dc44","Type":"ContainerDied","Data":"6909a8e8efa8aea6b3c5239315309179c8a21ed6be0e5c3ed0c16fa23a7e9cd3"} Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.913666 4973 scope.go:117] "RemoveContainer" containerID="9177ef8fd6fdd2756e298bcec7d63f4104099f4e0bb820eda29fc40835fc0d91" Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.913587 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-k7krj" Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.930749 4973 scope.go:117] "RemoveContainer" containerID="9177ef8fd6fdd2756e298bcec7d63f4104099f4e0bb820eda29fc40835fc0d91" Mar 20 13:30:41 crc kubenswrapper[4973]: E0320 13:30:41.931118 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9177ef8fd6fdd2756e298bcec7d63f4104099f4e0bb820eda29fc40835fc0d91\": container with ID starting with 9177ef8fd6fdd2756e298bcec7d63f4104099f4e0bb820eda29fc40835fc0d91 not found: ID does not exist" containerID="9177ef8fd6fdd2756e298bcec7d63f4104099f4e0bb820eda29fc40835fc0d91" Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.931148 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9177ef8fd6fdd2756e298bcec7d63f4104099f4e0bb820eda29fc40835fc0d91"} err="failed to get container status \"9177ef8fd6fdd2756e298bcec7d63f4104099f4e0bb820eda29fc40835fc0d91\": rpc error: code = NotFound desc = could not find container \"9177ef8fd6fdd2756e298bcec7d63f4104099f4e0bb820eda29fc40835fc0d91\": container with ID starting with 9177ef8fd6fdd2756e298bcec7d63f4104099f4e0bb820eda29fc40835fc0d91 not found: ID does not exist" Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.953201 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-k7krj"] Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.957756 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-k7krj"] Mar 20 13:30:41 crc kubenswrapper[4973]: I0320 13:30:41.965889 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de8d912e-7616-42ee-a688-b43d5b85dc44" path="/var/lib/kubelet/pods/de8d912e-7616-42ee-a688-b43d5b85dc44/volumes" Mar 20 13:30:45 crc kubenswrapper[4973]: I0320 13:30:45.853291 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 13:30:45 crc kubenswrapper[4973]: I0320 13:30:45.860407 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 13:31:06 crc kubenswrapper[4973]: I0320 13:31:06.739496 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:31:06 crc kubenswrapper[4973]: I0320 13:31:06.782429 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:31:07 crc kubenswrapper[4973]: I0320 13:31:07.122858 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 20 13:31:31 crc kubenswrapper[4973]: I0320 13:31:31.479805 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-69866dbfb5-64wk8"] Mar 20 13:31:31 crc kubenswrapper[4973]: E0320 13:31:31.480625 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de8d912e-7616-42ee-a688-b43d5b85dc44" containerName="console" Mar 20 13:31:31 crc kubenswrapper[4973]: I0320 13:31:31.480640 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="de8d912e-7616-42ee-a688-b43d5b85dc44" containerName="console" Mar 20 13:31:31 crc kubenswrapper[4973]: I0320 13:31:31.480799 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="de8d912e-7616-42ee-a688-b43d5b85dc44" containerName="console" Mar 20 13:31:31 crc kubenswrapper[4973]: I0320 13:31:31.481286 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:31:31 crc kubenswrapper[4973]: I0320 13:31:31.489926 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69866dbfb5-64wk8"] Mar 20 13:31:31 crc kubenswrapper[4973]: I0320 13:31:31.601614 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a13fbd0b-b630-41bc-b997-4aebc4cac884-oauth-serving-cert\") pod \"console-69866dbfb5-64wk8\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:31:31 crc kubenswrapper[4973]: I0320 13:31:31.601681 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a13fbd0b-b630-41bc-b997-4aebc4cac884-trusted-ca-bundle\") pod \"console-69866dbfb5-64wk8\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:31:31 crc kubenswrapper[4973]: I0320 13:31:31.601698 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a13fbd0b-b630-41bc-b997-4aebc4cac884-console-oauth-config\") pod \"console-69866dbfb5-64wk8\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:31:31 crc kubenswrapper[4973]: I0320 13:31:31.601719 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a13fbd0b-b630-41bc-b997-4aebc4cac884-console-serving-cert\") pod \"console-69866dbfb5-64wk8\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:31:31 crc kubenswrapper[4973]: I0320 13:31:31.601740 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a13fbd0b-b630-41bc-b997-4aebc4cac884-console-config\") pod \"console-69866dbfb5-64wk8\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:31:31 crc kubenswrapper[4973]: I0320 13:31:31.601992 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a13fbd0b-b630-41bc-b997-4aebc4cac884-service-ca\") pod \"console-69866dbfb5-64wk8\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:31:31 crc kubenswrapper[4973]: I0320 13:31:31.602097 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxvz8\" (UniqueName: \"kubernetes.io/projected/a13fbd0b-b630-41bc-b997-4aebc4cac884-kube-api-access-zxvz8\") pod \"console-69866dbfb5-64wk8\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:31:31 crc kubenswrapper[4973]: I0320 13:31:31.703392 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a13fbd0b-b630-41bc-b997-4aebc4cac884-oauth-serving-cert\") pod \"console-69866dbfb5-64wk8\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:31:31 crc kubenswrapper[4973]: I0320 13:31:31.703453 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a13fbd0b-b630-41bc-b997-4aebc4cac884-trusted-ca-bundle\") pod \"console-69866dbfb5-64wk8\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:31:31 crc kubenswrapper[4973]: I0320 13:31:31.703471 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a13fbd0b-b630-41bc-b997-4aebc4cac884-console-oauth-config\") pod \"console-69866dbfb5-64wk8\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:31:31 crc kubenswrapper[4973]: I0320 13:31:31.703493 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a13fbd0b-b630-41bc-b997-4aebc4cac884-console-serving-cert\") pod \"console-69866dbfb5-64wk8\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:31:31 crc kubenswrapper[4973]: I0320 13:31:31.703518 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a13fbd0b-b630-41bc-b997-4aebc4cac884-console-config\") pod \"console-69866dbfb5-64wk8\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:31:31 crc kubenswrapper[4973]: I0320 13:31:31.703552 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a13fbd0b-b630-41bc-b997-4aebc4cac884-service-ca\") pod \"console-69866dbfb5-64wk8\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:31:31 crc kubenswrapper[4973]: I0320 13:31:31.703571 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxvz8\" (UniqueName: \"kubernetes.io/projected/a13fbd0b-b630-41bc-b997-4aebc4cac884-kube-api-access-zxvz8\") pod \"console-69866dbfb5-64wk8\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:31:31 crc kubenswrapper[4973]: I0320 13:31:31.704534 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a13fbd0b-b630-41bc-b997-4aebc4cac884-console-config\") pod \"console-69866dbfb5-64wk8\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:31:31 crc kubenswrapper[4973]: I0320 13:31:31.704534 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a13fbd0b-b630-41bc-b997-4aebc4cac884-service-ca\") pod \"console-69866dbfb5-64wk8\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:31:31 crc kubenswrapper[4973]: I0320 13:31:31.704780 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a13fbd0b-b630-41bc-b997-4aebc4cac884-trusted-ca-bundle\") pod \"console-69866dbfb5-64wk8\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:31:31 crc kubenswrapper[4973]: I0320 13:31:31.705373 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a13fbd0b-b630-41bc-b997-4aebc4cac884-oauth-serving-cert\") pod \"console-69866dbfb5-64wk8\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:31:31 crc kubenswrapper[4973]: I0320 13:31:31.708860 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a13fbd0b-b630-41bc-b997-4aebc4cac884-console-serving-cert\") pod \"console-69866dbfb5-64wk8\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:31:31 crc kubenswrapper[4973]: I0320 13:31:31.708928 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a13fbd0b-b630-41bc-b997-4aebc4cac884-console-oauth-config\") pod \"console-69866dbfb5-64wk8\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:31:31 crc kubenswrapper[4973]: I0320 13:31:31.718514 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxvz8\" (UniqueName: \"kubernetes.io/projected/a13fbd0b-b630-41bc-b997-4aebc4cac884-kube-api-access-zxvz8\") pod \"console-69866dbfb5-64wk8\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:31:31 crc kubenswrapper[4973]: I0320 13:31:31.811172 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:31:32 crc kubenswrapper[4973]: I0320 13:31:32.231455 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69866dbfb5-64wk8"] Mar 20 13:31:32 crc kubenswrapper[4973]: I0320 13:31:32.272300 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69866dbfb5-64wk8" event={"ID":"a13fbd0b-b630-41bc-b997-4aebc4cac884","Type":"ContainerStarted","Data":"ffa9ce6e8bd053c3d7516143434ac206ede2b765935ea7d84fedb33b289c7220"} Mar 20 13:31:33 crc kubenswrapper[4973]: I0320 13:31:33.283083 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69866dbfb5-64wk8" event={"ID":"a13fbd0b-b630-41bc-b997-4aebc4cac884","Type":"ContainerStarted","Data":"3bb4d40594ddbc7ad4929b7b7fd18abcef7903fc40cb1aa527e342afd5e9e67a"} Mar 20 13:31:33 crc kubenswrapper[4973]: I0320 13:31:33.307624 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-69866dbfb5-64wk8" podStartSLOduration=2.307606577 podStartE2EDuration="2.307606577s" podCreationTimestamp="2026-03-20 13:31:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:31:33.307288238 +0000 UTC m=+614.050958022" watchObservedRunningTime="2026-03-20 13:31:33.307606577 +0000 UTC m=+614.051276321" Mar 20 13:31:35 crc kubenswrapper[4973]: I0320 13:31:35.472029 4973 scope.go:117] "RemoveContainer" containerID="f9fb4a8af7ca9f72f596503f7883fe894d2fee99d6a7d459cea017d98b13cf70" Mar 20 13:31:41 crc kubenswrapper[4973]: I0320 13:31:41.811864 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:31:41 crc kubenswrapper[4973]: I0320 13:31:41.812297 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:31:41 crc kubenswrapper[4973]: I0320 13:31:41.819485 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:31:42 crc kubenswrapper[4973]: I0320 13:31:42.352970 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:31:42 crc kubenswrapper[4973]: I0320 13:31:42.483191 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7bcc6b4fc8-spfpg"] Mar 20 13:31:43 crc kubenswrapper[4973]: I0320 13:31:43.321001 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:31:43 crc kubenswrapper[4973]: I0320 13:31:43.321408 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:32:00 crc kubenswrapper[4973]: I0320 13:32:00.160462 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566892-xxx6r"] Mar 20 13:32:00 crc kubenswrapper[4973]: I0320 13:32:00.162041 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566892-xxx6r" Mar 20 13:32:00 crc kubenswrapper[4973]: I0320 13:32:00.167551 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:32:00 crc kubenswrapper[4973]: I0320 13:32:00.168559 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 13:32:00 crc kubenswrapper[4973]: I0320 13:32:00.168841 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:32:00 crc kubenswrapper[4973]: I0320 13:32:00.179969 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566892-xxx6r"] Mar 20 13:32:00 crc kubenswrapper[4973]: I0320 13:32:00.249774 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk9pk\" (UniqueName: \"kubernetes.io/projected/51178de5-3e29-40cf-b4c4-05dcbed6ce8c-kube-api-access-hk9pk\") pod \"auto-csr-approver-29566892-xxx6r\" (UID: \"51178de5-3e29-40cf-b4c4-05dcbed6ce8c\") " pod="openshift-infra/auto-csr-approver-29566892-xxx6r" Mar 20 13:32:00 crc kubenswrapper[4973]: I0320 13:32:00.352608 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk9pk\" (UniqueName: \"kubernetes.io/projected/51178de5-3e29-40cf-b4c4-05dcbed6ce8c-kube-api-access-hk9pk\") pod \"auto-csr-approver-29566892-xxx6r\" (UID: \"51178de5-3e29-40cf-b4c4-05dcbed6ce8c\") " pod="openshift-infra/auto-csr-approver-29566892-xxx6r" Mar 20 13:32:00 crc kubenswrapper[4973]: I0320 13:32:00.377039 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk9pk\" (UniqueName: \"kubernetes.io/projected/51178de5-3e29-40cf-b4c4-05dcbed6ce8c-kube-api-access-hk9pk\") pod \"auto-csr-approver-29566892-xxx6r\" (UID: \"51178de5-3e29-40cf-b4c4-05dcbed6ce8c\") " pod="openshift-infra/auto-csr-approver-29566892-xxx6r" Mar 20 13:32:00 crc kubenswrapper[4973]: I0320 13:32:00.496206 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566892-xxx6r" Mar 20 13:32:00 crc kubenswrapper[4973]: I0320 13:32:00.752200 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566892-xxx6r"] Mar 20 13:32:01 crc kubenswrapper[4973]: I0320 13:32:01.492205 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566892-xxx6r" event={"ID":"51178de5-3e29-40cf-b4c4-05dcbed6ce8c","Type":"ContainerStarted","Data":"ae64fc506fdea0423f57d274f05a3e8c4e968a6e0b9852fe9b4cc68ab69db97e"} Mar 20 13:32:02 crc kubenswrapper[4973]: I0320 13:32:02.503792 4973 generic.go:334] "Generic (PLEG): container finished" podID="51178de5-3e29-40cf-b4c4-05dcbed6ce8c" containerID="39fad7f5278031b5a1dfe03e2753d65d0f252c680e396f68ba1175f849a60bdd" exitCode=0 Mar 20 13:32:02 crc kubenswrapper[4973]: I0320 13:32:02.503888 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566892-xxx6r" event={"ID":"51178de5-3e29-40cf-b4c4-05dcbed6ce8c","Type":"ContainerDied","Data":"39fad7f5278031b5a1dfe03e2753d65d0f252c680e396f68ba1175f849a60bdd"} Mar 20 13:32:03 crc kubenswrapper[4973]: I0320 13:32:03.736212 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566892-xxx6r" Mar 20 13:32:03 crc kubenswrapper[4973]: I0320 13:32:03.841619 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk9pk\" (UniqueName: \"kubernetes.io/projected/51178de5-3e29-40cf-b4c4-05dcbed6ce8c-kube-api-access-hk9pk\") pod \"51178de5-3e29-40cf-b4c4-05dcbed6ce8c\" (UID: \"51178de5-3e29-40cf-b4c4-05dcbed6ce8c\") " Mar 20 13:32:03 crc kubenswrapper[4973]: I0320 13:32:03.848696 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51178de5-3e29-40cf-b4c4-05dcbed6ce8c-kube-api-access-hk9pk" (OuterVolumeSpecName: "kube-api-access-hk9pk") pod "51178de5-3e29-40cf-b4c4-05dcbed6ce8c" (UID: "51178de5-3e29-40cf-b4c4-05dcbed6ce8c"). InnerVolumeSpecName "kube-api-access-hk9pk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:32:03 crc kubenswrapper[4973]: I0320 13:32:03.944708 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hk9pk\" (UniqueName: \"kubernetes.io/projected/51178de5-3e29-40cf-b4c4-05dcbed6ce8c-kube-api-access-hk9pk\") on node \"crc\" DevicePath \"\"" Mar 20 13:32:04 crc kubenswrapper[4973]: I0320 13:32:04.516845 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566892-xxx6r" event={"ID":"51178de5-3e29-40cf-b4c4-05dcbed6ce8c","Type":"ContainerDied","Data":"ae64fc506fdea0423f57d274f05a3e8c4e968a6e0b9852fe9b4cc68ab69db97e"} Mar 20 13:32:04 crc kubenswrapper[4973]: I0320 13:32:04.516889 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae64fc506fdea0423f57d274f05a3e8c4e968a6e0b9852fe9b4cc68ab69db97e" Mar 20 13:32:04 crc kubenswrapper[4973]: I0320 13:32:04.516958 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566892-xxx6r" Mar 20 13:32:04 crc kubenswrapper[4973]: I0320 13:32:04.796634 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566886-pm77h"] Mar 20 13:32:04 crc kubenswrapper[4973]: I0320 13:32:04.837241 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566886-pm77h"] Mar 20 13:32:05 crc kubenswrapper[4973]: I0320 13:32:05.966476 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f10e2a0a-df92-4a54-98e0-382851137211" path="/var/lib/kubelet/pods/f10e2a0a-df92-4a54-98e0-382851137211/volumes" Mar 20 13:32:07 crc kubenswrapper[4973]: I0320 13:32:07.534567 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-7bcc6b4fc8-spfpg" podUID="65cfdf70-b0b0-497e-9889-a04aaea42ac0" containerName="console" containerID="cri-o://efc02883c7f60ad1f4cc8e67cc2c8ca69a7b02bff7eccc98773f8784deaf8957" gracePeriod=15 Mar 20 13:32:07 crc kubenswrapper[4973]: I0320 13:32:07.981354 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7bcc6b4fc8-spfpg_65cfdf70-b0b0-497e-9889-a04aaea42ac0/console/0.log" Mar 20 13:32:07 crc kubenswrapper[4973]: I0320 13:32:07.981411 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.107966 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65cfdf70-b0b0-497e-9889-a04aaea42ac0-service-ca\") pod \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.108193 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65cfdf70-b0b0-497e-9889-a04aaea42ac0-console-serving-cert\") pod \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.108292 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65cfdf70-b0b0-497e-9889-a04aaea42ac0-trusted-ca-bundle\") pod \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.108402 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwgdh\" (UniqueName: \"kubernetes.io/projected/65cfdf70-b0b0-497e-9889-a04aaea42ac0-kube-api-access-jwgdh\") pod \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.108540 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65cfdf70-b0b0-497e-9889-a04aaea42ac0-oauth-serving-cert\") pod \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.108664 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65cfdf70-b0b0-497e-9889-a04aaea42ac0-console-config\") pod \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.108839 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65cfdf70-b0b0-497e-9889-a04aaea42ac0-console-oauth-config\") pod \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\" (UID: \"65cfdf70-b0b0-497e-9889-a04aaea42ac0\") " Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.109079 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65cfdf70-b0b0-497e-9889-a04aaea42ac0-service-ca" (OuterVolumeSpecName: "service-ca") pod "65cfdf70-b0b0-497e-9889-a04aaea42ac0" (UID: "65cfdf70-b0b0-497e-9889-a04aaea42ac0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.109105 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65cfdf70-b0b0-497e-9889-a04aaea42ac0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "65cfdf70-b0b0-497e-9889-a04aaea42ac0" (UID: "65cfdf70-b0b0-497e-9889-a04aaea42ac0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.109096 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65cfdf70-b0b0-497e-9889-a04aaea42ac0-console-config" (OuterVolumeSpecName: "console-config") pod "65cfdf70-b0b0-497e-9889-a04aaea42ac0" (UID: "65cfdf70-b0b0-497e-9889-a04aaea42ac0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.109166 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65cfdf70-b0b0-497e-9889-a04aaea42ac0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "65cfdf70-b0b0-497e-9889-a04aaea42ac0" (UID: "65cfdf70-b0b0-497e-9889-a04aaea42ac0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.109366 4973 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/65cfdf70-b0b0-497e-9889-a04aaea42ac0-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.109449 4973 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65cfdf70-b0b0-497e-9889-a04aaea42ac0-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.109508 4973 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/65cfdf70-b0b0-497e-9889-a04aaea42ac0-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.109572 4973 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/65cfdf70-b0b0-497e-9889-a04aaea42ac0-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.114123 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65cfdf70-b0b0-497e-9889-a04aaea42ac0-kube-api-access-jwgdh" (OuterVolumeSpecName: "kube-api-access-jwgdh") pod "65cfdf70-b0b0-497e-9889-a04aaea42ac0" (UID: "65cfdf70-b0b0-497e-9889-a04aaea42ac0"). InnerVolumeSpecName "kube-api-access-jwgdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.115023 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65cfdf70-b0b0-497e-9889-a04aaea42ac0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "65cfdf70-b0b0-497e-9889-a04aaea42ac0" (UID: "65cfdf70-b0b0-497e-9889-a04aaea42ac0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.115481 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65cfdf70-b0b0-497e-9889-a04aaea42ac0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "65cfdf70-b0b0-497e-9889-a04aaea42ac0" (UID: "65cfdf70-b0b0-497e-9889-a04aaea42ac0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.211283 4973 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/65cfdf70-b0b0-497e-9889-a04aaea42ac0-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.211325 4973 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/65cfdf70-b0b0-497e-9889-a04aaea42ac0-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.211358 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jwgdh\" (UniqueName: \"kubernetes.io/projected/65cfdf70-b0b0-497e-9889-a04aaea42ac0-kube-api-access-jwgdh\") on node \"crc\" DevicePath \"\"" Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.547660 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7bcc6b4fc8-spfpg_65cfdf70-b0b0-497e-9889-a04aaea42ac0/console/0.log" Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.547743 4973 generic.go:334] "Generic (PLEG): container finished" podID="65cfdf70-b0b0-497e-9889-a04aaea42ac0" containerID="efc02883c7f60ad1f4cc8e67cc2c8ca69a7b02bff7eccc98773f8784deaf8957" exitCode=2 Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.547785 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bcc6b4fc8-spfpg" event={"ID":"65cfdf70-b0b0-497e-9889-a04aaea42ac0","Type":"ContainerDied","Data":"efc02883c7f60ad1f4cc8e67cc2c8ca69a7b02bff7eccc98773f8784deaf8957"} Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.547821 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7bcc6b4fc8-spfpg" event={"ID":"65cfdf70-b0b0-497e-9889-a04aaea42ac0","Type":"ContainerDied","Data":"fa35ca3598dbfad6b2a86cb3347cb55e43447399d7674bb39f055f3a53135bf0"} Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.547840 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7bcc6b4fc8-spfpg" Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.547851 4973 scope.go:117] "RemoveContainer" containerID="efc02883c7f60ad1f4cc8e67cc2c8ca69a7b02bff7eccc98773f8784deaf8957" Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.572900 4973 scope.go:117] "RemoveContainer" containerID="efc02883c7f60ad1f4cc8e67cc2c8ca69a7b02bff7eccc98773f8784deaf8957" Mar 20 13:32:08 crc kubenswrapper[4973]: E0320 13:32:08.573645 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efc02883c7f60ad1f4cc8e67cc2c8ca69a7b02bff7eccc98773f8784deaf8957\": container with ID starting with efc02883c7f60ad1f4cc8e67cc2c8ca69a7b02bff7eccc98773f8784deaf8957 not found: ID does not exist" containerID="efc02883c7f60ad1f4cc8e67cc2c8ca69a7b02bff7eccc98773f8784deaf8957" Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.573714 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efc02883c7f60ad1f4cc8e67cc2c8ca69a7b02bff7eccc98773f8784deaf8957"} err="failed to get container status \"efc02883c7f60ad1f4cc8e67cc2c8ca69a7b02bff7eccc98773f8784deaf8957\": rpc error: code = NotFound desc = could not find container \"efc02883c7f60ad1f4cc8e67cc2c8ca69a7b02bff7eccc98773f8784deaf8957\": container with ID starting with efc02883c7f60ad1f4cc8e67cc2c8ca69a7b02bff7eccc98773f8784deaf8957 not found: ID does not exist" Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.599103 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7bcc6b4fc8-spfpg"] Mar 20 13:32:08 crc kubenswrapper[4973]: I0320 13:32:08.604104 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7bcc6b4fc8-spfpg"] Mar 20 13:32:09 crc kubenswrapper[4973]: I0320 13:32:09.974091 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65cfdf70-b0b0-497e-9889-a04aaea42ac0" path="/var/lib/kubelet/pods/65cfdf70-b0b0-497e-9889-a04aaea42ac0/volumes" Mar 20 13:32:13 crc kubenswrapper[4973]: I0320 13:32:13.320549 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:32:13 crc kubenswrapper[4973]: I0320 13:32:13.320974 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:32:35 crc kubenswrapper[4973]: I0320 13:32:35.535148 4973 scope.go:117] "RemoveContainer" containerID="3606634484df059d227d92640e427b3c4266d58ddb2cbfdc8486686661fd98d5" Mar 20 13:32:35 crc kubenswrapper[4973]: I0320 13:32:35.582985 4973 scope.go:117] "RemoveContainer" containerID="a23ae5a45cdbb57630353dfdb5b0383f7b30e8846fcdc33d5d52c131173a7423" Mar 20 13:32:35 crc kubenswrapper[4973]: I0320 13:32:35.600673 4973 scope.go:117] "RemoveContainer" containerID="3be77d6c72034a6d01a1e20fc2da07144d4311518686935282dbae1c4fada22b" Mar 20 13:32:35 crc kubenswrapper[4973]: I0320 13:32:35.617849 4973 scope.go:117] "RemoveContainer" containerID="18d92bd3a9ca7915dd4ae5d8f43927e1ae8256cc06c76bd242c200dfe3b90b44" Mar 20 13:32:35 crc kubenswrapper[4973]: I0320 13:32:35.644571 4973 scope.go:117] "RemoveContainer" containerID="96bf1ef5249aa5946a08794ce5306ea3474c4aeeb813e65c813085230c6afea9" Mar 20 13:32:43 crc kubenswrapper[4973]: I0320 13:32:43.321080 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:32:43 crc kubenswrapper[4973]: I0320 13:32:43.321886 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:32:43 crc kubenswrapper[4973]: I0320 13:32:43.321947 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 13:32:43 crc kubenswrapper[4973]: I0320 13:32:43.322695 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ae703893e0e60a1cb59d74cc5e33372631d6542f4ae1b15e831e0084a728c10a"} pod="openshift-machine-config-operator/machine-config-daemon-qlztx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:32:43 crc kubenswrapper[4973]: I0320 13:32:43.322797 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" containerID="cri-o://ae703893e0e60a1cb59d74cc5e33372631d6542f4ae1b15e831e0084a728c10a" gracePeriod=600 Mar 20 13:32:43 crc kubenswrapper[4973]: I0320 13:32:43.821209 4973 generic.go:334] "Generic (PLEG): container finished" podID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerID="ae703893e0e60a1cb59d74cc5e33372631d6542f4ae1b15e831e0084a728c10a" exitCode=0 Mar 20 13:32:43 crc kubenswrapper[4973]: I0320 13:32:43.821286 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerDied","Data":"ae703893e0e60a1cb59d74cc5e33372631d6542f4ae1b15e831e0084a728c10a"} Mar 20 13:32:43 crc kubenswrapper[4973]: I0320 13:32:43.821837 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerStarted","Data":"26d2f8d0ba44652122f03dbe7cb2777fe726b59947a9f579b0a01f84b56a0f0a"} Mar 20 13:32:43 crc kubenswrapper[4973]: I0320 13:32:43.821922 4973 scope.go:117] "RemoveContainer" containerID="52d5fd2368a231dd70e0a2d1cd4e97da6e1bd0ca50431edd3ab2ddcc1bd88dec" Mar 20 13:34:00 crc kubenswrapper[4973]: I0320 13:34:00.143109 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566894-d67xc"] Mar 20 13:34:00 crc kubenswrapper[4973]: E0320 13:34:00.146243 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65cfdf70-b0b0-497e-9889-a04aaea42ac0" containerName="console" Mar 20 13:34:00 crc kubenswrapper[4973]: I0320 13:34:00.146261 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="65cfdf70-b0b0-497e-9889-a04aaea42ac0" containerName="console" Mar 20 13:34:00 crc kubenswrapper[4973]: E0320 13:34:00.146279 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51178de5-3e29-40cf-b4c4-05dcbed6ce8c" containerName="oc" Mar 20 13:34:00 crc kubenswrapper[4973]: I0320 13:34:00.146286 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="51178de5-3e29-40cf-b4c4-05dcbed6ce8c" containerName="oc" Mar 20 13:34:00 crc kubenswrapper[4973]: I0320 13:34:00.146428 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="51178de5-3e29-40cf-b4c4-05dcbed6ce8c" containerName="oc" Mar 20 13:34:00 crc kubenswrapper[4973]: I0320 13:34:00.146443 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="65cfdf70-b0b0-497e-9889-a04aaea42ac0" containerName="console" Mar 20 13:34:00 crc kubenswrapper[4973]: I0320 13:34:00.146963 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566894-d67xc" Mar 20 13:34:00 crc kubenswrapper[4973]: I0320 13:34:00.150075 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:34:00 crc kubenswrapper[4973]: I0320 13:34:00.150660 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:34:00 crc kubenswrapper[4973]: I0320 13:34:00.150840 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 13:34:00 crc kubenswrapper[4973]: I0320 13:34:00.151871 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566894-d67xc"] Mar 20 13:34:00 crc kubenswrapper[4973]: I0320 13:34:00.175254 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjlpz\" (UniqueName: \"kubernetes.io/projected/a7634907-5a5c-4483-be73-3b057ec837ad-kube-api-access-wjlpz\") pod \"auto-csr-approver-29566894-d67xc\" (UID: \"a7634907-5a5c-4483-be73-3b057ec837ad\") " pod="openshift-infra/auto-csr-approver-29566894-d67xc" Mar 20 13:34:00 crc kubenswrapper[4973]: I0320 13:34:00.276255 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjlpz\" (UniqueName: \"kubernetes.io/projected/a7634907-5a5c-4483-be73-3b057ec837ad-kube-api-access-wjlpz\") pod \"auto-csr-approver-29566894-d67xc\" (UID: \"a7634907-5a5c-4483-be73-3b057ec837ad\") " pod="openshift-infra/auto-csr-approver-29566894-d67xc" Mar 20 13:34:00 crc kubenswrapper[4973]: I0320 13:34:00.296110 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjlpz\" (UniqueName: \"kubernetes.io/projected/a7634907-5a5c-4483-be73-3b057ec837ad-kube-api-access-wjlpz\") pod \"auto-csr-approver-29566894-d67xc\" (UID: \"a7634907-5a5c-4483-be73-3b057ec837ad\") " pod="openshift-infra/auto-csr-approver-29566894-d67xc" Mar 20 13:34:00 crc kubenswrapper[4973]: I0320 13:34:00.492519 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566894-d67xc" Mar 20 13:34:00 crc kubenswrapper[4973]: I0320 13:34:00.912839 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566894-d67xc"] Mar 20 13:34:01 crc kubenswrapper[4973]: I0320 13:34:01.364650 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566894-d67xc" event={"ID":"a7634907-5a5c-4483-be73-3b057ec837ad","Type":"ContainerStarted","Data":"6cf32b2adc94d63bd133208a1b493c123642fcb3370a8f08703fb509e19dfeac"} Mar 20 13:34:03 crc kubenswrapper[4973]: I0320 13:34:03.383231 4973 generic.go:334] "Generic (PLEG): container finished" podID="a7634907-5a5c-4483-be73-3b057ec837ad" containerID="db323a7526b36b7015171f6383383884f9410ee0c819b4f4fde67e961295e664" exitCode=0 Mar 20 13:34:03 crc kubenswrapper[4973]: I0320 13:34:03.383310 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566894-d67xc" event={"ID":"a7634907-5a5c-4483-be73-3b057ec837ad","Type":"ContainerDied","Data":"db323a7526b36b7015171f6383383884f9410ee0c819b4f4fde67e961295e664"} Mar 20 13:34:04 crc kubenswrapper[4973]: I0320 13:34:04.647558 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566894-d67xc" Mar 20 13:34:04 crc kubenswrapper[4973]: I0320 13:34:04.842243 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wjlpz\" (UniqueName: \"kubernetes.io/projected/a7634907-5a5c-4483-be73-3b057ec837ad-kube-api-access-wjlpz\") pod \"a7634907-5a5c-4483-be73-3b057ec837ad\" (UID: \"a7634907-5a5c-4483-be73-3b057ec837ad\") " Mar 20 13:34:04 crc kubenswrapper[4973]: I0320 13:34:04.848150 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7634907-5a5c-4483-be73-3b057ec837ad-kube-api-access-wjlpz" (OuterVolumeSpecName: "kube-api-access-wjlpz") pod "a7634907-5a5c-4483-be73-3b057ec837ad" (UID: "a7634907-5a5c-4483-be73-3b057ec837ad"). InnerVolumeSpecName "kube-api-access-wjlpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:34:04 crc kubenswrapper[4973]: I0320 13:34:04.943852 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wjlpz\" (UniqueName: \"kubernetes.io/projected/a7634907-5a5c-4483-be73-3b057ec837ad-kube-api-access-wjlpz\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:05 crc kubenswrapper[4973]: I0320 13:34:05.403974 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566894-d67xc" event={"ID":"a7634907-5a5c-4483-be73-3b057ec837ad","Type":"ContainerDied","Data":"6cf32b2adc94d63bd133208a1b493c123642fcb3370a8f08703fb509e19dfeac"} Mar 20 13:34:05 crc kubenswrapper[4973]: I0320 13:34:05.404018 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6cf32b2adc94d63bd133208a1b493c123642fcb3370a8f08703fb509e19dfeac" Mar 20 13:34:05 crc kubenswrapper[4973]: I0320 13:34:05.404059 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566894-d67xc" Mar 20 13:34:05 crc kubenswrapper[4973]: I0320 13:34:05.705704 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566888-z9k9w"] Mar 20 13:34:05 crc kubenswrapper[4973]: I0320 13:34:05.710639 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566888-z9k9w"] Mar 20 13:34:05 crc kubenswrapper[4973]: I0320 13:34:05.964944 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a10661fb-7db3-4aa7-b7f7-ffaaacb3999d" path="/var/lib/kubelet/pods/a10661fb-7db3-4aa7-b7f7-ffaaacb3999d/volumes" Mar 20 13:34:17 crc kubenswrapper[4973]: I0320 13:34:17.468194 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7"] Mar 20 13:34:17 crc kubenswrapper[4973]: E0320 13:34:17.468757 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7634907-5a5c-4483-be73-3b057ec837ad" containerName="oc" Mar 20 13:34:17 crc kubenswrapper[4973]: I0320 13:34:17.468771 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7634907-5a5c-4483-be73-3b057ec837ad" containerName="oc" Mar 20 13:34:17 crc kubenswrapper[4973]: I0320 13:34:17.468911 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7634907-5a5c-4483-be73-3b057ec837ad" containerName="oc" Mar 20 13:34:17 crc kubenswrapper[4973]: I0320 13:34:17.469941 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7" Mar 20 13:34:17 crc kubenswrapper[4973]: I0320 13:34:17.472460 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 13:34:17 crc kubenswrapper[4973]: I0320 13:34:17.488297 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7"] Mar 20 13:34:17 crc kubenswrapper[4973]: I0320 13:34:17.559171 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7\" (UID: \"0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7" Mar 20 13:34:17 crc kubenswrapper[4973]: I0320 13:34:17.559224 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbnnz\" (UniqueName: \"kubernetes.io/projected/0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7-kube-api-access-gbnnz\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7\" (UID: \"0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7" Mar 20 13:34:17 crc kubenswrapper[4973]: I0320 13:34:17.559296 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7\" (UID: \"0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7" Mar 20 13:34:17 crc kubenswrapper[4973]: I0320 13:34:17.659695 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7\" (UID: \"0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7" Mar 20 13:34:17 crc kubenswrapper[4973]: I0320 13:34:17.659786 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7\" (UID: \"0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7" Mar 20 13:34:17 crc kubenswrapper[4973]: I0320 13:34:17.659827 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbnnz\" (UniqueName: \"kubernetes.io/projected/0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7-kube-api-access-gbnnz\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7\" (UID: \"0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7" Mar 20 13:34:17 crc kubenswrapper[4973]: I0320 13:34:17.660539 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7-bundle\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7\" (UID: \"0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7" Mar 20 13:34:17 crc kubenswrapper[4973]: I0320 13:34:17.660612 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7-util\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7\" (UID: \"0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7" Mar 20 13:34:17 crc kubenswrapper[4973]: I0320 13:34:17.687787 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbnnz\" (UniqueName: \"kubernetes.io/projected/0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7-kube-api-access-gbnnz\") pod \"93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7\" (UID: \"0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7\") " pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7" Mar 20 13:34:17 crc kubenswrapper[4973]: I0320 13:34:17.799529 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7" Mar 20 13:34:18 crc kubenswrapper[4973]: I0320 13:34:18.073372 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7"] Mar 20 13:34:18 crc kubenswrapper[4973]: I0320 13:34:18.501566 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7" event={"ID":"0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7","Type":"ContainerStarted","Data":"8af46be636a61cb29e36b356b4ceeac3b36bf75532d4a771fb2b765916ee9a9e"} Mar 20 13:34:18 crc kubenswrapper[4973]: I0320 13:34:18.501603 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7" event={"ID":"0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7","Type":"ContainerStarted","Data":"baf18b10607062dd5f983e7d79d06fd047dbc2c284f07b4c996da0b605c63739"} Mar 20 13:34:19 crc kubenswrapper[4973]: I0320 13:34:19.510247 4973 generic.go:334] "Generic (PLEG): container finished" podID="0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7" containerID="8af46be636a61cb29e36b356b4ceeac3b36bf75532d4a771fb2b765916ee9a9e" exitCode=0 Mar 20 13:34:19 crc kubenswrapper[4973]: I0320 13:34:19.510385 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7" event={"ID":"0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7","Type":"ContainerDied","Data":"8af46be636a61cb29e36b356b4ceeac3b36bf75532d4a771fb2b765916ee9a9e"} Mar 20 13:34:23 crc kubenswrapper[4973]: I0320 13:34:23.551272 4973 generic.go:334] "Generic (PLEG): container finished" podID="0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7" containerID="61d9f311c735ce8b47c02f8b0e5128b13e2458307702203fc3602cb689a1e7dd" exitCode=0 Mar 20 13:34:23 crc kubenswrapper[4973]: I0320 13:34:23.551419 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7" event={"ID":"0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7","Type":"ContainerDied","Data":"61d9f311c735ce8b47c02f8b0e5128b13e2458307702203fc3602cb689a1e7dd"} Mar 20 13:34:24 crc kubenswrapper[4973]: I0320 13:34:24.561524 4973 generic.go:334] "Generic (PLEG): container finished" podID="0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7" containerID="a4745a27a7fb2eb5ea578aa78bb909a0e960faf50dee7c834bf1918313980332" exitCode=0 Mar 20 13:34:24 crc kubenswrapper[4973]: I0320 13:34:24.561642 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7" event={"ID":"0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7","Type":"ContainerDied","Data":"a4745a27a7fb2eb5ea578aa78bb909a0e960faf50dee7c834bf1918313980332"} Mar 20 13:34:25 crc kubenswrapper[4973]: I0320 13:34:25.847260 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7" Mar 20 13:34:25 crc kubenswrapper[4973]: I0320 13:34:25.989414 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbnnz\" (UniqueName: \"kubernetes.io/projected/0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7-kube-api-access-gbnnz\") pod \"0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7\" (UID: \"0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7\") " Mar 20 13:34:25 crc kubenswrapper[4973]: I0320 13:34:25.989567 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7-bundle\") pod \"0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7\" (UID: \"0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7\") " Mar 20 13:34:25 crc kubenswrapper[4973]: I0320 13:34:25.989624 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7-util\") pod \"0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7\" (UID: \"0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7\") " Mar 20 13:34:25 crc kubenswrapper[4973]: I0320 13:34:25.992278 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7-bundle" (OuterVolumeSpecName: "bundle") pod "0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7" (UID: "0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:34:25 crc kubenswrapper[4973]: I0320 13:34:25.995574 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7-kube-api-access-gbnnz" (OuterVolumeSpecName: "kube-api-access-gbnnz") pod "0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7" (UID: "0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7"). InnerVolumeSpecName "kube-api-access-gbnnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:34:25 crc kubenswrapper[4973]: I0320 13:34:25.999251 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7-util" (OuterVolumeSpecName: "util") pod "0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7" (UID: "0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:34:26 crc kubenswrapper[4973]: I0320 13:34:26.091701 4973 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7-util\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:26 crc kubenswrapper[4973]: I0320 13:34:26.091757 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbnnz\" (UniqueName: \"kubernetes.io/projected/0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7-kube-api-access-gbnnz\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:26 crc kubenswrapper[4973]: I0320 13:34:26.091779 4973 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:26 crc kubenswrapper[4973]: I0320 13:34:26.579498 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7" event={"ID":"0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7","Type":"ContainerDied","Data":"baf18b10607062dd5f983e7d79d06fd047dbc2c284f07b4c996da0b605c63739"} Mar 20 13:34:26 crc kubenswrapper[4973]: I0320 13:34:26.579557 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="baf18b10607062dd5f983e7d79d06fd047dbc2c284f07b4c996da0b605c63739" Mar 20 13:34:26 crc kubenswrapper[4973]: I0320 13:34:26.579600 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7" Mar 20 13:34:28 crc kubenswrapper[4973]: I0320 13:34:28.433320 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jllfx"] Mar 20 13:34:28 crc kubenswrapper[4973]: I0320 13:34:28.434360 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="ovn-controller" containerID="cri-o://84a85b4f11b49688632f7d86a55bf387f3469cf372fffceeb507ca48fb35d031" gracePeriod=30 Mar 20 13:34:28 crc kubenswrapper[4973]: I0320 13:34:28.434388 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="nbdb" containerID="cri-o://5f66b6b2fb9b2f8d32d126fc2a6ce3795a88f698d6da5fe5e2dd8f67633eaed1" gracePeriod=30 Mar 20 13:34:28 crc kubenswrapper[4973]: I0320 13:34:28.434494 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="ovn-acl-logging" containerID="cri-o://769012078c45497a3a9634abb008e5887154b0b6af53007e47fa95b7f8093e71" gracePeriod=30 Mar 20 13:34:28 crc kubenswrapper[4973]: I0320 13:34:28.434520 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://871fb9fe2847621c60969a007f73da50cc62ca926e0251ee5c0fb27aefbda812" gracePeriod=30 Mar 20 13:34:28 crc kubenswrapper[4973]: I0320 13:34:28.434528 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="northd" containerID="cri-o://b0ecf535291dac1ed0e5ea70a4bf7b4dd5d8db1a7d39b63ace83f7b7376a1a79" gracePeriod=30 Mar 20 13:34:28 crc kubenswrapper[4973]: I0320 13:34:28.434499 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="sbdb" containerID="cri-o://049540dfa35ad15c3611412fbe908d24faf4f90a09da41b7a16ce2c5b4bd5062" gracePeriod=30 Mar 20 13:34:28 crc kubenswrapper[4973]: I0320 13:34:28.434542 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="kube-rbac-proxy-node" containerID="cri-o://07ee7da60e3690096c10650c3b6c42fec3b15bd2067e4ce5013b7b450626f4cf" gracePeriod=30 Mar 20 13:34:28 crc kubenswrapper[4973]: I0320 13:34:28.487489 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="ovnkube-controller" containerID="cri-o://30ea7d81ac52428fd43b39fb40295daa70c1f8c01fc9159a5997250355e48b36" gracePeriod=30 Mar 20 13:34:28 crc kubenswrapper[4973]: I0320 13:34:28.599730 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-57hnn_35802646-2926-42b8-913a-986001818f97/kube-multus/2.log" Mar 20 13:34:28 crc kubenswrapper[4973]: I0320 13:34:28.600370 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-57hnn_35802646-2926-42b8-913a-986001818f97/kube-multus/1.log" Mar 20 13:34:28 crc kubenswrapper[4973]: I0320 13:34:28.600413 4973 generic.go:334] "Generic (PLEG): container finished" podID="35802646-2926-42b8-913a-986001818f97" containerID="f0c63468c8d0dbcc605d699d587c5443a1c5e7b884fa8bb415694f7c6679b7c6" exitCode=2 Mar 20 13:34:28 crc kubenswrapper[4973]: I0320 13:34:28.600468 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-57hnn" event={"ID":"35802646-2926-42b8-913a-986001818f97","Type":"ContainerDied","Data":"f0c63468c8d0dbcc605d699d587c5443a1c5e7b884fa8bb415694f7c6679b7c6"} Mar 20 13:34:28 crc kubenswrapper[4973]: I0320 13:34:28.600510 4973 scope.go:117] "RemoveContainer" containerID="47617dca1598ac1606a2c6a9c1e00cd3d4acb516976b6b2d685ae48fa382baf1" Mar 20 13:34:28 crc kubenswrapper[4973]: I0320 13:34:28.601079 4973 scope.go:117] "RemoveContainer" containerID="f0c63468c8d0dbcc605d699d587c5443a1c5e7b884fa8bb415694f7c6679b7c6" Mar 20 13:34:28 crc kubenswrapper[4973]: E0320 13:34:28.601285 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-57hnn_openshift-multus(35802646-2926-42b8-913a-986001818f97)\"" pod="openshift-multus/multus-57hnn" podUID="35802646-2926-42b8-913a-986001818f97" Mar 20 13:34:28 crc kubenswrapper[4973]: I0320 13:34:28.606877 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jllfx_774edfed-7d45-4b69-b9d7-a3a914cbca04/ovnkube-controller/3.log" Mar 20 13:34:28 crc kubenswrapper[4973]: I0320 13:34:28.618784 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jllfx_774edfed-7d45-4b69-b9d7-a3a914cbca04/ovn-acl-logging/0.log" Mar 20 13:34:28 crc kubenswrapper[4973]: I0320 13:34:28.621284 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jllfx_774edfed-7d45-4b69-b9d7-a3a914cbca04/ovn-controller/0.log" Mar 20 13:34:28 crc kubenswrapper[4973]: I0320 13:34:28.622756 4973 generic.go:334] "Generic (PLEG): container finished" podID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerID="769012078c45497a3a9634abb008e5887154b0b6af53007e47fa95b7f8093e71" exitCode=143 Mar 20 13:34:28 crc kubenswrapper[4973]: I0320 13:34:28.622782 4973 generic.go:334] "Generic (PLEG): container finished" podID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerID="84a85b4f11b49688632f7d86a55bf387f3469cf372fffceeb507ca48fb35d031" exitCode=143 Mar 20 13:34:28 crc kubenswrapper[4973]: I0320 13:34:28.622806 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" event={"ID":"774edfed-7d45-4b69-b9d7-a3a914cbca04","Type":"ContainerDied","Data":"769012078c45497a3a9634abb008e5887154b0b6af53007e47fa95b7f8093e71"} Mar 20 13:34:28 crc kubenswrapper[4973]: I0320 13:34:28.622837 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" event={"ID":"774edfed-7d45-4b69-b9d7-a3a914cbca04","Type":"ContainerDied","Data":"84a85b4f11b49688632f7d86a55bf387f3469cf372fffceeb507ca48fb35d031"} Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.631543 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jllfx_774edfed-7d45-4b69-b9d7-a3a914cbca04/ovnkube-controller/3.log" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.632821 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jllfx_774edfed-7d45-4b69-b9d7-a3a914cbca04/ovnkube-controller/3.log" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.634175 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jllfx_774edfed-7d45-4b69-b9d7-a3a914cbca04/ovn-acl-logging/0.log" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.634689 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jllfx_774edfed-7d45-4b69-b9d7-a3a914cbca04/ovn-controller/0.log" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.635081 4973 generic.go:334] "Generic (PLEG): container finished" podID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerID="30ea7d81ac52428fd43b39fb40295daa70c1f8c01fc9159a5997250355e48b36" exitCode=0 Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.635106 4973 generic.go:334] "Generic (PLEG): container finished" podID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerID="049540dfa35ad15c3611412fbe908d24faf4f90a09da41b7a16ce2c5b4bd5062" exitCode=0 Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.635113 4973 generic.go:334] "Generic (PLEG): container finished" podID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerID="5f66b6b2fb9b2f8d32d126fc2a6ce3795a88f698d6da5fe5e2dd8f67633eaed1" exitCode=0 Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.635120 4973 generic.go:334] "Generic (PLEG): container finished" podID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerID="b0ecf535291dac1ed0e5ea70a4bf7b4dd5d8db1a7d39b63ace83f7b7376a1a79" exitCode=0 Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.635126 4973 generic.go:334] "Generic (PLEG): container finished" podID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerID="871fb9fe2847621c60969a007f73da50cc62ca926e0251ee5c0fb27aefbda812" exitCode=0 Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.635133 4973 generic.go:334] "Generic (PLEG): container finished" podID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerID="07ee7da60e3690096c10650c3b6c42fec3b15bd2067e4ce5013b7b450626f4cf" exitCode=0 Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.635209 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" event={"ID":"774edfed-7d45-4b69-b9d7-a3a914cbca04","Type":"ContainerDied","Data":"30ea7d81ac52428fd43b39fb40295daa70c1f8c01fc9159a5997250355e48b36"} Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.635245 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jllfx_774edfed-7d45-4b69-b9d7-a3a914cbca04/ovn-acl-logging/0.log" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.635280 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" event={"ID":"774edfed-7d45-4b69-b9d7-a3a914cbca04","Type":"ContainerDied","Data":"049540dfa35ad15c3611412fbe908d24faf4f90a09da41b7a16ce2c5b4bd5062"} Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.635304 4973 scope.go:117] "RemoveContainer" containerID="39e07ba1d363ef9be20296cc96d9e4c8166d43a08ded9d617e7f41dab15dbab3" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.635304 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" event={"ID":"774edfed-7d45-4b69-b9d7-a3a914cbca04","Type":"ContainerDied","Data":"5f66b6b2fb9b2f8d32d126fc2a6ce3795a88f698d6da5fe5e2dd8f67633eaed1"} Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.635378 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" event={"ID":"774edfed-7d45-4b69-b9d7-a3a914cbca04","Type":"ContainerDied","Data":"b0ecf535291dac1ed0e5ea70a4bf7b4dd5d8db1a7d39b63ace83f7b7376a1a79"} Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.635395 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" event={"ID":"774edfed-7d45-4b69-b9d7-a3a914cbca04","Type":"ContainerDied","Data":"871fb9fe2847621c60969a007f73da50cc62ca926e0251ee5c0fb27aefbda812"} Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.635405 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" event={"ID":"774edfed-7d45-4b69-b9d7-a3a914cbca04","Type":"ContainerDied","Data":"07ee7da60e3690096c10650c3b6c42fec3b15bd2067e4ce5013b7b450626f4cf"} Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.635413 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" event={"ID":"774edfed-7d45-4b69-b9d7-a3a914cbca04","Type":"ContainerDied","Data":"0aa02c1453484ce87bcebc6e8ee798cfdbd3c90b867f6022264249094dc0a316"} Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.635422 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0aa02c1453484ce87bcebc6e8ee798cfdbd3c90b867f6022264249094dc0a316" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.635932 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jllfx_774edfed-7d45-4b69-b9d7-a3a914cbca04/ovn-controller/0.log" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.636310 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.637220 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-57hnn_35802646-2926-42b8-913a-986001818f97/kube-multus/2.log" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700085 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-fjrls"] Mar 20 13:34:29 crc kubenswrapper[4973]: E0320 13:34:29.700312 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7" containerName="pull" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700323 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7" containerName="pull" Mar 20 13:34:29 crc kubenswrapper[4973]: E0320 13:34:29.700349 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="ovn-controller" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700355 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="ovn-controller" Mar 20 13:34:29 crc kubenswrapper[4973]: E0320 13:34:29.700364 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="ovn-acl-logging" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700371 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="ovn-acl-logging" Mar 20 13:34:29 crc kubenswrapper[4973]: E0320 13:34:29.700377 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="ovnkube-controller" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700382 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="ovnkube-controller" Mar 20 13:34:29 crc kubenswrapper[4973]: E0320 13:34:29.700389 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="kube-rbac-proxy-node" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700394 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="kube-rbac-proxy-node" Mar 20 13:34:29 crc kubenswrapper[4973]: E0320 13:34:29.700403 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="ovnkube-controller" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700410 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="ovnkube-controller" Mar 20 13:34:29 crc kubenswrapper[4973]: E0320 13:34:29.700418 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700424 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 13:34:29 crc kubenswrapper[4973]: E0320 13:34:29.700433 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="nbdb" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700438 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="nbdb" Mar 20 13:34:29 crc kubenswrapper[4973]: E0320 13:34:29.700445 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7" containerName="util" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700450 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7" containerName="util" Mar 20 13:34:29 crc kubenswrapper[4973]: E0320 13:34:29.700459 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="northd" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700465 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="northd" Mar 20 13:34:29 crc kubenswrapper[4973]: E0320 13:34:29.700474 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="ovnkube-controller" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700480 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="ovnkube-controller" Mar 20 13:34:29 crc kubenswrapper[4973]: E0320 13:34:29.700486 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="ovnkube-controller" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700492 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="ovnkube-controller" Mar 20 13:34:29 crc kubenswrapper[4973]: E0320 13:34:29.700499 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="kubecfg-setup" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700504 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="kubecfg-setup" Mar 20 13:34:29 crc kubenswrapper[4973]: E0320 13:34:29.700512 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="ovnkube-controller" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700517 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="ovnkube-controller" Mar 20 13:34:29 crc kubenswrapper[4973]: E0320 13:34:29.700527 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7" containerName="extract" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700532 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7" containerName="extract" Mar 20 13:34:29 crc kubenswrapper[4973]: E0320 13:34:29.700539 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="sbdb" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700544 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="sbdb" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700635 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700643 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="nbdb" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700650 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="ovnkube-controller" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700657 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7" containerName="extract" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700666 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="ovn-acl-logging" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700676 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="kube-rbac-proxy-node" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700687 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="sbdb" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700697 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="ovn-controller" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700710 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="ovnkube-controller" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700717 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="northd" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700728 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="ovnkube-controller" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700736 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="ovnkube-controller" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.700926 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" containerName="ovnkube-controller" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.702481 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.737021 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.737064 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-host-slash\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.737082 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-host-run-ovn-kubernetes\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.737214 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-var-lib-openvswitch\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.737255 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/caed8735-bc22-4834-b3ce-7af8596b8c48-ovnkube-script-lib\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.737278 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkzc5\" (UniqueName: \"kubernetes.io/projected/caed8735-bc22-4834-b3ce-7af8596b8c48-kube-api-access-xkzc5\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.737294 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-run-ovn\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.737311 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-node-log\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.737331 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/caed8735-bc22-4834-b3ce-7af8596b8c48-ovn-node-metrics-cert\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.737360 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-run-systemd\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.737458 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-systemd-units\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.737490 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-run-openvswitch\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.737513 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-host-kubelet\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.737599 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-etc-openvswitch\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.737651 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-log-socket\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.737687 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-host-run-netns\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.737765 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-host-cni-bin\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.737812 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/caed8735-bc22-4834-b3ce-7af8596b8c48-env-overrides\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.737827 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/caed8735-bc22-4834-b3ce-7af8596b8c48-ovnkube-config\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.737857 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-host-cni-netd\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.838693 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/774edfed-7d45-4b69-b9d7-a3a914cbca04-ovn-node-metrics-cert\") pod \"774edfed-7d45-4b69-b9d7-a3a914cbca04\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.838744 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-kubelet\") pod \"774edfed-7d45-4b69-b9d7-a3a914cbca04\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.838769 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-run-openvswitch\") pod \"774edfed-7d45-4b69-b9d7-a3a914cbca04\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.838789 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-cni-bin\") pod \"774edfed-7d45-4b69-b9d7-a3a914cbca04\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.838829 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/774edfed-7d45-4b69-b9d7-a3a914cbca04-ovnkube-script-lib\") pod \"774edfed-7d45-4b69-b9d7-a3a914cbca04\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.838858 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-cni-netd\") pod \"774edfed-7d45-4b69-b9d7-a3a914cbca04\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.838878 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/774edfed-7d45-4b69-b9d7-a3a914cbca04-ovnkube-config\") pod \"774edfed-7d45-4b69-b9d7-a3a914cbca04\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.838878 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "774edfed-7d45-4b69-b9d7-a3a914cbca04" (UID: "774edfed-7d45-4b69-b9d7-a3a914cbca04"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.838895 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "774edfed-7d45-4b69-b9d7-a3a914cbca04" (UID: "774edfed-7d45-4b69-b9d7-a3a914cbca04"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.838878 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "774edfed-7d45-4b69-b9d7-a3a914cbca04" (UID: "774edfed-7d45-4b69-b9d7-a3a914cbca04"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.838908 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-var-lib-cni-networks-ovn-kubernetes\") pod \"774edfed-7d45-4b69-b9d7-a3a914cbca04\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.838998 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-etc-openvswitch\") pod \"774edfed-7d45-4b69-b9d7-a3a914cbca04\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839031 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-slash\") pod \"774edfed-7d45-4b69-b9d7-a3a914cbca04\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.838917 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "774edfed-7d45-4b69-b9d7-a3a914cbca04" (UID: "774edfed-7d45-4b69-b9d7-a3a914cbca04"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.838965 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "774edfed-7d45-4b69-b9d7-a3a914cbca04" (UID: "774edfed-7d45-4b69-b9d7-a3a914cbca04"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839087 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-run-ovn\") pod \"774edfed-7d45-4b69-b9d7-a3a914cbca04\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839092 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "774edfed-7d45-4b69-b9d7-a3a914cbca04" (UID: "774edfed-7d45-4b69-b9d7-a3a914cbca04"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839117 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-slash" (OuterVolumeSpecName: "host-slash") pod "774edfed-7d45-4b69-b9d7-a3a914cbca04" (UID: "774edfed-7d45-4b69-b9d7-a3a914cbca04"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839122 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-run-systemd\") pod \"774edfed-7d45-4b69-b9d7-a3a914cbca04\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839150 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/774edfed-7d45-4b69-b9d7-a3a914cbca04-env-overrides\") pod \"774edfed-7d45-4b69-b9d7-a3a914cbca04\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839139 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "774edfed-7d45-4b69-b9d7-a3a914cbca04" (UID: "774edfed-7d45-4b69-b9d7-a3a914cbca04"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839200 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-var-lib-openvswitch\") pod \"774edfed-7d45-4b69-b9d7-a3a914cbca04\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839221 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-run-ovn-kubernetes\") pod \"774edfed-7d45-4b69-b9d7-a3a914cbca04\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839268 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-node-log\") pod \"774edfed-7d45-4b69-b9d7-a3a914cbca04\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839293 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lx76c\" (UniqueName: \"kubernetes.io/projected/774edfed-7d45-4b69-b9d7-a3a914cbca04-kube-api-access-lx76c\") pod \"774edfed-7d45-4b69-b9d7-a3a914cbca04\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839319 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-systemd-units\") pod \"774edfed-7d45-4b69-b9d7-a3a914cbca04\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839387 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-log-socket\") pod \"774edfed-7d45-4b69-b9d7-a3a914cbca04\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839411 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-run-netns\") pod \"774edfed-7d45-4b69-b9d7-a3a914cbca04\" (UID: \"774edfed-7d45-4b69-b9d7-a3a914cbca04\") " Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839291 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/774edfed-7d45-4b69-b9d7-a3a914cbca04-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "774edfed-7d45-4b69-b9d7-a3a914cbca04" (UID: "774edfed-7d45-4b69-b9d7-a3a914cbca04"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839313 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "774edfed-7d45-4b69-b9d7-a3a914cbca04" (UID: "774edfed-7d45-4b69-b9d7-a3a914cbca04"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839314 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/774edfed-7d45-4b69-b9d7-a3a914cbca04-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "774edfed-7d45-4b69-b9d7-a3a914cbca04" (UID: "774edfed-7d45-4b69-b9d7-a3a914cbca04"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839352 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-node-log" (OuterVolumeSpecName: "node-log") pod "774edfed-7d45-4b69-b9d7-a3a914cbca04" (UID: "774edfed-7d45-4b69-b9d7-a3a914cbca04"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839367 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "774edfed-7d45-4b69-b9d7-a3a914cbca04" (UID: "774edfed-7d45-4b69-b9d7-a3a914cbca04"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839511 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/774edfed-7d45-4b69-b9d7-a3a914cbca04-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "774edfed-7d45-4b69-b9d7-a3a914cbca04" (UID: "774edfed-7d45-4b69-b9d7-a3a914cbca04"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839535 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "774edfed-7d45-4b69-b9d7-a3a914cbca04" (UID: "774edfed-7d45-4b69-b9d7-a3a914cbca04"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839587 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-log-socket" (OuterVolumeSpecName: "log-socket") pod "774edfed-7d45-4b69-b9d7-a3a914cbca04" (UID: "774edfed-7d45-4b69-b9d7-a3a914cbca04"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839591 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "774edfed-7d45-4b69-b9d7-a3a914cbca04" (UID: "774edfed-7d45-4b69-b9d7-a3a914cbca04"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839649 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-host-cni-bin\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839701 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/caed8735-bc22-4834-b3ce-7af8596b8c48-env-overrides\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839722 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-host-cni-netd\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839739 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/caed8735-bc22-4834-b3ce-7af8596b8c48-ovnkube-config\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839758 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-host-cni-bin\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839797 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-host-cni-netd\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839803 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839826 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-host-run-ovn-kubernetes\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839847 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-host-slash\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839879 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-var-lib-openvswitch\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839916 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/caed8735-bc22-4834-b3ce-7af8596b8c48-ovnkube-script-lib\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839941 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkzc5\" (UniqueName: \"kubernetes.io/projected/caed8735-bc22-4834-b3ce-7af8596b8c48-kube-api-access-xkzc5\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839961 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-run-ovn\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.839987 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-node-log\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840027 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/caed8735-bc22-4834-b3ce-7af8596b8c48-ovn-node-metrics-cert\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840051 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-run-systemd\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840104 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-run-openvswitch\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840126 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-systemd-units\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840150 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-host-kubelet\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840179 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-etc-openvswitch\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840206 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-log-socket\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840215 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/caed8735-bc22-4834-b3ce-7af8596b8c48-env-overrides\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840228 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-host-run-netns\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840251 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-run-ovn\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840380 4973 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/774edfed-7d45-4b69-b9d7-a3a914cbca04-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840395 4973 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840406 4973 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/774edfed-7d45-4b69-b9d7-a3a914cbca04-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840418 4973 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840430 4973 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840443 4973 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-slash\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840454 4973 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840465 4973 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/774edfed-7d45-4b69-b9d7-a3a914cbca04-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840477 4973 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840491 4973 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840503 4973 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-node-log\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840513 4973 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840524 4973 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-log-socket\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840534 4973 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840546 4973 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840556 4973 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840567 4973 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840609 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-node-log\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840658 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/caed8735-bc22-4834-b3ce-7af8596b8c48-ovnkube-config\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840695 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840721 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-host-run-ovn-kubernetes\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840745 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-host-slash\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.840764 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-var-lib-openvswitch\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.841177 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/caed8735-bc22-4834-b3ce-7af8596b8c48-ovnkube-script-lib\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.841219 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-host-kubelet\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.841240 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-run-systemd\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.841262 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-run-openvswitch\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.841284 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-systemd-units\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.841305 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-log-socket\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.841326 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-host-run-netns\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.841332 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/caed8735-bc22-4834-b3ce-7af8596b8c48-etc-openvswitch\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.844403 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/774edfed-7d45-4b69-b9d7-a3a914cbca04-kube-api-access-lx76c" (OuterVolumeSpecName: "kube-api-access-lx76c") pod "774edfed-7d45-4b69-b9d7-a3a914cbca04" (UID: "774edfed-7d45-4b69-b9d7-a3a914cbca04"). InnerVolumeSpecName "kube-api-access-lx76c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.853488 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/774edfed-7d45-4b69-b9d7-a3a914cbca04-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "774edfed-7d45-4b69-b9d7-a3a914cbca04" (UID: "774edfed-7d45-4b69-b9d7-a3a914cbca04"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.858831 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkzc5\" (UniqueName: \"kubernetes.io/projected/caed8735-bc22-4834-b3ce-7af8596b8c48-kube-api-access-xkzc5\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.861741 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/caed8735-bc22-4834-b3ce-7af8596b8c48-ovn-node-metrics-cert\") pod \"ovnkube-node-fjrls\" (UID: \"caed8735-bc22-4834-b3ce-7af8596b8c48\") " pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.871280 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "774edfed-7d45-4b69-b9d7-a3a914cbca04" (UID: "774edfed-7d45-4b69-b9d7-a3a914cbca04"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.941616 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lx76c\" (UniqueName: \"kubernetes.io/projected/774edfed-7d45-4b69-b9d7-a3a914cbca04-kube-api-access-lx76c\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.941666 4973 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/774edfed-7d45-4b69-b9d7-a3a914cbca04-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:29 crc kubenswrapper[4973]: I0320 13:34:29.941676 4973 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/774edfed-7d45-4b69-b9d7-a3a914cbca04-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 20 13:34:30 crc kubenswrapper[4973]: I0320 13:34:30.014863 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:30 crc kubenswrapper[4973]: I0320 13:34:30.644380 4973 generic.go:334] "Generic (PLEG): container finished" podID="caed8735-bc22-4834-b3ce-7af8596b8c48" containerID="3fd26ea4a7abf3898fe327c8c8ac354a31bca89ca9991609950fa6a8e750f97e" exitCode=0 Mar 20 13:34:30 crc kubenswrapper[4973]: I0320 13:34:30.644468 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" event={"ID":"caed8735-bc22-4834-b3ce-7af8596b8c48","Type":"ContainerDied","Data":"3fd26ea4a7abf3898fe327c8c8ac354a31bca89ca9991609950fa6a8e750f97e"} Mar 20 13:34:30 crc kubenswrapper[4973]: I0320 13:34:30.644783 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" event={"ID":"caed8735-bc22-4834-b3ce-7af8596b8c48","Type":"ContainerStarted","Data":"a3dea180c350680f0fbb11ccd89f8ebbb632f23cb15a5b3993363850e387b2c3"} Mar 20 13:34:30 crc kubenswrapper[4973]: I0320 13:34:30.648579 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jllfx_774edfed-7d45-4b69-b9d7-a3a914cbca04/ovn-acl-logging/0.log" Mar 20 13:34:30 crc kubenswrapper[4973]: I0320 13:34:30.668083 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-jllfx_774edfed-7d45-4b69-b9d7-a3a914cbca04/ovn-controller/0.log" Mar 20 13:34:30 crc kubenswrapper[4973]: I0320 13:34:30.668661 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jllfx" Mar 20 13:34:30 crc kubenswrapper[4973]: I0320 13:34:30.840411 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jllfx"] Mar 20 13:34:30 crc kubenswrapper[4973]: I0320 13:34:30.845687 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jllfx"] Mar 20 13:34:31 crc kubenswrapper[4973]: I0320 13:34:31.688053 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" event={"ID":"caed8735-bc22-4834-b3ce-7af8596b8c48","Type":"ContainerStarted","Data":"faab0612bc08d5a16eae889e5af2fcfc77444fdd5df8fbca5f0249f182d50bab"} Mar 20 13:34:31 crc kubenswrapper[4973]: I0320 13:34:31.688089 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" event={"ID":"caed8735-bc22-4834-b3ce-7af8596b8c48","Type":"ContainerStarted","Data":"80e32aeff7f498d5a1ca63d7b4cd56ab2d884c433922c23db4b472c157cc644a"} Mar 20 13:34:31 crc kubenswrapper[4973]: I0320 13:34:31.688100 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" event={"ID":"caed8735-bc22-4834-b3ce-7af8596b8c48","Type":"ContainerStarted","Data":"7deb1a9f072cd0f90f8d3edfa8e0ad58935eaf98ca831a986444079294bcb8bb"} Mar 20 13:34:31 crc kubenswrapper[4973]: I0320 13:34:31.688108 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" event={"ID":"caed8735-bc22-4834-b3ce-7af8596b8c48","Type":"ContainerStarted","Data":"bb3b3e2169e37b75bb2d9540505d7473c738f94c302ce253960ae9c9cb180179"} Mar 20 13:34:31 crc kubenswrapper[4973]: I0320 13:34:31.688118 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" event={"ID":"caed8735-bc22-4834-b3ce-7af8596b8c48","Type":"ContainerStarted","Data":"fa35e8ac9ad4906e0bafe2bba85183f9073a9a1442b09f40cd958891d366723f"} Mar 20 13:34:31 crc kubenswrapper[4973]: I0320 13:34:31.688131 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" event={"ID":"caed8735-bc22-4834-b3ce-7af8596b8c48","Type":"ContainerStarted","Data":"d29a4569af2a25137e6c062bd9d5dfe571a6f23c78d3af4b957e61bb578f3f56"} Mar 20 13:34:31 crc kubenswrapper[4973]: I0320 13:34:31.956987 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="774edfed-7d45-4b69-b9d7-a3a914cbca04" path="/var/lib/kubelet/pods/774edfed-7d45-4b69-b9d7-a3a914cbca04/volumes" Mar 20 13:34:34 crc kubenswrapper[4973]: I0320 13:34:34.707787 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" event={"ID":"caed8735-bc22-4834-b3ce-7af8596b8c48","Type":"ContainerStarted","Data":"dda73b309b2b5fd84ec7848d5f74012eb768295664e0afaf5123d16160a889e2"} Mar 20 13:34:35 crc kubenswrapper[4973]: I0320 13:34:35.800533 4973 scope.go:117] "RemoveContainer" containerID="b0ecf535291dac1ed0e5ea70a4bf7b4dd5d8db1a7d39b63ace83f7b7376a1a79" Mar 20 13:34:35 crc kubenswrapper[4973]: I0320 13:34:35.844962 4973 scope.go:117] "RemoveContainer" containerID="049540dfa35ad15c3611412fbe908d24faf4f90a09da41b7a16ce2c5b4bd5062" Mar 20 13:34:35 crc kubenswrapper[4973]: I0320 13:34:35.867577 4973 scope.go:117] "RemoveContainer" containerID="30ea7d81ac52428fd43b39fb40295daa70c1f8c01fc9159a5997250355e48b36" Mar 20 13:34:35 crc kubenswrapper[4973]: I0320 13:34:35.893568 4973 scope.go:117] "RemoveContainer" containerID="5f66b6b2fb9b2f8d32d126fc2a6ce3795a88f698d6da5fe5e2dd8f67633eaed1" Mar 20 13:34:35 crc kubenswrapper[4973]: I0320 13:34:35.909462 4973 scope.go:117] "RemoveContainer" containerID="84a85b4f11b49688632f7d86a55bf387f3469cf372fffceeb507ca48fb35d031" Mar 20 13:34:35 crc kubenswrapper[4973]: I0320 13:34:35.930090 4973 scope.go:117] "RemoveContainer" containerID="871fb9fe2847621c60969a007f73da50cc62ca926e0251ee5c0fb27aefbda812" Mar 20 13:34:35 crc kubenswrapper[4973]: I0320 13:34:35.959761 4973 scope.go:117] "RemoveContainer" containerID="0c6475e0b0e8b115d461fbcaa22f96cac44bb08baf3a011339b3ff6758eb9f8b" Mar 20 13:34:35 crc kubenswrapper[4973]: I0320 13:34:35.985549 4973 scope.go:117] "RemoveContainer" containerID="769012078c45497a3a9634abb008e5887154b0b6af53007e47fa95b7f8093e71" Mar 20 13:34:35 crc kubenswrapper[4973]: I0320 13:34:35.999150 4973 scope.go:117] "RemoveContainer" containerID="e5fe72df8627143ff5fb920f84e8b5cc60504f7b2e24fdabf880f96a640fee13" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.026122 4973 scope.go:117] "RemoveContainer" containerID="07ee7da60e3690096c10650c3b6c42fec3b15bd2067e4ce5013b7b450626f4cf" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.072662 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-5mfl6"] Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.073374 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-5mfl6" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.075184 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.083070 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.090581 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-9qqj8" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.250387 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9v8g\" (UniqueName: \"kubernetes.io/projected/12942d85-7c63-4b80-8df3-81e0941c91eb-kube-api-access-w9v8g\") pod \"obo-prometheus-operator-8ff7d675-5mfl6\" (UID: \"12942d85-7c63-4b80-8df3-81e0941c91eb\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-5mfl6" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.352146 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9v8g\" (UniqueName: \"kubernetes.io/projected/12942d85-7c63-4b80-8df3-81e0941c91eb-kube-api-access-w9v8g\") pod \"obo-prometheus-operator-8ff7d675-5mfl6\" (UID: \"12942d85-7c63-4b80-8df3-81e0941c91eb\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-5mfl6" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.377764 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9v8g\" (UniqueName: \"kubernetes.io/projected/12942d85-7c63-4b80-8df3-81e0941c91eb-kube-api-access-w9v8g\") pod \"obo-prometheus-operator-8ff7d675-5mfl6\" (UID: \"12942d85-7c63-4b80-8df3-81e0941c91eb\") " pod="openshift-operators/obo-prometheus-operator-8ff7d675-5mfl6" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.379491 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-655746896-qjslc"] Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.383182 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-qjslc" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.384687 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-4ppmv" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.384812 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.390809 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-655746896-lsnk6"] Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.391696 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-lsnk6" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.392379 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-5mfl6" Mar 20 13:34:36 crc kubenswrapper[4973]: E0320 13:34:36.421050 4973 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-5mfl6_openshift-operators_12942d85-7c63-4b80-8df3-81e0941c91eb_0(370eb437392e5a12679e19b2572667053c5f8860897172c0abbbd8f08634b48a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:34:36 crc kubenswrapper[4973]: E0320 13:34:36.421114 4973 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-5mfl6_openshift-operators_12942d85-7c63-4b80-8df3-81e0941c91eb_0(370eb437392e5a12679e19b2572667053c5f8860897172c0abbbd8f08634b48a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-8ff7d675-5mfl6" Mar 20 13:34:36 crc kubenswrapper[4973]: E0320 13:34:36.421139 4973 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-5mfl6_openshift-operators_12942d85-7c63-4b80-8df3-81e0941c91eb_0(370eb437392e5a12679e19b2572667053c5f8860897172c0abbbd8f08634b48a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-8ff7d675-5mfl6" Mar 20 13:34:36 crc kubenswrapper[4973]: E0320 13:34:36.421191 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-8ff7d675-5mfl6_openshift-operators(12942d85-7c63-4b80-8df3-81e0941c91eb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-8ff7d675-5mfl6_openshift-operators(12942d85-7c63-4b80-8df3-81e0941c91eb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-5mfl6_openshift-operators_12942d85-7c63-4b80-8df3-81e0941c91eb_0(370eb437392e5a12679e19b2572667053c5f8860897172c0abbbd8f08634b48a): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-8ff7d675-5mfl6" podUID="12942d85-7c63-4b80-8df3-81e0941c91eb" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.453910 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8b74a291-42fd-4819-aab4-957acbce8ec7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-655746896-lsnk6\" (UID: \"8b74a291-42fd-4819-aab4-957acbce8ec7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-lsnk6" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.453967 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/645dfcb4-2ecb-4f12-96a6-dc97944672fb-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-655746896-qjslc\" (UID: \"645dfcb4-2ecb-4f12-96a6-dc97944672fb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-qjslc" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.454096 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/645dfcb4-2ecb-4f12-96a6-dc97944672fb-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-655746896-qjslc\" (UID: \"645dfcb4-2ecb-4f12-96a6-dc97944672fb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-qjslc" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.454243 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8b74a291-42fd-4819-aab4-957acbce8ec7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-655746896-lsnk6\" (UID: \"8b74a291-42fd-4819-aab4-957acbce8ec7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-lsnk6" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.555022 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8b74a291-42fd-4819-aab4-957acbce8ec7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-655746896-lsnk6\" (UID: \"8b74a291-42fd-4819-aab4-957acbce8ec7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-lsnk6" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.555094 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8b74a291-42fd-4819-aab4-957acbce8ec7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-655746896-lsnk6\" (UID: \"8b74a291-42fd-4819-aab4-957acbce8ec7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-lsnk6" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.555122 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/645dfcb4-2ecb-4f12-96a6-dc97944672fb-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-655746896-qjslc\" (UID: \"645dfcb4-2ecb-4f12-96a6-dc97944672fb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-qjslc" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.555171 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/645dfcb4-2ecb-4f12-96a6-dc97944672fb-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-655746896-qjslc\" (UID: \"645dfcb4-2ecb-4f12-96a6-dc97944672fb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-qjslc" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.558533 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/645dfcb4-2ecb-4f12-96a6-dc97944672fb-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-655746896-qjslc\" (UID: \"645dfcb4-2ecb-4f12-96a6-dc97944672fb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-qjslc" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.559764 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8b74a291-42fd-4819-aab4-957acbce8ec7-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-655746896-lsnk6\" (UID: \"8b74a291-42fd-4819-aab4-957acbce8ec7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-lsnk6" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.560107 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/645dfcb4-2ecb-4f12-96a6-dc97944672fb-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-655746896-qjslc\" (UID: \"645dfcb4-2ecb-4f12-96a6-dc97944672fb\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-qjslc" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.562990 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8b74a291-42fd-4819-aab4-957acbce8ec7-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-655746896-lsnk6\" (UID: \"8b74a291-42fd-4819-aab4-957acbce8ec7\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-lsnk6" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.710707 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-qjslc" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.718441 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-lsnk6" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.729005 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" event={"ID":"caed8735-bc22-4834-b3ce-7af8596b8c48","Type":"ContainerStarted","Data":"e61484b69c53996b4b20c1a2a2c623f875bcff73452a3a678c61429c417f8ff5"} Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.729408 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.729499 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.729559 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.764715 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:36 crc kubenswrapper[4973]: E0320 13:34:36.771829 4973 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-655746896-qjslc_openshift-operators_645dfcb4-2ecb-4f12-96a6-dc97944672fb_0(a03a7cca78a86b6ec3ef0378bc4d729e98af684d909d65c7d1c1171e12310384): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:34:36 crc kubenswrapper[4973]: E0320 13:34:36.771880 4973 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-655746896-qjslc_openshift-operators_645dfcb4-2ecb-4f12-96a6-dc97944672fb_0(a03a7cca78a86b6ec3ef0378bc4d729e98af684d909d65c7d1c1171e12310384): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-qjslc" Mar 20 13:34:36 crc kubenswrapper[4973]: E0320 13:34:36.771916 4973 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-655746896-qjslc_openshift-operators_645dfcb4-2ecb-4f12-96a6-dc97944672fb_0(a03a7cca78a86b6ec3ef0378bc4d729e98af684d909d65c7d1c1171e12310384): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-qjslc" Mar 20 13:34:36 crc kubenswrapper[4973]: E0320 13:34:36.771990 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-655746896-qjslc_openshift-operators(645dfcb4-2ecb-4f12-96a6-dc97944672fb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-655746896-qjslc_openshift-operators(645dfcb4-2ecb-4f12-96a6-dc97944672fb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-655746896-qjslc_openshift-operators_645dfcb4-2ecb-4f12-96a6-dc97944672fb_0(a03a7cca78a86b6ec3ef0378bc4d729e98af684d909d65c7d1c1171e12310384): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-qjslc" podUID="645dfcb4-2ecb-4f12-96a6-dc97944672fb" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.775355 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" podStartSLOduration=7.775328292 podStartE2EDuration="7.775328292s" podCreationTimestamp="2026-03-20 13:34:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:34:36.771364886 +0000 UTC m=+797.515034650" watchObservedRunningTime="2026-03-20 13:34:36.775328292 +0000 UTC m=+797.518998036" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.802495 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:34:36 crc kubenswrapper[4973]: E0320 13:34:36.816076 4973 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-655746896-lsnk6_openshift-operators_8b74a291-42fd-4819-aab4-957acbce8ec7_0(70ad9266917fcfff1f0e937faf2cf4924cb99f5b958cf798bfff294931e9dcf9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:34:36 crc kubenswrapper[4973]: E0320 13:34:36.816152 4973 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-655746896-lsnk6_openshift-operators_8b74a291-42fd-4819-aab4-957acbce8ec7_0(70ad9266917fcfff1f0e937faf2cf4924cb99f5b958cf798bfff294931e9dcf9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-lsnk6" Mar 20 13:34:36 crc kubenswrapper[4973]: E0320 13:34:36.816176 4973 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-655746896-lsnk6_openshift-operators_8b74a291-42fd-4819-aab4-957acbce8ec7_0(70ad9266917fcfff1f0e937faf2cf4924cb99f5b958cf798bfff294931e9dcf9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-lsnk6" Mar 20 13:34:36 crc kubenswrapper[4973]: E0320 13:34:36.816233 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-655746896-lsnk6_openshift-operators(8b74a291-42fd-4819-aab4-957acbce8ec7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-655746896-lsnk6_openshift-operators(8b74a291-42fd-4819-aab4-957acbce8ec7)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-655746896-lsnk6_openshift-operators_8b74a291-42fd-4819-aab4-957acbce8ec7_0(70ad9266917fcfff1f0e937faf2cf4924cb99f5b958cf798bfff294931e9dcf9): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-lsnk6" podUID="8b74a291-42fd-4819-aab4-957acbce8ec7" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.826817 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-nhgtl"] Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.827557 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-nhgtl" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.831744 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-jr65v" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.832268 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.959966 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/0e53f263-96c0-4390-b28e-ca37e867101b-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-nhgtl\" (UID: \"0e53f263-96c0-4390-b28e-ca37e867101b\") " pod="openshift-operators/observability-operator-6dd7dd855f-nhgtl" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.960093 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzr77\" (UniqueName: \"kubernetes.io/projected/0e53f263-96c0-4390-b28e-ca37e867101b-kube-api-access-xzr77\") pod \"observability-operator-6dd7dd855f-nhgtl\" (UID: \"0e53f263-96c0-4390-b28e-ca37e867101b\") " pod="openshift-operators/observability-operator-6dd7dd855f-nhgtl" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.995381 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-655746896-lsnk6"] Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.998918 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-5mfl6"] Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.999000 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-5mfl6" Mar 20 13:34:36 crc kubenswrapper[4973]: I0320 13:34:36.999380 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-5mfl6" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.006110 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-655746896-qjslc"] Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.031421 4973 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-5mfl6_openshift-operators_12942d85-7c63-4b80-8df3-81e0941c91eb_0(7c1a35e08b76069e4c45da5b632a18be665e94693f04d794e8001763087b13ce): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.031483 4973 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-5mfl6_openshift-operators_12942d85-7c63-4b80-8df3-81e0941c91eb_0(7c1a35e08b76069e4c45da5b632a18be665e94693f04d794e8001763087b13ce): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-8ff7d675-5mfl6" Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.031501 4973 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-5mfl6_openshift-operators_12942d85-7c63-4b80-8df3-81e0941c91eb_0(7c1a35e08b76069e4c45da5b632a18be665e94693f04d794e8001763087b13ce): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-8ff7d675-5mfl6" Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.031546 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-8ff7d675-5mfl6_openshift-operators(12942d85-7c63-4b80-8df3-81e0941c91eb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-8ff7d675-5mfl6_openshift-operators(12942d85-7c63-4b80-8df3-81e0941c91eb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-8ff7d675-5mfl6_openshift-operators_12942d85-7c63-4b80-8df3-81e0941c91eb_0(7c1a35e08b76069e4c45da5b632a18be665e94693f04d794e8001763087b13ce): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-8ff7d675-5mfl6" podUID="12942d85-7c63-4b80-8df3-81e0941c91eb" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.049319 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-nhgtl"] Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.060988 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/0e53f263-96c0-4390-b28e-ca37e867101b-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-nhgtl\" (UID: \"0e53f263-96c0-4390-b28e-ca37e867101b\") " pod="openshift-operators/observability-operator-6dd7dd855f-nhgtl" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.061142 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzr77\" (UniqueName: \"kubernetes.io/projected/0e53f263-96c0-4390-b28e-ca37e867101b-kube-api-access-xzr77\") pod \"observability-operator-6dd7dd855f-nhgtl\" (UID: \"0e53f263-96c0-4390-b28e-ca37e867101b\") " pod="openshift-operators/observability-operator-6dd7dd855f-nhgtl" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.073830 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/0e53f263-96c0-4390-b28e-ca37e867101b-observability-operator-tls\") pod \"observability-operator-6dd7dd855f-nhgtl\" (UID: \"0e53f263-96c0-4390-b28e-ca37e867101b\") " pod="openshift-operators/observability-operator-6dd7dd855f-nhgtl" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.099041 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzr77\" (UniqueName: \"kubernetes.io/projected/0e53f263-96c0-4390-b28e-ca37e867101b-kube-api-access-xzr77\") pod \"observability-operator-6dd7dd855f-nhgtl\" (UID: \"0e53f263-96c0-4390-b28e-ca37e867101b\") " pod="openshift-operators/observability-operator-6dd7dd855f-nhgtl" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.143024 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-nhgtl" Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.168813 4973 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-nhgtl_openshift-operators_0e53f263-96c0-4390-b28e-ca37e867101b_0(ac8c4723a06bd6dafffa6b9822d82905ad44d4a91efaa688eb29ca2262a5a324): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.168884 4973 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-nhgtl_openshift-operators_0e53f263-96c0-4390-b28e-ca37e867101b_0(ac8c4723a06bd6dafffa6b9822d82905ad44d4a91efaa688eb29ca2262a5a324): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-6dd7dd855f-nhgtl" Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.168912 4973 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-nhgtl_openshift-operators_0e53f263-96c0-4390-b28e-ca37e867101b_0(ac8c4723a06bd6dafffa6b9822d82905ad44d4a91efaa688eb29ca2262a5a324): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-6dd7dd855f-nhgtl" Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.168967 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-6dd7dd855f-nhgtl_openshift-operators(0e53f263-96c0-4390-b28e-ca37e867101b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-6dd7dd855f-nhgtl_openshift-operators(0e53f263-96c0-4390-b28e-ca37e867101b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-nhgtl_openshift-operators_0e53f263-96c0-4390-b28e-ca37e867101b_0(ac8c4723a06bd6dafffa6b9822d82905ad44d4a91efaa688eb29ca2262a5a324): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-6dd7dd855f-nhgtl" podUID="0e53f263-96c0-4390-b28e-ca37e867101b" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.183776 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-9b89954cc-wfdgp"] Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.184514 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-9b89954cc-wfdgp" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.187111 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-vr6mg" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.187990 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-service-cert" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.198080 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-9b89954cc-wfdgp"] Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.364907 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/a19fcda0-339c-4f0e-9f54-5a2f76c934c5-openshift-service-ca\") pod \"perses-operator-9b89954cc-wfdgp\" (UID: \"a19fcda0-339c-4f0e-9f54-5a2f76c934c5\") " pod="openshift-operators/perses-operator-9b89954cc-wfdgp" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.364950 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a19fcda0-339c-4f0e-9f54-5a2f76c934c5-apiservice-cert\") pod \"perses-operator-9b89954cc-wfdgp\" (UID: \"a19fcda0-339c-4f0e-9f54-5a2f76c934c5\") " pod="openshift-operators/perses-operator-9b89954cc-wfdgp" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.365031 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7j8f\" (UniqueName: \"kubernetes.io/projected/a19fcda0-339c-4f0e-9f54-5a2f76c934c5-kube-api-access-r7j8f\") pod \"perses-operator-9b89954cc-wfdgp\" (UID: \"a19fcda0-339c-4f0e-9f54-5a2f76c934c5\") " pod="openshift-operators/perses-operator-9b89954cc-wfdgp" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.365079 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a19fcda0-339c-4f0e-9f54-5a2f76c934c5-webhook-cert\") pod \"perses-operator-9b89954cc-wfdgp\" (UID: \"a19fcda0-339c-4f0e-9f54-5a2f76c934c5\") " pod="openshift-operators/perses-operator-9b89954cc-wfdgp" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.465937 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r7j8f\" (UniqueName: \"kubernetes.io/projected/a19fcda0-339c-4f0e-9f54-5a2f76c934c5-kube-api-access-r7j8f\") pod \"perses-operator-9b89954cc-wfdgp\" (UID: \"a19fcda0-339c-4f0e-9f54-5a2f76c934c5\") " pod="openshift-operators/perses-operator-9b89954cc-wfdgp" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.466017 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a19fcda0-339c-4f0e-9f54-5a2f76c934c5-webhook-cert\") pod \"perses-operator-9b89954cc-wfdgp\" (UID: \"a19fcda0-339c-4f0e-9f54-5a2f76c934c5\") " pod="openshift-operators/perses-operator-9b89954cc-wfdgp" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.466067 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/a19fcda0-339c-4f0e-9f54-5a2f76c934c5-openshift-service-ca\") pod \"perses-operator-9b89954cc-wfdgp\" (UID: \"a19fcda0-339c-4f0e-9f54-5a2f76c934c5\") " pod="openshift-operators/perses-operator-9b89954cc-wfdgp" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.466091 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a19fcda0-339c-4f0e-9f54-5a2f76c934c5-apiservice-cert\") pod \"perses-operator-9b89954cc-wfdgp\" (UID: \"a19fcda0-339c-4f0e-9f54-5a2f76c934c5\") " pod="openshift-operators/perses-operator-9b89954cc-wfdgp" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.466983 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/a19fcda0-339c-4f0e-9f54-5a2f76c934c5-openshift-service-ca\") pod \"perses-operator-9b89954cc-wfdgp\" (UID: \"a19fcda0-339c-4f0e-9f54-5a2f76c934c5\") " pod="openshift-operators/perses-operator-9b89954cc-wfdgp" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.471886 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a19fcda0-339c-4f0e-9f54-5a2f76c934c5-webhook-cert\") pod \"perses-operator-9b89954cc-wfdgp\" (UID: \"a19fcda0-339c-4f0e-9f54-5a2f76c934c5\") " pod="openshift-operators/perses-operator-9b89954cc-wfdgp" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.475971 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a19fcda0-339c-4f0e-9f54-5a2f76c934c5-apiservice-cert\") pod \"perses-operator-9b89954cc-wfdgp\" (UID: \"a19fcda0-339c-4f0e-9f54-5a2f76c934c5\") " pod="openshift-operators/perses-operator-9b89954cc-wfdgp" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.489065 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7j8f\" (UniqueName: \"kubernetes.io/projected/a19fcda0-339c-4f0e-9f54-5a2f76c934c5-kube-api-access-r7j8f\") pod \"perses-operator-9b89954cc-wfdgp\" (UID: \"a19fcda0-339c-4f0e-9f54-5a2f76c934c5\") " pod="openshift-operators/perses-operator-9b89954cc-wfdgp" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.500251 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-9b89954cc-wfdgp" Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.532060 4973 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-9b89954cc-wfdgp_openshift-operators_a19fcda0-339c-4f0e-9f54-5a2f76c934c5_0(1f639812cb03b972930b26e940270f144593b75110b262cbeb5bba2eea144e4c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.532125 4973 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-9b89954cc-wfdgp_openshift-operators_a19fcda0-339c-4f0e-9f54-5a2f76c934c5_0(1f639812cb03b972930b26e940270f144593b75110b262cbeb5bba2eea144e4c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-9b89954cc-wfdgp" Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.532147 4973 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-9b89954cc-wfdgp_openshift-operators_a19fcda0-339c-4f0e-9f54-5a2f76c934c5_0(1f639812cb03b972930b26e940270f144593b75110b262cbeb5bba2eea144e4c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-9b89954cc-wfdgp" Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.532184 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-9b89954cc-wfdgp_openshift-operators(a19fcda0-339c-4f0e-9f54-5a2f76c934c5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-9b89954cc-wfdgp_openshift-operators(a19fcda0-339c-4f0e-9f54-5a2f76c934c5)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-9b89954cc-wfdgp_openshift-operators_a19fcda0-339c-4f0e-9f54-5a2f76c934c5_0(1f639812cb03b972930b26e940270f144593b75110b262cbeb5bba2eea144e4c): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-9b89954cc-wfdgp" podUID="a19fcda0-339c-4f0e-9f54-5a2f76c934c5" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.734262 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-9b89954cc-wfdgp" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.734301 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-lsnk6" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.734507 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-nhgtl" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.734574 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-qjslc" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.734834 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-9b89954cc-wfdgp" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.735460 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-qjslc" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.735889 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-nhgtl" Mar 20 13:34:37 crc kubenswrapper[4973]: I0320 13:34:37.736241 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-lsnk6" Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.800784 4973 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-9b89954cc-wfdgp_openshift-operators_a19fcda0-339c-4f0e-9f54-5a2f76c934c5_0(f791ca22a3968e66bc11e858dbe4fd33e048deecf01019e8368d57899b2d6e66): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.800863 4973 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-9b89954cc-wfdgp_openshift-operators_a19fcda0-339c-4f0e-9f54-5a2f76c934c5_0(f791ca22a3968e66bc11e858dbe4fd33e048deecf01019e8368d57899b2d6e66): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-9b89954cc-wfdgp" Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.800891 4973 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-9b89954cc-wfdgp_openshift-operators_a19fcda0-339c-4f0e-9f54-5a2f76c934c5_0(f791ca22a3968e66bc11e858dbe4fd33e048deecf01019e8368d57899b2d6e66): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/perses-operator-9b89954cc-wfdgp" Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.800935 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"perses-operator-9b89954cc-wfdgp_openshift-operators(a19fcda0-339c-4f0e-9f54-5a2f76c934c5)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"perses-operator-9b89954cc-wfdgp_openshift-operators(a19fcda0-339c-4f0e-9f54-5a2f76c934c5)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_perses-operator-9b89954cc-wfdgp_openshift-operators_a19fcda0-339c-4f0e-9f54-5a2f76c934c5_0(f791ca22a3968e66bc11e858dbe4fd33e048deecf01019e8368d57899b2d6e66): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/perses-operator-9b89954cc-wfdgp" podUID="a19fcda0-339c-4f0e-9f54-5a2f76c934c5" Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.821557 4973 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-655746896-qjslc_openshift-operators_645dfcb4-2ecb-4f12-96a6-dc97944672fb_0(eee6818e5ea587500ed190908d46d1302da77958ea26d763cc2ff5df6026c180): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.821622 4973 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-655746896-qjslc_openshift-operators_645dfcb4-2ecb-4f12-96a6-dc97944672fb_0(eee6818e5ea587500ed190908d46d1302da77958ea26d763cc2ff5df6026c180): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-qjslc" Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.821644 4973 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-655746896-qjslc_openshift-operators_645dfcb4-2ecb-4f12-96a6-dc97944672fb_0(eee6818e5ea587500ed190908d46d1302da77958ea26d763cc2ff5df6026c180): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-qjslc" Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.821693 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-655746896-qjslc_openshift-operators(645dfcb4-2ecb-4f12-96a6-dc97944672fb)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-655746896-qjslc_openshift-operators(645dfcb4-2ecb-4f12-96a6-dc97944672fb)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-655746896-qjslc_openshift-operators_645dfcb4-2ecb-4f12-96a6-dc97944672fb_0(eee6818e5ea587500ed190908d46d1302da77958ea26d763cc2ff5df6026c180): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-qjslc" podUID="645dfcb4-2ecb-4f12-96a6-dc97944672fb" Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.832264 4973 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-nhgtl_openshift-operators_0e53f263-96c0-4390-b28e-ca37e867101b_0(cfde72bdad9759065acb58b7dd1315649f48d6ce62b09da43283e1266611ccaf): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.832356 4973 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-nhgtl_openshift-operators_0e53f263-96c0-4390-b28e-ca37e867101b_0(cfde72bdad9759065acb58b7dd1315649f48d6ce62b09da43283e1266611ccaf): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-6dd7dd855f-nhgtl" Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.832390 4973 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-nhgtl_openshift-operators_0e53f263-96c0-4390-b28e-ca37e867101b_0(cfde72bdad9759065acb58b7dd1315649f48d6ce62b09da43283e1266611ccaf): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/observability-operator-6dd7dd855f-nhgtl" Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.832451 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"observability-operator-6dd7dd855f-nhgtl_openshift-operators(0e53f263-96c0-4390-b28e-ca37e867101b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"observability-operator-6dd7dd855f-nhgtl_openshift-operators(0e53f263-96c0-4390-b28e-ca37e867101b)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_observability-operator-6dd7dd855f-nhgtl_openshift-operators_0e53f263-96c0-4390-b28e-ca37e867101b_0(cfde72bdad9759065acb58b7dd1315649f48d6ce62b09da43283e1266611ccaf): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/observability-operator-6dd7dd855f-nhgtl" podUID="0e53f263-96c0-4390-b28e-ca37e867101b" Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.846952 4973 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-655746896-lsnk6_openshift-operators_8b74a291-42fd-4819-aab4-957acbce8ec7_0(78516a40d7e843c9324b3b7ca08d70c0708b632f8edbbbf5aca1b1b0282ab0fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.847028 4973 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-655746896-lsnk6_openshift-operators_8b74a291-42fd-4819-aab4-957acbce8ec7_0(78516a40d7e843c9324b3b7ca08d70c0708b632f8edbbbf5aca1b1b0282ab0fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-lsnk6" Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.847055 4973 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-655746896-lsnk6_openshift-operators_8b74a291-42fd-4819-aab4-957acbce8ec7_0(78516a40d7e843c9324b3b7ca08d70c0708b632f8edbbbf5aca1b1b0282ab0fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-lsnk6" Mar 20 13:34:37 crc kubenswrapper[4973]: E0320 13:34:37.847110 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"obo-prometheus-operator-admission-webhook-655746896-lsnk6_openshift-operators(8b74a291-42fd-4819-aab4-957acbce8ec7)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"obo-prometheus-operator-admission-webhook-655746896-lsnk6_openshift-operators(8b74a291-42fd-4819-aab4-957acbce8ec7)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_obo-prometheus-operator-admission-webhook-655746896-lsnk6_openshift-operators_8b74a291-42fd-4819-aab4-957acbce8ec7_0(78516a40d7e843c9324b3b7ca08d70c0708b632f8edbbbf5aca1b1b0282ab0fe): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-lsnk6" podUID="8b74a291-42fd-4819-aab4-957acbce8ec7" Mar 20 13:34:43 crc kubenswrapper[4973]: I0320 13:34:43.320583 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:34:43 crc kubenswrapper[4973]: I0320 13:34:43.321225 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:34:43 crc kubenswrapper[4973]: I0320 13:34:43.950802 4973 scope.go:117] "RemoveContainer" containerID="f0c63468c8d0dbcc605d699d587c5443a1c5e7b884fa8bb415694f7c6679b7c6" Mar 20 13:34:44 crc kubenswrapper[4973]: I0320 13:34:44.777522 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-57hnn_35802646-2926-42b8-913a-986001818f97/kube-multus/2.log" Mar 20 13:34:44 crc kubenswrapper[4973]: I0320 13:34:44.778158 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-57hnn" event={"ID":"35802646-2926-42b8-913a-986001818f97","Type":"ContainerStarted","Data":"f6e15f63786473ec13610013abb5564ae9695685a640fb213000a251e9b6ad15"} Mar 20 13:34:48 crc kubenswrapper[4973]: I0320 13:34:48.949798 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-qjslc" Mar 20 13:34:48 crc kubenswrapper[4973]: I0320 13:34:48.950711 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-qjslc" Mar 20 13:34:49 crc kubenswrapper[4973]: I0320 13:34:49.407125 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-655746896-qjslc"] Mar 20 13:34:49 crc kubenswrapper[4973]: I0320 13:34:49.810241 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-qjslc" event={"ID":"645dfcb4-2ecb-4f12-96a6-dc97944672fb","Type":"ContainerStarted","Data":"2419c21a0ef92a2812e4bba4a98b979cd0b6d74d39b6f6b7a454b0ba75267dd8"} Mar 20 13:34:49 crc kubenswrapper[4973]: I0320 13:34:49.950132 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-nhgtl" Mar 20 13:34:49 crc kubenswrapper[4973]: I0320 13:34:49.954672 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-6dd7dd855f-nhgtl" Mar 20 13:34:50 crc kubenswrapper[4973]: I0320 13:34:50.380791 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-6dd7dd855f-nhgtl"] Mar 20 13:34:50 crc kubenswrapper[4973]: I0320 13:34:50.389939 4973 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:34:50 crc kubenswrapper[4973]: I0320 13:34:50.815733 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-nhgtl" event={"ID":"0e53f263-96c0-4390-b28e-ca37e867101b","Type":"ContainerStarted","Data":"beb5f8f1ed024f547ba28ac1e1435350ffc3fa5c190f7bd1741a9f4a7f30b88f"} Mar 20 13:34:50 crc kubenswrapper[4973]: I0320 13:34:50.950220 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-5mfl6" Mar 20 13:34:50 crc kubenswrapper[4973]: I0320 13:34:50.950685 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-8ff7d675-5mfl6" Mar 20 13:34:51 crc kubenswrapper[4973]: I0320 13:34:51.502390 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-8ff7d675-5mfl6"] Mar 20 13:34:51 crc kubenswrapper[4973]: I0320 13:34:51.823446 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-5mfl6" event={"ID":"12942d85-7c63-4b80-8df3-81e0941c91eb","Type":"ContainerStarted","Data":"d16797d7086726848073c8847802127d0638442f4990fcaff327c6383695e8d2"} Mar 20 13:34:51 crc kubenswrapper[4973]: I0320 13:34:51.950052 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-9b89954cc-wfdgp" Mar 20 13:34:51 crc kubenswrapper[4973]: I0320 13:34:51.950150 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-lsnk6" Mar 20 13:34:51 crc kubenswrapper[4973]: I0320 13:34:51.950640 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-lsnk6" Mar 20 13:34:51 crc kubenswrapper[4973]: I0320 13:34:51.950999 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-9b89954cc-wfdgp" Mar 20 13:34:52 crc kubenswrapper[4973]: I0320 13:34:52.342922 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-655746896-lsnk6"] Mar 20 13:34:52 crc kubenswrapper[4973]: W0320 13:34:52.361582 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8b74a291_42fd_4819_aab4_957acbce8ec7.slice/crio-b8c93c197bdfc0baf17fb8364925333651221fa6c8dec3f5932d35ebcd7d60c3 WatchSource:0}: Error finding container b8c93c197bdfc0baf17fb8364925333651221fa6c8dec3f5932d35ebcd7d60c3: Status 404 returned error can't find the container with id b8c93c197bdfc0baf17fb8364925333651221fa6c8dec3f5932d35ebcd7d60c3 Mar 20 13:34:52 crc kubenswrapper[4973]: I0320 13:34:52.420385 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-9b89954cc-wfdgp"] Mar 20 13:34:52 crc kubenswrapper[4973]: W0320 13:34:52.430453 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda19fcda0_339c_4f0e_9f54_5a2f76c934c5.slice/crio-1747de571651eb63dc1f8c21fb4d0053698f70cfb5f50bf4190ae666ed1f2c3b WatchSource:0}: Error finding container 1747de571651eb63dc1f8c21fb4d0053698f70cfb5f50bf4190ae666ed1f2c3b: Status 404 returned error can't find the container with id 1747de571651eb63dc1f8c21fb4d0053698f70cfb5f50bf4190ae666ed1f2c3b Mar 20 13:34:52 crc kubenswrapper[4973]: I0320 13:34:52.841257 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-9b89954cc-wfdgp" event={"ID":"a19fcda0-339c-4f0e-9f54-5a2f76c934c5","Type":"ContainerStarted","Data":"1747de571651eb63dc1f8c21fb4d0053698f70cfb5f50bf4190ae666ed1f2c3b"} Mar 20 13:34:52 crc kubenswrapper[4973]: I0320 13:34:52.842327 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-lsnk6" event={"ID":"8b74a291-42fd-4819-aab4-957acbce8ec7","Type":"ContainerStarted","Data":"b8c93c197bdfc0baf17fb8364925333651221fa6c8dec3f5932d35ebcd7d60c3"} Mar 20 13:34:57 crc kubenswrapper[4973]: I0320 13:34:57.707310 4973 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 13:35:00 crc kubenswrapper[4973]: I0320 13:35:00.062547 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-fjrls" Mar 20 13:35:02 crc kubenswrapper[4973]: I0320 13:35:02.925161 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-9b89954cc-wfdgp" event={"ID":"a19fcda0-339c-4f0e-9f54-5a2f76c934c5","Type":"ContainerStarted","Data":"4d41ea82968689eefbff58ec5b0c4116219cd7c07a4d19f95560a771285daaff"} Mar 20 13:35:02 crc kubenswrapper[4973]: I0320 13:35:02.926861 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-qjslc" event={"ID":"645dfcb4-2ecb-4f12-96a6-dc97944672fb","Type":"ContainerStarted","Data":"746fbbdf0eda9227f26481d3a6a4eacbafec3823afb9656694a86d6cb9bd357d"} Mar 20 13:35:02 crc kubenswrapper[4973]: I0320 13:35:02.928493 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-lsnk6" event={"ID":"8b74a291-42fd-4819-aab4-957acbce8ec7","Type":"ContainerStarted","Data":"05f6917898217bec8b6785a88a6ae8d4e1b9ee9a55d04bc9558baac06cd950a5"} Mar 20 13:35:02 crc kubenswrapper[4973]: I0320 13:35:02.932677 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-6dd7dd855f-nhgtl" event={"ID":"0e53f263-96c0-4390-b28e-ca37e867101b","Type":"ContainerStarted","Data":"073fc93f59e158bcf33296609c458b7b2a7fb20974cfba34d2e4759cde80f7d2"} Mar 20 13:35:02 crc kubenswrapper[4973]: I0320 13:35:02.932719 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-6dd7dd855f-nhgtl" Mar 20 13:35:02 crc kubenswrapper[4973]: I0320 13:35:02.935021 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-8ff7d675-5mfl6" event={"ID":"12942d85-7c63-4b80-8df3-81e0941c91eb","Type":"ContainerStarted","Data":"0feecbffee7969eb6a51c11b02a1bf05f1ebcd99a7fdd53bee6f5249db95de2b"} Mar 20 13:35:02 crc kubenswrapper[4973]: I0320 13:35:02.935114 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-6dd7dd855f-nhgtl" Mar 20 13:35:02 crc kubenswrapper[4973]: I0320 13:35:02.951111 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-9b89954cc-wfdgp" podStartSLOduration=16.706863206 podStartE2EDuration="25.951085521s" podCreationTimestamp="2026-03-20 13:34:37 +0000 UTC" firstStartedPulling="2026-03-20 13:34:52.436559442 +0000 UTC m=+813.180229186" lastFinishedPulling="2026-03-20 13:35:01.680781757 +0000 UTC m=+822.424451501" observedRunningTime="2026-03-20 13:35:02.948278178 +0000 UTC m=+823.691947922" watchObservedRunningTime="2026-03-20 13:35:02.951085521 +0000 UTC m=+823.694755275" Mar 20 13:35:02 crc kubenswrapper[4973]: I0320 13:35:02.999036 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-8ff7d675-5mfl6" podStartSLOduration=16.858346591 podStartE2EDuration="26.999009807s" podCreationTimestamp="2026-03-20 13:34:36 +0000 UTC" firstStartedPulling="2026-03-20 13:34:51.529482128 +0000 UTC m=+812.273151872" lastFinishedPulling="2026-03-20 13:35:01.670145334 +0000 UTC m=+822.413815088" observedRunningTime="2026-03-20 13:35:02.969717942 +0000 UTC m=+823.713387696" watchObservedRunningTime="2026-03-20 13:35:02.999009807 +0000 UTC m=+823.742679561" Mar 20 13:35:03 crc kubenswrapper[4973]: I0320 13:35:03.002988 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-6dd7dd855f-nhgtl" podStartSLOduration=15.819464775 podStartE2EDuration="27.002969784s" podCreationTimestamp="2026-03-20 13:34:36 +0000 UTC" firstStartedPulling="2026-03-20 13:34:50.38968814 +0000 UTC m=+811.133357884" lastFinishedPulling="2026-03-20 13:35:01.573193149 +0000 UTC m=+822.316862893" observedRunningTime="2026-03-20 13:35:02.995886224 +0000 UTC m=+823.739555968" watchObservedRunningTime="2026-03-20 13:35:03.002969784 +0000 UTC m=+823.746639528" Mar 20 13:35:03 crc kubenswrapper[4973]: I0320 13:35:03.029953 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-qjslc" podStartSLOduration=14.879377355 podStartE2EDuration="27.02992989s" podCreationTimestamp="2026-03-20 13:34:36 +0000 UTC" firstStartedPulling="2026-03-20 13:34:49.422614393 +0000 UTC m=+810.166284137" lastFinishedPulling="2026-03-20 13:35:01.573166928 +0000 UTC m=+822.316836672" observedRunningTime="2026-03-20 13:35:03.027243611 +0000 UTC m=+823.770913355" watchObservedRunningTime="2026-03-20 13:35:03.02992989 +0000 UTC m=+823.773599634" Mar 20 13:35:03 crc kubenswrapper[4973]: I0320 13:35:03.067126 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-655746896-lsnk6" podStartSLOduration=17.870630464 podStartE2EDuration="27.067102028s" podCreationTimestamp="2026-03-20 13:34:36 +0000 UTC" firstStartedPulling="2026-03-20 13:34:52.381517277 +0000 UTC m=+813.125187021" lastFinishedPulling="2026-03-20 13:35:01.577988841 +0000 UTC m=+822.321658585" observedRunningTime="2026-03-20 13:35:03.049142628 +0000 UTC m=+823.792812382" watchObservedRunningTime="2026-03-20 13:35:03.067102028 +0000 UTC m=+823.810771772" Mar 20 13:35:03 crc kubenswrapper[4973]: I0320 13:35:03.940806 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-9b89954cc-wfdgp" Mar 20 13:35:07 crc kubenswrapper[4973]: I0320 13:35:07.503465 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-9b89954cc-wfdgp" Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.061588 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-5n7n5"] Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.062748 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-5n7n5" Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.070232 4973 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-lkbs7" Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.071891 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.073192 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.080958 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-94mm7"] Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.082045 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-94mm7" Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.084502 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-5n7n5"] Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.089257 4973 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-zgzpd" Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.121443 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-94mm7"] Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.123234 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb5qq\" (UniqueName: \"kubernetes.io/projected/606e1a56-8a9a-4c0f-a7e0-646755f64185-kube-api-access-lb5qq\") pod \"cert-manager-cainjector-cf98fcc89-5n7n5\" (UID: \"606e1a56-8a9a-4c0f-a7e0-646755f64185\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-5n7n5" Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.124924 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-4qwzv"] Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.125693 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-4qwzv" Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.129459 4973 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-jhknc" Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.167953 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-4qwzv"] Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.224080 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44dfr\" (UniqueName: \"kubernetes.io/projected/6f5a8b02-59f4-427d-b91d-e7cacaa1ba23-kube-api-access-44dfr\") pod \"cert-manager-webhook-687f57d79b-4qwzv\" (UID: \"6f5a8b02-59f4-427d-b91d-e7cacaa1ba23\") " pod="cert-manager/cert-manager-webhook-687f57d79b-4qwzv" Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.224162 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7l2f\" (UniqueName: \"kubernetes.io/projected/bfdfa88e-4049-47d3-87f8-07c52f1f51df-kube-api-access-m7l2f\") pod \"cert-manager-858654f9db-94mm7\" (UID: \"bfdfa88e-4049-47d3-87f8-07c52f1f51df\") " pod="cert-manager/cert-manager-858654f9db-94mm7" Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.224217 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb5qq\" (UniqueName: \"kubernetes.io/projected/606e1a56-8a9a-4c0f-a7e0-646755f64185-kube-api-access-lb5qq\") pod \"cert-manager-cainjector-cf98fcc89-5n7n5\" (UID: \"606e1a56-8a9a-4c0f-a7e0-646755f64185\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-5n7n5" Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.275417 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb5qq\" (UniqueName: \"kubernetes.io/projected/606e1a56-8a9a-4c0f-a7e0-646755f64185-kube-api-access-lb5qq\") pod \"cert-manager-cainjector-cf98fcc89-5n7n5\" (UID: \"606e1a56-8a9a-4c0f-a7e0-646755f64185\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-5n7n5" Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.325300 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44dfr\" (UniqueName: \"kubernetes.io/projected/6f5a8b02-59f4-427d-b91d-e7cacaa1ba23-kube-api-access-44dfr\") pod \"cert-manager-webhook-687f57d79b-4qwzv\" (UID: \"6f5a8b02-59f4-427d-b91d-e7cacaa1ba23\") " pod="cert-manager/cert-manager-webhook-687f57d79b-4qwzv" Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.325436 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7l2f\" (UniqueName: \"kubernetes.io/projected/bfdfa88e-4049-47d3-87f8-07c52f1f51df-kube-api-access-m7l2f\") pod \"cert-manager-858654f9db-94mm7\" (UID: \"bfdfa88e-4049-47d3-87f8-07c52f1f51df\") " pod="cert-manager/cert-manager-858654f9db-94mm7" Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.352246 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7l2f\" (UniqueName: \"kubernetes.io/projected/bfdfa88e-4049-47d3-87f8-07c52f1f51df-kube-api-access-m7l2f\") pod \"cert-manager-858654f9db-94mm7\" (UID: \"bfdfa88e-4049-47d3-87f8-07c52f1f51df\") " pod="cert-manager/cert-manager-858654f9db-94mm7" Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.354187 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44dfr\" (UniqueName: \"kubernetes.io/projected/6f5a8b02-59f4-427d-b91d-e7cacaa1ba23-kube-api-access-44dfr\") pod \"cert-manager-webhook-687f57d79b-4qwzv\" (UID: \"6f5a8b02-59f4-427d-b91d-e7cacaa1ba23\") " pod="cert-manager/cert-manager-webhook-687f57d79b-4qwzv" Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.379638 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-5n7n5" Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.394125 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-94mm7" Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.450585 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-4qwzv" Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.650112 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-5n7n5"] Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.857273 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-94mm7"] Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.988498 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-94mm7" event={"ID":"bfdfa88e-4049-47d3-87f8-07c52f1f51df","Type":"ContainerStarted","Data":"3cd775b40f35e1f463a4121f99030dafb0c6def5a9f7e07e0cdfd44b43eb23a1"} Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.989251 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-5n7n5" event={"ID":"606e1a56-8a9a-4c0f-a7e0-646755f64185","Type":"ContainerStarted","Data":"670b536c99cee3e4f1b72f89b7101ac73b8ac7c6590b180e6d77bf5214112965"} Mar 20 13:35:11 crc kubenswrapper[4973]: I0320 13:35:11.989948 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-4qwzv"] Mar 20 13:35:11 crc kubenswrapper[4973]: W0320 13:35:11.994208 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f5a8b02_59f4_427d_b91d_e7cacaa1ba23.slice/crio-0cf25bf17ed6b21e4ce66f9cd83f5b61ffa2a2b5eeb6322dfe78c6f4a7fd94dc WatchSource:0}: Error finding container 0cf25bf17ed6b21e4ce66f9cd83f5b61ffa2a2b5eeb6322dfe78c6f4a7fd94dc: Status 404 returned error can't find the container with id 0cf25bf17ed6b21e4ce66f9cd83f5b61ffa2a2b5eeb6322dfe78c6f4a7fd94dc Mar 20 13:35:13 crc kubenswrapper[4973]: I0320 13:35:13.005099 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-4qwzv" event={"ID":"6f5a8b02-59f4-427d-b91d-e7cacaa1ba23","Type":"ContainerStarted","Data":"0cf25bf17ed6b21e4ce66f9cd83f5b61ffa2a2b5eeb6322dfe78c6f4a7fd94dc"} Mar 20 13:35:13 crc kubenswrapper[4973]: I0320 13:35:13.321181 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:35:13 crc kubenswrapper[4973]: I0320 13:35:13.321587 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:35:17 crc kubenswrapper[4973]: I0320 13:35:17.077735 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-94mm7" event={"ID":"bfdfa88e-4049-47d3-87f8-07c52f1f51df","Type":"ContainerStarted","Data":"f0054474689385045fcf56d20fc256ff0d204fdc716c27ed0570d67fd2453605"} Mar 20 13:35:17 crc kubenswrapper[4973]: I0320 13:35:17.079970 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-5n7n5" event={"ID":"606e1a56-8a9a-4c0f-a7e0-646755f64185","Type":"ContainerStarted","Data":"24b60e777ecaffa2f8ac8afa5343e178e9ac643ca892ed08b9f17553dd150261"} Mar 20 13:35:17 crc kubenswrapper[4973]: I0320 13:35:17.082694 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-4qwzv" event={"ID":"6f5a8b02-59f4-427d-b91d-e7cacaa1ba23","Type":"ContainerStarted","Data":"a6d7cc079277cb99a9609891beec8eda5221ae0ec5a7c5b385d43b14985596d7"} Mar 20 13:35:17 crc kubenswrapper[4973]: I0320 13:35:17.082896 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-4qwzv" Mar 20 13:35:17 crc kubenswrapper[4973]: I0320 13:35:17.108298 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-94mm7" podStartSLOduration=1.910359765 podStartE2EDuration="6.108272985s" podCreationTimestamp="2026-03-20 13:35:11 +0000 UTC" firstStartedPulling="2026-03-20 13:35:11.86388405 +0000 UTC m=+832.607553794" lastFinishedPulling="2026-03-20 13:35:16.06179727 +0000 UTC m=+836.805467014" observedRunningTime="2026-03-20 13:35:17.100603431 +0000 UTC m=+837.844273205" watchObservedRunningTime="2026-03-20 13:35:17.108272985 +0000 UTC m=+837.851942769" Mar 20 13:35:17 crc kubenswrapper[4973]: I0320 13:35:17.133732 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-4qwzv" podStartSLOduration=1.995978187 podStartE2EDuration="6.133705373s" podCreationTimestamp="2026-03-20 13:35:11 +0000 UTC" firstStartedPulling="2026-03-20 13:35:11.996804048 +0000 UTC m=+832.740473792" lastFinishedPulling="2026-03-20 13:35:16.134531234 +0000 UTC m=+836.878200978" observedRunningTime="2026-03-20 13:35:17.131748327 +0000 UTC m=+837.875418091" watchObservedRunningTime="2026-03-20 13:35:17.133705373 +0000 UTC m=+837.877375157" Mar 20 13:35:17 crc kubenswrapper[4973]: I0320 13:35:17.152604 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-5n7n5" podStartSLOduration=1.770012079 podStartE2EDuration="6.152573757s" podCreationTimestamp="2026-03-20 13:35:11 +0000 UTC" firstStartedPulling="2026-03-20 13:35:11.679007446 +0000 UTC m=+832.422677190" lastFinishedPulling="2026-03-20 13:35:16.061569124 +0000 UTC m=+836.805238868" observedRunningTime="2026-03-20 13:35:17.14870045 +0000 UTC m=+837.892370234" watchObservedRunningTime="2026-03-20 13:35:17.152573757 +0000 UTC m=+837.896243501" Mar 20 13:35:21 crc kubenswrapper[4973]: I0320 13:35:21.457031 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-4qwzv" Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.184244 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc"] Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.186046 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc" Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.188760 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.197960 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc"] Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.320903 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.320970 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.321019 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.321655 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"26d2f8d0ba44652122f03dbe7cb2777fe726b59947a9f579b0a01f84b56a0f0a"} pod="openshift-machine-config-operator/machine-config-daemon-qlztx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.321711 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" containerID="cri-o://26d2f8d0ba44652122f03dbe7cb2777fe726b59947a9f579b0a01f84b56a0f0a" gracePeriod=600 Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.331488 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88wzc\" (UniqueName: \"kubernetes.io/projected/4e0294f1-0f7c-47f2-b81d-9e71231f19aa-kube-api-access-88wzc\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc\" (UID: \"4e0294f1-0f7c-47f2-b81d-9e71231f19aa\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc" Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.331552 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4e0294f1-0f7c-47f2-b81d-9e71231f19aa-util\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc\" (UID: \"4e0294f1-0f7c-47f2-b81d-9e71231f19aa\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc" Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.331626 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4e0294f1-0f7c-47f2-b81d-9e71231f19aa-bundle\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc\" (UID: \"4e0294f1-0f7c-47f2-b81d-9e71231f19aa\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc" Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.433158 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4e0294f1-0f7c-47f2-b81d-9e71231f19aa-bundle\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc\" (UID: \"4e0294f1-0f7c-47f2-b81d-9e71231f19aa\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc" Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.432662 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4e0294f1-0f7c-47f2-b81d-9e71231f19aa-bundle\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc\" (UID: \"4e0294f1-0f7c-47f2-b81d-9e71231f19aa\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc" Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.433637 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-88wzc\" (UniqueName: \"kubernetes.io/projected/4e0294f1-0f7c-47f2-b81d-9e71231f19aa-kube-api-access-88wzc\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc\" (UID: \"4e0294f1-0f7c-47f2-b81d-9e71231f19aa\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc" Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.433933 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4e0294f1-0f7c-47f2-b81d-9e71231f19aa-util\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc\" (UID: \"4e0294f1-0f7c-47f2-b81d-9e71231f19aa\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc" Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.434192 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4e0294f1-0f7c-47f2-b81d-9e71231f19aa-util\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc\" (UID: \"4e0294f1-0f7c-47f2-b81d-9e71231f19aa\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc" Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.440501 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f"] Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.441840 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f" Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.456121 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-88wzc\" (UniqueName: \"kubernetes.io/projected/4e0294f1-0f7c-47f2-b81d-9e71231f19aa-kube-api-access-88wzc\") pod \"3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc\" (UID: \"4e0294f1-0f7c-47f2-b81d-9e71231f19aa\") " pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc" Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.464489 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f"] Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.531282 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc" Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.535430 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08fa9eb6-01e9-4352-88bf-0302af5811b7-bundle\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f\" (UID: \"08fa9eb6-01e9-4352-88bf-0302af5811b7\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f" Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.535479 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hs5hr\" (UniqueName: \"kubernetes.io/projected/08fa9eb6-01e9-4352-88bf-0302af5811b7-kube-api-access-hs5hr\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f\" (UID: \"08fa9eb6-01e9-4352-88bf-0302af5811b7\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f" Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.535513 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08fa9eb6-01e9-4352-88bf-0302af5811b7-util\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f\" (UID: \"08fa9eb6-01e9-4352-88bf-0302af5811b7\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f" Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.639998 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08fa9eb6-01e9-4352-88bf-0302af5811b7-bundle\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f\" (UID: \"08fa9eb6-01e9-4352-88bf-0302af5811b7\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f" Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.640378 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hs5hr\" (UniqueName: \"kubernetes.io/projected/08fa9eb6-01e9-4352-88bf-0302af5811b7-kube-api-access-hs5hr\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f\" (UID: \"08fa9eb6-01e9-4352-88bf-0302af5811b7\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f" Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.640417 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08fa9eb6-01e9-4352-88bf-0302af5811b7-util\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f\" (UID: \"08fa9eb6-01e9-4352-88bf-0302af5811b7\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f" Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.641060 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08fa9eb6-01e9-4352-88bf-0302af5811b7-util\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f\" (UID: \"08fa9eb6-01e9-4352-88bf-0302af5811b7\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f" Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.641289 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08fa9eb6-01e9-4352-88bf-0302af5811b7-bundle\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f\" (UID: \"08fa9eb6-01e9-4352-88bf-0302af5811b7\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f" Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.678187 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hs5hr\" (UniqueName: \"kubernetes.io/projected/08fa9eb6-01e9-4352-88bf-0302af5811b7-kube-api-access-hs5hr\") pod \"4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f\" (UID: \"08fa9eb6-01e9-4352-88bf-0302af5811b7\") " pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f" Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.800513 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f" Mar 20 13:35:43 crc kubenswrapper[4973]: I0320 13:35:43.985373 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc"] Mar 20 13:35:44 crc kubenswrapper[4973]: I0320 13:35:44.042266 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f"] Mar 20 13:35:44 crc kubenswrapper[4973]: I0320 13:35:44.299307 4973 generic.go:334] "Generic (PLEG): container finished" podID="08fa9eb6-01e9-4352-88bf-0302af5811b7" containerID="797592d8d91f14cd1f754bcbf7f72edeb7acc1030a935dba774c58ed0319081b" exitCode=0 Mar 20 13:35:44 crc kubenswrapper[4973]: I0320 13:35:44.299406 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f" event={"ID":"08fa9eb6-01e9-4352-88bf-0302af5811b7","Type":"ContainerDied","Data":"797592d8d91f14cd1f754bcbf7f72edeb7acc1030a935dba774c58ed0319081b"} Mar 20 13:35:44 crc kubenswrapper[4973]: I0320 13:35:44.299720 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f" event={"ID":"08fa9eb6-01e9-4352-88bf-0302af5811b7","Type":"ContainerStarted","Data":"84f027b31c33d2d063fad8c9417812d9a43235bdee2e515ee1c84f744167cb56"} Mar 20 13:35:44 crc kubenswrapper[4973]: I0320 13:35:44.306509 4973 generic.go:334] "Generic (PLEG): container finished" podID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerID="26d2f8d0ba44652122f03dbe7cb2777fe726b59947a9f579b0a01f84b56a0f0a" exitCode=0 Mar 20 13:35:44 crc kubenswrapper[4973]: I0320 13:35:44.306591 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerDied","Data":"26d2f8d0ba44652122f03dbe7cb2777fe726b59947a9f579b0a01f84b56a0f0a"} Mar 20 13:35:44 crc kubenswrapper[4973]: I0320 13:35:44.306656 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerStarted","Data":"f898eb1e5a0799379b7d7bcd473943134d5addff2e02fbb8f8a3d4d7eb5c66a6"} Mar 20 13:35:44 crc kubenswrapper[4973]: I0320 13:35:44.306680 4973 scope.go:117] "RemoveContainer" containerID="ae703893e0e60a1cb59d74cc5e33372631d6542f4ae1b15e831e0084a728c10a" Mar 20 13:35:44 crc kubenswrapper[4973]: I0320 13:35:44.310058 4973 generic.go:334] "Generic (PLEG): container finished" podID="4e0294f1-0f7c-47f2-b81d-9e71231f19aa" containerID="7f1b1f17cc16bbba0cfef999350c87c1a94442c960200acbba6e96f6466c8d2d" exitCode=0 Mar 20 13:35:44 crc kubenswrapper[4973]: I0320 13:35:44.310117 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc" event={"ID":"4e0294f1-0f7c-47f2-b81d-9e71231f19aa","Type":"ContainerDied","Data":"7f1b1f17cc16bbba0cfef999350c87c1a94442c960200acbba6e96f6466c8d2d"} Mar 20 13:35:44 crc kubenswrapper[4973]: I0320 13:35:44.310143 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc" event={"ID":"4e0294f1-0f7c-47f2-b81d-9e71231f19aa","Type":"ContainerStarted","Data":"6c421655295bf85a13b4948e4c5128a5b82b13b5c9cb5867378712a9e52e9627"} Mar 20 13:35:46 crc kubenswrapper[4973]: I0320 13:35:46.326782 4973 generic.go:334] "Generic (PLEG): container finished" podID="08fa9eb6-01e9-4352-88bf-0302af5811b7" containerID="054a4b4935a2b5f1b32d33fb46055076adedb93ccb3a8681c8885994066aca0c" exitCode=0 Mar 20 13:35:46 crc kubenswrapper[4973]: I0320 13:35:46.326880 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f" event={"ID":"08fa9eb6-01e9-4352-88bf-0302af5811b7","Type":"ContainerDied","Data":"054a4b4935a2b5f1b32d33fb46055076adedb93ccb3a8681c8885994066aca0c"} Mar 20 13:35:46 crc kubenswrapper[4973]: I0320 13:35:46.331467 4973 generic.go:334] "Generic (PLEG): container finished" podID="4e0294f1-0f7c-47f2-b81d-9e71231f19aa" containerID="12846a0eb2a6d11c5573740cdf9f02915d222fe09b6bc4cfe9f0c4cc59682d40" exitCode=0 Mar 20 13:35:46 crc kubenswrapper[4973]: I0320 13:35:46.331505 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc" event={"ID":"4e0294f1-0f7c-47f2-b81d-9e71231f19aa","Type":"ContainerDied","Data":"12846a0eb2a6d11c5573740cdf9f02915d222fe09b6bc4cfe9f0c4cc59682d40"} Mar 20 13:35:46 crc kubenswrapper[4973]: I0320 13:35:46.949040 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-59gsb"] Mar 20 13:35:46 crc kubenswrapper[4973]: I0320 13:35:46.951967 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-59gsb" Mar 20 13:35:46 crc kubenswrapper[4973]: I0320 13:35:46.984300 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-59gsb"] Mar 20 13:35:46 crc kubenswrapper[4973]: I0320 13:35:46.999448 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cwxvs\" (UniqueName: \"kubernetes.io/projected/fff3d776-80b1-4222-885c-d4a0c4a67ba0-kube-api-access-cwxvs\") pod \"redhat-operators-59gsb\" (UID: \"fff3d776-80b1-4222-885c-d4a0c4a67ba0\") " pod="openshift-marketplace/redhat-operators-59gsb" Mar 20 13:35:46 crc kubenswrapper[4973]: I0320 13:35:46.999637 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fff3d776-80b1-4222-885c-d4a0c4a67ba0-catalog-content\") pod \"redhat-operators-59gsb\" (UID: \"fff3d776-80b1-4222-885c-d4a0c4a67ba0\") " pod="openshift-marketplace/redhat-operators-59gsb" Mar 20 13:35:46 crc kubenswrapper[4973]: I0320 13:35:46.999724 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fff3d776-80b1-4222-885c-d4a0c4a67ba0-utilities\") pod \"redhat-operators-59gsb\" (UID: \"fff3d776-80b1-4222-885c-d4a0c4a67ba0\") " pod="openshift-marketplace/redhat-operators-59gsb" Mar 20 13:35:47 crc kubenswrapper[4973]: I0320 13:35:47.101142 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fff3d776-80b1-4222-885c-d4a0c4a67ba0-catalog-content\") pod \"redhat-operators-59gsb\" (UID: \"fff3d776-80b1-4222-885c-d4a0c4a67ba0\") " pod="openshift-marketplace/redhat-operators-59gsb" Mar 20 13:35:47 crc kubenswrapper[4973]: I0320 13:35:47.101686 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fff3d776-80b1-4222-885c-d4a0c4a67ba0-utilities\") pod \"redhat-operators-59gsb\" (UID: \"fff3d776-80b1-4222-885c-d4a0c4a67ba0\") " pod="openshift-marketplace/redhat-operators-59gsb" Mar 20 13:35:47 crc kubenswrapper[4973]: I0320 13:35:47.101722 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cwxvs\" (UniqueName: \"kubernetes.io/projected/fff3d776-80b1-4222-885c-d4a0c4a67ba0-kube-api-access-cwxvs\") pod \"redhat-operators-59gsb\" (UID: \"fff3d776-80b1-4222-885c-d4a0c4a67ba0\") " pod="openshift-marketplace/redhat-operators-59gsb" Mar 20 13:35:47 crc kubenswrapper[4973]: I0320 13:35:47.102799 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fff3d776-80b1-4222-885c-d4a0c4a67ba0-catalog-content\") pod \"redhat-operators-59gsb\" (UID: \"fff3d776-80b1-4222-885c-d4a0c4a67ba0\") " pod="openshift-marketplace/redhat-operators-59gsb" Mar 20 13:35:47 crc kubenswrapper[4973]: I0320 13:35:47.103086 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fff3d776-80b1-4222-885c-d4a0c4a67ba0-utilities\") pod \"redhat-operators-59gsb\" (UID: \"fff3d776-80b1-4222-885c-d4a0c4a67ba0\") " pod="openshift-marketplace/redhat-operators-59gsb" Mar 20 13:35:47 crc kubenswrapper[4973]: I0320 13:35:47.127229 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cwxvs\" (UniqueName: \"kubernetes.io/projected/fff3d776-80b1-4222-885c-d4a0c4a67ba0-kube-api-access-cwxvs\") pod \"redhat-operators-59gsb\" (UID: \"fff3d776-80b1-4222-885c-d4a0c4a67ba0\") " pod="openshift-marketplace/redhat-operators-59gsb" Mar 20 13:35:47 crc kubenswrapper[4973]: I0320 13:35:47.271971 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-59gsb" Mar 20 13:35:47 crc kubenswrapper[4973]: I0320 13:35:47.659821 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-59gsb"] Mar 20 13:35:47 crc kubenswrapper[4973]: W0320 13:35:47.703008 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfff3d776_80b1_4222_885c_d4a0c4a67ba0.slice/crio-2d72c0719eabeb31d206124f4a543e1e4ade13166cc5fe0c79729a50b16d909b WatchSource:0}: Error finding container 2d72c0719eabeb31d206124f4a543e1e4ade13166cc5fe0c79729a50b16d909b: Status 404 returned error can't find the container with id 2d72c0719eabeb31d206124f4a543e1e4ade13166cc5fe0c79729a50b16d909b Mar 20 13:35:48 crc kubenswrapper[4973]: I0320 13:35:48.350508 4973 generic.go:334] "Generic (PLEG): container finished" podID="08fa9eb6-01e9-4352-88bf-0302af5811b7" containerID="f86f16b6a49834c55fd81cc6da4ade23ff9a7ada056d1e31bda15c72c7ae05d1" exitCode=0 Mar 20 13:35:48 crc kubenswrapper[4973]: I0320 13:35:48.350591 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f" event={"ID":"08fa9eb6-01e9-4352-88bf-0302af5811b7","Type":"ContainerDied","Data":"f86f16b6a49834c55fd81cc6da4ade23ff9a7ada056d1e31bda15c72c7ae05d1"} Mar 20 13:35:48 crc kubenswrapper[4973]: I0320 13:35:48.353013 4973 generic.go:334] "Generic (PLEG): container finished" podID="4e0294f1-0f7c-47f2-b81d-9e71231f19aa" containerID="6f140d17857ecffbbea7daef48aa08ba48aa9c584a9943ce71ae647c0ab8441a" exitCode=0 Mar 20 13:35:48 crc kubenswrapper[4973]: I0320 13:35:48.353043 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc" event={"ID":"4e0294f1-0f7c-47f2-b81d-9e71231f19aa","Type":"ContainerDied","Data":"6f140d17857ecffbbea7daef48aa08ba48aa9c584a9943ce71ae647c0ab8441a"} Mar 20 13:35:48 crc kubenswrapper[4973]: I0320 13:35:48.354941 4973 generic.go:334] "Generic (PLEG): container finished" podID="fff3d776-80b1-4222-885c-d4a0c4a67ba0" containerID="2e7b03ec1b608cd2945fa5f59b7583b4af5360a781f8b407f8e6a66e89f75947" exitCode=0 Mar 20 13:35:48 crc kubenswrapper[4973]: I0320 13:35:48.355003 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-59gsb" event={"ID":"fff3d776-80b1-4222-885c-d4a0c4a67ba0","Type":"ContainerDied","Data":"2e7b03ec1b608cd2945fa5f59b7583b4af5360a781f8b407f8e6a66e89f75947"} Mar 20 13:35:48 crc kubenswrapper[4973]: I0320 13:35:48.355031 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-59gsb" event={"ID":"fff3d776-80b1-4222-885c-d4a0c4a67ba0","Type":"ContainerStarted","Data":"2d72c0719eabeb31d206124f4a543e1e4ade13166cc5fe0c79729a50b16d909b"} Mar 20 13:35:49 crc kubenswrapper[4973]: I0320 13:35:49.363184 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-59gsb" event={"ID":"fff3d776-80b1-4222-885c-d4a0c4a67ba0","Type":"ContainerStarted","Data":"9dc4cda201daf66b8fbb1c0eb2b0db29cc04e751e77bcc943c568343e2fabf6c"} Mar 20 13:35:49 crc kubenswrapper[4973]: I0320 13:35:49.751447 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc" Mar 20 13:35:49 crc kubenswrapper[4973]: I0320 13:35:49.755170 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f" Mar 20 13:35:49 crc kubenswrapper[4973]: I0320 13:35:49.941862 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08fa9eb6-01e9-4352-88bf-0302af5811b7-util\") pod \"08fa9eb6-01e9-4352-88bf-0302af5811b7\" (UID: \"08fa9eb6-01e9-4352-88bf-0302af5811b7\") " Mar 20 13:35:49 crc kubenswrapper[4973]: I0320 13:35:49.941933 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08fa9eb6-01e9-4352-88bf-0302af5811b7-bundle\") pod \"08fa9eb6-01e9-4352-88bf-0302af5811b7\" (UID: \"08fa9eb6-01e9-4352-88bf-0302af5811b7\") " Mar 20 13:35:49 crc kubenswrapper[4973]: I0320 13:35:49.942016 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hs5hr\" (UniqueName: \"kubernetes.io/projected/08fa9eb6-01e9-4352-88bf-0302af5811b7-kube-api-access-hs5hr\") pod \"08fa9eb6-01e9-4352-88bf-0302af5811b7\" (UID: \"08fa9eb6-01e9-4352-88bf-0302af5811b7\") " Mar 20 13:35:49 crc kubenswrapper[4973]: I0320 13:35:49.942037 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4e0294f1-0f7c-47f2-b81d-9e71231f19aa-util\") pod \"4e0294f1-0f7c-47f2-b81d-9e71231f19aa\" (UID: \"4e0294f1-0f7c-47f2-b81d-9e71231f19aa\") " Mar 20 13:35:49 crc kubenswrapper[4973]: I0320 13:35:49.942093 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4e0294f1-0f7c-47f2-b81d-9e71231f19aa-bundle\") pod \"4e0294f1-0f7c-47f2-b81d-9e71231f19aa\" (UID: \"4e0294f1-0f7c-47f2-b81d-9e71231f19aa\") " Mar 20 13:35:49 crc kubenswrapper[4973]: I0320 13:35:49.942109 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-88wzc\" (UniqueName: \"kubernetes.io/projected/4e0294f1-0f7c-47f2-b81d-9e71231f19aa-kube-api-access-88wzc\") pod \"4e0294f1-0f7c-47f2-b81d-9e71231f19aa\" (UID: \"4e0294f1-0f7c-47f2-b81d-9e71231f19aa\") " Mar 20 13:35:49 crc kubenswrapper[4973]: I0320 13:35:49.942893 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08fa9eb6-01e9-4352-88bf-0302af5811b7-bundle" (OuterVolumeSpecName: "bundle") pod "08fa9eb6-01e9-4352-88bf-0302af5811b7" (UID: "08fa9eb6-01e9-4352-88bf-0302af5811b7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:35:49 crc kubenswrapper[4973]: I0320 13:35:49.944089 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e0294f1-0f7c-47f2-b81d-9e71231f19aa-bundle" (OuterVolumeSpecName: "bundle") pod "4e0294f1-0f7c-47f2-b81d-9e71231f19aa" (UID: "4e0294f1-0f7c-47f2-b81d-9e71231f19aa"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:35:49 crc kubenswrapper[4973]: I0320 13:35:49.947444 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4e0294f1-0f7c-47f2-b81d-9e71231f19aa-kube-api-access-88wzc" (OuterVolumeSpecName: "kube-api-access-88wzc") pod "4e0294f1-0f7c-47f2-b81d-9e71231f19aa" (UID: "4e0294f1-0f7c-47f2-b81d-9e71231f19aa"). InnerVolumeSpecName "kube-api-access-88wzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:35:49 crc kubenswrapper[4973]: I0320 13:35:49.948437 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08fa9eb6-01e9-4352-88bf-0302af5811b7-kube-api-access-hs5hr" (OuterVolumeSpecName: "kube-api-access-hs5hr") pod "08fa9eb6-01e9-4352-88bf-0302af5811b7" (UID: "08fa9eb6-01e9-4352-88bf-0302af5811b7"). InnerVolumeSpecName "kube-api-access-hs5hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:35:50 crc kubenswrapper[4973]: I0320 13:35:50.038652 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08fa9eb6-01e9-4352-88bf-0302af5811b7-util" (OuterVolumeSpecName: "util") pod "08fa9eb6-01e9-4352-88bf-0302af5811b7" (UID: "08fa9eb6-01e9-4352-88bf-0302af5811b7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:35:50 crc kubenswrapper[4973]: I0320 13:35:50.042756 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4e0294f1-0f7c-47f2-b81d-9e71231f19aa-util" (OuterVolumeSpecName: "util") pod "4e0294f1-0f7c-47f2-b81d-9e71231f19aa" (UID: "4e0294f1-0f7c-47f2-b81d-9e71231f19aa"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:35:50 crc kubenswrapper[4973]: I0320 13:35:50.043836 4973 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/08fa9eb6-01e9-4352-88bf-0302af5811b7-util\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:50 crc kubenswrapper[4973]: I0320 13:35:50.043866 4973 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/08fa9eb6-01e9-4352-88bf-0302af5811b7-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:50 crc kubenswrapper[4973]: I0320 13:35:50.043880 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hs5hr\" (UniqueName: \"kubernetes.io/projected/08fa9eb6-01e9-4352-88bf-0302af5811b7-kube-api-access-hs5hr\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:50 crc kubenswrapper[4973]: I0320 13:35:50.043894 4973 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/4e0294f1-0f7c-47f2-b81d-9e71231f19aa-util\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:50 crc kubenswrapper[4973]: I0320 13:35:50.043909 4973 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/4e0294f1-0f7c-47f2-b81d-9e71231f19aa-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:50 crc kubenswrapper[4973]: I0320 13:35:50.043926 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-88wzc\" (UniqueName: \"kubernetes.io/projected/4e0294f1-0f7c-47f2-b81d-9e71231f19aa-kube-api-access-88wzc\") on node \"crc\" DevicePath \"\"" Mar 20 13:35:50 crc kubenswrapper[4973]: I0320 13:35:50.371802 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f" event={"ID":"08fa9eb6-01e9-4352-88bf-0302af5811b7","Type":"ContainerDied","Data":"84f027b31c33d2d063fad8c9417812d9a43235bdee2e515ee1c84f744167cb56"} Mar 20 13:35:50 crc kubenswrapper[4973]: I0320 13:35:50.371871 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84f027b31c33d2d063fad8c9417812d9a43235bdee2e515ee1c84f744167cb56" Mar 20 13:35:50 crc kubenswrapper[4973]: I0320 13:35:50.371819 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f" Mar 20 13:35:50 crc kubenswrapper[4973]: I0320 13:35:50.378474 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc" event={"ID":"4e0294f1-0f7c-47f2-b81d-9e71231f19aa","Type":"ContainerDied","Data":"6c421655295bf85a13b4948e4c5128a5b82b13b5c9cb5867378712a9e52e9627"} Mar 20 13:35:50 crc kubenswrapper[4973]: I0320 13:35:50.378509 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c421655295bf85a13b4948e4c5128a5b82b13b5c9cb5867378712a9e52e9627" Mar 20 13:35:50 crc kubenswrapper[4973]: I0320 13:35:50.378485 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc" Mar 20 13:35:50 crc kubenswrapper[4973]: I0320 13:35:50.381047 4973 generic.go:334] "Generic (PLEG): container finished" podID="fff3d776-80b1-4222-885c-d4a0c4a67ba0" containerID="9dc4cda201daf66b8fbb1c0eb2b0db29cc04e751e77bcc943c568343e2fabf6c" exitCode=0 Mar 20 13:35:50 crc kubenswrapper[4973]: I0320 13:35:50.381082 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-59gsb" event={"ID":"fff3d776-80b1-4222-885c-d4a0c4a67ba0","Type":"ContainerDied","Data":"9dc4cda201daf66b8fbb1c0eb2b0db29cc04e751e77bcc943c568343e2fabf6c"} Mar 20 13:35:51 crc kubenswrapper[4973]: I0320 13:35:51.389107 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-59gsb" event={"ID":"fff3d776-80b1-4222-885c-d4a0c4a67ba0","Type":"ContainerStarted","Data":"d7c3dc36287e711f95f353e0e03e5c3b2c7f2095be4673f5c06c0f3870150392"} Mar 20 13:35:51 crc kubenswrapper[4973]: I0320 13:35:51.410439 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-59gsb" podStartSLOduration=2.984319689 podStartE2EDuration="5.410423082s" podCreationTimestamp="2026-03-20 13:35:46 +0000 UTC" firstStartedPulling="2026-03-20 13:35:48.356285305 +0000 UTC m=+869.099955049" lastFinishedPulling="2026-03-20 13:35:50.782388708 +0000 UTC m=+871.526058442" observedRunningTime="2026-03-20 13:35:51.406818202 +0000 UTC m=+872.150487946" watchObservedRunningTime="2026-03-20 13:35:51.410423082 +0000 UTC m=+872.154092836" Mar 20 13:35:57 crc kubenswrapper[4973]: I0320 13:35:57.272431 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-59gsb" Mar 20 13:35:57 crc kubenswrapper[4973]: I0320 13:35:57.274217 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-59gsb" Mar 20 13:35:57 crc kubenswrapper[4973]: I0320 13:35:57.319652 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-59gsb" Mar 20 13:35:57 crc kubenswrapper[4973]: I0320 13:35:57.478808 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-59gsb" Mar 20 13:36:00 crc kubenswrapper[4973]: I0320 13:36:00.160840 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566896-zmpzm"] Mar 20 13:36:00 crc kubenswrapper[4973]: E0320 13:36:00.161430 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e0294f1-0f7c-47f2-b81d-9e71231f19aa" containerName="pull" Mar 20 13:36:00 crc kubenswrapper[4973]: I0320 13:36:00.161444 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e0294f1-0f7c-47f2-b81d-9e71231f19aa" containerName="pull" Mar 20 13:36:00 crc kubenswrapper[4973]: E0320 13:36:00.161456 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08fa9eb6-01e9-4352-88bf-0302af5811b7" containerName="extract" Mar 20 13:36:00 crc kubenswrapper[4973]: I0320 13:36:00.161464 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="08fa9eb6-01e9-4352-88bf-0302af5811b7" containerName="extract" Mar 20 13:36:00 crc kubenswrapper[4973]: E0320 13:36:00.161476 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e0294f1-0f7c-47f2-b81d-9e71231f19aa" containerName="extract" Mar 20 13:36:00 crc kubenswrapper[4973]: I0320 13:36:00.161484 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e0294f1-0f7c-47f2-b81d-9e71231f19aa" containerName="extract" Mar 20 13:36:00 crc kubenswrapper[4973]: E0320 13:36:00.161498 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08fa9eb6-01e9-4352-88bf-0302af5811b7" containerName="pull" Mar 20 13:36:00 crc kubenswrapper[4973]: I0320 13:36:00.161505 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="08fa9eb6-01e9-4352-88bf-0302af5811b7" containerName="pull" Mar 20 13:36:00 crc kubenswrapper[4973]: E0320 13:36:00.161516 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4e0294f1-0f7c-47f2-b81d-9e71231f19aa" containerName="util" Mar 20 13:36:00 crc kubenswrapper[4973]: I0320 13:36:00.161523 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="4e0294f1-0f7c-47f2-b81d-9e71231f19aa" containerName="util" Mar 20 13:36:00 crc kubenswrapper[4973]: E0320 13:36:00.161537 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08fa9eb6-01e9-4352-88bf-0302af5811b7" containerName="util" Mar 20 13:36:00 crc kubenswrapper[4973]: I0320 13:36:00.161545 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="08fa9eb6-01e9-4352-88bf-0302af5811b7" containerName="util" Mar 20 13:36:00 crc kubenswrapper[4973]: I0320 13:36:00.161689 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="08fa9eb6-01e9-4352-88bf-0302af5811b7" containerName="extract" Mar 20 13:36:00 crc kubenswrapper[4973]: I0320 13:36:00.161709 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="4e0294f1-0f7c-47f2-b81d-9e71231f19aa" containerName="extract" Mar 20 13:36:00 crc kubenswrapper[4973]: I0320 13:36:00.162200 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566896-zmpzm" Mar 20 13:36:00 crc kubenswrapper[4973]: I0320 13:36:00.168225 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:36:00 crc kubenswrapper[4973]: I0320 13:36:00.168504 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:36:00 crc kubenswrapper[4973]: I0320 13:36:00.168993 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 13:36:00 crc kubenswrapper[4973]: I0320 13:36:00.194125 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566896-zmpzm"] Mar 20 13:36:00 crc kubenswrapper[4973]: I0320 13:36:00.287812 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wxmg\" (UniqueName: \"kubernetes.io/projected/578dfffc-6f28-4ee3-9635-5dd5efe7be69-kube-api-access-8wxmg\") pod \"auto-csr-approver-29566896-zmpzm\" (UID: \"578dfffc-6f28-4ee3-9635-5dd5efe7be69\") " pod="openshift-infra/auto-csr-approver-29566896-zmpzm" Mar 20 13:36:00 crc kubenswrapper[4973]: I0320 13:36:00.389332 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wxmg\" (UniqueName: \"kubernetes.io/projected/578dfffc-6f28-4ee3-9635-5dd5efe7be69-kube-api-access-8wxmg\") pod \"auto-csr-approver-29566896-zmpzm\" (UID: \"578dfffc-6f28-4ee3-9635-5dd5efe7be69\") " pod="openshift-infra/auto-csr-approver-29566896-zmpzm" Mar 20 13:36:00 crc kubenswrapper[4973]: I0320 13:36:00.419318 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wxmg\" (UniqueName: \"kubernetes.io/projected/578dfffc-6f28-4ee3-9635-5dd5efe7be69-kube-api-access-8wxmg\") pod \"auto-csr-approver-29566896-zmpzm\" (UID: \"578dfffc-6f28-4ee3-9635-5dd5efe7be69\") " pod="openshift-infra/auto-csr-approver-29566896-zmpzm" Mar 20 13:36:00 crc kubenswrapper[4973]: I0320 13:36:00.494194 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566896-zmpzm" Mar 20 13:36:00 crc kubenswrapper[4973]: I0320 13:36:00.531524 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-59gsb"] Mar 20 13:36:00 crc kubenswrapper[4973]: I0320 13:36:00.531796 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-59gsb" podUID="fff3d776-80b1-4222-885c-d4a0c4a67ba0" containerName="registry-server" containerID="cri-o://d7c3dc36287e711f95f353e0e03e5c3b2c7f2095be4673f5c06c0f3870150392" gracePeriod=2 Mar 20 13:36:00 crc kubenswrapper[4973]: I0320 13:36:00.914879 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-59gsb" Mar 20 13:36:00 crc kubenswrapper[4973]: I0320 13:36:00.988925 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566896-zmpzm"] Mar 20 13:36:00 crc kubenswrapper[4973]: I0320 13:36:00.999090 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fff3d776-80b1-4222-885c-d4a0c4a67ba0-catalog-content\") pod \"fff3d776-80b1-4222-885c-d4a0c4a67ba0\" (UID: \"fff3d776-80b1-4222-885c-d4a0c4a67ba0\") " Mar 20 13:36:00 crc kubenswrapper[4973]: I0320 13:36:00.999142 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fff3d776-80b1-4222-885c-d4a0c4a67ba0-utilities\") pod \"fff3d776-80b1-4222-885c-d4a0c4a67ba0\" (UID: \"fff3d776-80b1-4222-885c-d4a0c4a67ba0\") " Mar 20 13:36:00 crc kubenswrapper[4973]: I0320 13:36:00.999259 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cwxvs\" (UniqueName: \"kubernetes.io/projected/fff3d776-80b1-4222-885c-d4a0c4a67ba0-kube-api-access-cwxvs\") pod \"fff3d776-80b1-4222-885c-d4a0c4a67ba0\" (UID: \"fff3d776-80b1-4222-885c-d4a0c4a67ba0\") " Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.000241 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fff3d776-80b1-4222-885c-d4a0c4a67ba0-utilities" (OuterVolumeSpecName: "utilities") pod "fff3d776-80b1-4222-885c-d4a0c4a67ba0" (UID: "fff3d776-80b1-4222-885c-d4a0c4a67ba0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.003901 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fff3d776-80b1-4222-885c-d4a0c4a67ba0-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.004745 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fff3d776-80b1-4222-885c-d4a0c4a67ba0-kube-api-access-cwxvs" (OuterVolumeSpecName: "kube-api-access-cwxvs") pod "fff3d776-80b1-4222-885c-d4a0c4a67ba0" (UID: "fff3d776-80b1-4222-885c-d4a0c4a67ba0"). InnerVolumeSpecName "kube-api-access-cwxvs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.058982 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-6996757d8d-46qmw"] Mar 20 13:36:01 crc kubenswrapper[4973]: E0320 13:36:01.059222 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fff3d776-80b1-4222-885c-d4a0c4a67ba0" containerName="registry-server" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.059233 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="fff3d776-80b1-4222-885c-d4a0c4a67ba0" containerName="registry-server" Mar 20 13:36:01 crc kubenswrapper[4973]: E0320 13:36:01.059244 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fff3d776-80b1-4222-885c-d4a0c4a67ba0" containerName="extract-content" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.059250 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="fff3d776-80b1-4222-885c-d4a0c4a67ba0" containerName="extract-content" Mar 20 13:36:01 crc kubenswrapper[4973]: E0320 13:36:01.059275 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fff3d776-80b1-4222-885c-d4a0c4a67ba0" containerName="extract-utilities" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.059282 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="fff3d776-80b1-4222-885c-d4a0c4a67ba0" containerName="extract-utilities" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.059472 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="fff3d776-80b1-4222-885c-d4a0c4a67ba0" containerName="registry-server" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.060187 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-6996757d8d-46qmw" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.062547 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-metrics" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.063934 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"kube-root-ca.crt" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.064024 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"openshift-service-ca.crt" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.064124 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-service-cert" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.064196 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators-redhat"/"loki-operator-manager-config" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.070488 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators-redhat"/"loki-operator-controller-manager-dockercfg-b6vsm" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.079739 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-6996757d8d-46qmw"] Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.105179 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cwxvs\" (UniqueName: \"kubernetes.io/projected/fff3d776-80b1-4222-885c-d4a0c4a67ba0-kube-api-access-cwxvs\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.149574 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fff3d776-80b1-4222-885c-d4a0c4a67ba0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fff3d776-80b1-4222-885c-d4a0c4a67ba0" (UID: "fff3d776-80b1-4222-885c-d4a0c4a67ba0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.206448 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3bf2a551-4944-4096-99f4-03effa26dde8-apiservice-cert\") pod \"loki-operator-controller-manager-6996757d8d-46qmw\" (UID: \"3bf2a551-4944-4096-99f4-03effa26dde8\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6996757d8d-46qmw" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.207026 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqdk4\" (UniqueName: \"kubernetes.io/projected/3bf2a551-4944-4096-99f4-03effa26dde8-kube-api-access-kqdk4\") pod \"loki-operator-controller-manager-6996757d8d-46qmw\" (UID: \"3bf2a551-4944-4096-99f4-03effa26dde8\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6996757d8d-46qmw" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.207084 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/3bf2a551-4944-4096-99f4-03effa26dde8-manager-config\") pod \"loki-operator-controller-manager-6996757d8d-46qmw\" (UID: \"3bf2a551-4944-4096-99f4-03effa26dde8\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6996757d8d-46qmw" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.207120 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3bf2a551-4944-4096-99f4-03effa26dde8-webhook-cert\") pod \"loki-operator-controller-manager-6996757d8d-46qmw\" (UID: \"3bf2a551-4944-4096-99f4-03effa26dde8\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6996757d8d-46qmw" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.207196 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3bf2a551-4944-4096-99f4-03effa26dde8-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-6996757d8d-46qmw\" (UID: \"3bf2a551-4944-4096-99f4-03effa26dde8\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6996757d8d-46qmw" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.207268 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fff3d776-80b1-4222-885c-d4a0c4a67ba0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.308871 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3bf2a551-4944-4096-99f4-03effa26dde8-apiservice-cert\") pod \"loki-operator-controller-manager-6996757d8d-46qmw\" (UID: \"3bf2a551-4944-4096-99f4-03effa26dde8\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6996757d8d-46qmw" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.308946 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqdk4\" (UniqueName: \"kubernetes.io/projected/3bf2a551-4944-4096-99f4-03effa26dde8-kube-api-access-kqdk4\") pod \"loki-operator-controller-manager-6996757d8d-46qmw\" (UID: \"3bf2a551-4944-4096-99f4-03effa26dde8\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6996757d8d-46qmw" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.308986 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/3bf2a551-4944-4096-99f4-03effa26dde8-manager-config\") pod \"loki-operator-controller-manager-6996757d8d-46qmw\" (UID: \"3bf2a551-4944-4096-99f4-03effa26dde8\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6996757d8d-46qmw" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.309016 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3bf2a551-4944-4096-99f4-03effa26dde8-webhook-cert\") pod \"loki-operator-controller-manager-6996757d8d-46qmw\" (UID: \"3bf2a551-4944-4096-99f4-03effa26dde8\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6996757d8d-46qmw" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.309092 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3bf2a551-4944-4096-99f4-03effa26dde8-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-6996757d8d-46qmw\" (UID: \"3bf2a551-4944-4096-99f4-03effa26dde8\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6996757d8d-46qmw" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.310824 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/3bf2a551-4944-4096-99f4-03effa26dde8-manager-config\") pod \"loki-operator-controller-manager-6996757d8d-46qmw\" (UID: \"3bf2a551-4944-4096-99f4-03effa26dde8\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6996757d8d-46qmw" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.314112 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3bf2a551-4944-4096-99f4-03effa26dde8-apiservice-cert\") pod \"loki-operator-controller-manager-6996757d8d-46qmw\" (UID: \"3bf2a551-4944-4096-99f4-03effa26dde8\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6996757d8d-46qmw" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.314841 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"loki-operator-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3bf2a551-4944-4096-99f4-03effa26dde8-loki-operator-metrics-cert\") pod \"loki-operator-controller-manager-6996757d8d-46qmw\" (UID: \"3bf2a551-4944-4096-99f4-03effa26dde8\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6996757d8d-46qmw" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.330108 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3bf2a551-4944-4096-99f4-03effa26dde8-webhook-cert\") pod \"loki-operator-controller-manager-6996757d8d-46qmw\" (UID: \"3bf2a551-4944-4096-99f4-03effa26dde8\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6996757d8d-46qmw" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.336436 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqdk4\" (UniqueName: \"kubernetes.io/projected/3bf2a551-4944-4096-99f4-03effa26dde8-kube-api-access-kqdk4\") pod \"loki-operator-controller-manager-6996757d8d-46qmw\" (UID: \"3bf2a551-4944-4096-99f4-03effa26dde8\") " pod="openshift-operators-redhat/loki-operator-controller-manager-6996757d8d-46qmw" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.376012 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators-redhat/loki-operator-controller-manager-6996757d8d-46qmw" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.454944 4973 generic.go:334] "Generic (PLEG): container finished" podID="fff3d776-80b1-4222-885c-d4a0c4a67ba0" containerID="d7c3dc36287e711f95f353e0e03e5c3b2c7f2095be4673f5c06c0f3870150392" exitCode=0 Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.455006 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-59gsb" event={"ID":"fff3d776-80b1-4222-885c-d4a0c4a67ba0","Type":"ContainerDied","Data":"d7c3dc36287e711f95f353e0e03e5c3b2c7f2095be4673f5c06c0f3870150392"} Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.455036 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-59gsb" event={"ID":"fff3d776-80b1-4222-885c-d4a0c4a67ba0","Type":"ContainerDied","Data":"2d72c0719eabeb31d206124f4a543e1e4ade13166cc5fe0c79729a50b16d909b"} Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.455052 4973 scope.go:117] "RemoveContainer" containerID="d7c3dc36287e711f95f353e0e03e5c3b2c7f2095be4673f5c06c0f3870150392" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.455158 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-59gsb" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.458280 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566896-zmpzm" event={"ID":"578dfffc-6f28-4ee3-9635-5dd5efe7be69","Type":"ContainerStarted","Data":"4de4051485c8600d5c2a81667b1510ab076cf0842f880fd03d92bf10637ccc28"} Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.505051 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-59gsb"] Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.506305 4973 scope.go:117] "RemoveContainer" containerID="9dc4cda201daf66b8fbb1c0eb2b0db29cc04e751e77bcc943c568343e2fabf6c" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.513462 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-59gsb"] Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.527621 4973 scope.go:117] "RemoveContainer" containerID="2e7b03ec1b608cd2945fa5f59b7583b4af5360a781f8b407f8e6a66e89f75947" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.552585 4973 scope.go:117] "RemoveContainer" containerID="d7c3dc36287e711f95f353e0e03e5c3b2c7f2095be4673f5c06c0f3870150392" Mar 20 13:36:01 crc kubenswrapper[4973]: E0320 13:36:01.556492 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d7c3dc36287e711f95f353e0e03e5c3b2c7f2095be4673f5c06c0f3870150392\": container with ID starting with d7c3dc36287e711f95f353e0e03e5c3b2c7f2095be4673f5c06c0f3870150392 not found: ID does not exist" containerID="d7c3dc36287e711f95f353e0e03e5c3b2c7f2095be4673f5c06c0f3870150392" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.556517 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d7c3dc36287e711f95f353e0e03e5c3b2c7f2095be4673f5c06c0f3870150392"} err="failed to get container status \"d7c3dc36287e711f95f353e0e03e5c3b2c7f2095be4673f5c06c0f3870150392\": rpc error: code = NotFound desc = could not find container \"d7c3dc36287e711f95f353e0e03e5c3b2c7f2095be4673f5c06c0f3870150392\": container with ID starting with d7c3dc36287e711f95f353e0e03e5c3b2c7f2095be4673f5c06c0f3870150392 not found: ID does not exist" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.556538 4973 scope.go:117] "RemoveContainer" containerID="9dc4cda201daf66b8fbb1c0eb2b0db29cc04e751e77bcc943c568343e2fabf6c" Mar 20 13:36:01 crc kubenswrapper[4973]: E0320 13:36:01.558476 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dc4cda201daf66b8fbb1c0eb2b0db29cc04e751e77bcc943c568343e2fabf6c\": container with ID starting with 9dc4cda201daf66b8fbb1c0eb2b0db29cc04e751e77bcc943c568343e2fabf6c not found: ID does not exist" containerID="9dc4cda201daf66b8fbb1c0eb2b0db29cc04e751e77bcc943c568343e2fabf6c" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.558505 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dc4cda201daf66b8fbb1c0eb2b0db29cc04e751e77bcc943c568343e2fabf6c"} err="failed to get container status \"9dc4cda201daf66b8fbb1c0eb2b0db29cc04e751e77bcc943c568343e2fabf6c\": rpc error: code = NotFound desc = could not find container \"9dc4cda201daf66b8fbb1c0eb2b0db29cc04e751e77bcc943c568343e2fabf6c\": container with ID starting with 9dc4cda201daf66b8fbb1c0eb2b0db29cc04e751e77bcc943c568343e2fabf6c not found: ID does not exist" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.558521 4973 scope.go:117] "RemoveContainer" containerID="2e7b03ec1b608cd2945fa5f59b7583b4af5360a781f8b407f8e6a66e89f75947" Mar 20 13:36:01 crc kubenswrapper[4973]: E0320 13:36:01.558797 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e7b03ec1b608cd2945fa5f59b7583b4af5360a781f8b407f8e6a66e89f75947\": container with ID starting with 2e7b03ec1b608cd2945fa5f59b7583b4af5360a781f8b407f8e6a66e89f75947 not found: ID does not exist" containerID="2e7b03ec1b608cd2945fa5f59b7583b4af5360a781f8b407f8e6a66e89f75947" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.558841 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e7b03ec1b608cd2945fa5f59b7583b4af5360a781f8b407f8e6a66e89f75947"} err="failed to get container status \"2e7b03ec1b608cd2945fa5f59b7583b4af5360a781f8b407f8e6a66e89f75947\": rpc error: code = NotFound desc = could not find container \"2e7b03ec1b608cd2945fa5f59b7583b4af5360a781f8b407f8e6a66e89f75947\": container with ID starting with 2e7b03ec1b608cd2945fa5f59b7583b4af5360a781f8b407f8e6a66e89f75947 not found: ID does not exist" Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.871913 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators-redhat/loki-operator-controller-manager-6996757d8d-46qmw"] Mar 20 13:36:01 crc kubenswrapper[4973]: I0320 13:36:01.956863 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fff3d776-80b1-4222-885c-d4a0c4a67ba0" path="/var/lib/kubelet/pods/fff3d776-80b1-4222-885c-d4a0c4a67ba0/volumes" Mar 20 13:36:02 crc kubenswrapper[4973]: I0320 13:36:02.485120 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-6996757d8d-46qmw" event={"ID":"3bf2a551-4944-4096-99f4-03effa26dde8","Type":"ContainerStarted","Data":"ffc08b746e451081ef26f655b7bb0d40dba4ec111ccee50469c7df5672fc662c"} Mar 20 13:36:02 crc kubenswrapper[4973]: I0320 13:36:02.779050 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/cluster-logging-operator-66689c4bbf-9zjgt"] Mar 20 13:36:02 crc kubenswrapper[4973]: I0320 13:36:02.779883 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-66689c4bbf-9zjgt" Mar 20 13:36:02 crc kubenswrapper[4973]: I0320 13:36:02.783137 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-66689c4bbf-9zjgt"] Mar 20 13:36:02 crc kubenswrapper[4973]: I0320 13:36:02.783266 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"cluster-logging-operator-dockercfg-4qt4d" Mar 20 13:36:02 crc kubenswrapper[4973]: I0320 13:36:02.783578 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"openshift-service-ca.crt" Mar 20 13:36:02 crc kubenswrapper[4973]: I0320 13:36:02.783770 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"kube-root-ca.crt" Mar 20 13:36:02 crc kubenswrapper[4973]: I0320 13:36:02.966193 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hz4g\" (UniqueName: \"kubernetes.io/projected/f1930ce4-8a2b-4ef3-bf3b-eb748e20f255-kube-api-access-5hz4g\") pod \"cluster-logging-operator-66689c4bbf-9zjgt\" (UID: \"f1930ce4-8a2b-4ef3-bf3b-eb748e20f255\") " pod="openshift-logging/cluster-logging-operator-66689c4bbf-9zjgt" Mar 20 13:36:03 crc kubenswrapper[4973]: I0320 13:36:03.067480 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5hz4g\" (UniqueName: \"kubernetes.io/projected/f1930ce4-8a2b-4ef3-bf3b-eb748e20f255-kube-api-access-5hz4g\") pod \"cluster-logging-operator-66689c4bbf-9zjgt\" (UID: \"f1930ce4-8a2b-4ef3-bf3b-eb748e20f255\") " pod="openshift-logging/cluster-logging-operator-66689c4bbf-9zjgt" Mar 20 13:36:03 crc kubenswrapper[4973]: I0320 13:36:03.097402 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hz4g\" (UniqueName: \"kubernetes.io/projected/f1930ce4-8a2b-4ef3-bf3b-eb748e20f255-kube-api-access-5hz4g\") pod \"cluster-logging-operator-66689c4bbf-9zjgt\" (UID: \"f1930ce4-8a2b-4ef3-bf3b-eb748e20f255\") " pod="openshift-logging/cluster-logging-operator-66689c4bbf-9zjgt" Mar 20 13:36:03 crc kubenswrapper[4973]: I0320 13:36:03.105858 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/cluster-logging-operator-66689c4bbf-9zjgt" Mar 20 13:36:03 crc kubenswrapper[4973]: I0320 13:36:03.497172 4973 generic.go:334] "Generic (PLEG): container finished" podID="578dfffc-6f28-4ee3-9635-5dd5efe7be69" containerID="ac4f59f9001d9e345861f200ad8dfcc9222083eb317def46c65718537696db7d" exitCode=0 Mar 20 13:36:03 crc kubenswrapper[4973]: I0320 13:36:03.497265 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566896-zmpzm" event={"ID":"578dfffc-6f28-4ee3-9635-5dd5efe7be69","Type":"ContainerDied","Data":"ac4f59f9001d9e345861f200ad8dfcc9222083eb317def46c65718537696db7d"} Mar 20 13:36:03 crc kubenswrapper[4973]: I0320 13:36:03.528805 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/cluster-logging-operator-66689c4bbf-9zjgt"] Mar 20 13:36:03 crc kubenswrapper[4973]: W0320 13:36:03.539220 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1930ce4_8a2b_4ef3_bf3b_eb748e20f255.slice/crio-e371af386761ca2f077666f54e639663f52c2d0d6994f9ade660f1919f968cb1 WatchSource:0}: Error finding container e371af386761ca2f077666f54e639663f52c2d0d6994f9ade660f1919f968cb1: Status 404 returned error can't find the container with id e371af386761ca2f077666f54e639663f52c2d0d6994f9ade660f1919f968cb1 Mar 20 13:36:04 crc kubenswrapper[4973]: I0320 13:36:04.503647 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-66689c4bbf-9zjgt" event={"ID":"f1930ce4-8a2b-4ef3-bf3b-eb748e20f255","Type":"ContainerStarted","Data":"e371af386761ca2f077666f54e639663f52c2d0d6994f9ade660f1919f968cb1"} Mar 20 13:36:05 crc kubenswrapper[4973]: I0320 13:36:05.810453 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566896-zmpzm" Mar 20 13:36:05 crc kubenswrapper[4973]: I0320 13:36:05.909639 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wxmg\" (UniqueName: \"kubernetes.io/projected/578dfffc-6f28-4ee3-9635-5dd5efe7be69-kube-api-access-8wxmg\") pod \"578dfffc-6f28-4ee3-9635-5dd5efe7be69\" (UID: \"578dfffc-6f28-4ee3-9635-5dd5efe7be69\") " Mar 20 13:36:05 crc kubenswrapper[4973]: I0320 13:36:05.927867 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/578dfffc-6f28-4ee3-9635-5dd5efe7be69-kube-api-access-8wxmg" (OuterVolumeSpecName: "kube-api-access-8wxmg") pod "578dfffc-6f28-4ee3-9635-5dd5efe7be69" (UID: "578dfffc-6f28-4ee3-9635-5dd5efe7be69"). InnerVolumeSpecName "kube-api-access-8wxmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:36:06 crc kubenswrapper[4973]: I0320 13:36:06.012095 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wxmg\" (UniqueName: \"kubernetes.io/projected/578dfffc-6f28-4ee3-9635-5dd5efe7be69-kube-api-access-8wxmg\") on node \"crc\" DevicePath \"\"" Mar 20 13:36:06 crc kubenswrapper[4973]: I0320 13:36:06.517913 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566896-zmpzm" event={"ID":"578dfffc-6f28-4ee3-9635-5dd5efe7be69","Type":"ContainerDied","Data":"4de4051485c8600d5c2a81667b1510ab076cf0842f880fd03d92bf10637ccc28"} Mar 20 13:36:06 crc kubenswrapper[4973]: I0320 13:36:06.518280 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4de4051485c8600d5c2a81667b1510ab076cf0842f880fd03d92bf10637ccc28" Mar 20 13:36:06 crc kubenswrapper[4973]: I0320 13:36:06.517957 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566896-zmpzm" Mar 20 13:36:06 crc kubenswrapper[4973]: I0320 13:36:06.874239 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566890-dchfr"] Mar 20 13:36:07 crc kubenswrapper[4973]: I0320 13:36:06.884176 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566890-dchfr"] Mar 20 13:36:07 crc kubenswrapper[4973]: I0320 13:36:07.959477 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a33dac4d-9a04-4e13-8c3e-fac7025b14ca" path="/var/lib/kubelet/pods/a33dac4d-9a04-4e13-8c3e-fac7025b14ca/volumes" Mar 20 13:36:08 crc kubenswrapper[4973]: I0320 13:36:08.541130 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-6996757d8d-46qmw" event={"ID":"3bf2a551-4944-4096-99f4-03effa26dde8","Type":"ContainerStarted","Data":"14239ec51114909ea5b623371fb07954295b02dd26735f0d51336e5950bd9e77"} Mar 20 13:36:13 crc kubenswrapper[4973]: I0320 13:36:13.574299 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/cluster-logging-operator-66689c4bbf-9zjgt" event={"ID":"f1930ce4-8a2b-4ef3-bf3b-eb748e20f255","Type":"ContainerStarted","Data":"00667b0d5637ea6baf7ba3f8112ad3934fceb5ab9e4dbbbebb1e7c264ce18f41"} Mar 20 13:36:13 crc kubenswrapper[4973]: I0320 13:36:13.599865 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/cluster-logging-operator-66689c4bbf-9zjgt" podStartSLOduration=2.366855706 podStartE2EDuration="11.599839569s" podCreationTimestamp="2026-03-20 13:36:02 +0000 UTC" firstStartedPulling="2026-03-20 13:36:03.542185147 +0000 UTC m=+884.285854891" lastFinishedPulling="2026-03-20 13:36:12.77516901 +0000 UTC m=+893.518838754" observedRunningTime="2026-03-20 13:36:13.592444972 +0000 UTC m=+894.336114716" watchObservedRunningTime="2026-03-20 13:36:13.599839569 +0000 UTC m=+894.343509313" Mar 20 13:36:18 crc kubenswrapper[4973]: I0320 13:36:18.634143 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators-redhat/loki-operator-controller-manager-6996757d8d-46qmw" event={"ID":"3bf2a551-4944-4096-99f4-03effa26dde8","Type":"ContainerStarted","Data":"a2d26b095a2c9b3628e989f3f01187663c3ead4e9913ac4400032823fbb6d6ea"} Mar 20 13:36:18 crc kubenswrapper[4973]: I0320 13:36:18.635676 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators-redhat/loki-operator-controller-manager-6996757d8d-46qmw" Mar 20 13:36:18 crc kubenswrapper[4973]: I0320 13:36:18.636970 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators-redhat/loki-operator-controller-manager-6996757d8d-46qmw" Mar 20 13:36:18 crc kubenswrapper[4973]: I0320 13:36:18.662562 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators-redhat/loki-operator-controller-manager-6996757d8d-46qmw" podStartSLOduration=1.896896202 podStartE2EDuration="17.662543752s" podCreationTimestamp="2026-03-20 13:36:01 +0000 UTC" firstStartedPulling="2026-03-20 13:36:01.880371052 +0000 UTC m=+882.624040796" lastFinishedPulling="2026-03-20 13:36:17.646018602 +0000 UTC m=+898.389688346" observedRunningTime="2026-03-20 13:36:18.657357669 +0000 UTC m=+899.401027413" watchObservedRunningTime="2026-03-20 13:36:18.662543752 +0000 UTC m=+899.406213496" Mar 20 13:36:23 crc kubenswrapper[4973]: I0320 13:36:23.692049 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["minio-dev/minio"] Mar 20 13:36:23 crc kubenswrapper[4973]: E0320 13:36:23.693091 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="578dfffc-6f28-4ee3-9635-5dd5efe7be69" containerName="oc" Mar 20 13:36:23 crc kubenswrapper[4973]: I0320 13:36:23.693119 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="578dfffc-6f28-4ee3-9635-5dd5efe7be69" containerName="oc" Mar 20 13:36:23 crc kubenswrapper[4973]: I0320 13:36:23.693329 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="578dfffc-6f28-4ee3-9635-5dd5efe7be69" containerName="oc" Mar 20 13:36:23 crc kubenswrapper[4973]: I0320 13:36:23.694158 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 20 13:36:23 crc kubenswrapper[4973]: I0320 13:36:23.697989 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"openshift-service-ca.crt" Mar 20 13:36:23 crc kubenswrapper[4973]: I0320 13:36:23.699490 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"minio-dev"/"kube-root-ca.crt" Mar 20 13:36:23 crc kubenswrapper[4973]: I0320 13:36:23.701412 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 20 13:36:23 crc kubenswrapper[4973]: I0320 13:36:23.836074 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npdbx\" (UniqueName: \"kubernetes.io/projected/45de16ba-2836-4915-964e-b6475afe9b00-kube-api-access-npdbx\") pod \"minio\" (UID: \"45de16ba-2836-4915-964e-b6475afe9b00\") " pod="minio-dev/minio" Mar 20 13:36:23 crc kubenswrapper[4973]: I0320 13:36:23.836576 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-dbbb079e-6d7c-4ab2-9c97-0f288dc1adcf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dbbb079e-6d7c-4ab2-9c97-0f288dc1adcf\") pod \"minio\" (UID: \"45de16ba-2836-4915-964e-b6475afe9b00\") " pod="minio-dev/minio" Mar 20 13:36:23 crc kubenswrapper[4973]: I0320 13:36:23.939006 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-dbbb079e-6d7c-4ab2-9c97-0f288dc1adcf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dbbb079e-6d7c-4ab2-9c97-0f288dc1adcf\") pod \"minio\" (UID: \"45de16ba-2836-4915-964e-b6475afe9b00\") " pod="minio-dev/minio" Mar 20 13:36:23 crc kubenswrapper[4973]: I0320 13:36:23.939102 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npdbx\" (UniqueName: \"kubernetes.io/projected/45de16ba-2836-4915-964e-b6475afe9b00-kube-api-access-npdbx\") pod \"minio\" (UID: \"45de16ba-2836-4915-964e-b6475afe9b00\") " pod="minio-dev/minio" Mar 20 13:36:23 crc kubenswrapper[4973]: I0320 13:36:23.942157 4973 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:36:23 crc kubenswrapper[4973]: I0320 13:36:23.942196 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-dbbb079e-6d7c-4ab2-9c97-0f288dc1adcf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dbbb079e-6d7c-4ab2-9c97-0f288dc1adcf\") pod \"minio\" (UID: \"45de16ba-2836-4915-964e-b6475afe9b00\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/a4a9885210d2527b8fd5dbe9d8113742c694ab2ad5b72890c779210f5b75c6f6/globalmount\"" pod="minio-dev/minio" Mar 20 13:36:23 crc kubenswrapper[4973]: I0320 13:36:23.967876 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npdbx\" (UniqueName: \"kubernetes.io/projected/45de16ba-2836-4915-964e-b6475afe9b00-kube-api-access-npdbx\") pod \"minio\" (UID: \"45de16ba-2836-4915-964e-b6475afe9b00\") " pod="minio-dev/minio" Mar 20 13:36:23 crc kubenswrapper[4973]: I0320 13:36:23.975655 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-dbbb079e-6d7c-4ab2-9c97-0f288dc1adcf\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dbbb079e-6d7c-4ab2-9c97-0f288dc1adcf\") pod \"minio\" (UID: \"45de16ba-2836-4915-964e-b6475afe9b00\") " pod="minio-dev/minio" Mar 20 13:36:24 crc kubenswrapper[4973]: I0320 13:36:24.015178 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="minio-dev/minio" Mar 20 13:36:24 crc kubenswrapper[4973]: I0320 13:36:24.351673 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["minio-dev/minio"] Mar 20 13:36:24 crc kubenswrapper[4973]: I0320 13:36:24.667948 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"45de16ba-2836-4915-964e-b6475afe9b00","Type":"ContainerStarted","Data":"274821f1bbdf84ae35debace61011997a29fcd24a924b2bc25931f090af96e55"} Mar 20 13:36:28 crc kubenswrapper[4973]: I0320 13:36:28.694799 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="minio-dev/minio" event={"ID":"45de16ba-2836-4915-964e-b6475afe9b00","Type":"ContainerStarted","Data":"06903aac6c3287fbd32729130b6aa9d2215fe3df4d5903bb1274b6ec037ef6d2"} Mar 20 13:36:28 crc kubenswrapper[4973]: I0320 13:36:28.715274 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="minio-dev/minio" podStartSLOduration=4.236560642 podStartE2EDuration="7.715254817s" podCreationTimestamp="2026-03-20 13:36:21 +0000 UTC" firstStartedPulling="2026-03-20 13:36:24.375250584 +0000 UTC m=+905.118920338" lastFinishedPulling="2026-03-20 13:36:27.853944749 +0000 UTC m=+908.597614513" observedRunningTime="2026-03-20 13:36:28.71174808 +0000 UTC m=+909.455417844" watchObservedRunningTime="2026-03-20 13:36:28.715254817 +0000 UTC m=+909.458924561" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.319351 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-distributor-9c6b6d984-b5mdp"] Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.320936 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-9c6b6d984-b5mdp" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.323258 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-http" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.323322 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-config" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.327835 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-dockercfg-drhl7" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.327846 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-ca-bundle" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.328079 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-distributor-grpc" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.334366 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-9c6b6d984-b5mdp"] Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.414501 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/870a7fc7-0aac-45df-857e-dba72c60f80a-logging-loki-ca-bundle\") pod \"logging-loki-distributor-9c6b6d984-b5mdp\" (UID: \"870a7fc7-0aac-45df-857e-dba72c60f80a\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-b5mdp" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.414855 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/870a7fc7-0aac-45df-857e-dba72c60f80a-config\") pod \"logging-loki-distributor-9c6b6d984-b5mdp\" (UID: \"870a7fc7-0aac-45df-857e-dba72c60f80a\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-b5mdp" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.414912 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/870a7fc7-0aac-45df-857e-dba72c60f80a-logging-loki-distributor-http\") pod \"logging-loki-distributor-9c6b6d984-b5mdp\" (UID: \"870a7fc7-0aac-45df-857e-dba72c60f80a\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-b5mdp" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.414970 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khgpc\" (UniqueName: \"kubernetes.io/projected/870a7fc7-0aac-45df-857e-dba72c60f80a-kube-api-access-khgpc\") pod \"logging-loki-distributor-9c6b6d984-b5mdp\" (UID: \"870a7fc7-0aac-45df-857e-dba72c60f80a\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-b5mdp" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.415016 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/870a7fc7-0aac-45df-857e-dba72c60f80a-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-9c6b6d984-b5mdp\" (UID: \"870a7fc7-0aac-45df-857e-dba72c60f80a\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-b5mdp" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.494924 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g"] Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.495855 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.497864 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-s3" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.498218 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-http" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.500264 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-querier-grpc" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.518658 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khgpc\" (UniqueName: \"kubernetes.io/projected/870a7fc7-0aac-45df-857e-dba72c60f80a-kube-api-access-khgpc\") pod \"logging-loki-distributor-9c6b6d984-b5mdp\" (UID: \"870a7fc7-0aac-45df-857e-dba72c60f80a\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-b5mdp" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.518730 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/870a7fc7-0aac-45df-857e-dba72c60f80a-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-9c6b6d984-b5mdp\" (UID: \"870a7fc7-0aac-45df-857e-dba72c60f80a\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-b5mdp" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.518793 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/870a7fc7-0aac-45df-857e-dba72c60f80a-logging-loki-ca-bundle\") pod \"logging-loki-distributor-9c6b6d984-b5mdp\" (UID: \"870a7fc7-0aac-45df-857e-dba72c60f80a\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-b5mdp" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.518815 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/870a7fc7-0aac-45df-857e-dba72c60f80a-config\") pod \"logging-loki-distributor-9c6b6d984-b5mdp\" (UID: \"870a7fc7-0aac-45df-857e-dba72c60f80a\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-b5mdp" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.518849 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/870a7fc7-0aac-45df-857e-dba72c60f80a-logging-loki-distributor-http\") pod \"logging-loki-distributor-9c6b6d984-b5mdp\" (UID: \"870a7fc7-0aac-45df-857e-dba72c60f80a\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-b5mdp" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.519881 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/870a7fc7-0aac-45df-857e-dba72c60f80a-logging-loki-ca-bundle\") pod \"logging-loki-distributor-9c6b6d984-b5mdp\" (UID: \"870a7fc7-0aac-45df-857e-dba72c60f80a\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-b5mdp" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.520112 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/870a7fc7-0aac-45df-857e-dba72c60f80a-config\") pod \"logging-loki-distributor-9c6b6d984-b5mdp\" (UID: \"870a7fc7-0aac-45df-857e-dba72c60f80a\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-b5mdp" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.530126 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-grpc\" (UniqueName: \"kubernetes.io/secret/870a7fc7-0aac-45df-857e-dba72c60f80a-logging-loki-distributor-grpc\") pod \"logging-loki-distributor-9c6b6d984-b5mdp\" (UID: \"870a7fc7-0aac-45df-857e-dba72c60f80a\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-b5mdp" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.530276 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-distributor-http\" (UniqueName: \"kubernetes.io/secret/870a7fc7-0aac-45df-857e-dba72c60f80a-logging-loki-distributor-http\") pod \"logging-loki-distributor-9c6b6d984-b5mdp\" (UID: \"870a7fc7-0aac-45df-857e-dba72c60f80a\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-b5mdp" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.558281 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khgpc\" (UniqueName: \"kubernetes.io/projected/870a7fc7-0aac-45df-857e-dba72c60f80a-kube-api-access-khgpc\") pod \"logging-loki-distributor-9c6b6d984-b5mdp\" (UID: \"870a7fc7-0aac-45df-857e-dba72c60f80a\") " pod="openshift-logging/logging-loki-distributor-9c6b6d984-b5mdp" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.569423 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-query-frontend-ff66c4dc9-ssp8t"] Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.570524 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-ssp8t" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.573630 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-http" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.573904 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-query-frontend-grpc" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.574408 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g"] Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.595490 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-ff66c4dc9-ssp8t"] Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.620254 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4d7w\" (UniqueName: \"kubernetes.io/projected/727aa0fc-f2ea-4183-a168-24918669937b-kube-api-access-b4d7w\") pod \"logging-loki-querier-6dcbdf8bb8-zjf9g\" (UID: \"727aa0fc-f2ea-4183-a168-24918669937b\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.620302 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/727aa0fc-f2ea-4183-a168-24918669937b-logging-loki-s3\") pod \"logging-loki-querier-6dcbdf8bb8-zjf9g\" (UID: \"727aa0fc-f2ea-4183-a168-24918669937b\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.620349 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/727aa0fc-f2ea-4183-a168-24918669937b-logging-loki-querier-grpc\") pod \"logging-loki-querier-6dcbdf8bb8-zjf9g\" (UID: \"727aa0fc-f2ea-4183-a168-24918669937b\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.620374 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/727aa0fc-f2ea-4183-a168-24918669937b-logging-loki-ca-bundle\") pod \"logging-loki-querier-6dcbdf8bb8-zjf9g\" (UID: \"727aa0fc-f2ea-4183-a168-24918669937b\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.620420 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/727aa0fc-f2ea-4183-a168-24918669937b-logging-loki-querier-http\") pod \"logging-loki-querier-6dcbdf8bb8-zjf9g\" (UID: \"727aa0fc-f2ea-4183-a168-24918669937b\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.620439 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/727aa0fc-f2ea-4183-a168-24918669937b-config\") pod \"logging-loki-querier-6dcbdf8bb8-zjf9g\" (UID: \"727aa0fc-f2ea-4183-a168-24918669937b\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.650986 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-distributor-9c6b6d984-b5mdp" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.674062 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9"] Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.675744 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.683790 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-http" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.683882 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.684311 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway-ca-bundle" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.684596 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-client-http" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.684601 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"logging-loki-gateway" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.708655 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9"] Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.722803 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4d7w\" (UniqueName: \"kubernetes.io/projected/727aa0fc-f2ea-4183-a168-24918669937b-kube-api-access-b4d7w\") pod \"logging-loki-querier-6dcbdf8bb8-zjf9g\" (UID: \"727aa0fc-f2ea-4183-a168-24918669937b\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.722855 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/727aa0fc-f2ea-4183-a168-24918669937b-logging-loki-s3\") pod \"logging-loki-querier-6dcbdf8bb8-zjf9g\" (UID: \"727aa0fc-f2ea-4183-a168-24918669937b\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.722893 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/727aa0fc-f2ea-4183-a168-24918669937b-logging-loki-querier-grpc\") pod \"logging-loki-querier-6dcbdf8bb8-zjf9g\" (UID: \"727aa0fc-f2ea-4183-a168-24918669937b\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.722917 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/727aa0fc-f2ea-4183-a168-24918669937b-logging-loki-ca-bundle\") pod \"logging-loki-querier-6dcbdf8bb8-zjf9g\" (UID: \"727aa0fc-f2ea-4183-a168-24918669937b\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.722955 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d08579c3-9bb3-4b06-a613-0b81a2d7fb44-config\") pod \"logging-loki-query-frontend-ff66c4dc9-ssp8t\" (UID: \"d08579c3-9bb3-4b06-a613-0b81a2d7fb44\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-ssp8t" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.722977 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/d08579c3-9bb3-4b06-a613-0b81a2d7fb44-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-ff66c4dc9-ssp8t\" (UID: \"d08579c3-9bb3-4b06-a613-0b81a2d7fb44\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-ssp8t" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.722998 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/727aa0fc-f2ea-4183-a168-24918669937b-logging-loki-querier-http\") pod \"logging-loki-querier-6dcbdf8bb8-zjf9g\" (UID: \"727aa0fc-f2ea-4183-a168-24918669937b\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.723014 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/d08579c3-9bb3-4b06-a613-0b81a2d7fb44-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-ff66c4dc9-ssp8t\" (UID: \"d08579c3-9bb3-4b06-a613-0b81a2d7fb44\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-ssp8t" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.723030 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d08579c3-9bb3-4b06-a613-0b81a2d7fb44-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-ff66c4dc9-ssp8t\" (UID: \"d08579c3-9bb3-4b06-a613-0b81a2d7fb44\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-ssp8t" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.723049 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/727aa0fc-f2ea-4183-a168-24918669937b-config\") pod \"logging-loki-querier-6dcbdf8bb8-zjf9g\" (UID: \"727aa0fc-f2ea-4183-a168-24918669937b\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.723097 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brsj8\" (UniqueName: \"kubernetes.io/projected/d08579c3-9bb3-4b06-a613-0b81a2d7fb44-kube-api-access-brsj8\") pod \"logging-loki-query-frontend-ff66c4dc9-ssp8t\" (UID: \"d08579c3-9bb3-4b06-a613-0b81a2d7fb44\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-ssp8t" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.724298 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/727aa0fc-f2ea-4183-a168-24918669937b-logging-loki-ca-bundle\") pod \"logging-loki-querier-6dcbdf8bb8-zjf9g\" (UID: \"727aa0fc-f2ea-4183-a168-24918669937b\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.724831 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/727aa0fc-f2ea-4183-a168-24918669937b-config\") pod \"logging-loki-querier-6dcbdf8bb8-zjf9g\" (UID: \"727aa0fc-f2ea-4183-a168-24918669937b\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.727331 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-grpc\" (UniqueName: \"kubernetes.io/secret/727aa0fc-f2ea-4183-a168-24918669937b-logging-loki-querier-grpc\") pod \"logging-loki-querier-6dcbdf8bb8-zjf9g\" (UID: \"727aa0fc-f2ea-4183-a168-24918669937b\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.731119 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-querier-http\" (UniqueName: \"kubernetes.io/secret/727aa0fc-f2ea-4183-a168-24918669937b-logging-loki-querier-http\") pod \"logging-loki-querier-6dcbdf8bb8-zjf9g\" (UID: \"727aa0fc-f2ea-4183-a168-24918669937b\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.731150 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r"] Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.732265 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.734296 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-gateway-dockercfg-ntjhj" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.739875 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/727aa0fc-f2ea-4183-a168-24918669937b-logging-loki-s3\") pod \"logging-loki-querier-6dcbdf8bb8-zjf9g\" (UID: \"727aa0fc-f2ea-4183-a168-24918669937b\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.741917 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4d7w\" (UniqueName: \"kubernetes.io/projected/727aa0fc-f2ea-4183-a168-24918669937b-kube-api-access-b4d7w\") pod \"logging-loki-querier-6dcbdf8bb8-zjf9g\" (UID: \"727aa0fc-f2ea-4183-a168-24918669937b\") " pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.754225 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r"] Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.824232 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d08579c3-9bb3-4b06-a613-0b81a2d7fb44-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-ff66c4dc9-ssp8t\" (UID: \"d08579c3-9bb3-4b06-a613-0b81a2d7fb44\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-ssp8t" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.824619 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4e8002e-56ed-40a4-a768-9fd6a44d891c-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-fbc7bc644-sw7l9\" (UID: \"d4e8002e-56ed-40a4-a768-9fd6a44d891c\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.824643 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ff158ae-7281-4d5c-95cf-ff14e136c414-logging-loki-ca-bundle\") pod \"logging-loki-gateway-fbc7bc644-ltv8r\" (UID: \"9ff158ae-7281-4d5c-95cf-ff14e136c414\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.824671 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/9ff158ae-7281-4d5c-95cf-ff14e136c414-tls-secret\") pod \"logging-loki-gateway-fbc7bc644-ltv8r\" (UID: \"9ff158ae-7281-4d5c-95cf-ff14e136c414\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.824696 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brsj8\" (UniqueName: \"kubernetes.io/projected/d08579c3-9bb3-4b06-a613-0b81a2d7fb44-kube-api-access-brsj8\") pod \"logging-loki-query-frontend-ff66c4dc9-ssp8t\" (UID: \"d08579c3-9bb3-4b06-a613-0b81a2d7fb44\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-ssp8t" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.824711 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ff158ae-7281-4d5c-95cf-ff14e136c414-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-fbc7bc644-ltv8r\" (UID: \"9ff158ae-7281-4d5c-95cf-ff14e136c414\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.824736 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/d4e8002e-56ed-40a4-a768-9fd6a44d891c-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-fbc7bc644-sw7l9\" (UID: \"d4e8002e-56ed-40a4-a768-9fd6a44d891c\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.824765 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/9ff158ae-7281-4d5c-95cf-ff14e136c414-lokistack-gateway\") pod \"logging-loki-gateway-fbc7bc644-ltv8r\" (UID: \"9ff158ae-7281-4d5c-95cf-ff14e136c414\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.824786 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/9ff158ae-7281-4d5c-95cf-ff14e136c414-tenants\") pod \"logging-loki-gateway-fbc7bc644-ltv8r\" (UID: \"9ff158ae-7281-4d5c-95cf-ff14e136c414\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.824806 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njjv4\" (UniqueName: \"kubernetes.io/projected/9ff158ae-7281-4d5c-95cf-ff14e136c414-kube-api-access-njjv4\") pod \"logging-loki-gateway-fbc7bc644-ltv8r\" (UID: \"9ff158ae-7281-4d5c-95cf-ff14e136c414\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.824835 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/d4e8002e-56ed-40a4-a768-9fd6a44d891c-lokistack-gateway\") pod \"logging-loki-gateway-fbc7bc644-sw7l9\" (UID: \"d4e8002e-56ed-40a4-a768-9fd6a44d891c\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.824852 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/d4e8002e-56ed-40a4-a768-9fd6a44d891c-tls-secret\") pod \"logging-loki-gateway-fbc7bc644-sw7l9\" (UID: \"d4e8002e-56ed-40a4-a768-9fd6a44d891c\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.824884 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/9ff158ae-7281-4d5c-95cf-ff14e136c414-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-fbc7bc644-ltv8r\" (UID: \"9ff158ae-7281-4d5c-95cf-ff14e136c414\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.824903 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4e8002e-56ed-40a4-a768-9fd6a44d891c-logging-loki-ca-bundle\") pod \"logging-loki-gateway-fbc7bc644-sw7l9\" (UID: \"d4e8002e-56ed-40a4-a768-9fd6a44d891c\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.824925 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d08579c3-9bb3-4b06-a613-0b81a2d7fb44-config\") pod \"logging-loki-query-frontend-ff66c4dc9-ssp8t\" (UID: \"d08579c3-9bb3-4b06-a613-0b81a2d7fb44\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-ssp8t" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.824939 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/d4e8002e-56ed-40a4-a768-9fd6a44d891c-rbac\") pod \"logging-loki-gateway-fbc7bc644-sw7l9\" (UID: \"d4e8002e-56ed-40a4-a768-9fd6a44d891c\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.824958 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/d4e8002e-56ed-40a4-a768-9fd6a44d891c-tenants\") pod \"logging-loki-gateway-fbc7bc644-sw7l9\" (UID: \"d4e8002e-56ed-40a4-a768-9fd6a44d891c\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.824975 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77fnl\" (UniqueName: \"kubernetes.io/projected/d4e8002e-56ed-40a4-a768-9fd6a44d891c-kube-api-access-77fnl\") pod \"logging-loki-gateway-fbc7bc644-sw7l9\" (UID: \"d4e8002e-56ed-40a4-a768-9fd6a44d891c\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.824993 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/d08579c3-9bb3-4b06-a613-0b81a2d7fb44-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-ff66c4dc9-ssp8t\" (UID: \"d08579c3-9bb3-4b06-a613-0b81a2d7fb44\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-ssp8t" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.825012 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/9ff158ae-7281-4d5c-95cf-ff14e136c414-rbac\") pod \"logging-loki-gateway-fbc7bc644-ltv8r\" (UID: \"9ff158ae-7281-4d5c-95cf-ff14e136c414\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.825030 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/d08579c3-9bb3-4b06-a613-0b81a2d7fb44-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-ff66c4dc9-ssp8t\" (UID: \"d08579c3-9bb3-4b06-a613-0b81a2d7fb44\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-ssp8t" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.830879 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d08579c3-9bb3-4b06-a613-0b81a2d7fb44-logging-loki-ca-bundle\") pod \"logging-loki-query-frontend-ff66c4dc9-ssp8t\" (UID: \"d08579c3-9bb3-4b06-a613-0b81a2d7fb44\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-ssp8t" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.830960 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-http\" (UniqueName: \"kubernetes.io/secret/d08579c3-9bb3-4b06-a613-0b81a2d7fb44-logging-loki-query-frontend-http\") pod \"logging-loki-query-frontend-ff66c4dc9-ssp8t\" (UID: \"d08579c3-9bb3-4b06-a613-0b81a2d7fb44\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-ssp8t" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.833776 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d08579c3-9bb3-4b06-a613-0b81a2d7fb44-config\") pod \"logging-loki-query-frontend-ff66c4dc9-ssp8t\" (UID: \"d08579c3-9bb3-4b06-a613-0b81a2d7fb44\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-ssp8t" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.839688 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-query-frontend-grpc\" (UniqueName: \"kubernetes.io/secret/d08579c3-9bb3-4b06-a613-0b81a2d7fb44-logging-loki-query-frontend-grpc\") pod \"logging-loki-query-frontend-ff66c4dc9-ssp8t\" (UID: \"d08579c3-9bb3-4b06-a613-0b81a2d7fb44\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-ssp8t" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.848113 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brsj8\" (UniqueName: \"kubernetes.io/projected/d08579c3-9bb3-4b06-a613-0b81a2d7fb44-kube-api-access-brsj8\") pod \"logging-loki-query-frontend-ff66c4dc9-ssp8t\" (UID: \"d08579c3-9bb3-4b06-a613-0b81a2d7fb44\") " pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-ssp8t" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.858556 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.898967 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-ssp8t" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.926319 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4e8002e-56ed-40a4-a768-9fd6a44d891c-logging-loki-ca-bundle\") pod \"logging-loki-gateway-fbc7bc644-sw7l9\" (UID: \"d4e8002e-56ed-40a4-a768-9fd6a44d891c\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.926385 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/d4e8002e-56ed-40a4-a768-9fd6a44d891c-rbac\") pod \"logging-loki-gateway-fbc7bc644-sw7l9\" (UID: \"d4e8002e-56ed-40a4-a768-9fd6a44d891c\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.926404 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/d4e8002e-56ed-40a4-a768-9fd6a44d891c-tenants\") pod \"logging-loki-gateway-fbc7bc644-sw7l9\" (UID: \"d4e8002e-56ed-40a4-a768-9fd6a44d891c\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.926425 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77fnl\" (UniqueName: \"kubernetes.io/projected/d4e8002e-56ed-40a4-a768-9fd6a44d891c-kube-api-access-77fnl\") pod \"logging-loki-gateway-fbc7bc644-sw7l9\" (UID: \"d4e8002e-56ed-40a4-a768-9fd6a44d891c\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.926444 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/9ff158ae-7281-4d5c-95cf-ff14e136c414-rbac\") pod \"logging-loki-gateway-fbc7bc644-ltv8r\" (UID: \"9ff158ae-7281-4d5c-95cf-ff14e136c414\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.926466 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ff158ae-7281-4d5c-95cf-ff14e136c414-logging-loki-ca-bundle\") pod \"logging-loki-gateway-fbc7bc644-ltv8r\" (UID: \"9ff158ae-7281-4d5c-95cf-ff14e136c414\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.926482 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4e8002e-56ed-40a4-a768-9fd6a44d891c-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-fbc7bc644-sw7l9\" (UID: \"d4e8002e-56ed-40a4-a768-9fd6a44d891c\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.926510 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/9ff158ae-7281-4d5c-95cf-ff14e136c414-tls-secret\") pod \"logging-loki-gateway-fbc7bc644-ltv8r\" (UID: \"9ff158ae-7281-4d5c-95cf-ff14e136c414\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.926533 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ff158ae-7281-4d5c-95cf-ff14e136c414-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-fbc7bc644-ltv8r\" (UID: \"9ff158ae-7281-4d5c-95cf-ff14e136c414\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.926553 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/d4e8002e-56ed-40a4-a768-9fd6a44d891c-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-fbc7bc644-sw7l9\" (UID: \"d4e8002e-56ed-40a4-a768-9fd6a44d891c\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.926582 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/9ff158ae-7281-4d5c-95cf-ff14e136c414-lokistack-gateway\") pod \"logging-loki-gateway-fbc7bc644-ltv8r\" (UID: \"9ff158ae-7281-4d5c-95cf-ff14e136c414\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.926602 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/9ff158ae-7281-4d5c-95cf-ff14e136c414-tenants\") pod \"logging-loki-gateway-fbc7bc644-ltv8r\" (UID: \"9ff158ae-7281-4d5c-95cf-ff14e136c414\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.926616 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njjv4\" (UniqueName: \"kubernetes.io/projected/9ff158ae-7281-4d5c-95cf-ff14e136c414-kube-api-access-njjv4\") pod \"logging-loki-gateway-fbc7bc644-ltv8r\" (UID: \"9ff158ae-7281-4d5c-95cf-ff14e136c414\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.926645 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/d4e8002e-56ed-40a4-a768-9fd6a44d891c-lokistack-gateway\") pod \"logging-loki-gateway-fbc7bc644-sw7l9\" (UID: \"d4e8002e-56ed-40a4-a768-9fd6a44d891c\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.926663 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/d4e8002e-56ed-40a4-a768-9fd6a44d891c-tls-secret\") pod \"logging-loki-gateway-fbc7bc644-sw7l9\" (UID: \"d4e8002e-56ed-40a4-a768-9fd6a44d891c\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.926694 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/9ff158ae-7281-4d5c-95cf-ff14e136c414-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-fbc7bc644-ltv8r\" (UID: \"9ff158ae-7281-4d5c-95cf-ff14e136c414\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.929849 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4e8002e-56ed-40a4-a768-9fd6a44d891c-logging-loki-ca-bundle\") pod \"logging-loki-gateway-fbc7bc644-sw7l9\" (UID: \"d4e8002e-56ed-40a4-a768-9fd6a44d891c\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.930637 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/d4e8002e-56ed-40a4-a768-9fd6a44d891c-rbac\") pod \"logging-loki-gateway-fbc7bc644-sw7l9\" (UID: \"d4e8002e-56ed-40a4-a768-9fd6a44d891c\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.937234 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ff158ae-7281-4d5c-95cf-ff14e136c414-logging-loki-ca-bundle\") pod \"logging-loki-gateway-fbc7bc644-ltv8r\" (UID: \"9ff158ae-7281-4d5c-95cf-ff14e136c414\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.937955 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rbac\" (UniqueName: \"kubernetes.io/configmap/9ff158ae-7281-4d5c-95cf-ff14e136c414-rbac\") pod \"logging-loki-gateway-fbc7bc644-ltv8r\" (UID: \"9ff158ae-7281-4d5c-95cf-ff14e136c414\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.938422 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9ff158ae-7281-4d5c-95cf-ff14e136c414-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-fbc7bc644-ltv8r\" (UID: \"9ff158ae-7281-4d5c-95cf-ff14e136c414\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.939355 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/d4e8002e-56ed-40a4-a768-9fd6a44d891c-lokistack-gateway\") pod \"logging-loki-gateway-fbc7bc644-sw7l9\" (UID: \"d4e8002e-56ed-40a4-a768-9fd6a44d891c\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.943439 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/9ff158ae-7281-4d5c-95cf-ff14e136c414-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-fbc7bc644-ltv8r\" (UID: \"9ff158ae-7281-4d5c-95cf-ff14e136c414\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.943781 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/9ff158ae-7281-4d5c-95cf-ff14e136c414-tenants\") pod \"logging-loki-gateway-fbc7bc644-ltv8r\" (UID: \"9ff158ae-7281-4d5c-95cf-ff14e136c414\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.944275 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d4e8002e-56ed-40a4-a768-9fd6a44d891c-logging-loki-gateway-ca-bundle\") pod \"logging-loki-gateway-fbc7bc644-sw7l9\" (UID: \"d4e8002e-56ed-40a4-a768-9fd6a44d891c\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.944326 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lokistack-gateway\" (UniqueName: \"kubernetes.io/configmap/9ff158ae-7281-4d5c-95cf-ff14e136c414-lokistack-gateway\") pod \"logging-loki-gateway-fbc7bc644-ltv8r\" (UID: \"9ff158ae-7281-4d5c-95cf-ff14e136c414\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.947936 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/9ff158ae-7281-4d5c-95cf-ff14e136c414-tls-secret\") pod \"logging-loki-gateway-fbc7bc644-ltv8r\" (UID: \"9ff158ae-7281-4d5c-95cf-ff14e136c414\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.949877 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tenants\" (UniqueName: \"kubernetes.io/secret/d4e8002e-56ed-40a4-a768-9fd6a44d891c-tenants\") pod \"logging-loki-gateway-fbc7bc644-sw7l9\" (UID: \"d4e8002e-56ed-40a4-a768-9fd6a44d891c\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.956050 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-gateway-client-http\" (UniqueName: \"kubernetes.io/secret/d4e8002e-56ed-40a4-a768-9fd6a44d891c-logging-loki-gateway-client-http\") pod \"logging-loki-gateway-fbc7bc644-sw7l9\" (UID: \"d4e8002e-56ed-40a4-a768-9fd6a44d891c\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.958976 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-secret\" (UniqueName: \"kubernetes.io/secret/d4e8002e-56ed-40a4-a768-9fd6a44d891c-tls-secret\") pod \"logging-loki-gateway-fbc7bc644-sw7l9\" (UID: \"d4e8002e-56ed-40a4-a768-9fd6a44d891c\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.971882 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njjv4\" (UniqueName: \"kubernetes.io/projected/9ff158ae-7281-4d5c-95cf-ff14e136c414-kube-api-access-njjv4\") pod \"logging-loki-gateway-fbc7bc644-ltv8r\" (UID: \"9ff158ae-7281-4d5c-95cf-ff14e136c414\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:34 crc kubenswrapper[4973]: I0320 13:36:34.973155 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77fnl\" (UniqueName: \"kubernetes.io/projected/d4e8002e-56ed-40a4-a768-9fd6a44d891c-kube-api-access-77fnl\") pod \"logging-loki-gateway-fbc7bc644-sw7l9\" (UID: \"d4e8002e-56ed-40a4-a768-9fd6a44d891c\") " pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.067657 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.075447 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.190888 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-distributor-9c6b6d984-b5mdp"] Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.455762 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g"] Mar 20 13:36:35 crc kubenswrapper[4973]: W0320 13:36:35.459481 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod727aa0fc_f2ea_4183_a168_24918669937b.slice/crio-3b45e2e9d25857cf8c9ca73b6d4cfac4eff998762f90f80a14dcaa8b916c14cf WatchSource:0}: Error finding container 3b45e2e9d25857cf8c9ca73b6d4cfac4eff998762f90f80a14dcaa8b916c14cf: Status 404 returned error can't find the container with id 3b45e2e9d25857cf8c9ca73b6d4cfac4eff998762f90f80a14dcaa8b916c14cf Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.476866 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.477642 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.479778 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-http" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.480884 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-ingester-grpc" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.491130 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.546075 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.546944 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.549289 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/96545de2-ed06-4e4e-9102-37e56cbd6cdb-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"96545de2-ed06-4e4e-9102-37e56cbd6cdb\") " pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.549295 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-http" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.549382 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96545de2-ed06-4e4e-9102-37e56cbd6cdb-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"96545de2-ed06-4e4e-9102-37e56cbd6cdb\") " pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.549409 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ab04496e-b517-41ee-a601-b448031391e5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab04496e-b517-41ee-a601-b448031391e5\") pod \"logging-loki-ingester-0\" (UID: \"96545de2-ed06-4e4e-9102-37e56cbd6cdb\") " pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.549426 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktqp6\" (UniqueName: \"kubernetes.io/projected/96545de2-ed06-4e4e-9102-37e56cbd6cdb-kube-api-access-ktqp6\") pod \"logging-loki-ingester-0\" (UID: \"96545de2-ed06-4e4e-9102-37e56cbd6cdb\") " pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.549455 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-bf5ede3a-a87c-43a2-9ec0-f381b2fbae3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bf5ede3a-a87c-43a2-9ec0-f381b2fbae3b\") pod \"logging-loki-ingester-0\" (UID: \"96545de2-ed06-4e4e-9102-37e56cbd6cdb\") " pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.549426 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-compactor-grpc" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.549490 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96545de2-ed06-4e4e-9102-37e56cbd6cdb-config\") pod \"logging-loki-ingester-0\" (UID: \"96545de2-ed06-4e4e-9102-37e56cbd6cdb\") " pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.549613 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/96545de2-ed06-4e4e-9102-37e56cbd6cdb-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"96545de2-ed06-4e4e-9102-37e56cbd6cdb\") " pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.549766 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/96545de2-ed06-4e4e-9102-37e56cbd6cdb-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"96545de2-ed06-4e4e-9102-37e56cbd6cdb\") " pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.557085 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 20 13:36:35 crc kubenswrapper[4973]: W0320 13:36:35.587247 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd08579c3_9bb3_4b06_a613_0b81a2d7fb44.slice/crio-1c2113654ca44b9dc2b9344d2fd3bcd073d7a2181c1c211cb1c3377ded555197 WatchSource:0}: Error finding container 1c2113654ca44b9dc2b9344d2fd3bcd073d7a2181c1c211cb1c3377ded555197: Status 404 returned error can't find the container with id 1c2113654ca44b9dc2b9344d2fd3bcd073d7a2181c1c211cb1c3377ded555197 Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.590036 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-query-frontend-ff66c4dc9-ssp8t"] Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.617920 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.621199 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.634856 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-grpc" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.635139 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"logging-loki-index-gateway-http" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.644448 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.657146 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/fef9a921-6e35-4927-87b4-2741b40a3ab8-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"fef9a921-6e35-4927-87b4-2741b40a3ab8\") " pod="openshift-logging/logging-loki-compactor-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.657187 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-77a0386b-1117-49ee-965c-203bb8cab7f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-77a0386b-1117-49ee-965c-203bb8cab7f6\") pod \"logging-loki-compactor-0\" (UID: \"fef9a921-6e35-4927-87b4-2741b40a3ab8\") " pod="openshift-logging/logging-loki-compactor-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.657210 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96545de2-ed06-4e4e-9102-37e56cbd6cdb-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"96545de2-ed06-4e4e-9102-37e56cbd6cdb\") " pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.657227 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktqp6\" (UniqueName: \"kubernetes.io/projected/96545de2-ed06-4e4e-9102-37e56cbd6cdb-kube-api-access-ktqp6\") pod \"logging-loki-ingester-0\" (UID: \"96545de2-ed06-4e4e-9102-37e56cbd6cdb\") " pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.657243 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ab04496e-b517-41ee-a601-b448031391e5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab04496e-b517-41ee-a601-b448031391e5\") pod \"logging-loki-ingester-0\" (UID: \"96545de2-ed06-4e4e-9102-37e56cbd6cdb\") " pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.657267 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fef9a921-6e35-4927-87b4-2741b40a3ab8-config\") pod \"logging-loki-compactor-0\" (UID: \"fef9a921-6e35-4927-87b4-2741b40a3ab8\") " pod="openshift-logging/logging-loki-compactor-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.657289 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-bf5ede3a-a87c-43a2-9ec0-f381b2fbae3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bf5ede3a-a87c-43a2-9ec0-f381b2fbae3b\") pod \"logging-loki-ingester-0\" (UID: \"96545de2-ed06-4e4e-9102-37e56cbd6cdb\") " pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.657386 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96545de2-ed06-4e4e-9102-37e56cbd6cdb-config\") pod \"logging-loki-ingester-0\" (UID: \"96545de2-ed06-4e4e-9102-37e56cbd6cdb\") " pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.657413 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s8ch\" (UniqueName: \"kubernetes.io/projected/fef9a921-6e35-4927-87b4-2741b40a3ab8-kube-api-access-7s8ch\") pod \"logging-loki-compactor-0\" (UID: \"fef9a921-6e35-4927-87b4-2741b40a3ab8\") " pod="openshift-logging/logging-loki-compactor-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.657434 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/96545de2-ed06-4e4e-9102-37e56cbd6cdb-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"96545de2-ed06-4e4e-9102-37e56cbd6cdb\") " pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.657462 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/fef9a921-6e35-4927-87b4-2741b40a3ab8-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"fef9a921-6e35-4927-87b4-2741b40a3ab8\") " pod="openshift-logging/logging-loki-compactor-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.657501 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fef9a921-6e35-4927-87b4-2741b40a3ab8-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"fef9a921-6e35-4927-87b4-2741b40a3ab8\") " pod="openshift-logging/logging-loki-compactor-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.657518 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/96545de2-ed06-4e4e-9102-37e56cbd6cdb-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"96545de2-ed06-4e4e-9102-37e56cbd6cdb\") " pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.657541 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/fef9a921-6e35-4927-87b4-2741b40a3ab8-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"fef9a921-6e35-4927-87b4-2741b40a3ab8\") " pod="openshift-logging/logging-loki-compactor-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.657565 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/96545de2-ed06-4e4e-9102-37e56cbd6cdb-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"96545de2-ed06-4e4e-9102-37e56cbd6cdb\") " pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.663548 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/96545de2-ed06-4e4e-9102-37e56cbd6cdb-logging-loki-ca-bundle\") pod \"logging-loki-ingester-0\" (UID: \"96545de2-ed06-4e4e-9102-37e56cbd6cdb\") " pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.664802 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96545de2-ed06-4e4e-9102-37e56cbd6cdb-config\") pod \"logging-loki-ingester-0\" (UID: \"96545de2-ed06-4e4e-9102-37e56cbd6cdb\") " pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.669256 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-grpc\" (UniqueName: \"kubernetes.io/secret/96545de2-ed06-4e4e-9102-37e56cbd6cdb-logging-loki-ingester-grpc\") pod \"logging-loki-ingester-0\" (UID: \"96545de2-ed06-4e4e-9102-37e56cbd6cdb\") " pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.669256 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ingester-http\" (UniqueName: \"kubernetes.io/secret/96545de2-ed06-4e4e-9102-37e56cbd6cdb-logging-loki-ingester-http\") pod \"logging-loki-ingester-0\" (UID: \"96545de2-ed06-4e4e-9102-37e56cbd6cdb\") " pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.669416 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/96545de2-ed06-4e4e-9102-37e56cbd6cdb-logging-loki-s3\") pod \"logging-loki-ingester-0\" (UID: \"96545de2-ed06-4e4e-9102-37e56cbd6cdb\") " pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.671413 4973 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.671464 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-bf5ede3a-a87c-43a2-9ec0-f381b2fbae3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bf5ede3a-a87c-43a2-9ec0-f381b2fbae3b\") pod \"logging-loki-ingester-0\" (UID: \"96545de2-ed06-4e4e-9102-37e56cbd6cdb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ae49e500c360d4e319bfb7ee425c19fecf852e629884e9a50dec678b3bc4c63a/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.672113 4973 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.672419 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ab04496e-b517-41ee-a601-b448031391e5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab04496e-b517-41ee-a601-b448031391e5\") pod \"logging-loki-ingester-0\" (UID: \"96545de2-ed06-4e4e-9102-37e56cbd6cdb\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/67ea932f5711fcbd22df7971544ac3e94addf2c454f9d8a2e97744f055020e1a/globalmount\"" pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.672465 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9"] Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.688464 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktqp6\" (UniqueName: \"kubernetes.io/projected/96545de2-ed06-4e4e-9102-37e56cbd6cdb-kube-api-access-ktqp6\") pod \"logging-loki-ingester-0\" (UID: \"96545de2-ed06-4e4e-9102-37e56cbd6cdb\") " pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.708404 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ab04496e-b517-41ee-a601-b448031391e5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ab04496e-b517-41ee-a601-b448031391e5\") pod \"logging-loki-ingester-0\" (UID: \"96545de2-ed06-4e4e-9102-37e56cbd6cdb\") " pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.719257 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r"] Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.719290 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-bf5ede3a-a87c-43a2-9ec0-f381b2fbae3b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-bf5ede3a-a87c-43a2-9ec0-f381b2fbae3b\") pod \"logging-loki-ingester-0\" (UID: \"96545de2-ed06-4e4e-9102-37e56cbd6cdb\") " pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: W0320 13:36:35.721995 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9ff158ae_7281_4d5c_95cf_ff14e136c414.slice/crio-1dd9ff6e7fd0f65469887fe5d352e04b58d57a697f36b5528c244fe0e3da2678 WatchSource:0}: Error finding container 1dd9ff6e7fd0f65469887fe5d352e04b58d57a697f36b5528c244fe0e3da2678: Status 404 returned error can't find the container with id 1dd9ff6e7fd0f65469887fe5d352e04b58d57a697f36b5528c244fe0e3da2678 Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.749297 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" event={"ID":"d4e8002e-56ed-40a4-a768-9fd6a44d891c","Type":"ContainerStarted","Data":"92f34c1f109102c14db6a30b1a55ddb8ea58c8fb572d20e526fc20635c5af456"} Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.750472 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" event={"ID":"9ff158ae-7281-4d5c-95cf-ff14e136c414","Type":"ContainerStarted","Data":"1dd9ff6e7fd0f65469887fe5d352e04b58d57a697f36b5528c244fe0e3da2678"} Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.751448 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-9c6b6d984-b5mdp" event={"ID":"870a7fc7-0aac-45df-857e-dba72c60f80a","Type":"ContainerStarted","Data":"29b722d9d72044b0b79f0b82d69884883e5a213103fb4c366534898a8db54f2b"} Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.752480 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-ssp8t" event={"ID":"d08579c3-9bb3-4b06-a613-0b81a2d7fb44","Type":"ContainerStarted","Data":"1c2113654ca44b9dc2b9344d2fd3bcd073d7a2181c1c211cb1c3377ded555197"} Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.753385 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g" event={"ID":"727aa0fc-f2ea-4183-a168-24918669937b","Type":"ContainerStarted","Data":"3b45e2e9d25857cf8c9ca73b6d4cfac4eff998762f90f80a14dcaa8b916c14cf"} Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.759208 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fef9a921-6e35-4927-87b4-2741b40a3ab8-config\") pod \"logging-loki-compactor-0\" (UID: \"fef9a921-6e35-4927-87b4-2741b40a3ab8\") " pod="openshift-logging/logging-loki-compactor-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.759305 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7fa5cfce-66c5-41cb-8963-ccca3ea48fa5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7fa5cfce-66c5-41cb-8963-ccca3ea48fa5\") pod \"logging-loki-index-gateway-0\" (UID: \"28abf398-d023-4625-b8f5-42db7c452df8\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.759389 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28abf398-d023-4625-b8f5-42db7c452df8-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"28abf398-d023-4625-b8f5-42db7c452df8\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.759413 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/fef9a921-6e35-4927-87b4-2741b40a3ab8-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"fef9a921-6e35-4927-87b4-2741b40a3ab8\") " pod="openshift-logging/logging-loki-compactor-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.759483 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/28abf398-d023-4625-b8f5-42db7c452df8-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"28abf398-d023-4625-b8f5-42db7c452df8\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.759540 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-77a0386b-1117-49ee-965c-203bb8cab7f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-77a0386b-1117-49ee-965c-203bb8cab7f6\") pod \"logging-loki-compactor-0\" (UID: \"fef9a921-6e35-4927-87b4-2741b40a3ab8\") " pod="openshift-logging/logging-loki-compactor-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.759567 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/28abf398-d023-4625-b8f5-42db7c452df8-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"28abf398-d023-4625-b8f5-42db7c452df8\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.759626 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqvj5\" (UniqueName: \"kubernetes.io/projected/28abf398-d023-4625-b8f5-42db7c452df8-kube-api-access-mqvj5\") pod \"logging-loki-index-gateway-0\" (UID: \"28abf398-d023-4625-b8f5-42db7c452df8\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.759651 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s8ch\" (UniqueName: \"kubernetes.io/projected/fef9a921-6e35-4927-87b4-2741b40a3ab8-kube-api-access-7s8ch\") pod \"logging-loki-compactor-0\" (UID: \"fef9a921-6e35-4927-87b4-2741b40a3ab8\") " pod="openshift-logging/logging-loki-compactor-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.759675 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/fef9a921-6e35-4927-87b4-2741b40a3ab8-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"fef9a921-6e35-4927-87b4-2741b40a3ab8\") " pod="openshift-logging/logging-loki-compactor-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.759704 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/28abf398-d023-4625-b8f5-42db7c452df8-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"28abf398-d023-4625-b8f5-42db7c452df8\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.759724 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28abf398-d023-4625-b8f5-42db7c452df8-config\") pod \"logging-loki-index-gateway-0\" (UID: \"28abf398-d023-4625-b8f5-42db7c452df8\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.759757 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fef9a921-6e35-4927-87b4-2741b40a3ab8-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"fef9a921-6e35-4927-87b4-2741b40a3ab8\") " pod="openshift-logging/logging-loki-compactor-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.759785 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/fef9a921-6e35-4927-87b4-2741b40a3ab8-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"fef9a921-6e35-4927-87b4-2741b40a3ab8\") " pod="openshift-logging/logging-loki-compactor-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.761111 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fef9a921-6e35-4927-87b4-2741b40a3ab8-logging-loki-ca-bundle\") pod \"logging-loki-compactor-0\" (UID: \"fef9a921-6e35-4927-87b4-2741b40a3ab8\") " pod="openshift-logging/logging-loki-compactor-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.761214 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fef9a921-6e35-4927-87b4-2741b40a3ab8-config\") pod \"logging-loki-compactor-0\" (UID: \"fef9a921-6e35-4927-87b4-2741b40a3ab8\") " pod="openshift-logging/logging-loki-compactor-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.762054 4973 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.762083 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-77a0386b-1117-49ee-965c-203bb8cab7f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-77a0386b-1117-49ee-965c-203bb8cab7f6\") pod \"logging-loki-compactor-0\" (UID: \"fef9a921-6e35-4927-87b4-2741b40a3ab8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/c3871c32113d45a1e4873db9b0e577e6ce0c922268312f7010ee95be8dd3fedd/globalmount\"" pod="openshift-logging/logging-loki-compactor-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.766749 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-grpc\" (UniqueName: \"kubernetes.io/secret/fef9a921-6e35-4927-87b4-2741b40a3ab8-logging-loki-compactor-grpc\") pod \"logging-loki-compactor-0\" (UID: \"fef9a921-6e35-4927-87b4-2741b40a3ab8\") " pod="openshift-logging/logging-loki-compactor-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.768921 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-compactor-http\" (UniqueName: \"kubernetes.io/secret/fef9a921-6e35-4927-87b4-2741b40a3ab8-logging-loki-compactor-http\") pod \"logging-loki-compactor-0\" (UID: \"fef9a921-6e35-4927-87b4-2741b40a3ab8\") " pod="openshift-logging/logging-loki-compactor-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.768991 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/fef9a921-6e35-4927-87b4-2741b40a3ab8-logging-loki-s3\") pod \"logging-loki-compactor-0\" (UID: \"fef9a921-6e35-4927-87b4-2741b40a3ab8\") " pod="openshift-logging/logging-loki-compactor-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.781690 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s8ch\" (UniqueName: \"kubernetes.io/projected/fef9a921-6e35-4927-87b4-2741b40a3ab8-kube-api-access-7s8ch\") pod \"logging-loki-compactor-0\" (UID: \"fef9a921-6e35-4927-87b4-2741b40a3ab8\") " pod="openshift-logging/logging-loki-compactor-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.787539 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-77a0386b-1117-49ee-965c-203bb8cab7f6\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-77a0386b-1117-49ee-965c-203bb8cab7f6\") pod \"logging-loki-compactor-0\" (UID: \"fef9a921-6e35-4927-87b4-2741b40a3ab8\") " pod="openshift-logging/logging-loki-compactor-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.794803 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.864866 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28abf398-d023-4625-b8f5-42db7c452df8-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"28abf398-d023-4625-b8f5-42db7c452df8\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.864926 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/28abf398-d023-4625-b8f5-42db7c452df8-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"28abf398-d023-4625-b8f5-42db7c452df8\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.864994 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/28abf398-d023-4625-b8f5-42db7c452df8-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"28abf398-d023-4625-b8f5-42db7c452df8\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.865176 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqvj5\" (UniqueName: \"kubernetes.io/projected/28abf398-d023-4625-b8f5-42db7c452df8-kube-api-access-mqvj5\") pod \"logging-loki-index-gateway-0\" (UID: \"28abf398-d023-4625-b8f5-42db7c452df8\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.865640 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/28abf398-d023-4625-b8f5-42db7c452df8-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"28abf398-d023-4625-b8f5-42db7c452df8\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.865658 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28abf398-d023-4625-b8f5-42db7c452df8-config\") pod \"logging-loki-index-gateway-0\" (UID: \"28abf398-d023-4625-b8f5-42db7c452df8\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.865811 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7fa5cfce-66c5-41cb-8963-ccca3ea48fa5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7fa5cfce-66c5-41cb-8963-ccca3ea48fa5\") pod \"logging-loki-index-gateway-0\" (UID: \"28abf398-d023-4625-b8f5-42db7c452df8\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.866893 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28abf398-d023-4625-b8f5-42db7c452df8-config\") pod \"logging-loki-index-gateway-0\" (UID: \"28abf398-d023-4625-b8f5-42db7c452df8\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.867434 4973 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.867456 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7fa5cfce-66c5-41cb-8963-ccca3ea48fa5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7fa5cfce-66c5-41cb-8963-ccca3ea48fa5\") pod \"logging-loki-index-gateway-0\" (UID: \"28abf398-d023-4625-b8f5-42db7c452df8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/83d2d33faf1d3795ebec9d02b2009decdac1fa7e8e83e396ddc6fc621bb7f96d/globalmount\"" pod="openshift-logging/logging-loki-index-gateway-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.868004 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-grpc\" (UniqueName: \"kubernetes.io/secret/28abf398-d023-4625-b8f5-42db7c452df8-logging-loki-index-gateway-grpc\") pod \"logging-loki-index-gateway-0\" (UID: \"28abf398-d023-4625-b8f5-42db7c452df8\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.868454 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28abf398-d023-4625-b8f5-42db7c452df8-logging-loki-ca-bundle\") pod \"logging-loki-index-gateway-0\" (UID: \"28abf398-d023-4625-b8f5-42db7c452df8\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.869228 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-s3\" (UniqueName: \"kubernetes.io/secret/28abf398-d023-4625-b8f5-42db7c452df8-logging-loki-s3\") pod \"logging-loki-index-gateway-0\" (UID: \"28abf398-d023-4625-b8f5-42db7c452df8\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.869399 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-loki-index-gateway-http\" (UniqueName: \"kubernetes.io/secret/28abf398-d023-4625-b8f5-42db7c452df8-logging-loki-index-gateway-http\") pod \"logging-loki-index-gateway-0\" (UID: \"28abf398-d023-4625-b8f5-42db7c452df8\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.891501 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7fa5cfce-66c5-41cb-8963-ccca3ea48fa5\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7fa5cfce-66c5-41cb-8963-ccca3ea48fa5\") pod \"logging-loki-index-gateway-0\" (UID: \"28abf398-d023-4625-b8f5-42db7c452df8\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.892819 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqvj5\" (UniqueName: \"kubernetes.io/projected/28abf398-d023-4625-b8f5-42db7c452df8-kube-api-access-mqvj5\") pod \"logging-loki-index-gateway-0\" (UID: \"28abf398-d023-4625-b8f5-42db7c452df8\") " pod="openshift-logging/logging-loki-index-gateway-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.942933 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-compactor-0" Mar 20 13:36:35 crc kubenswrapper[4973]: I0320 13:36:35.955630 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/logging-loki-index-gateway-0" Mar 20 13:36:36 crc kubenswrapper[4973]: I0320 13:36:36.098622 4973 scope.go:117] "RemoveContainer" containerID="bb9c5df0c614ab19b805a339ea8af161e279182509f169ff90df7a81cf3ba345" Mar 20 13:36:36 crc kubenswrapper[4973]: W0320 13:36:36.258529 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod96545de2_ed06_4e4e_9102_37e56cbd6cdb.slice/crio-4024dc903815f5c384608b4edde70e79569d61ea27fd90f586869831eaed7a8b WatchSource:0}: Error finding container 4024dc903815f5c384608b4edde70e79569d61ea27fd90f586869831eaed7a8b: Status 404 returned error can't find the container with id 4024dc903815f5c384608b4edde70e79569d61ea27fd90f586869831eaed7a8b Mar 20 13:36:36 crc kubenswrapper[4973]: I0320 13:36:36.260041 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-ingester-0"] Mar 20 13:36:36 crc kubenswrapper[4973]: I0320 13:36:36.364635 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-index-gateway-0"] Mar 20 13:36:36 crc kubenswrapper[4973]: I0320 13:36:36.426742 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/logging-loki-compactor-0"] Mar 20 13:36:36 crc kubenswrapper[4973]: W0320 13:36:36.430722 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfef9a921_6e35_4927_87b4_2741b40a3ab8.slice/crio-4a58b16126b775cf8a1a56b846e14ad5362ccb51ea04164f86659e3b2242e1e1 WatchSource:0}: Error finding container 4a58b16126b775cf8a1a56b846e14ad5362ccb51ea04164f86659e3b2242e1e1: Status 404 returned error can't find the container with id 4a58b16126b775cf8a1a56b846e14ad5362ccb51ea04164f86659e3b2242e1e1 Mar 20 13:36:36 crc kubenswrapper[4973]: I0320 13:36:36.762663 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"96545de2-ed06-4e4e-9102-37e56cbd6cdb","Type":"ContainerStarted","Data":"4024dc903815f5c384608b4edde70e79569d61ea27fd90f586869831eaed7a8b"} Mar 20 13:36:36 crc kubenswrapper[4973]: I0320 13:36:36.764682 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"28abf398-d023-4625-b8f5-42db7c452df8","Type":"ContainerStarted","Data":"c190e4dc45dee7ee488184707330fa79c3fa0573427a6675bdd62fce50613fee"} Mar 20 13:36:36 crc kubenswrapper[4973]: I0320 13:36:36.765702 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"fef9a921-6e35-4927-87b4-2741b40a3ab8","Type":"ContainerStarted","Data":"4a58b16126b775cf8a1a56b846e14ad5362ccb51ea04164f86659e3b2242e1e1"} Mar 20 13:36:40 crc kubenswrapper[4973]: I0320 13:36:40.795632 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-ssp8t" event={"ID":"d08579c3-9bb3-4b06-a613-0b81a2d7fb44","Type":"ContainerStarted","Data":"ccfd475905dd797d8e9c4053eade5558cf40684c29add407f53f36acf7e3cea3"} Mar 20 13:36:40 crc kubenswrapper[4973]: I0320 13:36:40.797268 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-ssp8t" Mar 20 13:36:40 crc kubenswrapper[4973]: I0320 13:36:40.799956 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-compactor-0" event={"ID":"fef9a921-6e35-4927-87b4-2741b40a3ab8","Type":"ContainerStarted","Data":"4024373372fb272cef655d4f609640567c624d053abd6d5bd211816d0a388f85"} Mar 20 13:36:40 crc kubenswrapper[4973]: I0320 13:36:40.800153 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-compactor-0" Mar 20 13:36:40 crc kubenswrapper[4973]: I0320 13:36:40.802736 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g" event={"ID":"727aa0fc-f2ea-4183-a168-24918669937b","Type":"ContainerStarted","Data":"1215a7756896ce2cec5eac19a09b5c1e82364d1937d65cb616e67f47685577a4"} Mar 20 13:36:40 crc kubenswrapper[4973]: I0320 13:36:40.802876 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g" Mar 20 13:36:40 crc kubenswrapper[4973]: I0320 13:36:40.804536 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" event={"ID":"d4e8002e-56ed-40a4-a768-9fd6a44d891c","Type":"ContainerStarted","Data":"31840f65254f4e847c070aa89cf49944f4913f4380ff2eb1a16134f6ca5cff28"} Mar 20 13:36:40 crc kubenswrapper[4973]: I0320 13:36:40.806017 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" event={"ID":"9ff158ae-7281-4d5c-95cf-ff14e136c414","Type":"ContainerStarted","Data":"0c71fd29e943011b8d1ab93ab7038e024b2b5a2a64b79e802b927d61fb69d188"} Mar 20 13:36:40 crc kubenswrapper[4973]: I0320 13:36:40.807463 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-ingester-0" event={"ID":"96545de2-ed06-4e4e-9102-37e56cbd6cdb","Type":"ContainerStarted","Data":"934380265d5b6d5b3db2fa8b3094512ffe7118a89d9f777e83b476d67a8aec1c"} Mar 20 13:36:40 crc kubenswrapper[4973]: I0320 13:36:40.807548 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:36:40 crc kubenswrapper[4973]: I0320 13:36:40.809637 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-index-gateway-0" event={"ID":"28abf398-d023-4625-b8f5-42db7c452df8","Type":"ContainerStarted","Data":"d68bb8fc9fd2ea9bf5b09d3ba3bef1411fe1032afcd73edaa76f2281a0ee1a06"} Mar 20 13:36:40 crc kubenswrapper[4973]: I0320 13:36:40.809735 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-index-gateway-0" Mar 20 13:36:40 crc kubenswrapper[4973]: I0320 13:36:40.811036 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-distributor-9c6b6d984-b5mdp" event={"ID":"870a7fc7-0aac-45df-857e-dba72c60f80a","Type":"ContainerStarted","Data":"6881e4183fa61612f042accc0a3f960cc9cd350c8c383b9f0ad5600a2f382393"} Mar 20 13:36:40 crc kubenswrapper[4973]: I0320 13:36:40.811091 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-distributor-9c6b6d984-b5mdp" Mar 20 13:36:40 crc kubenswrapper[4973]: I0320 13:36:40.823839 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-ssp8t" podStartSLOduration=2.045753226 podStartE2EDuration="6.823819006s" podCreationTimestamp="2026-03-20 13:36:34 +0000 UTC" firstStartedPulling="2026-03-20 13:36:35.592644022 +0000 UTC m=+916.336313766" lastFinishedPulling="2026-03-20 13:36:40.370709802 +0000 UTC m=+921.114379546" observedRunningTime="2026-03-20 13:36:40.820748751 +0000 UTC m=+921.564418495" watchObservedRunningTime="2026-03-20 13:36:40.823819006 +0000 UTC m=+921.567488750" Mar 20 13:36:40 crc kubenswrapper[4973]: I0320 13:36:40.845489 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-compactor-0" podStartSLOduration=2.964578117 podStartE2EDuration="6.8454745s" podCreationTimestamp="2026-03-20 13:36:34 +0000 UTC" firstStartedPulling="2026-03-20 13:36:36.437625956 +0000 UTC m=+917.181295700" lastFinishedPulling="2026-03-20 13:36:40.318522339 +0000 UTC m=+921.062192083" observedRunningTime="2026-03-20 13:36:40.843539175 +0000 UTC m=+921.587208939" watchObservedRunningTime="2026-03-20 13:36:40.8454745 +0000 UTC m=+921.589144244" Mar 20 13:36:40 crc kubenswrapper[4973]: I0320 13:36:40.866939 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-index-gateway-0" podStartSLOduration=2.892303854 podStartE2EDuration="6.866923847s" podCreationTimestamp="2026-03-20 13:36:34 +0000 UTC" firstStartedPulling="2026-03-20 13:36:36.372209195 +0000 UTC m=+917.115878939" lastFinishedPulling="2026-03-20 13:36:40.346829188 +0000 UTC m=+921.090498932" observedRunningTime="2026-03-20 13:36:40.861483765 +0000 UTC m=+921.605153509" watchObservedRunningTime="2026-03-20 13:36:40.866923847 +0000 UTC m=+921.610593591" Mar 20 13:36:40 crc kubenswrapper[4973]: I0320 13:36:40.886864 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g" podStartSLOduration=2.093084915 podStartE2EDuration="6.886847401s" podCreationTimestamp="2026-03-20 13:36:34 +0000 UTC" firstStartedPulling="2026-03-20 13:36:35.461666466 +0000 UTC m=+916.205336210" lastFinishedPulling="2026-03-20 13:36:40.255428952 +0000 UTC m=+920.999098696" observedRunningTime="2026-03-20 13:36:40.882725436 +0000 UTC m=+921.626395190" watchObservedRunningTime="2026-03-20 13:36:40.886847401 +0000 UTC m=+921.630517145" Mar 20 13:36:40 crc kubenswrapper[4973]: I0320 13:36:40.903684 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-ingester-0" podStartSLOduration=2.794000296 podStartE2EDuration="6.903665379s" podCreationTimestamp="2026-03-20 13:36:34 +0000 UTC" firstStartedPulling="2026-03-20 13:36:36.260949347 +0000 UTC m=+917.004619091" lastFinishedPulling="2026-03-20 13:36:40.37061443 +0000 UTC m=+921.114284174" observedRunningTime="2026-03-20 13:36:40.900838751 +0000 UTC m=+921.644508505" watchObservedRunningTime="2026-03-20 13:36:40.903665379 +0000 UTC m=+921.647335123" Mar 20 13:36:40 crc kubenswrapper[4973]: I0320 13:36:40.926985 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-distributor-9c6b6d984-b5mdp" podStartSLOduration=1.822408118 podStartE2EDuration="6.926966238s" podCreationTimestamp="2026-03-20 13:36:34 +0000 UTC" firstStartedPulling="2026-03-20 13:36:35.213390763 +0000 UTC m=+915.957060507" lastFinishedPulling="2026-03-20 13:36:40.317948883 +0000 UTC m=+921.061618627" observedRunningTime="2026-03-20 13:36:40.921050424 +0000 UTC m=+921.664720178" watchObservedRunningTime="2026-03-20 13:36:40.926966238 +0000 UTC m=+921.670635982" Mar 20 13:36:42 crc kubenswrapper[4973]: I0320 13:36:42.825715 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" event={"ID":"d4e8002e-56ed-40a4-a768-9fd6a44d891c","Type":"ContainerStarted","Data":"51faa7972254c4d122221c850a14988fa9949398d2343aa0f15ab7e7e087fc69"} Mar 20 13:36:42 crc kubenswrapper[4973]: I0320 13:36:42.826038 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:42 crc kubenswrapper[4973]: I0320 13:36:42.826064 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:42 crc kubenswrapper[4973]: I0320 13:36:42.828610 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" event={"ID":"9ff158ae-7281-4d5c-95cf-ff14e136c414","Type":"ContainerStarted","Data":"aa1cfe5e773c88c44fde4a3bf2d2696a86916344f99a1079e8501d358e2aa337"} Mar 20 13:36:42 crc kubenswrapper[4973]: I0320 13:36:42.828903 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:42 crc kubenswrapper[4973]: I0320 13:36:42.838111 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:42 crc kubenswrapper[4973]: I0320 13:36:42.840622 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:42 crc kubenswrapper[4973]: I0320 13:36:42.841440 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" Mar 20 13:36:42 crc kubenswrapper[4973]: I0320 13:36:42.848766 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" podStartSLOduration=2.104326687 podStartE2EDuration="8.848747939s" podCreationTimestamp="2026-03-20 13:36:34 +0000 UTC" firstStartedPulling="2026-03-20 13:36:35.684564551 +0000 UTC m=+916.428234295" lastFinishedPulling="2026-03-20 13:36:42.428985803 +0000 UTC m=+923.172655547" observedRunningTime="2026-03-20 13:36:42.84484522 +0000 UTC m=+923.588514964" watchObservedRunningTime="2026-03-20 13:36:42.848747939 +0000 UTC m=+923.592417683" Mar 20 13:36:42 crc kubenswrapper[4973]: I0320 13:36:42.888794 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" podStartSLOduration=2.179770927 podStartE2EDuration="8.888779833s" podCreationTimestamp="2026-03-20 13:36:34 +0000 UTC" firstStartedPulling="2026-03-20 13:36:35.724291537 +0000 UTC m=+916.467961281" lastFinishedPulling="2026-03-20 13:36:42.433300443 +0000 UTC m=+923.176970187" observedRunningTime="2026-03-20 13:36:42.88687977 +0000 UTC m=+923.630549514" watchObservedRunningTime="2026-03-20 13:36:42.888779833 +0000 UTC m=+923.632449567" Mar 20 13:36:43 crc kubenswrapper[4973]: I0320 13:36:43.838153 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:43 crc kubenswrapper[4973]: I0320 13:36:43.848131 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" Mar 20 13:36:55 crc kubenswrapper[4973]: I0320 13:36:55.803187 4973 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Mar 20 13:36:55 crc kubenswrapper[4973]: I0320 13:36:55.803867 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="96545de2-ed06-4e4e-9102-37e56cbd6cdb" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 13:36:55 crc kubenswrapper[4973]: I0320 13:36:55.949563 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-compactor-0" Mar 20 13:36:55 crc kubenswrapper[4973]: I0320 13:36:55.964539 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-index-gateway-0" Mar 20 13:37:04 crc kubenswrapper[4973]: I0320 13:37:04.666000 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-distributor-9c6b6d984-b5mdp" Mar 20 13:37:04 crc kubenswrapper[4973]: I0320 13:37:04.874551 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g" Mar 20 13:37:04 crc kubenswrapper[4973]: I0320 13:37:04.909760 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-ssp8t" Mar 20 13:37:05 crc kubenswrapper[4973]: I0320 13:37:05.806663 4973 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: this instance owns no tokens Mar 20 13:37:05 crc kubenswrapper[4973]: I0320 13:37:05.807240 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="96545de2-ed06-4e4e-9102-37e56cbd6cdb" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 13:37:15 crc kubenswrapper[4973]: I0320 13:37:15.800618 4973 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Mar 20 13:37:15 crc kubenswrapper[4973]: I0320 13:37:15.801240 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="96545de2-ed06-4e4e-9102-37e56cbd6cdb" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 13:37:25 crc kubenswrapper[4973]: I0320 13:37:25.798528 4973 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body=Ingester not ready: waiting for 15s after being ready Mar 20 13:37:25 crc kubenswrapper[4973]: I0320 13:37:25.799073 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="96545de2-ed06-4e4e-9102-37e56cbd6cdb" containerName="loki-ingester" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 13:37:35 crc kubenswrapper[4973]: I0320 13:37:35.800958 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-logging/logging-loki-ingester-0" Mar 20 13:37:43 crc kubenswrapper[4973]: I0320 13:37:43.320327 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:37:43 crc kubenswrapper[4973]: I0320 13:37:43.321050 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.111853 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-499c4"] Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.113993 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.123278 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-vdbmb" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.123854 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.124885 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.125788 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-499c4"] Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.138232 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.138700 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.158088 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.168498 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-499c4"] Mar 20 13:37:53 crc kubenswrapper[4973]: E0320 13:37:53.170216 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-4w9xs metrics sa-token tmp trusted-ca], unattached volumes=[], failed to process volumes=[collector-syslog-receiver collector-token config config-openshift-service-cacrt datadir entrypoint kube-api-access-4w9xs metrics sa-token tmp trusted-ca]: context canceled" pod="openshift-logging/collector-499c4" podUID="dd123f7c-f331-450b-9901-f4498ba7de60" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.294399 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/dd123f7c-f331-450b-9901-f4498ba7de60-datadir\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.294450 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/dd123f7c-f331-450b-9901-f4498ba7de60-collector-token\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.294467 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd123f7c-f331-450b-9901-f4498ba7de60-config\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.294485 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/dd123f7c-f331-450b-9901-f4498ba7de60-sa-token\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.294732 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/dd123f7c-f331-450b-9901-f4498ba7de60-entrypoint\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.294811 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w9xs\" (UniqueName: \"kubernetes.io/projected/dd123f7c-f331-450b-9901-f4498ba7de60-kube-api-access-4w9xs\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.294836 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/dd123f7c-f331-450b-9901-f4498ba7de60-config-openshift-service-cacrt\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.294886 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/dd123f7c-f331-450b-9901-f4498ba7de60-metrics\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.294983 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dd123f7c-f331-450b-9901-f4498ba7de60-tmp\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.295045 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd123f7c-f331-450b-9901-f4498ba7de60-trusted-ca\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.295072 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/dd123f7c-f331-450b-9901-f4498ba7de60-collector-syslog-receiver\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.353249 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.360899 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.396635 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/dd123f7c-f331-450b-9901-f4498ba7de60-datadir\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.396696 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/dd123f7c-f331-450b-9901-f4498ba7de60-collector-token\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.396725 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd123f7c-f331-450b-9901-f4498ba7de60-config\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.396748 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/dd123f7c-f331-450b-9901-f4498ba7de60-sa-token\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.396757 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/dd123f7c-f331-450b-9901-f4498ba7de60-datadir\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.396834 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/dd123f7c-f331-450b-9901-f4498ba7de60-entrypoint\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.396894 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w9xs\" (UniqueName: \"kubernetes.io/projected/dd123f7c-f331-450b-9901-f4498ba7de60-kube-api-access-4w9xs\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.396916 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/dd123f7c-f331-450b-9901-f4498ba7de60-config-openshift-service-cacrt\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.396935 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/dd123f7c-f331-450b-9901-f4498ba7de60-metrics\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.397016 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dd123f7c-f331-450b-9901-f4498ba7de60-tmp\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.397063 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd123f7c-f331-450b-9901-f4498ba7de60-trusted-ca\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.397086 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/dd123f7c-f331-450b-9901-f4498ba7de60-collector-syslog-receiver\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: E0320 13:37:53.397297 4973 secret.go:188] Couldn't get secret openshift-logging/collector-syslog-receiver: secret "collector-syslog-receiver" not found Mar 20 13:37:53 crc kubenswrapper[4973]: E0320 13:37:53.397358 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd123f7c-f331-450b-9901-f4498ba7de60-collector-syslog-receiver podName:dd123f7c-f331-450b-9901-f4498ba7de60 nodeName:}" failed. No retries permitted until 2026-03-20 13:37:53.897326527 +0000 UTC m=+994.640996271 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "collector-syslog-receiver" (UniqueName: "kubernetes.io/secret/dd123f7c-f331-450b-9901-f4498ba7de60-collector-syslog-receiver") pod "collector-499c4" (UID: "dd123f7c-f331-450b-9901-f4498ba7de60") : secret "collector-syslog-receiver" not found Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.398085 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd123f7c-f331-450b-9901-f4498ba7de60-config\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.398151 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/dd123f7c-f331-450b-9901-f4498ba7de60-entrypoint\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.398912 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/dd123f7c-f331-450b-9901-f4498ba7de60-config-openshift-service-cacrt\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.399295 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd123f7c-f331-450b-9901-f4498ba7de60-trusted-ca\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.406143 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/dd123f7c-f331-450b-9901-f4498ba7de60-metrics\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.406963 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dd123f7c-f331-450b-9901-f4498ba7de60-tmp\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.413414 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/dd123f7c-f331-450b-9901-f4498ba7de60-collector-token\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.413937 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/dd123f7c-f331-450b-9901-f4498ba7de60-sa-token\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.415644 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w9xs\" (UniqueName: \"kubernetes.io/projected/dd123f7c-f331-450b-9901-f4498ba7de60-kube-api-access-4w9xs\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.498235 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/dd123f7c-f331-450b-9901-f4498ba7de60-entrypoint\") pod \"dd123f7c-f331-450b-9901-f4498ba7de60\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.498368 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/dd123f7c-f331-450b-9901-f4498ba7de60-sa-token\") pod \"dd123f7c-f331-450b-9901-f4498ba7de60\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.498432 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd123f7c-f331-450b-9901-f4498ba7de60-config\") pod \"dd123f7c-f331-450b-9901-f4498ba7de60\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.498468 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/dd123f7c-f331-450b-9901-f4498ba7de60-config-openshift-service-cacrt\") pod \"dd123f7c-f331-450b-9901-f4498ba7de60\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.498507 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/dd123f7c-f331-450b-9901-f4498ba7de60-datadir\") pod \"dd123f7c-f331-450b-9901-f4498ba7de60\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.498531 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dd123f7c-f331-450b-9901-f4498ba7de60-tmp\") pod \"dd123f7c-f331-450b-9901-f4498ba7de60\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.498551 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/dd123f7c-f331-450b-9901-f4498ba7de60-metrics\") pod \"dd123f7c-f331-450b-9901-f4498ba7de60\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.498632 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd123f7c-f331-450b-9901-f4498ba7de60-trusted-ca\") pod \"dd123f7c-f331-450b-9901-f4498ba7de60\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.498662 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/dd123f7c-f331-450b-9901-f4498ba7de60-collector-token\") pod \"dd123f7c-f331-450b-9901-f4498ba7de60\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.498681 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w9xs\" (UniqueName: \"kubernetes.io/projected/dd123f7c-f331-450b-9901-f4498ba7de60-kube-api-access-4w9xs\") pod \"dd123f7c-f331-450b-9901-f4498ba7de60\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.498809 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd123f7c-f331-450b-9901-f4498ba7de60-entrypoint" (OuterVolumeSpecName: "entrypoint") pod "dd123f7c-f331-450b-9901-f4498ba7de60" (UID: "dd123f7c-f331-450b-9901-f4498ba7de60"). InnerVolumeSpecName "entrypoint". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.499113 4973 reconciler_common.go:293] "Volume detached for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/dd123f7c-f331-450b-9901-f4498ba7de60-entrypoint\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.499353 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd123f7c-f331-450b-9901-f4498ba7de60-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "dd123f7c-f331-450b-9901-f4498ba7de60" (UID: "dd123f7c-f331-450b-9901-f4498ba7de60"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.499752 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd123f7c-f331-450b-9901-f4498ba7de60-config-openshift-service-cacrt" (OuterVolumeSpecName: "config-openshift-service-cacrt") pod "dd123f7c-f331-450b-9901-f4498ba7de60" (UID: "dd123f7c-f331-450b-9901-f4498ba7de60"). InnerVolumeSpecName "config-openshift-service-cacrt". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.500074 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd123f7c-f331-450b-9901-f4498ba7de60-config" (OuterVolumeSpecName: "config") pod "dd123f7c-f331-450b-9901-f4498ba7de60" (UID: "dd123f7c-f331-450b-9901-f4498ba7de60"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.500129 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dd123f7c-f331-450b-9901-f4498ba7de60-datadir" (OuterVolumeSpecName: "datadir") pod "dd123f7c-f331-450b-9901-f4498ba7de60" (UID: "dd123f7c-f331-450b-9901-f4498ba7de60"). InnerVolumeSpecName "datadir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.501058 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd123f7c-f331-450b-9901-f4498ba7de60-sa-token" (OuterVolumeSpecName: "sa-token") pod "dd123f7c-f331-450b-9901-f4498ba7de60" (UID: "dd123f7c-f331-450b-9901-f4498ba7de60"). InnerVolumeSpecName "sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.501686 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd123f7c-f331-450b-9901-f4498ba7de60-kube-api-access-4w9xs" (OuterVolumeSpecName: "kube-api-access-4w9xs") pod "dd123f7c-f331-450b-9901-f4498ba7de60" (UID: "dd123f7c-f331-450b-9901-f4498ba7de60"). InnerVolumeSpecName "kube-api-access-4w9xs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.502570 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd123f7c-f331-450b-9901-f4498ba7de60-collector-token" (OuterVolumeSpecName: "collector-token") pod "dd123f7c-f331-450b-9901-f4498ba7de60" (UID: "dd123f7c-f331-450b-9901-f4498ba7de60"). InnerVolumeSpecName "collector-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.503136 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd123f7c-f331-450b-9901-f4498ba7de60-metrics" (OuterVolumeSpecName: "metrics") pod "dd123f7c-f331-450b-9901-f4498ba7de60" (UID: "dd123f7c-f331-450b-9901-f4498ba7de60"). InnerVolumeSpecName "metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.505263 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd123f7c-f331-450b-9901-f4498ba7de60-tmp" (OuterVolumeSpecName: "tmp") pod "dd123f7c-f331-450b-9901-f4498ba7de60" (UID: "dd123f7c-f331-450b-9901-f4498ba7de60"). InnerVolumeSpecName "tmp". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.600808 4973 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dd123f7c-f331-450b-9901-f4498ba7de60-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.600838 4973 reconciler_common.go:293] "Volume detached for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/dd123f7c-f331-450b-9901-f4498ba7de60-collector-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.600848 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w9xs\" (UniqueName: \"kubernetes.io/projected/dd123f7c-f331-450b-9901-f4498ba7de60-kube-api-access-4w9xs\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.600857 4973 reconciler_common.go:293] "Volume detached for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/dd123f7c-f331-450b-9901-f4498ba7de60-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.600865 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dd123f7c-f331-450b-9901-f4498ba7de60-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.600874 4973 reconciler_common.go:293] "Volume detached for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/dd123f7c-f331-450b-9901-f4498ba7de60-config-openshift-service-cacrt\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.600883 4973 reconciler_common.go:293] "Volume detached for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/dd123f7c-f331-450b-9901-f4498ba7de60-datadir\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.600892 4973 reconciler_common.go:293] "Volume detached for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/dd123f7c-f331-450b-9901-f4498ba7de60-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.600900 4973 reconciler_common.go:293] "Volume detached for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/dd123f7c-f331-450b-9901-f4498ba7de60-tmp\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.905142 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/dd123f7c-f331-450b-9901-f4498ba7de60-collector-syslog-receiver\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:53 crc kubenswrapper[4973]: I0320 13:37:53.908454 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/dd123f7c-f331-450b-9901-f4498ba7de60-collector-syslog-receiver\") pod \"collector-499c4\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " pod="openshift-logging/collector-499c4" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.006442 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/dd123f7c-f331-450b-9901-f4498ba7de60-collector-syslog-receiver\") pod \"dd123f7c-f331-450b-9901-f4498ba7de60\" (UID: \"dd123f7c-f331-450b-9901-f4498ba7de60\") " Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.011273 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd123f7c-f331-450b-9901-f4498ba7de60-collector-syslog-receiver" (OuterVolumeSpecName: "collector-syslog-receiver") pod "dd123f7c-f331-450b-9901-f4498ba7de60" (UID: "dd123f7c-f331-450b-9901-f4498ba7de60"). InnerVolumeSpecName "collector-syslog-receiver". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.108623 4973 reconciler_common.go:293] "Volume detached for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/dd123f7c-f331-450b-9901-f4498ba7de60-collector-syslog-receiver\") on node \"crc\" DevicePath \"\"" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.361496 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-499c4" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.437788 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-logging/collector-499c4"] Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.449291 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-logging/collector-499c4"] Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.459161 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-logging/collector-45jbl"] Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.461218 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.465222 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-syslog-receiver" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.465693 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-token" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.465968 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-config" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.466094 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-metrics" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.466261 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-logging"/"collector-dockercfg-vdbmb" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.474421 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-45jbl"] Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.475201 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-logging"/"collector-trustbundle" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.618303 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/03b9e8f1-4452-4677-991f-64048197690b-tmp\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.618386 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/03b9e8f1-4452-4677-991f-64048197690b-collector-token\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.618418 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpznn\" (UniqueName: \"kubernetes.io/projected/03b9e8f1-4452-4677-991f-64048197690b-kube-api-access-cpznn\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.618507 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/03b9e8f1-4452-4677-991f-64048197690b-sa-token\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.618528 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/03b9e8f1-4452-4677-991f-64048197690b-config-openshift-service-cacrt\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.618548 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/03b9e8f1-4452-4677-991f-64048197690b-entrypoint\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.618721 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/03b9e8f1-4452-4677-991f-64048197690b-collector-syslog-receiver\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.618927 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03b9e8f1-4452-4677-991f-64048197690b-trusted-ca\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.619042 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03b9e8f1-4452-4677-991f-64048197690b-config\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.619130 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/03b9e8f1-4452-4677-991f-64048197690b-metrics\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.619175 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/03b9e8f1-4452-4677-991f-64048197690b-datadir\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.721355 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03b9e8f1-4452-4677-991f-64048197690b-config\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.721657 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/03b9e8f1-4452-4677-991f-64048197690b-metrics\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.721753 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/03b9e8f1-4452-4677-991f-64048197690b-datadir\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.721849 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/03b9e8f1-4452-4677-991f-64048197690b-tmp\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.721917 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"datadir\" (UniqueName: \"kubernetes.io/host-path/03b9e8f1-4452-4677-991f-64048197690b-datadir\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.721932 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/03b9e8f1-4452-4677-991f-64048197690b-collector-token\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.722204 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpznn\" (UniqueName: \"kubernetes.io/projected/03b9e8f1-4452-4677-991f-64048197690b-kube-api-access-cpznn\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.722506 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/03b9e8f1-4452-4677-991f-64048197690b-sa-token\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.722652 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/03b9e8f1-4452-4677-991f-64048197690b-config-openshift-service-cacrt\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.722720 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/03b9e8f1-4452-4677-991f-64048197690b-entrypoint\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.722759 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/03b9e8f1-4452-4677-991f-64048197690b-collector-syslog-receiver\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.722875 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03b9e8f1-4452-4677-991f-64048197690b-trusted-ca\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.723221 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/03b9e8f1-4452-4677-991f-64048197690b-config\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.723937 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"entrypoint\" (UniqueName: \"kubernetes.io/configmap/03b9e8f1-4452-4677-991f-64048197690b-entrypoint\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.724567 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/03b9e8f1-4452-4677-991f-64048197690b-trusted-ca\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.724681 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-openshift-service-cacrt\" (UniqueName: \"kubernetes.io/configmap/03b9e8f1-4452-4677-991f-64048197690b-config-openshift-service-cacrt\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.727050 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-syslog-receiver\" (UniqueName: \"kubernetes.io/secret/03b9e8f1-4452-4677-991f-64048197690b-collector-syslog-receiver\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.727643 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"collector-token\" (UniqueName: \"kubernetes.io/secret/03b9e8f1-4452-4677-991f-64048197690b-collector-token\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.733663 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/secret/03b9e8f1-4452-4677-991f-64048197690b-metrics\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.735555 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/03b9e8f1-4452-4677-991f-64048197690b-tmp\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.740656 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpznn\" (UniqueName: \"kubernetes.io/projected/03b9e8f1-4452-4677-991f-64048197690b-kube-api-access-cpznn\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.756959 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sa-token\" (UniqueName: \"kubernetes.io/projected/03b9e8f1-4452-4677-991f-64048197690b-sa-token\") pod \"collector-45jbl\" (UID: \"03b9e8f1-4452-4677-991f-64048197690b\") " pod="openshift-logging/collector-45jbl" Mar 20 13:37:54 crc kubenswrapper[4973]: I0320 13:37:54.780779 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-logging/collector-45jbl" Mar 20 13:37:55 crc kubenswrapper[4973]: I0320 13:37:55.235722 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-logging/collector-45jbl"] Mar 20 13:37:55 crc kubenswrapper[4973]: I0320 13:37:55.368042 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-45jbl" event={"ID":"03b9e8f1-4452-4677-991f-64048197690b","Type":"ContainerStarted","Data":"951c475d04ba027c57a66e45813c2438c03b828da5afee9d824a188721d2f977"} Mar 20 13:37:55 crc kubenswrapper[4973]: I0320 13:37:55.967746 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd123f7c-f331-450b-9901-f4498ba7de60" path="/var/lib/kubelet/pods/dd123f7c-f331-450b-9901-f4498ba7de60/volumes" Mar 20 13:38:00 crc kubenswrapper[4973]: I0320 13:38:00.140987 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566898-tm499"] Mar 20 13:38:00 crc kubenswrapper[4973]: I0320 13:38:00.145219 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566898-tm499" Mar 20 13:38:00 crc kubenswrapper[4973]: I0320 13:38:00.147656 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:38:00 crc kubenswrapper[4973]: I0320 13:38:00.148269 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:38:00 crc kubenswrapper[4973]: I0320 13:38:00.148865 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566898-tm499"] Mar 20 13:38:00 crc kubenswrapper[4973]: I0320 13:38:00.150417 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 13:38:00 crc kubenswrapper[4973]: I0320 13:38:00.336027 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz9c7\" (UniqueName: \"kubernetes.io/projected/7019dcc1-49da-40e5-ae40-80f09d83984d-kube-api-access-xz9c7\") pod \"auto-csr-approver-29566898-tm499\" (UID: \"7019dcc1-49da-40e5-ae40-80f09d83984d\") " pod="openshift-infra/auto-csr-approver-29566898-tm499" Mar 20 13:38:00 crc kubenswrapper[4973]: I0320 13:38:00.437839 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xz9c7\" (UniqueName: \"kubernetes.io/projected/7019dcc1-49da-40e5-ae40-80f09d83984d-kube-api-access-xz9c7\") pod \"auto-csr-approver-29566898-tm499\" (UID: \"7019dcc1-49da-40e5-ae40-80f09d83984d\") " pod="openshift-infra/auto-csr-approver-29566898-tm499" Mar 20 13:38:00 crc kubenswrapper[4973]: I0320 13:38:00.455985 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz9c7\" (UniqueName: \"kubernetes.io/projected/7019dcc1-49da-40e5-ae40-80f09d83984d-kube-api-access-xz9c7\") pod \"auto-csr-approver-29566898-tm499\" (UID: \"7019dcc1-49da-40e5-ae40-80f09d83984d\") " pod="openshift-infra/auto-csr-approver-29566898-tm499" Mar 20 13:38:00 crc kubenswrapper[4973]: I0320 13:38:00.471842 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566898-tm499" Mar 20 13:38:00 crc kubenswrapper[4973]: I0320 13:38:00.586297 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cvzcj"] Mar 20 13:38:00 crc kubenswrapper[4973]: I0320 13:38:00.587677 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cvzcj" Mar 20 13:38:00 crc kubenswrapper[4973]: I0320 13:38:00.597152 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cvzcj"] Mar 20 13:38:00 crc kubenswrapper[4973]: I0320 13:38:00.741799 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd7c44c9-f93f-4620-80a8-ca0c05c0629d-catalog-content\") pod \"certified-operators-cvzcj\" (UID: \"cd7c44c9-f93f-4620-80a8-ca0c05c0629d\") " pod="openshift-marketplace/certified-operators-cvzcj" Mar 20 13:38:00 crc kubenswrapper[4973]: I0320 13:38:00.742510 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd7c44c9-f93f-4620-80a8-ca0c05c0629d-utilities\") pod \"certified-operators-cvzcj\" (UID: \"cd7c44c9-f93f-4620-80a8-ca0c05c0629d\") " pod="openshift-marketplace/certified-operators-cvzcj" Mar 20 13:38:00 crc kubenswrapper[4973]: I0320 13:38:00.742567 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vb2h\" (UniqueName: \"kubernetes.io/projected/cd7c44c9-f93f-4620-80a8-ca0c05c0629d-kube-api-access-9vb2h\") pod \"certified-operators-cvzcj\" (UID: \"cd7c44c9-f93f-4620-80a8-ca0c05c0629d\") " pod="openshift-marketplace/certified-operators-cvzcj" Mar 20 13:38:00 crc kubenswrapper[4973]: I0320 13:38:00.844539 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd7c44c9-f93f-4620-80a8-ca0c05c0629d-catalog-content\") pod \"certified-operators-cvzcj\" (UID: \"cd7c44c9-f93f-4620-80a8-ca0c05c0629d\") " pod="openshift-marketplace/certified-operators-cvzcj" Mar 20 13:38:00 crc kubenswrapper[4973]: I0320 13:38:00.844902 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd7c44c9-f93f-4620-80a8-ca0c05c0629d-utilities\") pod \"certified-operators-cvzcj\" (UID: \"cd7c44c9-f93f-4620-80a8-ca0c05c0629d\") " pod="openshift-marketplace/certified-operators-cvzcj" Mar 20 13:38:00 crc kubenswrapper[4973]: I0320 13:38:00.844924 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vb2h\" (UniqueName: \"kubernetes.io/projected/cd7c44c9-f93f-4620-80a8-ca0c05c0629d-kube-api-access-9vb2h\") pod \"certified-operators-cvzcj\" (UID: \"cd7c44c9-f93f-4620-80a8-ca0c05c0629d\") " pod="openshift-marketplace/certified-operators-cvzcj" Mar 20 13:38:00 crc kubenswrapper[4973]: I0320 13:38:00.845056 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd7c44c9-f93f-4620-80a8-ca0c05c0629d-catalog-content\") pod \"certified-operators-cvzcj\" (UID: \"cd7c44c9-f93f-4620-80a8-ca0c05c0629d\") " pod="openshift-marketplace/certified-operators-cvzcj" Mar 20 13:38:00 crc kubenswrapper[4973]: I0320 13:38:00.845362 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd7c44c9-f93f-4620-80a8-ca0c05c0629d-utilities\") pod \"certified-operators-cvzcj\" (UID: \"cd7c44c9-f93f-4620-80a8-ca0c05c0629d\") " pod="openshift-marketplace/certified-operators-cvzcj" Mar 20 13:38:00 crc kubenswrapper[4973]: I0320 13:38:00.865136 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vb2h\" (UniqueName: \"kubernetes.io/projected/cd7c44c9-f93f-4620-80a8-ca0c05c0629d-kube-api-access-9vb2h\") pod \"certified-operators-cvzcj\" (UID: \"cd7c44c9-f93f-4620-80a8-ca0c05c0629d\") " pod="openshift-marketplace/certified-operators-cvzcj" Mar 20 13:38:00 crc kubenswrapper[4973]: I0320 13:38:00.934521 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cvzcj" Mar 20 13:38:00 crc kubenswrapper[4973]: I0320 13:38:00.963302 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566898-tm499"] Mar 20 13:38:01 crc kubenswrapper[4973]: I0320 13:38:01.263928 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cvzcj"] Mar 20 13:38:01 crc kubenswrapper[4973]: W0320 13:38:01.268350 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd7c44c9_f93f_4620_80a8_ca0c05c0629d.slice/crio-5c728c30e56a9ad1f781ab943c59f2a7063e4af79657eb1a40e424952d295c49 WatchSource:0}: Error finding container 5c728c30e56a9ad1f781ab943c59f2a7063e4af79657eb1a40e424952d295c49: Status 404 returned error can't find the container with id 5c728c30e56a9ad1f781ab943c59f2a7063e4af79657eb1a40e424952d295c49 Mar 20 13:38:01 crc kubenswrapper[4973]: I0320 13:38:01.413977 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566898-tm499" event={"ID":"7019dcc1-49da-40e5-ae40-80f09d83984d","Type":"ContainerStarted","Data":"872cc4e8b65532f3a84cfcda11eb2f84609a64ec9f72d36f9dbce154371f0aa7"} Mar 20 13:38:01 crc kubenswrapper[4973]: I0320 13:38:01.415175 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cvzcj" event={"ID":"cd7c44c9-f93f-4620-80a8-ca0c05c0629d","Type":"ContainerStarted","Data":"e7b46243f3a03ef2cad834c13abd96ae5aab377e51026116c02bec096ddaaa8b"} Mar 20 13:38:01 crc kubenswrapper[4973]: I0320 13:38:01.415204 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cvzcj" event={"ID":"cd7c44c9-f93f-4620-80a8-ca0c05c0629d","Type":"ContainerStarted","Data":"5c728c30e56a9ad1f781ab943c59f2a7063e4af79657eb1a40e424952d295c49"} Mar 20 13:38:02 crc kubenswrapper[4973]: I0320 13:38:02.426806 4973 generic.go:334] "Generic (PLEG): container finished" podID="cd7c44c9-f93f-4620-80a8-ca0c05c0629d" containerID="e7b46243f3a03ef2cad834c13abd96ae5aab377e51026116c02bec096ddaaa8b" exitCode=0 Mar 20 13:38:02 crc kubenswrapper[4973]: I0320 13:38:02.427173 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cvzcj" event={"ID":"cd7c44c9-f93f-4620-80a8-ca0c05c0629d","Type":"ContainerDied","Data":"e7b46243f3a03ef2cad834c13abd96ae5aab377e51026116c02bec096ddaaa8b"} Mar 20 13:38:03 crc kubenswrapper[4973]: I0320 13:38:03.436872 4973 generic.go:334] "Generic (PLEG): container finished" podID="7019dcc1-49da-40e5-ae40-80f09d83984d" containerID="096f3595763b234e154ad2f6f4f5fa489d0bc31073cd4df710239496060c67e4" exitCode=0 Mar 20 13:38:03 crc kubenswrapper[4973]: I0320 13:38:03.436929 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566898-tm499" event={"ID":"7019dcc1-49da-40e5-ae40-80f09d83984d","Type":"ContainerDied","Data":"096f3595763b234e154ad2f6f4f5fa489d0bc31073cd4df710239496060c67e4"} Mar 20 13:38:04 crc kubenswrapper[4973]: I0320 13:38:04.444326 4973 generic.go:334] "Generic (PLEG): container finished" podID="cd7c44c9-f93f-4620-80a8-ca0c05c0629d" containerID="292b985b0718dd112cdee5a91e0302a5f3b54eb1b7d39f73a4e5acdd68bd9978" exitCode=0 Mar 20 13:38:04 crc kubenswrapper[4973]: I0320 13:38:04.444435 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cvzcj" event={"ID":"cd7c44c9-f93f-4620-80a8-ca0c05c0629d","Type":"ContainerDied","Data":"292b985b0718dd112cdee5a91e0302a5f3b54eb1b7d39f73a4e5acdd68bd9978"} Mar 20 13:38:04 crc kubenswrapper[4973]: I0320 13:38:04.759773 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566898-tm499" Mar 20 13:38:04 crc kubenswrapper[4973]: I0320 13:38:04.842934 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xz9c7\" (UniqueName: \"kubernetes.io/projected/7019dcc1-49da-40e5-ae40-80f09d83984d-kube-api-access-xz9c7\") pod \"7019dcc1-49da-40e5-ae40-80f09d83984d\" (UID: \"7019dcc1-49da-40e5-ae40-80f09d83984d\") " Mar 20 13:38:04 crc kubenswrapper[4973]: I0320 13:38:04.858255 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7019dcc1-49da-40e5-ae40-80f09d83984d-kube-api-access-xz9c7" (OuterVolumeSpecName: "kube-api-access-xz9c7") pod "7019dcc1-49da-40e5-ae40-80f09d83984d" (UID: "7019dcc1-49da-40e5-ae40-80f09d83984d"). InnerVolumeSpecName "kube-api-access-xz9c7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:38:04 crc kubenswrapper[4973]: I0320 13:38:04.944078 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xz9c7\" (UniqueName: \"kubernetes.io/projected/7019dcc1-49da-40e5-ae40-80f09d83984d-kube-api-access-xz9c7\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:05 crc kubenswrapper[4973]: I0320 13:38:05.453175 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cvzcj" event={"ID":"cd7c44c9-f93f-4620-80a8-ca0c05c0629d","Type":"ContainerStarted","Data":"8e63db6bc6a18f64e879a25407fe1d2b399779109ae54b6696cd7bc8672a99ad"} Mar 20 13:38:05 crc kubenswrapper[4973]: I0320 13:38:05.454437 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566898-tm499" event={"ID":"7019dcc1-49da-40e5-ae40-80f09d83984d","Type":"ContainerDied","Data":"872cc4e8b65532f3a84cfcda11eb2f84609a64ec9f72d36f9dbce154371f0aa7"} Mar 20 13:38:05 crc kubenswrapper[4973]: I0320 13:38:05.454798 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="872cc4e8b65532f3a84cfcda11eb2f84609a64ec9f72d36f9dbce154371f0aa7" Mar 20 13:38:05 crc kubenswrapper[4973]: I0320 13:38:05.454468 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566898-tm499" Mar 20 13:38:05 crc kubenswrapper[4973]: I0320 13:38:05.476145 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cvzcj" podStartSLOduration=2.97098128 podStartE2EDuration="5.476123113s" podCreationTimestamp="2026-03-20 13:38:00 +0000 UTC" firstStartedPulling="2026-03-20 13:38:02.428819549 +0000 UTC m=+1003.172489303" lastFinishedPulling="2026-03-20 13:38:04.933961392 +0000 UTC m=+1005.677631136" observedRunningTime="2026-03-20 13:38:05.475722973 +0000 UTC m=+1006.219392717" watchObservedRunningTime="2026-03-20 13:38:05.476123113 +0000 UTC m=+1006.219792867" Mar 20 13:38:05 crc kubenswrapper[4973]: I0320 13:38:05.830849 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566892-xxx6r"] Mar 20 13:38:05 crc kubenswrapper[4973]: I0320 13:38:05.838059 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566892-xxx6r"] Mar 20 13:38:05 crc kubenswrapper[4973]: I0320 13:38:05.958539 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51178de5-3e29-40cf-b4c4-05dcbed6ce8c" path="/var/lib/kubelet/pods/51178de5-3e29-40cf-b4c4-05dcbed6ce8c/volumes" Mar 20 13:38:10 crc kubenswrapper[4973]: I0320 13:38:10.935090 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cvzcj" Mar 20 13:38:10 crc kubenswrapper[4973]: I0320 13:38:10.935919 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cvzcj" Mar 20 13:38:10 crc kubenswrapper[4973]: I0320 13:38:10.998352 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cvzcj" Mar 20 13:38:11 crc kubenswrapper[4973]: I0320 13:38:11.575955 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cvzcj" Mar 20 13:38:11 crc kubenswrapper[4973]: I0320 13:38:11.618800 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cvzcj"] Mar 20 13:38:13 crc kubenswrapper[4973]: I0320 13:38:13.320481 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:38:13 crc kubenswrapper[4973]: I0320 13:38:13.320767 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:38:13 crc kubenswrapper[4973]: I0320 13:38:13.520581 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-cvzcj" podUID="cd7c44c9-f93f-4620-80a8-ca0c05c0629d" containerName="registry-server" containerID="cri-o://8e63db6bc6a18f64e879a25407fe1d2b399779109ae54b6696cd7bc8672a99ad" gracePeriod=2 Mar 20 13:38:14 crc kubenswrapper[4973]: I0320 13:38:14.540006 4973 generic.go:334] "Generic (PLEG): container finished" podID="cd7c44c9-f93f-4620-80a8-ca0c05c0629d" containerID="8e63db6bc6a18f64e879a25407fe1d2b399779109ae54b6696cd7bc8672a99ad" exitCode=0 Mar 20 13:38:14 crc kubenswrapper[4973]: I0320 13:38:14.540096 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cvzcj" event={"ID":"cd7c44c9-f93f-4620-80a8-ca0c05c0629d","Type":"ContainerDied","Data":"8e63db6bc6a18f64e879a25407fe1d2b399779109ae54b6696cd7bc8672a99ad"} Mar 20 13:38:14 crc kubenswrapper[4973]: I0320 13:38:14.723771 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cvzcj" Mar 20 13:38:14 crc kubenswrapper[4973]: I0320 13:38:14.914561 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd7c44c9-f93f-4620-80a8-ca0c05c0629d-utilities\") pod \"cd7c44c9-f93f-4620-80a8-ca0c05c0629d\" (UID: \"cd7c44c9-f93f-4620-80a8-ca0c05c0629d\") " Mar 20 13:38:14 crc kubenswrapper[4973]: I0320 13:38:14.914842 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vb2h\" (UniqueName: \"kubernetes.io/projected/cd7c44c9-f93f-4620-80a8-ca0c05c0629d-kube-api-access-9vb2h\") pod \"cd7c44c9-f93f-4620-80a8-ca0c05c0629d\" (UID: \"cd7c44c9-f93f-4620-80a8-ca0c05c0629d\") " Mar 20 13:38:14 crc kubenswrapper[4973]: I0320 13:38:14.914963 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd7c44c9-f93f-4620-80a8-ca0c05c0629d-catalog-content\") pod \"cd7c44c9-f93f-4620-80a8-ca0c05c0629d\" (UID: \"cd7c44c9-f93f-4620-80a8-ca0c05c0629d\") " Mar 20 13:38:14 crc kubenswrapper[4973]: I0320 13:38:14.916177 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd7c44c9-f93f-4620-80a8-ca0c05c0629d-utilities" (OuterVolumeSpecName: "utilities") pod "cd7c44c9-f93f-4620-80a8-ca0c05c0629d" (UID: "cd7c44c9-f93f-4620-80a8-ca0c05c0629d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:38:14 crc kubenswrapper[4973]: I0320 13:38:14.919560 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd7c44c9-f93f-4620-80a8-ca0c05c0629d-kube-api-access-9vb2h" (OuterVolumeSpecName: "kube-api-access-9vb2h") pod "cd7c44c9-f93f-4620-80a8-ca0c05c0629d" (UID: "cd7c44c9-f93f-4620-80a8-ca0c05c0629d"). InnerVolumeSpecName "kube-api-access-9vb2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:38:15 crc kubenswrapper[4973]: I0320 13:38:15.017612 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd7c44c9-f93f-4620-80a8-ca0c05c0629d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:15 crc kubenswrapper[4973]: I0320 13:38:15.017904 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vb2h\" (UniqueName: \"kubernetes.io/projected/cd7c44c9-f93f-4620-80a8-ca0c05c0629d-kube-api-access-9vb2h\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:15 crc kubenswrapper[4973]: I0320 13:38:15.145864 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd7c44c9-f93f-4620-80a8-ca0c05c0629d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd7c44c9-f93f-4620-80a8-ca0c05c0629d" (UID: "cd7c44c9-f93f-4620-80a8-ca0c05c0629d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:38:15 crc kubenswrapper[4973]: I0320 13:38:15.220894 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd7c44c9-f93f-4620-80a8-ca0c05c0629d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:15 crc kubenswrapper[4973]: I0320 13:38:15.549006 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-logging/collector-45jbl" event={"ID":"03b9e8f1-4452-4677-991f-64048197690b","Type":"ContainerStarted","Data":"4038f10781771ce8d0905feb8e15630c6d32430df743ca32f4d1931971333766"} Mar 20 13:38:15 crc kubenswrapper[4973]: I0320 13:38:15.551651 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cvzcj" event={"ID":"cd7c44c9-f93f-4620-80a8-ca0c05c0629d","Type":"ContainerDied","Data":"5c728c30e56a9ad1f781ab943c59f2a7063e4af79657eb1a40e424952d295c49"} Mar 20 13:38:15 crc kubenswrapper[4973]: I0320 13:38:15.551698 4973 scope.go:117] "RemoveContainer" containerID="8e63db6bc6a18f64e879a25407fe1d2b399779109ae54b6696cd7bc8672a99ad" Mar 20 13:38:15 crc kubenswrapper[4973]: I0320 13:38:15.551817 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cvzcj" Mar 20 13:38:15 crc kubenswrapper[4973]: I0320 13:38:15.583331 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-logging/collector-45jbl" podStartSLOduration=2.314559849 podStartE2EDuration="21.583311773s" podCreationTimestamp="2026-03-20 13:37:54 +0000 UTC" firstStartedPulling="2026-03-20 13:37:55.242014552 +0000 UTC m=+995.985684296" lastFinishedPulling="2026-03-20 13:38:14.510766476 +0000 UTC m=+1015.254436220" observedRunningTime="2026-03-20 13:38:15.577768244 +0000 UTC m=+1016.321438008" watchObservedRunningTime="2026-03-20 13:38:15.583311773 +0000 UTC m=+1016.326981517" Mar 20 13:38:15 crc kubenswrapper[4973]: I0320 13:38:15.583763 4973 scope.go:117] "RemoveContainer" containerID="292b985b0718dd112cdee5a91e0302a5f3b54eb1b7d39f73a4e5acdd68bd9978" Mar 20 13:38:15 crc kubenswrapper[4973]: I0320 13:38:15.596666 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-cvzcj"] Mar 20 13:38:15 crc kubenswrapper[4973]: I0320 13:38:15.606157 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-cvzcj"] Mar 20 13:38:15 crc kubenswrapper[4973]: I0320 13:38:15.654638 4973 scope.go:117] "RemoveContainer" containerID="e7b46243f3a03ef2cad834c13abd96ae5aab377e51026116c02bec096ddaaa8b" Mar 20 13:38:15 crc kubenswrapper[4973]: I0320 13:38:15.961915 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd7c44c9-f93f-4620-80a8-ca0c05c0629d" path="/var/lib/kubelet/pods/cd7c44c9-f93f-4620-80a8-ca0c05c0629d/volumes" Mar 20 13:38:16 crc kubenswrapper[4973]: I0320 13:38:16.965082 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hprnw"] Mar 20 13:38:16 crc kubenswrapper[4973]: E0320 13:38:16.965383 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd7c44c9-f93f-4620-80a8-ca0c05c0629d" containerName="extract-content" Mar 20 13:38:16 crc kubenswrapper[4973]: I0320 13:38:16.965397 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd7c44c9-f93f-4620-80a8-ca0c05c0629d" containerName="extract-content" Mar 20 13:38:16 crc kubenswrapper[4973]: E0320 13:38:16.965407 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7019dcc1-49da-40e5-ae40-80f09d83984d" containerName="oc" Mar 20 13:38:16 crc kubenswrapper[4973]: I0320 13:38:16.965413 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="7019dcc1-49da-40e5-ae40-80f09d83984d" containerName="oc" Mar 20 13:38:16 crc kubenswrapper[4973]: E0320 13:38:16.965429 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd7c44c9-f93f-4620-80a8-ca0c05c0629d" containerName="registry-server" Mar 20 13:38:16 crc kubenswrapper[4973]: I0320 13:38:16.965436 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd7c44c9-f93f-4620-80a8-ca0c05c0629d" containerName="registry-server" Mar 20 13:38:16 crc kubenswrapper[4973]: E0320 13:38:16.965445 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd7c44c9-f93f-4620-80a8-ca0c05c0629d" containerName="extract-utilities" Mar 20 13:38:16 crc kubenswrapper[4973]: I0320 13:38:16.965452 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd7c44c9-f93f-4620-80a8-ca0c05c0629d" containerName="extract-utilities" Mar 20 13:38:16 crc kubenswrapper[4973]: I0320 13:38:16.965568 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd7c44c9-f93f-4620-80a8-ca0c05c0629d" containerName="registry-server" Mar 20 13:38:16 crc kubenswrapper[4973]: I0320 13:38:16.965584 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="7019dcc1-49da-40e5-ae40-80f09d83984d" containerName="oc" Mar 20 13:38:16 crc kubenswrapper[4973]: I0320 13:38:16.968261 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hprnw" Mar 20 13:38:16 crc kubenswrapper[4973]: I0320 13:38:16.979590 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hprnw"] Mar 20 13:38:17 crc kubenswrapper[4973]: I0320 13:38:17.161942 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/554d90cf-d67f-4b4c-83e9-26a38c3cd840-catalog-content\") pod \"redhat-marketplace-hprnw\" (UID: \"554d90cf-d67f-4b4c-83e9-26a38c3cd840\") " pod="openshift-marketplace/redhat-marketplace-hprnw" Mar 20 13:38:17 crc kubenswrapper[4973]: I0320 13:38:17.162737 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/554d90cf-d67f-4b4c-83e9-26a38c3cd840-utilities\") pod \"redhat-marketplace-hprnw\" (UID: \"554d90cf-d67f-4b4c-83e9-26a38c3cd840\") " pod="openshift-marketplace/redhat-marketplace-hprnw" Mar 20 13:38:17 crc kubenswrapper[4973]: I0320 13:38:17.162852 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp5s8\" (UniqueName: \"kubernetes.io/projected/554d90cf-d67f-4b4c-83e9-26a38c3cd840-kube-api-access-jp5s8\") pod \"redhat-marketplace-hprnw\" (UID: \"554d90cf-d67f-4b4c-83e9-26a38c3cd840\") " pod="openshift-marketplace/redhat-marketplace-hprnw" Mar 20 13:38:17 crc kubenswrapper[4973]: I0320 13:38:17.264225 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/554d90cf-d67f-4b4c-83e9-26a38c3cd840-catalog-content\") pod \"redhat-marketplace-hprnw\" (UID: \"554d90cf-d67f-4b4c-83e9-26a38c3cd840\") " pod="openshift-marketplace/redhat-marketplace-hprnw" Mar 20 13:38:17 crc kubenswrapper[4973]: I0320 13:38:17.264361 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/554d90cf-d67f-4b4c-83e9-26a38c3cd840-utilities\") pod \"redhat-marketplace-hprnw\" (UID: \"554d90cf-d67f-4b4c-83e9-26a38c3cd840\") " pod="openshift-marketplace/redhat-marketplace-hprnw" Mar 20 13:38:17 crc kubenswrapper[4973]: I0320 13:38:17.264429 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp5s8\" (UniqueName: \"kubernetes.io/projected/554d90cf-d67f-4b4c-83e9-26a38c3cd840-kube-api-access-jp5s8\") pod \"redhat-marketplace-hprnw\" (UID: \"554d90cf-d67f-4b4c-83e9-26a38c3cd840\") " pod="openshift-marketplace/redhat-marketplace-hprnw" Mar 20 13:38:17 crc kubenswrapper[4973]: I0320 13:38:17.264720 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/554d90cf-d67f-4b4c-83e9-26a38c3cd840-catalog-content\") pod \"redhat-marketplace-hprnw\" (UID: \"554d90cf-d67f-4b4c-83e9-26a38c3cd840\") " pod="openshift-marketplace/redhat-marketplace-hprnw" Mar 20 13:38:17 crc kubenswrapper[4973]: I0320 13:38:17.264834 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/554d90cf-d67f-4b4c-83e9-26a38c3cd840-utilities\") pod \"redhat-marketplace-hprnw\" (UID: \"554d90cf-d67f-4b4c-83e9-26a38c3cd840\") " pod="openshift-marketplace/redhat-marketplace-hprnw" Mar 20 13:38:17 crc kubenswrapper[4973]: I0320 13:38:17.288518 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp5s8\" (UniqueName: \"kubernetes.io/projected/554d90cf-d67f-4b4c-83e9-26a38c3cd840-kube-api-access-jp5s8\") pod \"redhat-marketplace-hprnw\" (UID: \"554d90cf-d67f-4b4c-83e9-26a38c3cd840\") " pod="openshift-marketplace/redhat-marketplace-hprnw" Mar 20 13:38:17 crc kubenswrapper[4973]: I0320 13:38:17.292292 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hprnw" Mar 20 13:38:17 crc kubenswrapper[4973]: I0320 13:38:17.731063 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hprnw"] Mar 20 13:38:17 crc kubenswrapper[4973]: W0320 13:38:17.737189 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod554d90cf_d67f_4b4c_83e9_26a38c3cd840.slice/crio-88320f4909669721d3381cdb5bd78cb9c9fb41eac774b8eddf84430d98dd71e5 WatchSource:0}: Error finding container 88320f4909669721d3381cdb5bd78cb9c9fb41eac774b8eddf84430d98dd71e5: Status 404 returned error can't find the container with id 88320f4909669721d3381cdb5bd78cb9c9fb41eac774b8eddf84430d98dd71e5 Mar 20 13:38:18 crc kubenswrapper[4973]: I0320 13:38:18.575959 4973 generic.go:334] "Generic (PLEG): container finished" podID="554d90cf-d67f-4b4c-83e9-26a38c3cd840" containerID="9496fc1c5e7f43d5a1160651050c79df6a746d79ece2aa077982a9d0c2105a35" exitCode=0 Mar 20 13:38:18 crc kubenswrapper[4973]: I0320 13:38:18.576012 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hprnw" event={"ID":"554d90cf-d67f-4b4c-83e9-26a38c3cd840","Type":"ContainerDied","Data":"9496fc1c5e7f43d5a1160651050c79df6a746d79ece2aa077982a9d0c2105a35"} Mar 20 13:38:18 crc kubenswrapper[4973]: I0320 13:38:18.576053 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hprnw" event={"ID":"554d90cf-d67f-4b4c-83e9-26a38c3cd840","Type":"ContainerStarted","Data":"88320f4909669721d3381cdb5bd78cb9c9fb41eac774b8eddf84430d98dd71e5"} Mar 20 13:38:19 crc kubenswrapper[4973]: I0320 13:38:19.584553 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hprnw" event={"ID":"554d90cf-d67f-4b4c-83e9-26a38c3cd840","Type":"ContainerStarted","Data":"14eef9fad4f96c3badb01c0221553b4caaccb17cb7e66e6ec1ebc08240f016cb"} Mar 20 13:38:20 crc kubenswrapper[4973]: I0320 13:38:20.604712 4973 generic.go:334] "Generic (PLEG): container finished" podID="554d90cf-d67f-4b4c-83e9-26a38c3cd840" containerID="14eef9fad4f96c3badb01c0221553b4caaccb17cb7e66e6ec1ebc08240f016cb" exitCode=0 Mar 20 13:38:20 crc kubenswrapper[4973]: I0320 13:38:20.605217 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hprnw" event={"ID":"554d90cf-d67f-4b4c-83e9-26a38c3cd840","Type":"ContainerDied","Data":"14eef9fad4f96c3badb01c0221553b4caaccb17cb7e66e6ec1ebc08240f016cb"} Mar 20 13:38:21 crc kubenswrapper[4973]: I0320 13:38:21.613558 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hprnw" event={"ID":"554d90cf-d67f-4b4c-83e9-26a38c3cd840","Type":"ContainerStarted","Data":"ca13b912c8290b57d9f66ce2d1b17c9cb43153aeda009df3849ba4d82012e43d"} Mar 20 13:38:21 crc kubenswrapper[4973]: I0320 13:38:21.637443 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hprnw" podStartSLOduration=2.908259223 podStartE2EDuration="5.637421513s" podCreationTimestamp="2026-03-20 13:38:16 +0000 UTC" firstStartedPulling="2026-03-20 13:38:18.578592629 +0000 UTC m=+1019.322262373" lastFinishedPulling="2026-03-20 13:38:21.307754919 +0000 UTC m=+1022.051424663" observedRunningTime="2026-03-20 13:38:21.629892321 +0000 UTC m=+1022.373562085" watchObservedRunningTime="2026-03-20 13:38:21.637421513 +0000 UTC m=+1022.381091257" Mar 20 13:38:27 crc kubenswrapper[4973]: I0320 13:38:27.293385 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hprnw" Mar 20 13:38:27 crc kubenswrapper[4973]: I0320 13:38:27.294047 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hprnw" Mar 20 13:38:27 crc kubenswrapper[4973]: I0320 13:38:27.335056 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hprnw" Mar 20 13:38:27 crc kubenswrapper[4973]: I0320 13:38:27.702312 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hprnw" Mar 20 13:38:27 crc kubenswrapper[4973]: I0320 13:38:27.747725 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hprnw"] Mar 20 13:38:29 crc kubenswrapper[4973]: I0320 13:38:29.663059 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hprnw" podUID="554d90cf-d67f-4b4c-83e9-26a38c3cd840" containerName="registry-server" containerID="cri-o://ca13b912c8290b57d9f66ce2d1b17c9cb43153aeda009df3849ba4d82012e43d" gracePeriod=2 Mar 20 13:38:30 crc kubenswrapper[4973]: I0320 13:38:30.083779 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hprnw" Mar 20 13:38:30 crc kubenswrapper[4973]: I0320 13:38:30.264066 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jp5s8\" (UniqueName: \"kubernetes.io/projected/554d90cf-d67f-4b4c-83e9-26a38c3cd840-kube-api-access-jp5s8\") pod \"554d90cf-d67f-4b4c-83e9-26a38c3cd840\" (UID: \"554d90cf-d67f-4b4c-83e9-26a38c3cd840\") " Mar 20 13:38:30 crc kubenswrapper[4973]: I0320 13:38:30.264140 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/554d90cf-d67f-4b4c-83e9-26a38c3cd840-catalog-content\") pod \"554d90cf-d67f-4b4c-83e9-26a38c3cd840\" (UID: \"554d90cf-d67f-4b4c-83e9-26a38c3cd840\") " Mar 20 13:38:30 crc kubenswrapper[4973]: I0320 13:38:30.264326 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/554d90cf-d67f-4b4c-83e9-26a38c3cd840-utilities\") pod \"554d90cf-d67f-4b4c-83e9-26a38c3cd840\" (UID: \"554d90cf-d67f-4b4c-83e9-26a38c3cd840\") " Mar 20 13:38:30 crc kubenswrapper[4973]: I0320 13:38:30.265146 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/554d90cf-d67f-4b4c-83e9-26a38c3cd840-utilities" (OuterVolumeSpecName: "utilities") pod "554d90cf-d67f-4b4c-83e9-26a38c3cd840" (UID: "554d90cf-d67f-4b4c-83e9-26a38c3cd840"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:38:30 crc kubenswrapper[4973]: I0320 13:38:30.269639 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/554d90cf-d67f-4b4c-83e9-26a38c3cd840-kube-api-access-jp5s8" (OuterVolumeSpecName: "kube-api-access-jp5s8") pod "554d90cf-d67f-4b4c-83e9-26a38c3cd840" (UID: "554d90cf-d67f-4b4c-83e9-26a38c3cd840"). InnerVolumeSpecName "kube-api-access-jp5s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:38:30 crc kubenswrapper[4973]: I0320 13:38:30.366034 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/554d90cf-d67f-4b4c-83e9-26a38c3cd840-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:30 crc kubenswrapper[4973]: I0320 13:38:30.366083 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jp5s8\" (UniqueName: \"kubernetes.io/projected/554d90cf-d67f-4b4c-83e9-26a38c3cd840-kube-api-access-jp5s8\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:30 crc kubenswrapper[4973]: I0320 13:38:30.381070 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/554d90cf-d67f-4b4c-83e9-26a38c3cd840-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "554d90cf-d67f-4b4c-83e9-26a38c3cd840" (UID: "554d90cf-d67f-4b4c-83e9-26a38c3cd840"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:38:30 crc kubenswrapper[4973]: I0320 13:38:30.468214 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/554d90cf-d67f-4b4c-83e9-26a38c3cd840-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:30 crc kubenswrapper[4973]: I0320 13:38:30.671741 4973 generic.go:334] "Generic (PLEG): container finished" podID="554d90cf-d67f-4b4c-83e9-26a38c3cd840" containerID="ca13b912c8290b57d9f66ce2d1b17c9cb43153aeda009df3849ba4d82012e43d" exitCode=0 Mar 20 13:38:30 crc kubenswrapper[4973]: I0320 13:38:30.671803 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hprnw" Mar 20 13:38:30 crc kubenswrapper[4973]: I0320 13:38:30.671825 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hprnw" event={"ID":"554d90cf-d67f-4b4c-83e9-26a38c3cd840","Type":"ContainerDied","Data":"ca13b912c8290b57d9f66ce2d1b17c9cb43153aeda009df3849ba4d82012e43d"} Mar 20 13:38:30 crc kubenswrapper[4973]: I0320 13:38:30.672818 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hprnw" event={"ID":"554d90cf-d67f-4b4c-83e9-26a38c3cd840","Type":"ContainerDied","Data":"88320f4909669721d3381cdb5bd78cb9c9fb41eac774b8eddf84430d98dd71e5"} Mar 20 13:38:30 crc kubenswrapper[4973]: I0320 13:38:30.672843 4973 scope.go:117] "RemoveContainer" containerID="ca13b912c8290b57d9f66ce2d1b17c9cb43153aeda009df3849ba4d82012e43d" Mar 20 13:38:30 crc kubenswrapper[4973]: I0320 13:38:30.692973 4973 scope.go:117] "RemoveContainer" containerID="14eef9fad4f96c3badb01c0221553b4caaccb17cb7e66e6ec1ebc08240f016cb" Mar 20 13:38:30 crc kubenswrapper[4973]: I0320 13:38:30.700002 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hprnw"] Mar 20 13:38:30 crc kubenswrapper[4973]: I0320 13:38:30.713636 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hprnw"] Mar 20 13:38:30 crc kubenswrapper[4973]: I0320 13:38:30.725454 4973 scope.go:117] "RemoveContainer" containerID="9496fc1c5e7f43d5a1160651050c79df6a746d79ece2aa077982a9d0c2105a35" Mar 20 13:38:30 crc kubenswrapper[4973]: I0320 13:38:30.745697 4973 scope.go:117] "RemoveContainer" containerID="ca13b912c8290b57d9f66ce2d1b17c9cb43153aeda009df3849ba4d82012e43d" Mar 20 13:38:30 crc kubenswrapper[4973]: E0320 13:38:30.746144 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca13b912c8290b57d9f66ce2d1b17c9cb43153aeda009df3849ba4d82012e43d\": container with ID starting with ca13b912c8290b57d9f66ce2d1b17c9cb43153aeda009df3849ba4d82012e43d not found: ID does not exist" containerID="ca13b912c8290b57d9f66ce2d1b17c9cb43153aeda009df3849ba4d82012e43d" Mar 20 13:38:30 crc kubenswrapper[4973]: I0320 13:38:30.746178 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca13b912c8290b57d9f66ce2d1b17c9cb43153aeda009df3849ba4d82012e43d"} err="failed to get container status \"ca13b912c8290b57d9f66ce2d1b17c9cb43153aeda009df3849ba4d82012e43d\": rpc error: code = NotFound desc = could not find container \"ca13b912c8290b57d9f66ce2d1b17c9cb43153aeda009df3849ba4d82012e43d\": container with ID starting with ca13b912c8290b57d9f66ce2d1b17c9cb43153aeda009df3849ba4d82012e43d not found: ID does not exist" Mar 20 13:38:30 crc kubenswrapper[4973]: I0320 13:38:30.746198 4973 scope.go:117] "RemoveContainer" containerID="14eef9fad4f96c3badb01c0221553b4caaccb17cb7e66e6ec1ebc08240f016cb" Mar 20 13:38:30 crc kubenswrapper[4973]: E0320 13:38:30.748682 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14eef9fad4f96c3badb01c0221553b4caaccb17cb7e66e6ec1ebc08240f016cb\": container with ID starting with 14eef9fad4f96c3badb01c0221553b4caaccb17cb7e66e6ec1ebc08240f016cb not found: ID does not exist" containerID="14eef9fad4f96c3badb01c0221553b4caaccb17cb7e66e6ec1ebc08240f016cb" Mar 20 13:38:30 crc kubenswrapper[4973]: I0320 13:38:30.748716 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14eef9fad4f96c3badb01c0221553b4caaccb17cb7e66e6ec1ebc08240f016cb"} err="failed to get container status \"14eef9fad4f96c3badb01c0221553b4caaccb17cb7e66e6ec1ebc08240f016cb\": rpc error: code = NotFound desc = could not find container \"14eef9fad4f96c3badb01c0221553b4caaccb17cb7e66e6ec1ebc08240f016cb\": container with ID starting with 14eef9fad4f96c3badb01c0221553b4caaccb17cb7e66e6ec1ebc08240f016cb not found: ID does not exist" Mar 20 13:38:30 crc kubenswrapper[4973]: I0320 13:38:30.748742 4973 scope.go:117] "RemoveContainer" containerID="9496fc1c5e7f43d5a1160651050c79df6a746d79ece2aa077982a9d0c2105a35" Mar 20 13:38:30 crc kubenswrapper[4973]: E0320 13:38:30.749636 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9496fc1c5e7f43d5a1160651050c79df6a746d79ece2aa077982a9d0c2105a35\": container with ID starting with 9496fc1c5e7f43d5a1160651050c79df6a746d79ece2aa077982a9d0c2105a35 not found: ID does not exist" containerID="9496fc1c5e7f43d5a1160651050c79df6a746d79ece2aa077982a9d0c2105a35" Mar 20 13:38:30 crc kubenswrapper[4973]: I0320 13:38:30.749661 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9496fc1c5e7f43d5a1160651050c79df6a746d79ece2aa077982a9d0c2105a35"} err="failed to get container status \"9496fc1c5e7f43d5a1160651050c79df6a746d79ece2aa077982a9d0c2105a35\": rpc error: code = NotFound desc = could not find container \"9496fc1c5e7f43d5a1160651050c79df6a746d79ece2aa077982a9d0c2105a35\": container with ID starting with 9496fc1c5e7f43d5a1160651050c79df6a746d79ece2aa077982a9d0c2105a35 not found: ID does not exist" Mar 20 13:38:31 crc kubenswrapper[4973]: I0320 13:38:31.810650 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q"] Mar 20 13:38:31 crc kubenswrapper[4973]: E0320 13:38:31.812157 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="554d90cf-d67f-4b4c-83e9-26a38c3cd840" containerName="extract-utilities" Mar 20 13:38:31 crc kubenswrapper[4973]: I0320 13:38:31.812276 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="554d90cf-d67f-4b4c-83e9-26a38c3cd840" containerName="extract-utilities" Mar 20 13:38:31 crc kubenswrapper[4973]: E0320 13:38:31.812403 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="554d90cf-d67f-4b4c-83e9-26a38c3cd840" containerName="registry-server" Mar 20 13:38:31 crc kubenswrapper[4973]: I0320 13:38:31.812478 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="554d90cf-d67f-4b4c-83e9-26a38c3cd840" containerName="registry-server" Mar 20 13:38:31 crc kubenswrapper[4973]: E0320 13:38:31.812554 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="554d90cf-d67f-4b4c-83e9-26a38c3cd840" containerName="extract-content" Mar 20 13:38:31 crc kubenswrapper[4973]: I0320 13:38:31.812626 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="554d90cf-d67f-4b4c-83e9-26a38c3cd840" containerName="extract-content" Mar 20 13:38:31 crc kubenswrapper[4973]: I0320 13:38:31.812834 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="554d90cf-d67f-4b4c-83e9-26a38c3cd840" containerName="registry-server" Mar 20 13:38:31 crc kubenswrapper[4973]: I0320 13:38:31.813867 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q" Mar 20 13:38:31 crc kubenswrapper[4973]: I0320 13:38:31.816492 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 13:38:31 crc kubenswrapper[4973]: I0320 13:38:31.826304 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q"] Mar 20 13:38:31 crc kubenswrapper[4973]: I0320 13:38:31.889025 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dfd8af36-fd3a-4466-b375-585baab50b83-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q\" (UID: \"dfd8af36-fd3a-4466-b375-585baab50b83\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q" Mar 20 13:38:31 crc kubenswrapper[4973]: I0320 13:38:31.889124 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r6l27\" (UniqueName: \"kubernetes.io/projected/dfd8af36-fd3a-4466-b375-585baab50b83-kube-api-access-r6l27\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q\" (UID: \"dfd8af36-fd3a-4466-b375-585baab50b83\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q" Mar 20 13:38:31 crc kubenswrapper[4973]: I0320 13:38:31.889174 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dfd8af36-fd3a-4466-b375-585baab50b83-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q\" (UID: \"dfd8af36-fd3a-4466-b375-585baab50b83\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q" Mar 20 13:38:31 crc kubenswrapper[4973]: I0320 13:38:31.960577 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="554d90cf-d67f-4b4c-83e9-26a38c3cd840" path="/var/lib/kubelet/pods/554d90cf-d67f-4b4c-83e9-26a38c3cd840/volumes" Mar 20 13:38:31 crc kubenswrapper[4973]: I0320 13:38:31.991146 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dfd8af36-fd3a-4466-b375-585baab50b83-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q\" (UID: \"dfd8af36-fd3a-4466-b375-585baab50b83\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q" Mar 20 13:38:31 crc kubenswrapper[4973]: I0320 13:38:31.991262 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dfd8af36-fd3a-4466-b375-585baab50b83-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q\" (UID: \"dfd8af36-fd3a-4466-b375-585baab50b83\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q" Mar 20 13:38:31 crc kubenswrapper[4973]: I0320 13:38:31.991378 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r6l27\" (UniqueName: \"kubernetes.io/projected/dfd8af36-fd3a-4466-b375-585baab50b83-kube-api-access-r6l27\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q\" (UID: \"dfd8af36-fd3a-4466-b375-585baab50b83\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q" Mar 20 13:38:31 crc kubenswrapper[4973]: I0320 13:38:31.992118 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dfd8af36-fd3a-4466-b375-585baab50b83-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q\" (UID: \"dfd8af36-fd3a-4466-b375-585baab50b83\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q" Mar 20 13:38:31 crc kubenswrapper[4973]: I0320 13:38:31.992634 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dfd8af36-fd3a-4466-b375-585baab50b83-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q\" (UID: \"dfd8af36-fd3a-4466-b375-585baab50b83\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q" Mar 20 13:38:32 crc kubenswrapper[4973]: I0320 13:38:32.014577 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r6l27\" (UniqueName: \"kubernetes.io/projected/dfd8af36-fd3a-4466-b375-585baab50b83-kube-api-access-r6l27\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q\" (UID: \"dfd8af36-fd3a-4466-b375-585baab50b83\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q" Mar 20 13:38:32 crc kubenswrapper[4973]: I0320 13:38:32.130121 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q" Mar 20 13:38:32 crc kubenswrapper[4973]: I0320 13:38:32.563722 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q"] Mar 20 13:38:32 crc kubenswrapper[4973]: W0320 13:38:32.568496 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddfd8af36_fd3a_4466_b375_585baab50b83.slice/crio-77276669b7c15682620d587aa5c7262d6798372f04a330d92036cc297beb8aba WatchSource:0}: Error finding container 77276669b7c15682620d587aa5c7262d6798372f04a330d92036cc297beb8aba: Status 404 returned error can't find the container with id 77276669b7c15682620d587aa5c7262d6798372f04a330d92036cc297beb8aba Mar 20 13:38:32 crc kubenswrapper[4973]: I0320 13:38:32.689748 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q" event={"ID":"dfd8af36-fd3a-4466-b375-585baab50b83","Type":"ContainerStarted","Data":"77276669b7c15682620d587aa5c7262d6798372f04a330d92036cc297beb8aba"} Mar 20 13:38:33 crc kubenswrapper[4973]: I0320 13:38:33.698003 4973 generic.go:334] "Generic (PLEG): container finished" podID="dfd8af36-fd3a-4466-b375-585baab50b83" containerID="9a27b329b79b7a4254155d556faa781e50f3b7470de00d4d7c57976dc02d12ac" exitCode=0 Mar 20 13:38:33 crc kubenswrapper[4973]: I0320 13:38:33.698053 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q" event={"ID":"dfd8af36-fd3a-4466-b375-585baab50b83","Type":"ContainerDied","Data":"9a27b329b79b7a4254155d556faa781e50f3b7470de00d4d7c57976dc02d12ac"} Mar 20 13:38:36 crc kubenswrapper[4973]: I0320 13:38:36.176491 4973 scope.go:117] "RemoveContainer" containerID="39fad7f5278031b5a1dfe03e2753d65d0f252c680e396f68ba1175f849a60bdd" Mar 20 13:38:37 crc kubenswrapper[4973]: I0320 13:38:37.724863 4973 generic.go:334] "Generic (PLEG): container finished" podID="dfd8af36-fd3a-4466-b375-585baab50b83" containerID="6dbb44650790f8b6c89abdded1fe68e4c77e6f5854d55a72623e101c8f385563" exitCode=0 Mar 20 13:38:37 crc kubenswrapper[4973]: I0320 13:38:37.724910 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q" event={"ID":"dfd8af36-fd3a-4466-b375-585baab50b83","Type":"ContainerDied","Data":"6dbb44650790f8b6c89abdded1fe68e4c77e6f5854d55a72623e101c8f385563"} Mar 20 13:38:39 crc kubenswrapper[4973]: I0320 13:38:39.742171 4973 generic.go:334] "Generic (PLEG): container finished" podID="dfd8af36-fd3a-4466-b375-585baab50b83" containerID="7f9ba00fca6a8c54305fb91677eaf7e0abd2989ccb8c3335c1aa88dfef07098a" exitCode=0 Mar 20 13:38:39 crc kubenswrapper[4973]: I0320 13:38:39.742231 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q" event={"ID":"dfd8af36-fd3a-4466-b375-585baab50b83","Type":"ContainerDied","Data":"7f9ba00fca6a8c54305fb91677eaf7e0abd2989ccb8c3335c1aa88dfef07098a"} Mar 20 13:38:41 crc kubenswrapper[4973]: I0320 13:38:41.012134 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q" Mar 20 13:38:41 crc kubenswrapper[4973]: I0320 13:38:41.110985 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dfd8af36-fd3a-4466-b375-585baab50b83-bundle\") pod \"dfd8af36-fd3a-4466-b375-585baab50b83\" (UID: \"dfd8af36-fd3a-4466-b375-585baab50b83\") " Mar 20 13:38:41 crc kubenswrapper[4973]: I0320 13:38:41.111206 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dfd8af36-fd3a-4466-b375-585baab50b83-util\") pod \"dfd8af36-fd3a-4466-b375-585baab50b83\" (UID: \"dfd8af36-fd3a-4466-b375-585baab50b83\") " Mar 20 13:38:41 crc kubenswrapper[4973]: I0320 13:38:41.111254 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r6l27\" (UniqueName: \"kubernetes.io/projected/dfd8af36-fd3a-4466-b375-585baab50b83-kube-api-access-r6l27\") pod \"dfd8af36-fd3a-4466-b375-585baab50b83\" (UID: \"dfd8af36-fd3a-4466-b375-585baab50b83\") " Mar 20 13:38:41 crc kubenswrapper[4973]: I0320 13:38:41.111797 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfd8af36-fd3a-4466-b375-585baab50b83-bundle" (OuterVolumeSpecName: "bundle") pod "dfd8af36-fd3a-4466-b375-585baab50b83" (UID: "dfd8af36-fd3a-4466-b375-585baab50b83"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:38:41 crc kubenswrapper[4973]: I0320 13:38:41.119493 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfd8af36-fd3a-4466-b375-585baab50b83-kube-api-access-r6l27" (OuterVolumeSpecName: "kube-api-access-r6l27") pod "dfd8af36-fd3a-4466-b375-585baab50b83" (UID: "dfd8af36-fd3a-4466-b375-585baab50b83"). InnerVolumeSpecName "kube-api-access-r6l27". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:38:41 crc kubenswrapper[4973]: I0320 13:38:41.122179 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dfd8af36-fd3a-4466-b375-585baab50b83-util" (OuterVolumeSpecName: "util") pod "dfd8af36-fd3a-4466-b375-585baab50b83" (UID: "dfd8af36-fd3a-4466-b375-585baab50b83"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:38:41 crc kubenswrapper[4973]: I0320 13:38:41.212881 4973 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dfd8af36-fd3a-4466-b375-585baab50b83-util\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:41 crc kubenswrapper[4973]: I0320 13:38:41.212938 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r6l27\" (UniqueName: \"kubernetes.io/projected/dfd8af36-fd3a-4466-b375-585baab50b83-kube-api-access-r6l27\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:41 crc kubenswrapper[4973]: I0320 13:38:41.212949 4973 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dfd8af36-fd3a-4466-b375-585baab50b83-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:41 crc kubenswrapper[4973]: I0320 13:38:41.762946 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q" event={"ID":"dfd8af36-fd3a-4466-b375-585baab50b83","Type":"ContainerDied","Data":"77276669b7c15682620d587aa5c7262d6798372f04a330d92036cc297beb8aba"} Mar 20 13:38:41 crc kubenswrapper[4973]: I0320 13:38:41.762986 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77276669b7c15682620d587aa5c7262d6798372f04a330d92036cc297beb8aba" Mar 20 13:38:41 crc kubenswrapper[4973]: I0320 13:38:41.763067 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q" Mar 20 13:38:42 crc kubenswrapper[4973]: I0320 13:38:42.575418 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ln949"] Mar 20 13:38:42 crc kubenswrapper[4973]: E0320 13:38:42.577077 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd8af36-fd3a-4466-b375-585baab50b83" containerName="util" Mar 20 13:38:42 crc kubenswrapper[4973]: I0320 13:38:42.577166 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd8af36-fd3a-4466-b375-585baab50b83" containerName="util" Mar 20 13:38:42 crc kubenswrapper[4973]: E0320 13:38:42.577273 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd8af36-fd3a-4466-b375-585baab50b83" containerName="extract" Mar 20 13:38:42 crc kubenswrapper[4973]: I0320 13:38:42.577368 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd8af36-fd3a-4466-b375-585baab50b83" containerName="extract" Mar 20 13:38:42 crc kubenswrapper[4973]: E0320 13:38:42.577460 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfd8af36-fd3a-4466-b375-585baab50b83" containerName="pull" Mar 20 13:38:42 crc kubenswrapper[4973]: I0320 13:38:42.577524 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfd8af36-fd3a-4466-b375-585baab50b83" containerName="pull" Mar 20 13:38:42 crc kubenswrapper[4973]: I0320 13:38:42.577757 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfd8af36-fd3a-4466-b375-585baab50b83" containerName="extract" Mar 20 13:38:42 crc kubenswrapper[4973]: I0320 13:38:42.579069 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ln949" Mar 20 13:38:42 crc kubenswrapper[4973]: I0320 13:38:42.592375 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ln949"] Mar 20 13:38:42 crc kubenswrapper[4973]: I0320 13:38:42.741381 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a487bfd8-8ca2-44e3-8a45-41137a2c8cd7-utilities\") pod \"community-operators-ln949\" (UID: \"a487bfd8-8ca2-44e3-8a45-41137a2c8cd7\") " pod="openshift-marketplace/community-operators-ln949" Mar 20 13:38:42 crc kubenswrapper[4973]: I0320 13:38:42.741440 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwtc6\" (UniqueName: \"kubernetes.io/projected/a487bfd8-8ca2-44e3-8a45-41137a2c8cd7-kube-api-access-nwtc6\") pod \"community-operators-ln949\" (UID: \"a487bfd8-8ca2-44e3-8a45-41137a2c8cd7\") " pod="openshift-marketplace/community-operators-ln949" Mar 20 13:38:42 crc kubenswrapper[4973]: I0320 13:38:42.741484 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a487bfd8-8ca2-44e3-8a45-41137a2c8cd7-catalog-content\") pod \"community-operators-ln949\" (UID: \"a487bfd8-8ca2-44e3-8a45-41137a2c8cd7\") " pod="openshift-marketplace/community-operators-ln949" Mar 20 13:38:42 crc kubenswrapper[4973]: I0320 13:38:42.843264 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a487bfd8-8ca2-44e3-8a45-41137a2c8cd7-utilities\") pod \"community-operators-ln949\" (UID: \"a487bfd8-8ca2-44e3-8a45-41137a2c8cd7\") " pod="openshift-marketplace/community-operators-ln949" Mar 20 13:38:42 crc kubenswrapper[4973]: I0320 13:38:42.843315 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwtc6\" (UniqueName: \"kubernetes.io/projected/a487bfd8-8ca2-44e3-8a45-41137a2c8cd7-kube-api-access-nwtc6\") pod \"community-operators-ln949\" (UID: \"a487bfd8-8ca2-44e3-8a45-41137a2c8cd7\") " pod="openshift-marketplace/community-operators-ln949" Mar 20 13:38:42 crc kubenswrapper[4973]: I0320 13:38:42.843380 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a487bfd8-8ca2-44e3-8a45-41137a2c8cd7-catalog-content\") pod \"community-operators-ln949\" (UID: \"a487bfd8-8ca2-44e3-8a45-41137a2c8cd7\") " pod="openshift-marketplace/community-operators-ln949" Mar 20 13:38:42 crc kubenswrapper[4973]: I0320 13:38:42.843837 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a487bfd8-8ca2-44e3-8a45-41137a2c8cd7-utilities\") pod \"community-operators-ln949\" (UID: \"a487bfd8-8ca2-44e3-8a45-41137a2c8cd7\") " pod="openshift-marketplace/community-operators-ln949" Mar 20 13:38:42 crc kubenswrapper[4973]: I0320 13:38:42.843933 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a487bfd8-8ca2-44e3-8a45-41137a2c8cd7-catalog-content\") pod \"community-operators-ln949\" (UID: \"a487bfd8-8ca2-44e3-8a45-41137a2c8cd7\") " pod="openshift-marketplace/community-operators-ln949" Mar 20 13:38:42 crc kubenswrapper[4973]: I0320 13:38:42.863024 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwtc6\" (UniqueName: \"kubernetes.io/projected/a487bfd8-8ca2-44e3-8a45-41137a2c8cd7-kube-api-access-nwtc6\") pod \"community-operators-ln949\" (UID: \"a487bfd8-8ca2-44e3-8a45-41137a2c8cd7\") " pod="openshift-marketplace/community-operators-ln949" Mar 20 13:38:42 crc kubenswrapper[4973]: I0320 13:38:42.904078 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ln949" Mar 20 13:38:43 crc kubenswrapper[4973]: I0320 13:38:43.321230 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:38:43 crc kubenswrapper[4973]: I0320 13:38:43.321608 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:38:43 crc kubenswrapper[4973]: I0320 13:38:43.321657 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 13:38:43 crc kubenswrapper[4973]: I0320 13:38:43.322403 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f898eb1e5a0799379b7d7bcd473943134d5addff2e02fbb8f8a3d4d7eb5c66a6"} pod="openshift-machine-config-operator/machine-config-daemon-qlztx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:38:43 crc kubenswrapper[4973]: I0320 13:38:43.322486 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" containerID="cri-o://f898eb1e5a0799379b7d7bcd473943134d5addff2e02fbb8f8a3d4d7eb5c66a6" gracePeriod=600 Mar 20 13:38:43 crc kubenswrapper[4973]: I0320 13:38:43.430624 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ln949"] Mar 20 13:38:43 crc kubenswrapper[4973]: I0320 13:38:43.779153 4973 generic.go:334] "Generic (PLEG): container finished" podID="a487bfd8-8ca2-44e3-8a45-41137a2c8cd7" containerID="569388fdc02a181a3daeae452b70b44a4cffd5104d1b334a2cf160b2176f6b53" exitCode=0 Mar 20 13:38:43 crc kubenswrapper[4973]: I0320 13:38:43.779278 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ln949" event={"ID":"a487bfd8-8ca2-44e3-8a45-41137a2c8cd7","Type":"ContainerDied","Data":"569388fdc02a181a3daeae452b70b44a4cffd5104d1b334a2cf160b2176f6b53"} Mar 20 13:38:43 crc kubenswrapper[4973]: I0320 13:38:43.779363 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ln949" event={"ID":"a487bfd8-8ca2-44e3-8a45-41137a2c8cd7","Type":"ContainerStarted","Data":"df6bbe05bd9b64110cdae5297e1cbeba150b02f5ffdfc0c048ba7fadbb66cc2f"} Mar 20 13:38:43 crc kubenswrapper[4973]: I0320 13:38:43.781940 4973 generic.go:334] "Generic (PLEG): container finished" podID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerID="f898eb1e5a0799379b7d7bcd473943134d5addff2e02fbb8f8a3d4d7eb5c66a6" exitCode=0 Mar 20 13:38:43 crc kubenswrapper[4973]: I0320 13:38:43.781996 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerDied","Data":"f898eb1e5a0799379b7d7bcd473943134d5addff2e02fbb8f8a3d4d7eb5c66a6"} Mar 20 13:38:43 crc kubenswrapper[4973]: I0320 13:38:43.782040 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerStarted","Data":"6ee146100b8d3ae20a6493daed451ee0c8c9f7d655dfaba5ce2b9446864d5f7d"} Mar 20 13:38:43 crc kubenswrapper[4973]: I0320 13:38:43.782063 4973 scope.go:117] "RemoveContainer" containerID="26d2f8d0ba44652122f03dbe7cb2777fe726b59947a9f579b0a01f84b56a0f0a" Mar 20 13:38:44 crc kubenswrapper[4973]: I0320 13:38:44.789740 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ln949" event={"ID":"a487bfd8-8ca2-44e3-8a45-41137a2c8cd7","Type":"ContainerStarted","Data":"74706db3b8d59f0206f59123ba671a76a3add1d751330d23c8a96e4f0e650498"} Mar 20 13:38:45 crc kubenswrapper[4973]: I0320 13:38:45.806145 4973 generic.go:334] "Generic (PLEG): container finished" podID="a487bfd8-8ca2-44e3-8a45-41137a2c8cd7" containerID="74706db3b8d59f0206f59123ba671a76a3add1d751330d23c8a96e4f0e650498" exitCode=0 Mar 20 13:38:45 crc kubenswrapper[4973]: I0320 13:38:45.806320 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ln949" event={"ID":"a487bfd8-8ca2-44e3-8a45-41137a2c8cd7","Type":"ContainerDied","Data":"74706db3b8d59f0206f59123ba671a76a3add1d751330d23c8a96e4f0e650498"} Mar 20 13:38:46 crc kubenswrapper[4973]: I0320 13:38:46.818561 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ln949" event={"ID":"a487bfd8-8ca2-44e3-8a45-41137a2c8cd7","Type":"ContainerStarted","Data":"e471a4311f3fb57826354703a93b5d86a1be2728285f18f92d0a0a5c965c4e72"} Mar 20 13:38:46 crc kubenswrapper[4973]: I0320 13:38:46.842594 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ln949" podStartSLOduration=2.361354494 podStartE2EDuration="4.842576964s" podCreationTimestamp="2026-03-20 13:38:42 +0000 UTC" firstStartedPulling="2026-03-20 13:38:43.782513348 +0000 UTC m=+1044.526183082" lastFinishedPulling="2026-03-20 13:38:46.263735808 +0000 UTC m=+1047.007405552" observedRunningTime="2026-03-20 13:38:46.83605234 +0000 UTC m=+1047.579722084" watchObservedRunningTime="2026-03-20 13:38:46.842576964 +0000 UTC m=+1047.586246708" Mar 20 13:38:48 crc kubenswrapper[4973]: I0320 13:38:48.230525 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-nvs2k"] Mar 20 13:38:48 crc kubenswrapper[4973]: I0320 13:38:48.236671 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-nvs2k" Mar 20 13:38:48 crc kubenswrapper[4973]: I0320 13:38:48.240705 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-mzxsz" Mar 20 13:38:48 crc kubenswrapper[4973]: I0320 13:38:48.240956 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 20 13:38:48 crc kubenswrapper[4973]: I0320 13:38:48.241069 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 20 13:38:48 crc kubenswrapper[4973]: I0320 13:38:48.243192 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-nvs2k"] Mar 20 13:38:48 crc kubenswrapper[4973]: I0320 13:38:48.336379 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f86ws\" (UniqueName: \"kubernetes.io/projected/8b93e9c4-b7b6-407a-b5c7-9fec8ed5a64a-kube-api-access-f86ws\") pod \"nmstate-operator-796d4cfff4-nvs2k\" (UID: \"8b93e9c4-b7b6-407a-b5c7-9fec8ed5a64a\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-nvs2k" Mar 20 13:38:48 crc kubenswrapper[4973]: I0320 13:38:48.437770 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f86ws\" (UniqueName: \"kubernetes.io/projected/8b93e9c4-b7b6-407a-b5c7-9fec8ed5a64a-kube-api-access-f86ws\") pod \"nmstate-operator-796d4cfff4-nvs2k\" (UID: \"8b93e9c4-b7b6-407a-b5c7-9fec8ed5a64a\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-nvs2k" Mar 20 13:38:48 crc kubenswrapper[4973]: I0320 13:38:48.477271 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f86ws\" (UniqueName: \"kubernetes.io/projected/8b93e9c4-b7b6-407a-b5c7-9fec8ed5a64a-kube-api-access-f86ws\") pod \"nmstate-operator-796d4cfff4-nvs2k\" (UID: \"8b93e9c4-b7b6-407a-b5c7-9fec8ed5a64a\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-nvs2k" Mar 20 13:38:48 crc kubenswrapper[4973]: I0320 13:38:48.559509 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-nvs2k" Mar 20 13:38:49 crc kubenswrapper[4973]: I0320 13:38:49.178098 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-nvs2k"] Mar 20 13:38:49 crc kubenswrapper[4973]: I0320 13:38:49.847602 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-nvs2k" event={"ID":"8b93e9c4-b7b6-407a-b5c7-9fec8ed5a64a","Type":"ContainerStarted","Data":"4b7db8c97502cde35168d0c9adfecacc5655084567bbc3159c1bb12cb397ca75"} Mar 20 13:38:51 crc kubenswrapper[4973]: I0320 13:38:51.867843 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-nvs2k" event={"ID":"8b93e9c4-b7b6-407a-b5c7-9fec8ed5a64a","Type":"ContainerStarted","Data":"5145c275018861a679e233eb90f4e07d5b30dbbea2e70d061d66b5603e573efd"} Mar 20 13:38:51 crc kubenswrapper[4973]: I0320 13:38:51.888953 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-nvs2k" podStartSLOduration=2.010567121 podStartE2EDuration="3.88893518s" podCreationTimestamp="2026-03-20 13:38:48 +0000 UTC" firstStartedPulling="2026-03-20 13:38:49.189520129 +0000 UTC m=+1049.933189873" lastFinishedPulling="2026-03-20 13:38:51.067888198 +0000 UTC m=+1051.811557932" observedRunningTime="2026-03-20 13:38:51.882930968 +0000 UTC m=+1052.626600742" watchObservedRunningTime="2026-03-20 13:38:51.88893518 +0000 UTC m=+1052.632604944" Mar 20 13:38:52 crc kubenswrapper[4973]: I0320 13:38:52.802118 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-pmf2g"] Mar 20 13:38:52 crc kubenswrapper[4973]: I0320 13:38:52.803945 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-pmf2g" Mar 20 13:38:52 crc kubenswrapper[4973]: I0320 13:38:52.806631 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-4rp2s" Mar 20 13:38:52 crc kubenswrapper[4973]: I0320 13:38:52.812146 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-pmf2g"] Mar 20 13:38:52 crc kubenswrapper[4973]: I0320 13:38:52.829227 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-9c8nv"] Mar 20 13:38:52 crc kubenswrapper[4973]: I0320 13:38:52.830409 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-9c8nv" Mar 20 13:38:52 crc kubenswrapper[4973]: I0320 13:38:52.833706 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 20 13:38:52 crc kubenswrapper[4973]: I0320 13:38:52.882365 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-9c8nv"] Mar 20 13:38:52 crc kubenswrapper[4973]: I0320 13:38:52.904930 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ln949" Mar 20 13:38:52 crc kubenswrapper[4973]: I0320 13:38:52.905180 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ln949" Mar 20 13:38:52 crc kubenswrapper[4973]: I0320 13:38:52.912458 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-4z4zn"] Mar 20 13:38:52 crc kubenswrapper[4973]: I0320 13:38:52.913624 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-4z4zn" Mar 20 13:38:52 crc kubenswrapper[4973]: I0320 13:38:52.931218 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2437569f-a833-4666-a051-db0d4818cc5f-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-9c8nv\" (UID: \"2437569f-a833-4666-a051-db0d4818cc5f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-9c8nv" Mar 20 13:38:52 crc kubenswrapper[4973]: I0320 13:38:52.931310 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk9c7\" (UniqueName: \"kubernetes.io/projected/0f84a71d-ca4b-44e1-9008-4a5f34dbaf9d-kube-api-access-jk9c7\") pod \"nmstate-metrics-9b8c8685d-pmf2g\" (UID: \"0f84a71d-ca4b-44e1-9008-4a5f34dbaf9d\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-pmf2g" Mar 20 13:38:52 crc kubenswrapper[4973]: I0320 13:38:52.931377 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqt8x\" (UniqueName: \"kubernetes.io/projected/2437569f-a833-4666-a051-db0d4818cc5f-kube-api-access-zqt8x\") pod \"nmstate-webhook-5f558f5558-9c8nv\" (UID: \"2437569f-a833-4666-a051-db0d4818cc5f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-9c8nv" Mar 20 13:38:52 crc kubenswrapper[4973]: I0320 13:38:52.985649 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ln949" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.022791 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-bqcmm"] Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.035105 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bqcmm" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.037046 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5f74ece1-d945-42b2-a93f-622bb0c63aa7-dbus-socket\") pod \"nmstate-handler-4z4zn\" (UID: \"5f74ece1-d945-42b2-a93f-622bb0c63aa7\") " pod="openshift-nmstate/nmstate-handler-4z4zn" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.037163 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk9c7\" (UniqueName: \"kubernetes.io/projected/0f84a71d-ca4b-44e1-9008-4a5f34dbaf9d-kube-api-access-jk9c7\") pod \"nmstate-metrics-9b8c8685d-pmf2g\" (UID: \"0f84a71d-ca4b-44e1-9008-4a5f34dbaf9d\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-pmf2g" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.037333 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq76w\" (UniqueName: \"kubernetes.io/projected/5f74ece1-d945-42b2-a93f-622bb0c63aa7-kube-api-access-vq76w\") pod \"nmstate-handler-4z4zn\" (UID: \"5f74ece1-d945-42b2-a93f-622bb0c63aa7\") " pod="openshift-nmstate/nmstate-handler-4z4zn" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.046633 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqt8x\" (UniqueName: \"kubernetes.io/projected/2437569f-a833-4666-a051-db0d4818cc5f-kube-api-access-zqt8x\") pod \"nmstate-webhook-5f558f5558-9c8nv\" (UID: \"2437569f-a833-4666-a051-db0d4818cc5f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-9c8nv" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.046789 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5f74ece1-d945-42b2-a93f-622bb0c63aa7-nmstate-lock\") pod \"nmstate-handler-4z4zn\" (UID: \"5f74ece1-d945-42b2-a93f-622bb0c63aa7\") " pod="openshift-nmstate/nmstate-handler-4z4zn" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.047299 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2437569f-a833-4666-a051-db0d4818cc5f-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-9c8nv\" (UID: \"2437569f-a833-4666-a051-db0d4818cc5f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-9c8nv" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.047588 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5f74ece1-d945-42b2-a93f-622bb0c63aa7-ovs-socket\") pod \"nmstate-handler-4z4zn\" (UID: \"5f74ece1-d945-42b2-a93f-622bb0c63aa7\") " pod="openshift-nmstate/nmstate-handler-4z4zn" Mar 20 13:38:53 crc kubenswrapper[4973]: E0320 13:38:53.064304 4973 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 20 13:38:53 crc kubenswrapper[4973]: E0320 13:38:53.064429 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2437569f-a833-4666-a051-db0d4818cc5f-tls-key-pair podName:2437569f-a833-4666-a051-db0d4818cc5f nodeName:}" failed. No retries permitted until 2026-03-20 13:38:53.56440519 +0000 UTC m=+1054.308074934 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/2437569f-a833-4666-a051-db0d4818cc5f-tls-key-pair") pod "nmstate-webhook-5f558f5558-9c8nv" (UID: "2437569f-a833-4666-a051-db0d4818cc5f") : secret "openshift-nmstate-webhook" not found Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.071985 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-mz627" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.072400 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.073070 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.145076 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-bqcmm"] Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.147069 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk9c7\" (UniqueName: \"kubernetes.io/projected/0f84a71d-ca4b-44e1-9008-4a5f34dbaf9d-kube-api-access-jk9c7\") pod \"nmstate-metrics-9b8c8685d-pmf2g\" (UID: \"0f84a71d-ca4b-44e1-9008-4a5f34dbaf9d\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-pmf2g" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.152860 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqt8x\" (UniqueName: \"kubernetes.io/projected/2437569f-a833-4666-a051-db0d4818cc5f-kube-api-access-zqt8x\") pod \"nmstate-webhook-5f558f5558-9c8nv\" (UID: \"2437569f-a833-4666-a051-db0d4818cc5f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-9c8nv" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.167146 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5f74ece1-d945-42b2-a93f-622bb0c63aa7-nmstate-lock\") pod \"nmstate-handler-4z4zn\" (UID: \"5f74ece1-d945-42b2-a93f-622bb0c63aa7\") " pod="openshift-nmstate/nmstate-handler-4z4zn" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.167224 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5f74ece1-d945-42b2-a93f-622bb0c63aa7-ovs-socket\") pod \"nmstate-handler-4z4zn\" (UID: \"5f74ece1-d945-42b2-a93f-622bb0c63aa7\") " pod="openshift-nmstate/nmstate-handler-4z4zn" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.167270 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5f74ece1-d945-42b2-a93f-622bb0c63aa7-dbus-socket\") pod \"nmstate-handler-4z4zn\" (UID: \"5f74ece1-d945-42b2-a93f-622bb0c63aa7\") " pod="openshift-nmstate/nmstate-handler-4z4zn" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.167301 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/720b7a25-1a14-473b-af5e-0d3e764074ee-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-bqcmm\" (UID: \"720b7a25-1a14-473b-af5e-0d3e764074ee\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bqcmm" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.167365 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq76w\" (UniqueName: \"kubernetes.io/projected/5f74ece1-d945-42b2-a93f-622bb0c63aa7-kube-api-access-vq76w\") pod \"nmstate-handler-4z4zn\" (UID: \"5f74ece1-d945-42b2-a93f-622bb0c63aa7\") " pod="openshift-nmstate/nmstate-handler-4z4zn" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.167385 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5lq8c\" (UniqueName: \"kubernetes.io/projected/720b7a25-1a14-473b-af5e-0d3e764074ee-kube-api-access-5lq8c\") pod \"nmstate-console-plugin-86f58fcf4-bqcmm\" (UID: \"720b7a25-1a14-473b-af5e-0d3e764074ee\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bqcmm" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.167428 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/720b7a25-1a14-473b-af5e-0d3e764074ee-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-bqcmm\" (UID: \"720b7a25-1a14-473b-af5e-0d3e764074ee\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bqcmm" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.167522 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/5f74ece1-d945-42b2-a93f-622bb0c63aa7-nmstate-lock\") pod \"nmstate-handler-4z4zn\" (UID: \"5f74ece1-d945-42b2-a93f-622bb0c63aa7\") " pod="openshift-nmstate/nmstate-handler-4z4zn" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.167550 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/5f74ece1-d945-42b2-a93f-622bb0c63aa7-ovs-socket\") pod \"nmstate-handler-4z4zn\" (UID: \"5f74ece1-d945-42b2-a93f-622bb0c63aa7\") " pod="openshift-nmstate/nmstate-handler-4z4zn" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.167840 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/5f74ece1-d945-42b2-a93f-622bb0c63aa7-dbus-socket\") pod \"nmstate-handler-4z4zn\" (UID: \"5f74ece1-d945-42b2-a93f-622bb0c63aa7\") " pod="openshift-nmstate/nmstate-handler-4z4zn" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.209077 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq76w\" (UniqueName: \"kubernetes.io/projected/5f74ece1-d945-42b2-a93f-622bb0c63aa7-kube-api-access-vq76w\") pod \"nmstate-handler-4z4zn\" (UID: \"5f74ece1-d945-42b2-a93f-622bb0c63aa7\") " pod="openshift-nmstate/nmstate-handler-4z4zn" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.250214 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-4z4zn" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.269103 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/720b7a25-1a14-473b-af5e-0d3e764074ee-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-bqcmm\" (UID: \"720b7a25-1a14-473b-af5e-0d3e764074ee\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bqcmm" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.269450 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5lq8c\" (UniqueName: \"kubernetes.io/projected/720b7a25-1a14-473b-af5e-0d3e764074ee-kube-api-access-5lq8c\") pod \"nmstate-console-plugin-86f58fcf4-bqcmm\" (UID: \"720b7a25-1a14-473b-af5e-0d3e764074ee\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bqcmm" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.269501 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/720b7a25-1a14-473b-af5e-0d3e764074ee-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-bqcmm\" (UID: \"720b7a25-1a14-473b-af5e-0d3e764074ee\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bqcmm" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.270095 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/720b7a25-1a14-473b-af5e-0d3e764074ee-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-bqcmm\" (UID: \"720b7a25-1a14-473b-af5e-0d3e764074ee\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bqcmm" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.285397 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/720b7a25-1a14-473b-af5e-0d3e764074ee-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-bqcmm\" (UID: \"720b7a25-1a14-473b-af5e-0d3e764074ee\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bqcmm" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.298790 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7794c74589-2c6lq"] Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.299795 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.315203 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7794c74589-2c6lq"] Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.315930 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5lq8c\" (UniqueName: \"kubernetes.io/projected/720b7a25-1a14-473b-af5e-0d3e764074ee-kube-api-access-5lq8c\") pod \"nmstate-console-plugin-86f58fcf4-bqcmm\" (UID: \"720b7a25-1a14-473b-af5e-0d3e764074ee\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bqcmm" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.369986 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bqcmm" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.424559 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-pmf2g" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.472388 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a8f8e46-e4bd-440c-87dd-046f52b26e69-oauth-serving-cert\") pod \"console-7794c74589-2c6lq\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.472455 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a8f8e46-e4bd-440c-87dd-046f52b26e69-console-oauth-config\") pod \"console-7794c74589-2c6lq\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.472538 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a8f8e46-e4bd-440c-87dd-046f52b26e69-console-config\") pod \"console-7794c74589-2c6lq\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.472716 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxt5v\" (UniqueName: \"kubernetes.io/projected/4a8f8e46-e4bd-440c-87dd-046f52b26e69-kube-api-access-sxt5v\") pod \"console-7794c74589-2c6lq\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.472786 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a8f8e46-e4bd-440c-87dd-046f52b26e69-trusted-ca-bundle\") pod \"console-7794c74589-2c6lq\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.472923 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a8f8e46-e4bd-440c-87dd-046f52b26e69-service-ca\") pod \"console-7794c74589-2c6lq\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.473022 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a8f8e46-e4bd-440c-87dd-046f52b26e69-console-serving-cert\") pod \"console-7794c74589-2c6lq\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.574260 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a8f8e46-e4bd-440c-87dd-046f52b26e69-console-serving-cert\") pod \"console-7794c74589-2c6lq\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.574328 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a8f8e46-e4bd-440c-87dd-046f52b26e69-oauth-serving-cert\") pod \"console-7794c74589-2c6lq\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.574413 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a8f8e46-e4bd-440c-87dd-046f52b26e69-console-oauth-config\") pod \"console-7794c74589-2c6lq\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.574436 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a8f8e46-e4bd-440c-87dd-046f52b26e69-console-config\") pod \"console-7794c74589-2c6lq\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.574464 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxt5v\" (UniqueName: \"kubernetes.io/projected/4a8f8e46-e4bd-440c-87dd-046f52b26e69-kube-api-access-sxt5v\") pod \"console-7794c74589-2c6lq\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.574488 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a8f8e46-e4bd-440c-87dd-046f52b26e69-trusted-ca-bundle\") pod \"console-7794c74589-2c6lq\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.574513 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2437569f-a833-4666-a051-db0d4818cc5f-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-9c8nv\" (UID: \"2437569f-a833-4666-a051-db0d4818cc5f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-9c8nv" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.574541 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a8f8e46-e4bd-440c-87dd-046f52b26e69-service-ca\") pod \"console-7794c74589-2c6lq\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.575445 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a8f8e46-e4bd-440c-87dd-046f52b26e69-service-ca\") pod \"console-7794c74589-2c6lq\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.579926 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a8f8e46-e4bd-440c-87dd-046f52b26e69-console-serving-cert\") pod \"console-7794c74589-2c6lq\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.580589 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a8f8e46-e4bd-440c-87dd-046f52b26e69-oauth-serving-cert\") pod \"console-7794c74589-2c6lq\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.584755 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a8f8e46-e4bd-440c-87dd-046f52b26e69-console-oauth-config\") pod \"console-7794c74589-2c6lq\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.585263 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a8f8e46-e4bd-440c-87dd-046f52b26e69-console-config\") pod \"console-7794c74589-2c6lq\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.586260 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a8f8e46-e4bd-440c-87dd-046f52b26e69-trusted-ca-bundle\") pod \"console-7794c74589-2c6lq\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.590956 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/2437569f-a833-4666-a051-db0d4818cc5f-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-9c8nv\" (UID: \"2437569f-a833-4666-a051-db0d4818cc5f\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-9c8nv" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.631327 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxt5v\" (UniqueName: \"kubernetes.io/projected/4a8f8e46-e4bd-440c-87dd-046f52b26e69-kube-api-access-sxt5v\") pod \"console-7794c74589-2c6lq\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.658071 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.703543 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-bqcmm"] Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.775293 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-9c8nv" Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.893845 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-4z4zn" event={"ID":"5f74ece1-d945-42b2-a93f-622bb0c63aa7","Type":"ContainerStarted","Data":"8af8a179b0e158393b2593e8f0093dfccd872544a859c033f3fe98f70f20da94"} Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.896492 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bqcmm" event={"ID":"720b7a25-1a14-473b-af5e-0d3e764074ee","Type":"ContainerStarted","Data":"2932c1909919448c506858e9017d121129fd33539f9582958571ba206ac0292e"} Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.937298 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-pmf2g"] Mar 20 13:38:53 crc kubenswrapper[4973]: W0320 13:38:53.958295 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f84a71d_ca4b_44e1_9008_4a5f34dbaf9d.slice/crio-733e6a1c5e9484abae430418af95e9ee19dd736c96eac77a6fa336116dbd97f4 WatchSource:0}: Error finding container 733e6a1c5e9484abae430418af95e9ee19dd736c96eac77a6fa336116dbd97f4: Status 404 returned error can't find the container with id 733e6a1c5e9484abae430418af95e9ee19dd736c96eac77a6fa336116dbd97f4 Mar 20 13:38:53 crc kubenswrapper[4973]: I0320 13:38:53.965074 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ln949" Mar 20 13:38:54 crc kubenswrapper[4973]: I0320 13:38:54.156243 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7794c74589-2c6lq"] Mar 20 13:38:54 crc kubenswrapper[4973]: I0320 13:38:54.290928 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-9c8nv"] Mar 20 13:38:54 crc kubenswrapper[4973]: I0320 13:38:54.904031 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-9c8nv" event={"ID":"2437569f-a833-4666-a051-db0d4818cc5f","Type":"ContainerStarted","Data":"16f8dc5b62fa3733c9f6a2b95ef48b59dbfaacaba959af24e1233a43353117d3"} Mar 20 13:38:54 crc kubenswrapper[4973]: I0320 13:38:54.906192 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7794c74589-2c6lq" event={"ID":"4a8f8e46-e4bd-440c-87dd-046f52b26e69","Type":"ContainerStarted","Data":"0c11382906a33a4d1a09b8ce97b60d980b2fb98cdca388f8ba9c13611675de20"} Mar 20 13:38:54 crc kubenswrapper[4973]: I0320 13:38:54.906232 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7794c74589-2c6lq" event={"ID":"4a8f8e46-e4bd-440c-87dd-046f52b26e69","Type":"ContainerStarted","Data":"e28145cdfc9f3cc219150b38f20e11ad9d9b52fec6266a1bc807bb78a65e096c"} Mar 20 13:38:54 crc kubenswrapper[4973]: I0320 13:38:54.907628 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-pmf2g" event={"ID":"0f84a71d-ca4b-44e1-9008-4a5f34dbaf9d","Type":"ContainerStarted","Data":"733e6a1c5e9484abae430418af95e9ee19dd736c96eac77a6fa336116dbd97f4"} Mar 20 13:38:54 crc kubenswrapper[4973]: I0320 13:38:54.928837 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7794c74589-2c6lq" podStartSLOduration=1.928818235 podStartE2EDuration="1.928818235s" podCreationTimestamp="2026-03-20 13:38:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:38:54.925171016 +0000 UTC m=+1055.668840760" watchObservedRunningTime="2026-03-20 13:38:54.928818235 +0000 UTC m=+1055.672487979" Mar 20 13:38:55 crc kubenswrapper[4973]: I0320 13:38:55.374134 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ln949"] Mar 20 13:38:56 crc kubenswrapper[4973]: I0320 13:38:56.923840 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ln949" podUID="a487bfd8-8ca2-44e3-8a45-41137a2c8cd7" containerName="registry-server" containerID="cri-o://e471a4311f3fb57826354703a93b5d86a1be2728285f18f92d0a0a5c965c4e72" gracePeriod=2 Mar 20 13:38:57 crc kubenswrapper[4973]: I0320 13:38:57.366642 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ln949" Mar 20 13:38:57 crc kubenswrapper[4973]: I0320 13:38:57.563272 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a487bfd8-8ca2-44e3-8a45-41137a2c8cd7-catalog-content\") pod \"a487bfd8-8ca2-44e3-8a45-41137a2c8cd7\" (UID: \"a487bfd8-8ca2-44e3-8a45-41137a2c8cd7\") " Mar 20 13:38:57 crc kubenswrapper[4973]: I0320 13:38:57.563324 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a487bfd8-8ca2-44e3-8a45-41137a2c8cd7-utilities\") pod \"a487bfd8-8ca2-44e3-8a45-41137a2c8cd7\" (UID: \"a487bfd8-8ca2-44e3-8a45-41137a2c8cd7\") " Mar 20 13:38:57 crc kubenswrapper[4973]: I0320 13:38:57.563400 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nwtc6\" (UniqueName: \"kubernetes.io/projected/a487bfd8-8ca2-44e3-8a45-41137a2c8cd7-kube-api-access-nwtc6\") pod \"a487bfd8-8ca2-44e3-8a45-41137a2c8cd7\" (UID: \"a487bfd8-8ca2-44e3-8a45-41137a2c8cd7\") " Mar 20 13:38:57 crc kubenswrapper[4973]: I0320 13:38:57.564722 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a487bfd8-8ca2-44e3-8a45-41137a2c8cd7-utilities" (OuterVolumeSpecName: "utilities") pod "a487bfd8-8ca2-44e3-8a45-41137a2c8cd7" (UID: "a487bfd8-8ca2-44e3-8a45-41137a2c8cd7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:38:57 crc kubenswrapper[4973]: I0320 13:38:57.568904 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a487bfd8-8ca2-44e3-8a45-41137a2c8cd7-kube-api-access-nwtc6" (OuterVolumeSpecName: "kube-api-access-nwtc6") pod "a487bfd8-8ca2-44e3-8a45-41137a2c8cd7" (UID: "a487bfd8-8ca2-44e3-8a45-41137a2c8cd7"). InnerVolumeSpecName "kube-api-access-nwtc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:38:57 crc kubenswrapper[4973]: I0320 13:38:57.614986 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a487bfd8-8ca2-44e3-8a45-41137a2c8cd7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a487bfd8-8ca2-44e3-8a45-41137a2c8cd7" (UID: "a487bfd8-8ca2-44e3-8a45-41137a2c8cd7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:38:57 crc kubenswrapper[4973]: I0320 13:38:57.665458 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a487bfd8-8ca2-44e3-8a45-41137a2c8cd7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:57 crc kubenswrapper[4973]: I0320 13:38:57.665492 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a487bfd8-8ca2-44e3-8a45-41137a2c8cd7-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:57 crc kubenswrapper[4973]: I0320 13:38:57.665504 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nwtc6\" (UniqueName: \"kubernetes.io/projected/a487bfd8-8ca2-44e3-8a45-41137a2c8cd7-kube-api-access-nwtc6\") on node \"crc\" DevicePath \"\"" Mar 20 13:38:57 crc kubenswrapper[4973]: I0320 13:38:57.934857 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-9c8nv" event={"ID":"2437569f-a833-4666-a051-db0d4818cc5f","Type":"ContainerStarted","Data":"e9555023ae7582f2c5add9537976d61d304b4fbbbbee6f8e70f0f4243ce68fbd"} Mar 20 13:38:57 crc kubenswrapper[4973]: I0320 13:38:57.935039 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-9c8nv" Mar 20 13:38:57 crc kubenswrapper[4973]: I0320 13:38:57.936289 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-4z4zn" event={"ID":"5f74ece1-d945-42b2-a93f-622bb0c63aa7","Type":"ContainerStarted","Data":"f570988661b70bfbfce6d3c37bdc99278978e5767fbd406060499734014d296d"} Mar 20 13:38:57 crc kubenswrapper[4973]: I0320 13:38:57.936385 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-4z4zn" Mar 20 13:38:57 crc kubenswrapper[4973]: I0320 13:38:57.939430 4973 generic.go:334] "Generic (PLEG): container finished" podID="a487bfd8-8ca2-44e3-8a45-41137a2c8cd7" containerID="e471a4311f3fb57826354703a93b5d86a1be2728285f18f92d0a0a5c965c4e72" exitCode=0 Mar 20 13:38:57 crc kubenswrapper[4973]: I0320 13:38:57.939482 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ln949" Mar 20 13:38:57 crc kubenswrapper[4973]: I0320 13:38:57.939499 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ln949" event={"ID":"a487bfd8-8ca2-44e3-8a45-41137a2c8cd7","Type":"ContainerDied","Data":"e471a4311f3fb57826354703a93b5d86a1be2728285f18f92d0a0a5c965c4e72"} Mar 20 13:38:57 crc kubenswrapper[4973]: I0320 13:38:57.939526 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ln949" event={"ID":"a487bfd8-8ca2-44e3-8a45-41137a2c8cd7","Type":"ContainerDied","Data":"df6bbe05bd9b64110cdae5297e1cbeba150b02f5ffdfc0c048ba7fadbb66cc2f"} Mar 20 13:38:57 crc kubenswrapper[4973]: I0320 13:38:57.939543 4973 scope.go:117] "RemoveContainer" containerID="e471a4311f3fb57826354703a93b5d86a1be2728285f18f92d0a0a5c965c4e72" Mar 20 13:38:57 crc kubenswrapper[4973]: I0320 13:38:57.954553 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-9c8nv" podStartSLOduration=3.300487317 podStartE2EDuration="5.954531099s" podCreationTimestamp="2026-03-20 13:38:52 +0000 UTC" firstStartedPulling="2026-03-20 13:38:54.295918436 +0000 UTC m=+1055.039588180" lastFinishedPulling="2026-03-20 13:38:56.949962178 +0000 UTC m=+1057.693631962" observedRunningTime="2026-03-20 13:38:57.950043738 +0000 UTC m=+1058.693713502" watchObservedRunningTime="2026-03-20 13:38:57.954531099 +0000 UTC m=+1058.698200843" Mar 20 13:38:57 crc kubenswrapper[4973]: I0320 13:38:57.965352 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-pmf2g" event={"ID":"0f84a71d-ca4b-44e1-9008-4a5f34dbaf9d","Type":"ContainerStarted","Data":"007b0686601ae9b2adf3087a1caf4589f145a781770025c1fd697a629102605e"} Mar 20 13:38:57 crc kubenswrapper[4973]: I0320 13:38:57.965672 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bqcmm" event={"ID":"720b7a25-1a14-473b-af5e-0d3e764074ee","Type":"ContainerStarted","Data":"573d0d9ef9a68eaa415ad200c2da6ac89d1a46d368f7b95c951e31a46b6870a5"} Mar 20 13:38:57 crc kubenswrapper[4973]: I0320 13:38:57.990972 4973 scope.go:117] "RemoveContainer" containerID="74706db3b8d59f0206f59123ba671a76a3add1d751330d23c8a96e4f0e650498" Mar 20 13:38:57 crc kubenswrapper[4973]: I0320 13:38:57.993734 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-4z4zn" podStartSLOduration=2.333407474 podStartE2EDuration="5.993718042s" podCreationTimestamp="2026-03-20 13:38:52 +0000 UTC" firstStartedPulling="2026-03-20 13:38:53.300976604 +0000 UTC m=+1054.044646348" lastFinishedPulling="2026-03-20 13:38:56.961287162 +0000 UTC m=+1057.704956916" observedRunningTime="2026-03-20 13:38:57.991599174 +0000 UTC m=+1058.735268918" watchObservedRunningTime="2026-03-20 13:38:57.993718042 +0000 UTC m=+1058.737387786" Mar 20 13:38:58 crc kubenswrapper[4973]: I0320 13:38:58.012471 4973 scope.go:117] "RemoveContainer" containerID="569388fdc02a181a3daeae452b70b44a4cffd5104d1b334a2cf160b2176f6b53" Mar 20 13:38:58 crc kubenswrapper[4973]: I0320 13:38:58.016005 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-bqcmm" podStartSLOduration=2.777972204 podStartE2EDuration="6.01599033s" podCreationTimestamp="2026-03-20 13:38:52 +0000 UTC" firstStartedPulling="2026-03-20 13:38:53.710126063 +0000 UTC m=+1054.453795807" lastFinishedPulling="2026-03-20 13:38:56.948144179 +0000 UTC m=+1057.691813933" observedRunningTime="2026-03-20 13:38:58.013197835 +0000 UTC m=+1058.756867599" watchObservedRunningTime="2026-03-20 13:38:58.01599033 +0000 UTC m=+1058.759660074" Mar 20 13:38:58 crc kubenswrapper[4973]: I0320 13:38:58.042541 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ln949"] Mar 20 13:38:58 crc kubenswrapper[4973]: I0320 13:38:58.048871 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ln949"] Mar 20 13:38:58 crc kubenswrapper[4973]: I0320 13:38:58.064038 4973 scope.go:117] "RemoveContainer" containerID="e471a4311f3fb57826354703a93b5d86a1be2728285f18f92d0a0a5c965c4e72" Mar 20 13:38:58 crc kubenswrapper[4973]: E0320 13:38:58.064386 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e471a4311f3fb57826354703a93b5d86a1be2728285f18f92d0a0a5c965c4e72\": container with ID starting with e471a4311f3fb57826354703a93b5d86a1be2728285f18f92d0a0a5c965c4e72 not found: ID does not exist" containerID="e471a4311f3fb57826354703a93b5d86a1be2728285f18f92d0a0a5c965c4e72" Mar 20 13:38:58 crc kubenswrapper[4973]: I0320 13:38:58.064424 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e471a4311f3fb57826354703a93b5d86a1be2728285f18f92d0a0a5c965c4e72"} err="failed to get container status \"e471a4311f3fb57826354703a93b5d86a1be2728285f18f92d0a0a5c965c4e72\": rpc error: code = NotFound desc = could not find container \"e471a4311f3fb57826354703a93b5d86a1be2728285f18f92d0a0a5c965c4e72\": container with ID starting with e471a4311f3fb57826354703a93b5d86a1be2728285f18f92d0a0a5c965c4e72 not found: ID does not exist" Mar 20 13:38:58 crc kubenswrapper[4973]: I0320 13:38:58.064445 4973 scope.go:117] "RemoveContainer" containerID="74706db3b8d59f0206f59123ba671a76a3add1d751330d23c8a96e4f0e650498" Mar 20 13:38:58 crc kubenswrapper[4973]: E0320 13:38:58.064714 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74706db3b8d59f0206f59123ba671a76a3add1d751330d23c8a96e4f0e650498\": container with ID starting with 74706db3b8d59f0206f59123ba671a76a3add1d751330d23c8a96e4f0e650498 not found: ID does not exist" containerID="74706db3b8d59f0206f59123ba671a76a3add1d751330d23c8a96e4f0e650498" Mar 20 13:38:58 crc kubenswrapper[4973]: I0320 13:38:58.064736 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74706db3b8d59f0206f59123ba671a76a3add1d751330d23c8a96e4f0e650498"} err="failed to get container status \"74706db3b8d59f0206f59123ba671a76a3add1d751330d23c8a96e4f0e650498\": rpc error: code = NotFound desc = could not find container \"74706db3b8d59f0206f59123ba671a76a3add1d751330d23c8a96e4f0e650498\": container with ID starting with 74706db3b8d59f0206f59123ba671a76a3add1d751330d23c8a96e4f0e650498 not found: ID does not exist" Mar 20 13:38:58 crc kubenswrapper[4973]: I0320 13:38:58.064749 4973 scope.go:117] "RemoveContainer" containerID="569388fdc02a181a3daeae452b70b44a4cffd5104d1b334a2cf160b2176f6b53" Mar 20 13:38:58 crc kubenswrapper[4973]: E0320 13:38:58.064974 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"569388fdc02a181a3daeae452b70b44a4cffd5104d1b334a2cf160b2176f6b53\": container with ID starting with 569388fdc02a181a3daeae452b70b44a4cffd5104d1b334a2cf160b2176f6b53 not found: ID does not exist" containerID="569388fdc02a181a3daeae452b70b44a4cffd5104d1b334a2cf160b2176f6b53" Mar 20 13:38:58 crc kubenswrapper[4973]: I0320 13:38:58.064992 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"569388fdc02a181a3daeae452b70b44a4cffd5104d1b334a2cf160b2176f6b53"} err="failed to get container status \"569388fdc02a181a3daeae452b70b44a4cffd5104d1b334a2cf160b2176f6b53\": rpc error: code = NotFound desc = could not find container \"569388fdc02a181a3daeae452b70b44a4cffd5104d1b334a2cf160b2176f6b53\": container with ID starting with 569388fdc02a181a3daeae452b70b44a4cffd5104d1b334a2cf160b2176f6b53 not found: ID does not exist" Mar 20 13:38:59 crc kubenswrapper[4973]: I0320 13:38:59.967247 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a487bfd8-8ca2-44e3-8a45-41137a2c8cd7" path="/var/lib/kubelet/pods/a487bfd8-8ca2-44e3-8a45-41137a2c8cd7/volumes" Mar 20 13:39:00 crc kubenswrapper[4973]: I0320 13:39:00.987441 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-pmf2g" event={"ID":"0f84a71d-ca4b-44e1-9008-4a5f34dbaf9d","Type":"ContainerStarted","Data":"45665d374d78a5be705806f8b373cff670fcff1e066196b8be0b33c6af8ee5fe"} Mar 20 13:39:01 crc kubenswrapper[4973]: I0320 13:39:01.006913 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-pmf2g" podStartSLOduration=3.148251218 podStartE2EDuration="9.006896109s" podCreationTimestamp="2026-03-20 13:38:52 +0000 UTC" firstStartedPulling="2026-03-20 13:38:53.975673865 +0000 UTC m=+1054.719343609" lastFinishedPulling="2026-03-20 13:38:59.834318756 +0000 UTC m=+1060.577988500" observedRunningTime="2026-03-20 13:39:01.003426946 +0000 UTC m=+1061.747096700" watchObservedRunningTime="2026-03-20 13:39:01.006896109 +0000 UTC m=+1061.750565853" Mar 20 13:39:03 crc kubenswrapper[4973]: I0320 13:39:03.286250 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-4z4zn" Mar 20 13:39:03 crc kubenswrapper[4973]: I0320 13:39:03.660027 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:39:03 crc kubenswrapper[4973]: I0320 13:39:03.660102 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:39:03 crc kubenswrapper[4973]: I0320 13:39:03.664690 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:39:04 crc kubenswrapper[4973]: I0320 13:39:04.019771 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:39:04 crc kubenswrapper[4973]: I0320 13:39:04.087152 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69866dbfb5-64wk8"] Mar 20 13:39:13 crc kubenswrapper[4973]: I0320 13:39:13.933789 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-9c8nv" Mar 20 13:39:29 crc kubenswrapper[4973]: I0320 13:39:29.136191 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-69866dbfb5-64wk8" podUID="a13fbd0b-b630-41bc-b997-4aebc4cac884" containerName="console" containerID="cri-o://3bb4d40594ddbc7ad4929b7b7fd18abcef7903fc40cb1aa527e342afd5e9e67a" gracePeriod=15 Mar 20 13:39:29 crc kubenswrapper[4973]: I0320 13:39:29.619972 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69866dbfb5-64wk8_a13fbd0b-b630-41bc-b997-4aebc4cac884/console/0.log" Mar 20 13:39:29 crc kubenswrapper[4973]: I0320 13:39:29.620053 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:39:29 crc kubenswrapper[4973]: I0320 13:39:29.716878 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a13fbd0b-b630-41bc-b997-4aebc4cac884-service-ca\") pod \"a13fbd0b-b630-41bc-b997-4aebc4cac884\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " Mar 20 13:39:29 crc kubenswrapper[4973]: I0320 13:39:29.716990 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a13fbd0b-b630-41bc-b997-4aebc4cac884-console-oauth-config\") pod \"a13fbd0b-b630-41bc-b997-4aebc4cac884\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " Mar 20 13:39:29 crc kubenswrapper[4973]: I0320 13:39:29.717015 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a13fbd0b-b630-41bc-b997-4aebc4cac884-console-serving-cert\") pod \"a13fbd0b-b630-41bc-b997-4aebc4cac884\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " Mar 20 13:39:29 crc kubenswrapper[4973]: I0320 13:39:29.717144 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a13fbd0b-b630-41bc-b997-4aebc4cac884-oauth-serving-cert\") pod \"a13fbd0b-b630-41bc-b997-4aebc4cac884\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " Mar 20 13:39:29 crc kubenswrapper[4973]: I0320 13:39:29.717168 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxvz8\" (UniqueName: \"kubernetes.io/projected/a13fbd0b-b630-41bc-b997-4aebc4cac884-kube-api-access-zxvz8\") pod \"a13fbd0b-b630-41bc-b997-4aebc4cac884\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " Mar 20 13:39:29 crc kubenswrapper[4973]: I0320 13:39:29.717185 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a13fbd0b-b630-41bc-b997-4aebc4cac884-console-config\") pod \"a13fbd0b-b630-41bc-b997-4aebc4cac884\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " Mar 20 13:39:29 crc kubenswrapper[4973]: I0320 13:39:29.717209 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a13fbd0b-b630-41bc-b997-4aebc4cac884-trusted-ca-bundle\") pod \"a13fbd0b-b630-41bc-b997-4aebc4cac884\" (UID: \"a13fbd0b-b630-41bc-b997-4aebc4cac884\") " Mar 20 13:39:29 crc kubenswrapper[4973]: I0320 13:39:29.718362 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a13fbd0b-b630-41bc-b997-4aebc4cac884-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "a13fbd0b-b630-41bc-b997-4aebc4cac884" (UID: "a13fbd0b-b630-41bc-b997-4aebc4cac884"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:39:29 crc kubenswrapper[4973]: I0320 13:39:29.718445 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a13fbd0b-b630-41bc-b997-4aebc4cac884-service-ca" (OuterVolumeSpecName: "service-ca") pod "a13fbd0b-b630-41bc-b997-4aebc4cac884" (UID: "a13fbd0b-b630-41bc-b997-4aebc4cac884"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:39:29 crc kubenswrapper[4973]: I0320 13:39:29.718747 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a13fbd0b-b630-41bc-b997-4aebc4cac884-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "a13fbd0b-b630-41bc-b997-4aebc4cac884" (UID: "a13fbd0b-b630-41bc-b997-4aebc4cac884"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:39:29 crc kubenswrapper[4973]: I0320 13:39:29.719208 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a13fbd0b-b630-41bc-b997-4aebc4cac884-console-config" (OuterVolumeSpecName: "console-config") pod "a13fbd0b-b630-41bc-b997-4aebc4cac884" (UID: "a13fbd0b-b630-41bc-b997-4aebc4cac884"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:39:29 crc kubenswrapper[4973]: I0320 13:39:29.733676 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a13fbd0b-b630-41bc-b997-4aebc4cac884-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "a13fbd0b-b630-41bc-b997-4aebc4cac884" (UID: "a13fbd0b-b630-41bc-b997-4aebc4cac884"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:39:29 crc kubenswrapper[4973]: I0320 13:39:29.733879 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a13fbd0b-b630-41bc-b997-4aebc4cac884-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "a13fbd0b-b630-41bc-b997-4aebc4cac884" (UID: "a13fbd0b-b630-41bc-b997-4aebc4cac884"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:39:29 crc kubenswrapper[4973]: I0320 13:39:29.742130 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a13fbd0b-b630-41bc-b997-4aebc4cac884-kube-api-access-zxvz8" (OuterVolumeSpecName: "kube-api-access-zxvz8") pod "a13fbd0b-b630-41bc-b997-4aebc4cac884" (UID: "a13fbd0b-b630-41bc-b997-4aebc4cac884"). InnerVolumeSpecName "kube-api-access-zxvz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:39:29 crc kubenswrapper[4973]: I0320 13:39:29.818772 4973 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a13fbd0b-b630-41bc-b997-4aebc4cac884-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:29 crc kubenswrapper[4973]: I0320 13:39:29.818869 4973 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a13fbd0b-b630-41bc-b997-4aebc4cac884-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:29 crc kubenswrapper[4973]: I0320 13:39:29.818881 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxvz8\" (UniqueName: \"kubernetes.io/projected/a13fbd0b-b630-41bc-b997-4aebc4cac884-kube-api-access-zxvz8\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:29 crc kubenswrapper[4973]: I0320 13:39:29.818894 4973 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a13fbd0b-b630-41bc-b997-4aebc4cac884-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:29 crc kubenswrapper[4973]: I0320 13:39:29.818904 4973 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a13fbd0b-b630-41bc-b997-4aebc4cac884-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:29 crc kubenswrapper[4973]: I0320 13:39:29.818916 4973 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a13fbd0b-b630-41bc-b997-4aebc4cac884-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:29 crc kubenswrapper[4973]: I0320 13:39:29.818926 4973 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a13fbd0b-b630-41bc-b997-4aebc4cac884-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:30 crc kubenswrapper[4973]: I0320 13:39:30.214969 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69866dbfb5-64wk8_a13fbd0b-b630-41bc-b997-4aebc4cac884/console/0.log" Mar 20 13:39:30 crc kubenswrapper[4973]: I0320 13:39:30.215014 4973 generic.go:334] "Generic (PLEG): container finished" podID="a13fbd0b-b630-41bc-b997-4aebc4cac884" containerID="3bb4d40594ddbc7ad4929b7b7fd18abcef7903fc40cb1aa527e342afd5e9e67a" exitCode=2 Mar 20 13:39:30 crc kubenswrapper[4973]: I0320 13:39:30.215043 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69866dbfb5-64wk8" event={"ID":"a13fbd0b-b630-41bc-b997-4aebc4cac884","Type":"ContainerDied","Data":"3bb4d40594ddbc7ad4929b7b7fd18abcef7903fc40cb1aa527e342afd5e9e67a"} Mar 20 13:39:30 crc kubenswrapper[4973]: I0320 13:39:30.215069 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69866dbfb5-64wk8" event={"ID":"a13fbd0b-b630-41bc-b997-4aebc4cac884","Type":"ContainerDied","Data":"ffa9ce6e8bd053c3d7516143434ac206ede2b765935ea7d84fedb33b289c7220"} Mar 20 13:39:30 crc kubenswrapper[4973]: I0320 13:39:30.215086 4973 scope.go:117] "RemoveContainer" containerID="3bb4d40594ddbc7ad4929b7b7fd18abcef7903fc40cb1aa527e342afd5e9e67a" Mar 20 13:39:30 crc kubenswrapper[4973]: I0320 13:39:30.215192 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69866dbfb5-64wk8" Mar 20 13:39:30 crc kubenswrapper[4973]: I0320 13:39:30.242218 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69866dbfb5-64wk8"] Mar 20 13:39:30 crc kubenswrapper[4973]: I0320 13:39:30.245392 4973 scope.go:117] "RemoveContainer" containerID="3bb4d40594ddbc7ad4929b7b7fd18abcef7903fc40cb1aa527e342afd5e9e67a" Mar 20 13:39:30 crc kubenswrapper[4973]: E0320 13:39:30.245830 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bb4d40594ddbc7ad4929b7b7fd18abcef7903fc40cb1aa527e342afd5e9e67a\": container with ID starting with 3bb4d40594ddbc7ad4929b7b7fd18abcef7903fc40cb1aa527e342afd5e9e67a not found: ID does not exist" containerID="3bb4d40594ddbc7ad4929b7b7fd18abcef7903fc40cb1aa527e342afd5e9e67a" Mar 20 13:39:30 crc kubenswrapper[4973]: I0320 13:39:30.245867 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bb4d40594ddbc7ad4929b7b7fd18abcef7903fc40cb1aa527e342afd5e9e67a"} err="failed to get container status \"3bb4d40594ddbc7ad4929b7b7fd18abcef7903fc40cb1aa527e342afd5e9e67a\": rpc error: code = NotFound desc = could not find container \"3bb4d40594ddbc7ad4929b7b7fd18abcef7903fc40cb1aa527e342afd5e9e67a\": container with ID starting with 3bb4d40594ddbc7ad4929b7b7fd18abcef7903fc40cb1aa527e342afd5e9e67a not found: ID does not exist" Mar 20 13:39:30 crc kubenswrapper[4973]: I0320 13:39:30.248604 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-69866dbfb5-64wk8"] Mar 20 13:39:31 crc kubenswrapper[4973]: I0320 13:39:31.158645 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9"] Mar 20 13:39:31 crc kubenswrapper[4973]: E0320 13:39:31.159358 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a13fbd0b-b630-41bc-b997-4aebc4cac884" containerName="console" Mar 20 13:39:31 crc kubenswrapper[4973]: I0320 13:39:31.159376 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="a13fbd0b-b630-41bc-b997-4aebc4cac884" containerName="console" Mar 20 13:39:31 crc kubenswrapper[4973]: E0320 13:39:31.159398 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a487bfd8-8ca2-44e3-8a45-41137a2c8cd7" containerName="extract-content" Mar 20 13:39:31 crc kubenswrapper[4973]: I0320 13:39:31.159407 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="a487bfd8-8ca2-44e3-8a45-41137a2c8cd7" containerName="extract-content" Mar 20 13:39:31 crc kubenswrapper[4973]: E0320 13:39:31.159427 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a487bfd8-8ca2-44e3-8a45-41137a2c8cd7" containerName="extract-utilities" Mar 20 13:39:31 crc kubenswrapper[4973]: I0320 13:39:31.159437 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="a487bfd8-8ca2-44e3-8a45-41137a2c8cd7" containerName="extract-utilities" Mar 20 13:39:31 crc kubenswrapper[4973]: E0320 13:39:31.159465 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a487bfd8-8ca2-44e3-8a45-41137a2c8cd7" containerName="registry-server" Mar 20 13:39:31 crc kubenswrapper[4973]: I0320 13:39:31.159473 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="a487bfd8-8ca2-44e3-8a45-41137a2c8cd7" containerName="registry-server" Mar 20 13:39:31 crc kubenswrapper[4973]: I0320 13:39:31.159644 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="a487bfd8-8ca2-44e3-8a45-41137a2c8cd7" containerName="registry-server" Mar 20 13:39:31 crc kubenswrapper[4973]: I0320 13:39:31.159660 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="a13fbd0b-b630-41bc-b997-4aebc4cac884" containerName="console" Mar 20 13:39:31 crc kubenswrapper[4973]: I0320 13:39:31.160957 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9" Mar 20 13:39:31 crc kubenswrapper[4973]: I0320 13:39:31.165567 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9"] Mar 20 13:39:31 crc kubenswrapper[4973]: I0320 13:39:31.169261 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 13:39:31 crc kubenswrapper[4973]: I0320 13:39:31.244818 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jr5q\" (UniqueName: \"kubernetes.io/projected/19b59a52-d780-491f-ab38-270ea519cddc-kube-api-access-6jr5q\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9\" (UID: \"19b59a52-d780-491f-ab38-270ea519cddc\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9" Mar 20 13:39:31 crc kubenswrapper[4973]: I0320 13:39:31.245721 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19b59a52-d780-491f-ab38-270ea519cddc-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9\" (UID: \"19b59a52-d780-491f-ab38-270ea519cddc\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9" Mar 20 13:39:31 crc kubenswrapper[4973]: I0320 13:39:31.245926 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19b59a52-d780-491f-ab38-270ea519cddc-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9\" (UID: \"19b59a52-d780-491f-ab38-270ea519cddc\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9" Mar 20 13:39:31 crc kubenswrapper[4973]: I0320 13:39:31.347563 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19b59a52-d780-491f-ab38-270ea519cddc-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9\" (UID: \"19b59a52-d780-491f-ab38-270ea519cddc\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9" Mar 20 13:39:31 crc kubenswrapper[4973]: I0320 13:39:31.347992 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19b59a52-d780-491f-ab38-270ea519cddc-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9\" (UID: \"19b59a52-d780-491f-ab38-270ea519cddc\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9" Mar 20 13:39:31 crc kubenswrapper[4973]: I0320 13:39:31.348099 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19b59a52-d780-491f-ab38-270ea519cddc-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9\" (UID: \"19b59a52-d780-491f-ab38-270ea519cddc\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9" Mar 20 13:39:31 crc kubenswrapper[4973]: I0320 13:39:31.348253 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jr5q\" (UniqueName: \"kubernetes.io/projected/19b59a52-d780-491f-ab38-270ea519cddc-kube-api-access-6jr5q\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9\" (UID: \"19b59a52-d780-491f-ab38-270ea519cddc\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9" Mar 20 13:39:31 crc kubenswrapper[4973]: I0320 13:39:31.348312 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19b59a52-d780-491f-ab38-270ea519cddc-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9\" (UID: \"19b59a52-d780-491f-ab38-270ea519cddc\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9" Mar 20 13:39:31 crc kubenswrapper[4973]: I0320 13:39:31.368191 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jr5q\" (UniqueName: \"kubernetes.io/projected/19b59a52-d780-491f-ab38-270ea519cddc-kube-api-access-6jr5q\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9\" (UID: \"19b59a52-d780-491f-ab38-270ea519cddc\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9" Mar 20 13:39:31 crc kubenswrapper[4973]: I0320 13:39:31.482172 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9" Mar 20 13:39:31 crc kubenswrapper[4973]: I0320 13:39:31.888506 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9"] Mar 20 13:39:31 crc kubenswrapper[4973]: I0320 13:39:31.959987 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a13fbd0b-b630-41bc-b997-4aebc4cac884" path="/var/lib/kubelet/pods/a13fbd0b-b630-41bc-b997-4aebc4cac884/volumes" Mar 20 13:39:32 crc kubenswrapper[4973]: I0320 13:39:32.242610 4973 generic.go:334] "Generic (PLEG): container finished" podID="19b59a52-d780-491f-ab38-270ea519cddc" containerID="afdc74e3dcaa1b973888cb3f5cd083375958d60064b1da447049ccf958f3cd4e" exitCode=0 Mar 20 13:39:32 crc kubenswrapper[4973]: I0320 13:39:32.242666 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9" event={"ID":"19b59a52-d780-491f-ab38-270ea519cddc","Type":"ContainerDied","Data":"afdc74e3dcaa1b973888cb3f5cd083375958d60064b1da447049ccf958f3cd4e"} Mar 20 13:39:32 crc kubenswrapper[4973]: I0320 13:39:32.242701 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9" event={"ID":"19b59a52-d780-491f-ab38-270ea519cddc","Type":"ContainerStarted","Data":"d4fa89d46001b7c53da7cc35a7823e2e2eafdcbdeea136b72c42e475a1176b95"} Mar 20 13:39:34 crc kubenswrapper[4973]: I0320 13:39:34.259402 4973 generic.go:334] "Generic (PLEG): container finished" podID="19b59a52-d780-491f-ab38-270ea519cddc" containerID="0ff966d60445e9635db083f4a282c65acf87ccb6e7f6b5dec46fc94a0b9088c0" exitCode=0 Mar 20 13:39:34 crc kubenswrapper[4973]: I0320 13:39:34.259456 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9" event={"ID":"19b59a52-d780-491f-ab38-270ea519cddc","Type":"ContainerDied","Data":"0ff966d60445e9635db083f4a282c65acf87ccb6e7f6b5dec46fc94a0b9088c0"} Mar 20 13:39:35 crc kubenswrapper[4973]: I0320 13:39:35.268938 4973 generic.go:334] "Generic (PLEG): container finished" podID="19b59a52-d780-491f-ab38-270ea519cddc" containerID="978088cb89fd508d872257093f6c9b4eb1aeb2d06e735cadf587b68f8f256cc0" exitCode=0 Mar 20 13:39:35 crc kubenswrapper[4973]: I0320 13:39:35.269282 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9" event={"ID":"19b59a52-d780-491f-ab38-270ea519cddc","Type":"ContainerDied","Data":"978088cb89fd508d872257093f6c9b4eb1aeb2d06e735cadf587b68f8f256cc0"} Mar 20 13:39:36 crc kubenswrapper[4973]: I0320 13:39:36.566168 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9" Mar 20 13:39:36 crc kubenswrapper[4973]: I0320 13:39:36.663365 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19b59a52-d780-491f-ab38-270ea519cddc-bundle\") pod \"19b59a52-d780-491f-ab38-270ea519cddc\" (UID: \"19b59a52-d780-491f-ab38-270ea519cddc\") " Mar 20 13:39:36 crc kubenswrapper[4973]: I0320 13:39:36.663434 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jr5q\" (UniqueName: \"kubernetes.io/projected/19b59a52-d780-491f-ab38-270ea519cddc-kube-api-access-6jr5q\") pod \"19b59a52-d780-491f-ab38-270ea519cddc\" (UID: \"19b59a52-d780-491f-ab38-270ea519cddc\") " Mar 20 13:39:36 crc kubenswrapper[4973]: I0320 13:39:36.663572 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19b59a52-d780-491f-ab38-270ea519cddc-util\") pod \"19b59a52-d780-491f-ab38-270ea519cddc\" (UID: \"19b59a52-d780-491f-ab38-270ea519cddc\") " Mar 20 13:39:36 crc kubenswrapper[4973]: I0320 13:39:36.664331 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19b59a52-d780-491f-ab38-270ea519cddc-bundle" (OuterVolumeSpecName: "bundle") pod "19b59a52-d780-491f-ab38-270ea519cddc" (UID: "19b59a52-d780-491f-ab38-270ea519cddc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:39:36 crc kubenswrapper[4973]: I0320 13:39:36.672501 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/19b59a52-d780-491f-ab38-270ea519cddc-kube-api-access-6jr5q" (OuterVolumeSpecName: "kube-api-access-6jr5q") pod "19b59a52-d780-491f-ab38-270ea519cddc" (UID: "19b59a52-d780-491f-ab38-270ea519cddc"). InnerVolumeSpecName "kube-api-access-6jr5q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:39:36 crc kubenswrapper[4973]: I0320 13:39:36.689388 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/19b59a52-d780-491f-ab38-270ea519cddc-util" (OuterVolumeSpecName: "util") pod "19b59a52-d780-491f-ab38-270ea519cddc" (UID: "19b59a52-d780-491f-ab38-270ea519cddc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:39:36 crc kubenswrapper[4973]: I0320 13:39:36.765011 4973 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/19b59a52-d780-491f-ab38-270ea519cddc-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:36 crc kubenswrapper[4973]: I0320 13:39:36.765045 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jr5q\" (UniqueName: \"kubernetes.io/projected/19b59a52-d780-491f-ab38-270ea519cddc-kube-api-access-6jr5q\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:36 crc kubenswrapper[4973]: I0320 13:39:36.765059 4973 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/19b59a52-d780-491f-ab38-270ea519cddc-util\") on node \"crc\" DevicePath \"\"" Mar 20 13:39:37 crc kubenswrapper[4973]: I0320 13:39:37.283543 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9" event={"ID":"19b59a52-d780-491f-ab38-270ea519cddc","Type":"ContainerDied","Data":"d4fa89d46001b7c53da7cc35a7823e2e2eafdcbdeea136b72c42e475a1176b95"} Mar 20 13:39:37 crc kubenswrapper[4973]: I0320 13:39:37.283916 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4fa89d46001b7c53da7cc35a7823e2e2eafdcbdeea136b72c42e475a1176b95" Mar 20 13:39:37 crc kubenswrapper[4973]: I0320 13:39:37.283606 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.571682 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7f9db6bfb5-b8w47"] Mar 20 13:39:45 crc kubenswrapper[4973]: E0320 13:39:45.572615 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19b59a52-d780-491f-ab38-270ea519cddc" containerName="pull" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.572630 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="19b59a52-d780-491f-ab38-270ea519cddc" containerName="pull" Mar 20 13:39:45 crc kubenswrapper[4973]: E0320 13:39:45.572640 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19b59a52-d780-491f-ab38-270ea519cddc" containerName="extract" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.572647 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="19b59a52-d780-491f-ab38-270ea519cddc" containerName="extract" Mar 20 13:39:45 crc kubenswrapper[4973]: E0320 13:39:45.572662 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="19b59a52-d780-491f-ab38-270ea519cddc" containerName="util" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.572670 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="19b59a52-d780-491f-ab38-270ea519cddc" containerName="util" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.572821 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="19b59a52-d780-491f-ab38-270ea519cddc" containerName="extract" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.573467 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7f9db6bfb5-b8w47" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.576435 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.576573 4973 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.576528 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.577996 4973 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.594326 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7f9db6bfb5-b8w47"] Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.596029 4973 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-9r56g" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.624619 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpcwp\" (UniqueName: \"kubernetes.io/projected/487335bd-36f4-42e2-87e1-5acef7226919-kube-api-access-dpcwp\") pod \"metallb-operator-controller-manager-7f9db6bfb5-b8w47\" (UID: \"487335bd-36f4-42e2-87e1-5acef7226919\") " pod="metallb-system/metallb-operator-controller-manager-7f9db6bfb5-b8w47" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.624663 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/487335bd-36f4-42e2-87e1-5acef7226919-apiservice-cert\") pod \"metallb-operator-controller-manager-7f9db6bfb5-b8w47\" (UID: \"487335bd-36f4-42e2-87e1-5acef7226919\") " pod="metallb-system/metallb-operator-controller-manager-7f9db6bfb5-b8w47" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.624763 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/487335bd-36f4-42e2-87e1-5acef7226919-webhook-cert\") pod \"metallb-operator-controller-manager-7f9db6bfb5-b8w47\" (UID: \"487335bd-36f4-42e2-87e1-5acef7226919\") " pod="metallb-system/metallb-operator-controller-manager-7f9db6bfb5-b8w47" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.726199 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/487335bd-36f4-42e2-87e1-5acef7226919-webhook-cert\") pod \"metallb-operator-controller-manager-7f9db6bfb5-b8w47\" (UID: \"487335bd-36f4-42e2-87e1-5acef7226919\") " pod="metallb-system/metallb-operator-controller-manager-7f9db6bfb5-b8w47" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.726283 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpcwp\" (UniqueName: \"kubernetes.io/projected/487335bd-36f4-42e2-87e1-5acef7226919-kube-api-access-dpcwp\") pod \"metallb-operator-controller-manager-7f9db6bfb5-b8w47\" (UID: \"487335bd-36f4-42e2-87e1-5acef7226919\") " pod="metallb-system/metallb-operator-controller-manager-7f9db6bfb5-b8w47" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.726318 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/487335bd-36f4-42e2-87e1-5acef7226919-apiservice-cert\") pod \"metallb-operator-controller-manager-7f9db6bfb5-b8w47\" (UID: \"487335bd-36f4-42e2-87e1-5acef7226919\") " pod="metallb-system/metallb-operator-controller-manager-7f9db6bfb5-b8w47" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.734331 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/487335bd-36f4-42e2-87e1-5acef7226919-apiservice-cert\") pod \"metallb-operator-controller-manager-7f9db6bfb5-b8w47\" (UID: \"487335bd-36f4-42e2-87e1-5acef7226919\") " pod="metallb-system/metallb-operator-controller-manager-7f9db6bfb5-b8w47" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.740963 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/487335bd-36f4-42e2-87e1-5acef7226919-webhook-cert\") pod \"metallb-operator-controller-manager-7f9db6bfb5-b8w47\" (UID: \"487335bd-36f4-42e2-87e1-5acef7226919\") " pod="metallb-system/metallb-operator-controller-manager-7f9db6bfb5-b8w47" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.755197 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpcwp\" (UniqueName: \"kubernetes.io/projected/487335bd-36f4-42e2-87e1-5acef7226919-kube-api-access-dpcwp\") pod \"metallb-operator-controller-manager-7f9db6bfb5-b8w47\" (UID: \"487335bd-36f4-42e2-87e1-5acef7226919\") " pod="metallb-system/metallb-operator-controller-manager-7f9db6bfb5-b8w47" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.828037 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r"] Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.829838 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.831815 4973 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.831876 4973 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.831927 4973 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-nh99k" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.844796 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r"] Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.893552 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7f9db6bfb5-b8w47" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.929956 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d74cf88e-0824-45f2-92ff-3798ad77f943-apiservice-cert\") pod \"metallb-operator-webhook-server-68c6dd9858-4mw5r\" (UID: \"d74cf88e-0824-45f2-92ff-3798ad77f943\") " pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.930156 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjcnp\" (UniqueName: \"kubernetes.io/projected/d74cf88e-0824-45f2-92ff-3798ad77f943-kube-api-access-qjcnp\") pod \"metallb-operator-webhook-server-68c6dd9858-4mw5r\" (UID: \"d74cf88e-0824-45f2-92ff-3798ad77f943\") " pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" Mar 20 13:39:45 crc kubenswrapper[4973]: I0320 13:39:45.930356 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d74cf88e-0824-45f2-92ff-3798ad77f943-webhook-cert\") pod \"metallb-operator-webhook-server-68c6dd9858-4mw5r\" (UID: \"d74cf88e-0824-45f2-92ff-3798ad77f943\") " pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" Mar 20 13:39:46 crc kubenswrapper[4973]: I0320 13:39:46.032464 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d74cf88e-0824-45f2-92ff-3798ad77f943-apiservice-cert\") pod \"metallb-operator-webhook-server-68c6dd9858-4mw5r\" (UID: \"d74cf88e-0824-45f2-92ff-3798ad77f943\") " pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" Mar 20 13:39:46 crc kubenswrapper[4973]: I0320 13:39:46.032520 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qjcnp\" (UniqueName: \"kubernetes.io/projected/d74cf88e-0824-45f2-92ff-3798ad77f943-kube-api-access-qjcnp\") pod \"metallb-operator-webhook-server-68c6dd9858-4mw5r\" (UID: \"d74cf88e-0824-45f2-92ff-3798ad77f943\") " pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" Mar 20 13:39:46 crc kubenswrapper[4973]: I0320 13:39:46.032570 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d74cf88e-0824-45f2-92ff-3798ad77f943-webhook-cert\") pod \"metallb-operator-webhook-server-68c6dd9858-4mw5r\" (UID: \"d74cf88e-0824-45f2-92ff-3798ad77f943\") " pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" Mar 20 13:39:46 crc kubenswrapper[4973]: I0320 13:39:46.039243 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d74cf88e-0824-45f2-92ff-3798ad77f943-webhook-cert\") pod \"metallb-operator-webhook-server-68c6dd9858-4mw5r\" (UID: \"d74cf88e-0824-45f2-92ff-3798ad77f943\") " pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" Mar 20 13:39:46 crc kubenswrapper[4973]: I0320 13:39:46.046410 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d74cf88e-0824-45f2-92ff-3798ad77f943-apiservice-cert\") pod \"metallb-operator-webhook-server-68c6dd9858-4mw5r\" (UID: \"d74cf88e-0824-45f2-92ff-3798ad77f943\") " pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" Mar 20 13:39:46 crc kubenswrapper[4973]: I0320 13:39:46.057045 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjcnp\" (UniqueName: \"kubernetes.io/projected/d74cf88e-0824-45f2-92ff-3798ad77f943-kube-api-access-qjcnp\") pod \"metallb-operator-webhook-server-68c6dd9858-4mw5r\" (UID: \"d74cf88e-0824-45f2-92ff-3798ad77f943\") " pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" Mar 20 13:39:46 crc kubenswrapper[4973]: I0320 13:39:46.151180 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" Mar 20 13:39:46 crc kubenswrapper[4973]: I0320 13:39:46.454016 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7f9db6bfb5-b8w47"] Mar 20 13:39:46 crc kubenswrapper[4973]: I0320 13:39:46.568727 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r"] Mar 20 13:39:46 crc kubenswrapper[4973]: W0320 13:39:46.583860 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd74cf88e_0824_45f2_92ff_3798ad77f943.slice/crio-5271b060a9eb0bd3ac366f9042149ccd33b16dac89783c84a1dc1562edf89420 WatchSource:0}: Error finding container 5271b060a9eb0bd3ac366f9042149ccd33b16dac89783c84a1dc1562edf89420: Status 404 returned error can't find the container with id 5271b060a9eb0bd3ac366f9042149ccd33b16dac89783c84a1dc1562edf89420 Mar 20 13:39:47 crc kubenswrapper[4973]: I0320 13:39:47.371928 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7f9db6bfb5-b8w47" event={"ID":"487335bd-36f4-42e2-87e1-5acef7226919","Type":"ContainerStarted","Data":"d03cf3d77039d26452b0cee32ebd8c5d57e1977b0d2853188052f152f306f12e"} Mar 20 13:39:47 crc kubenswrapper[4973]: I0320 13:39:47.373001 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" event={"ID":"d74cf88e-0824-45f2-92ff-3798ad77f943","Type":"ContainerStarted","Data":"5271b060a9eb0bd3ac366f9042149ccd33b16dac89783c84a1dc1562edf89420"} Mar 20 13:39:53 crc kubenswrapper[4973]: I0320 13:39:53.423187 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" event={"ID":"d74cf88e-0824-45f2-92ff-3798ad77f943","Type":"ContainerStarted","Data":"06762ebb55337251c2176c98fc908dfe85ebf019d46b4dc4000dbf43275e0070"} Mar 20 13:39:53 crc kubenswrapper[4973]: I0320 13:39:53.423737 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" Mar 20 13:39:53 crc kubenswrapper[4973]: I0320 13:39:53.425829 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7f9db6bfb5-b8w47" event={"ID":"487335bd-36f4-42e2-87e1-5acef7226919","Type":"ContainerStarted","Data":"469f9087a0882e289bf733e5ab6e0de0f542150cf8caef4fda3edc554d5b5e8c"} Mar 20 13:39:53 crc kubenswrapper[4973]: I0320 13:39:53.426070 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7f9db6bfb5-b8w47" Mar 20 13:39:53 crc kubenswrapper[4973]: I0320 13:39:53.447203 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" podStartSLOduration=2.351136773 podStartE2EDuration="8.447183213s" podCreationTimestamp="2026-03-20 13:39:45 +0000 UTC" firstStartedPulling="2026-03-20 13:39:46.586181848 +0000 UTC m=+1107.329851592" lastFinishedPulling="2026-03-20 13:39:52.682228278 +0000 UTC m=+1113.425898032" observedRunningTime="2026-03-20 13:39:53.439086421 +0000 UTC m=+1114.182756165" watchObservedRunningTime="2026-03-20 13:39:53.447183213 +0000 UTC m=+1114.190852957" Mar 20 13:39:53 crc kubenswrapper[4973]: I0320 13:39:53.471769 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7f9db6bfb5-b8w47" podStartSLOduration=2.303358745 podStartE2EDuration="8.471751447s" podCreationTimestamp="2026-03-20 13:39:45 +0000 UTC" firstStartedPulling="2026-03-20 13:39:46.495493134 +0000 UTC m=+1107.239162878" lastFinishedPulling="2026-03-20 13:39:52.663885836 +0000 UTC m=+1113.407555580" observedRunningTime="2026-03-20 13:39:53.464463497 +0000 UTC m=+1114.208133261" watchObservedRunningTime="2026-03-20 13:39:53.471751447 +0000 UTC m=+1114.215421191" Mar 20 13:40:00 crc kubenswrapper[4973]: I0320 13:40:00.125823 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566900-mxl6j"] Mar 20 13:40:00 crc kubenswrapper[4973]: I0320 13:40:00.127636 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566900-mxl6j" Mar 20 13:40:00 crc kubenswrapper[4973]: I0320 13:40:00.129601 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:40:00 crc kubenswrapper[4973]: I0320 13:40:00.130215 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:40:00 crc kubenswrapper[4973]: I0320 13:40:00.130435 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 13:40:00 crc kubenswrapper[4973]: I0320 13:40:00.133825 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566900-mxl6j"] Mar 20 13:40:00 crc kubenswrapper[4973]: I0320 13:40:00.212288 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwcfn\" (UniqueName: \"kubernetes.io/projected/803bba01-75c9-4c14-80c0-0da407ad672d-kube-api-access-xwcfn\") pod \"auto-csr-approver-29566900-mxl6j\" (UID: \"803bba01-75c9-4c14-80c0-0da407ad672d\") " pod="openshift-infra/auto-csr-approver-29566900-mxl6j" Mar 20 13:40:00 crc kubenswrapper[4973]: I0320 13:40:00.314143 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwcfn\" (UniqueName: \"kubernetes.io/projected/803bba01-75c9-4c14-80c0-0da407ad672d-kube-api-access-xwcfn\") pod \"auto-csr-approver-29566900-mxl6j\" (UID: \"803bba01-75c9-4c14-80c0-0da407ad672d\") " pod="openshift-infra/auto-csr-approver-29566900-mxl6j" Mar 20 13:40:00 crc kubenswrapper[4973]: I0320 13:40:00.336370 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwcfn\" (UniqueName: \"kubernetes.io/projected/803bba01-75c9-4c14-80c0-0da407ad672d-kube-api-access-xwcfn\") pod \"auto-csr-approver-29566900-mxl6j\" (UID: \"803bba01-75c9-4c14-80c0-0da407ad672d\") " pod="openshift-infra/auto-csr-approver-29566900-mxl6j" Mar 20 13:40:00 crc kubenswrapper[4973]: I0320 13:40:00.463758 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566900-mxl6j" Mar 20 13:40:00 crc kubenswrapper[4973]: I0320 13:40:00.980198 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566900-mxl6j"] Mar 20 13:40:00 crc kubenswrapper[4973]: W0320 13:40:00.986586 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod803bba01_75c9_4c14_80c0_0da407ad672d.slice/crio-171effcc9e49fa8504a0da147d4468c790c44fd78d973c548879ffb92bcd3849 WatchSource:0}: Error finding container 171effcc9e49fa8504a0da147d4468c790c44fd78d973c548879ffb92bcd3849: Status 404 returned error can't find the container with id 171effcc9e49fa8504a0da147d4468c790c44fd78d973c548879ffb92bcd3849 Mar 20 13:40:00 crc kubenswrapper[4973]: I0320 13:40:00.989943 4973 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:40:01 crc kubenswrapper[4973]: I0320 13:40:01.475730 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566900-mxl6j" event={"ID":"803bba01-75c9-4c14-80c0-0da407ad672d","Type":"ContainerStarted","Data":"171effcc9e49fa8504a0da147d4468c790c44fd78d973c548879ffb92bcd3849"} Mar 20 13:40:03 crc kubenswrapper[4973]: I0320 13:40:03.490757 4973 generic.go:334] "Generic (PLEG): container finished" podID="803bba01-75c9-4c14-80c0-0da407ad672d" containerID="e83600963fce5240b7bf8aef16bc7dfa70c017aa2cb389f72dff2641d37d5b18" exitCode=0 Mar 20 13:40:03 crc kubenswrapper[4973]: I0320 13:40:03.490971 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566900-mxl6j" event={"ID":"803bba01-75c9-4c14-80c0-0da407ad672d","Type":"ContainerDied","Data":"e83600963fce5240b7bf8aef16bc7dfa70c017aa2cb389f72dff2641d37d5b18"} Mar 20 13:40:05 crc kubenswrapper[4973]: I0320 13:40:05.022244 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566900-mxl6j" Mar 20 13:40:05 crc kubenswrapper[4973]: I0320 13:40:05.085537 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwcfn\" (UniqueName: \"kubernetes.io/projected/803bba01-75c9-4c14-80c0-0da407ad672d-kube-api-access-xwcfn\") pod \"803bba01-75c9-4c14-80c0-0da407ad672d\" (UID: \"803bba01-75c9-4c14-80c0-0da407ad672d\") " Mar 20 13:40:05 crc kubenswrapper[4973]: I0320 13:40:05.091161 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/803bba01-75c9-4c14-80c0-0da407ad672d-kube-api-access-xwcfn" (OuterVolumeSpecName: "kube-api-access-xwcfn") pod "803bba01-75c9-4c14-80c0-0da407ad672d" (UID: "803bba01-75c9-4c14-80c0-0da407ad672d"). InnerVolumeSpecName "kube-api-access-xwcfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:05 crc kubenswrapper[4973]: I0320 13:40:05.188105 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwcfn\" (UniqueName: \"kubernetes.io/projected/803bba01-75c9-4c14-80c0-0da407ad672d-kube-api-access-xwcfn\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:05 crc kubenswrapper[4973]: I0320 13:40:05.511386 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566900-mxl6j" event={"ID":"803bba01-75c9-4c14-80c0-0da407ad672d","Type":"ContainerDied","Data":"171effcc9e49fa8504a0da147d4468c790c44fd78d973c548879ffb92bcd3849"} Mar 20 13:40:05 crc kubenswrapper[4973]: I0320 13:40:05.511424 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="171effcc9e49fa8504a0da147d4468c790c44fd78d973c548879ffb92bcd3849" Mar 20 13:40:05 crc kubenswrapper[4973]: I0320 13:40:05.511454 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566900-mxl6j" Mar 20 13:40:06 crc kubenswrapper[4973]: I0320 13:40:06.082506 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566894-d67xc"] Mar 20 13:40:06 crc kubenswrapper[4973]: I0320 13:40:06.090278 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566894-d67xc"] Mar 20 13:40:06 crc kubenswrapper[4973]: I0320 13:40:06.156680 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" Mar 20 13:40:07 crc kubenswrapper[4973]: I0320 13:40:07.959107 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7634907-5a5c-4483-be73-3b057ec837ad" path="/var/lib/kubelet/pods/a7634907-5a5c-4483-be73-3b057ec837ad/volumes" Mar 20 13:40:25 crc kubenswrapper[4973]: I0320 13:40:25.895561 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7f9db6bfb5-b8w47" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.567472 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-cgmjd"] Mar 20 13:40:26 crc kubenswrapper[4973]: E0320 13:40:26.567774 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="803bba01-75c9-4c14-80c0-0da407ad672d" containerName="oc" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.567791 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="803bba01-75c9-4c14-80c0-0da407ad672d" containerName="oc" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.567934 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="803bba01-75c9-4c14-80c0-0da407ad672d" containerName="oc" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.570817 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-cgmjd" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.572522 4973 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.573633 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.574005 4973 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-vlw2n" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.575372 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjhq2"] Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.576549 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjhq2" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.577948 4973 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.587457 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjhq2"] Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.640255 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/75fc4720-ae9c-4ae5-8e4c-7c9a800f5478-frr-conf\") pod \"frr-k8s-cgmjd\" (UID: \"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478\") " pod="metallb-system/frr-k8s-cgmjd" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.640666 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg225\" (UniqueName: \"kubernetes.io/projected/75fc4720-ae9c-4ae5-8e4c-7c9a800f5478-kube-api-access-xg225\") pod \"frr-k8s-cgmjd\" (UID: \"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478\") " pod="metallb-system/frr-k8s-cgmjd" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.640734 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75fc4720-ae9c-4ae5-8e4c-7c9a800f5478-metrics-certs\") pod \"frr-k8s-cgmjd\" (UID: \"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478\") " pod="metallb-system/frr-k8s-cgmjd" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.640759 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/75fc4720-ae9c-4ae5-8e4c-7c9a800f5478-frr-sockets\") pod \"frr-k8s-cgmjd\" (UID: \"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478\") " pod="metallb-system/frr-k8s-cgmjd" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.640826 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/75fc4720-ae9c-4ae5-8e4c-7c9a800f5478-frr-startup\") pod \"frr-k8s-cgmjd\" (UID: \"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478\") " pod="metallb-system/frr-k8s-cgmjd" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.640861 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/75fc4720-ae9c-4ae5-8e4c-7c9a800f5478-metrics\") pod \"frr-k8s-cgmjd\" (UID: \"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478\") " pod="metallb-system/frr-k8s-cgmjd" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.640890 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-58mqk\" (UniqueName: \"kubernetes.io/projected/1221336e-652c-45b4-bd66-43e96cf2c643-kube-api-access-58mqk\") pod \"frr-k8s-webhook-server-bcc4b6f68-rjhq2\" (UID: \"1221336e-652c-45b4-bd66-43e96cf2c643\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjhq2" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.640916 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/75fc4720-ae9c-4ae5-8e4c-7c9a800f5478-reloader\") pod \"frr-k8s-cgmjd\" (UID: \"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478\") " pod="metallb-system/frr-k8s-cgmjd" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.640941 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1221336e-652c-45b4-bd66-43e96cf2c643-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-rjhq2\" (UID: \"1221336e-652c-45b4-bd66-43e96cf2c643\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjhq2" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.679747 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-5npj7"] Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.684810 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5npj7" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.686726 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.686727 4973 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.687143 4973 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.687145 4973 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-98rt8" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.697294 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-k2z22"] Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.698534 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-k2z22" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.699854 4973 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.726966 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-k2z22"] Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.741878 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/75fc4720-ae9c-4ae5-8e4c-7c9a800f5478-frr-sockets\") pod \"frr-k8s-cgmjd\" (UID: \"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478\") " pod="metallb-system/frr-k8s-cgmjd" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.741962 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/75fc4720-ae9c-4ae5-8e4c-7c9a800f5478-frr-startup\") pod \"frr-k8s-cgmjd\" (UID: \"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478\") " pod="metallb-system/frr-k8s-cgmjd" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.741994 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/75fc4720-ae9c-4ae5-8e4c-7c9a800f5478-metrics\") pod \"frr-k8s-cgmjd\" (UID: \"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478\") " pod="metallb-system/frr-k8s-cgmjd" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.742031 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-58mqk\" (UniqueName: \"kubernetes.io/projected/1221336e-652c-45b4-bd66-43e96cf2c643-kube-api-access-58mqk\") pod \"frr-k8s-webhook-server-bcc4b6f68-rjhq2\" (UID: \"1221336e-652c-45b4-bd66-43e96cf2c643\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjhq2" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.742068 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/549c9d4d-d8f5-441e-9e2b-faa4bc5bd589-cert\") pod \"controller-7bb4cc7c98-k2z22\" (UID: \"549c9d4d-d8f5-441e-9e2b-faa4bc5bd589\") " pod="metallb-system/controller-7bb4cc7c98-k2z22" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.742092 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/75fc4720-ae9c-4ae5-8e4c-7c9a800f5478-reloader\") pod \"frr-k8s-cgmjd\" (UID: \"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478\") " pod="metallb-system/frr-k8s-cgmjd" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.742115 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2d4aa91f-29e1-4129-b67e-493c83865a51-memberlist\") pod \"speaker-5npj7\" (UID: \"2d4aa91f-29e1-4129-b67e-493c83865a51\") " pod="metallb-system/speaker-5npj7" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.742145 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1221336e-652c-45b4-bd66-43e96cf2c643-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-rjhq2\" (UID: \"1221336e-652c-45b4-bd66-43e96cf2c643\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjhq2" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.742183 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z6qr\" (UniqueName: \"kubernetes.io/projected/549c9d4d-d8f5-441e-9e2b-faa4bc5bd589-kube-api-access-2z6qr\") pod \"controller-7bb4cc7c98-k2z22\" (UID: \"549c9d4d-d8f5-441e-9e2b-faa4bc5bd589\") " pod="metallb-system/controller-7bb4cc7c98-k2z22" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.742214 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/75fc4720-ae9c-4ae5-8e4c-7c9a800f5478-frr-conf\") pod \"frr-k8s-cgmjd\" (UID: \"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478\") " pod="metallb-system/frr-k8s-cgmjd" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.742271 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg225\" (UniqueName: \"kubernetes.io/projected/75fc4720-ae9c-4ae5-8e4c-7c9a800f5478-kube-api-access-xg225\") pod \"frr-k8s-cgmjd\" (UID: \"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478\") " pod="metallb-system/frr-k8s-cgmjd" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.742307 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2d4aa91f-29e1-4129-b67e-493c83865a51-metallb-excludel2\") pod \"speaker-5npj7\" (UID: \"2d4aa91f-29e1-4129-b67e-493c83865a51\") " pod="metallb-system/speaker-5npj7" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.742358 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/549c9d4d-d8f5-441e-9e2b-faa4bc5bd589-metrics-certs\") pod \"controller-7bb4cc7c98-k2z22\" (UID: \"549c9d4d-d8f5-441e-9e2b-faa4bc5bd589\") " pod="metallb-system/controller-7bb4cc7c98-k2z22" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.742383 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2d4aa91f-29e1-4129-b67e-493c83865a51-metrics-certs\") pod \"speaker-5npj7\" (UID: \"2d4aa91f-29e1-4129-b67e-493c83865a51\") " pod="metallb-system/speaker-5npj7" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.742412 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wtmq\" (UniqueName: \"kubernetes.io/projected/2d4aa91f-29e1-4129-b67e-493c83865a51-kube-api-access-9wtmq\") pod \"speaker-5npj7\" (UID: \"2d4aa91f-29e1-4129-b67e-493c83865a51\") " pod="metallb-system/speaker-5npj7" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.742437 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75fc4720-ae9c-4ae5-8e4c-7c9a800f5478-metrics-certs\") pod \"frr-k8s-cgmjd\" (UID: \"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478\") " pod="metallb-system/frr-k8s-cgmjd" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.742507 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/75fc4720-ae9c-4ae5-8e4c-7c9a800f5478-frr-sockets\") pod \"frr-k8s-cgmjd\" (UID: \"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478\") " pod="metallb-system/frr-k8s-cgmjd" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.742877 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/75fc4720-ae9c-4ae5-8e4c-7c9a800f5478-metrics\") pod \"frr-k8s-cgmjd\" (UID: \"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478\") " pod="metallb-system/frr-k8s-cgmjd" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.743117 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/75fc4720-ae9c-4ae5-8e4c-7c9a800f5478-frr-startup\") pod \"frr-k8s-cgmjd\" (UID: \"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478\") " pod="metallb-system/frr-k8s-cgmjd" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.743469 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/75fc4720-ae9c-4ae5-8e4c-7c9a800f5478-reloader\") pod \"frr-k8s-cgmjd\" (UID: \"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478\") " pod="metallb-system/frr-k8s-cgmjd" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.743687 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/75fc4720-ae9c-4ae5-8e4c-7c9a800f5478-frr-conf\") pod \"frr-k8s-cgmjd\" (UID: \"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478\") " pod="metallb-system/frr-k8s-cgmjd" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.752190 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/75fc4720-ae9c-4ae5-8e4c-7c9a800f5478-metrics-certs\") pod \"frr-k8s-cgmjd\" (UID: \"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478\") " pod="metallb-system/frr-k8s-cgmjd" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.756947 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1221336e-652c-45b4-bd66-43e96cf2c643-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-rjhq2\" (UID: \"1221336e-652c-45b4-bd66-43e96cf2c643\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjhq2" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.760294 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg225\" (UniqueName: \"kubernetes.io/projected/75fc4720-ae9c-4ae5-8e4c-7c9a800f5478-kube-api-access-xg225\") pod \"frr-k8s-cgmjd\" (UID: \"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478\") " pod="metallb-system/frr-k8s-cgmjd" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.763008 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-58mqk\" (UniqueName: \"kubernetes.io/projected/1221336e-652c-45b4-bd66-43e96cf2c643-kube-api-access-58mqk\") pod \"frr-k8s-webhook-server-bcc4b6f68-rjhq2\" (UID: \"1221336e-652c-45b4-bd66-43e96cf2c643\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjhq2" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.843557 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/549c9d4d-d8f5-441e-9e2b-faa4bc5bd589-cert\") pod \"controller-7bb4cc7c98-k2z22\" (UID: \"549c9d4d-d8f5-441e-9e2b-faa4bc5bd589\") " pod="metallb-system/controller-7bb4cc7c98-k2z22" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.843600 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2d4aa91f-29e1-4129-b67e-493c83865a51-memberlist\") pod \"speaker-5npj7\" (UID: \"2d4aa91f-29e1-4129-b67e-493c83865a51\") " pod="metallb-system/speaker-5npj7" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.843639 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z6qr\" (UniqueName: \"kubernetes.io/projected/549c9d4d-d8f5-441e-9e2b-faa4bc5bd589-kube-api-access-2z6qr\") pod \"controller-7bb4cc7c98-k2z22\" (UID: \"549c9d4d-d8f5-441e-9e2b-faa4bc5bd589\") " pod="metallb-system/controller-7bb4cc7c98-k2z22" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.843687 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2d4aa91f-29e1-4129-b67e-493c83865a51-metallb-excludel2\") pod \"speaker-5npj7\" (UID: \"2d4aa91f-29e1-4129-b67e-493c83865a51\") " pod="metallb-system/speaker-5npj7" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.843713 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/549c9d4d-d8f5-441e-9e2b-faa4bc5bd589-metrics-certs\") pod \"controller-7bb4cc7c98-k2z22\" (UID: \"549c9d4d-d8f5-441e-9e2b-faa4bc5bd589\") " pod="metallb-system/controller-7bb4cc7c98-k2z22" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.843729 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2d4aa91f-29e1-4129-b67e-493c83865a51-metrics-certs\") pod \"speaker-5npj7\" (UID: \"2d4aa91f-29e1-4129-b67e-493c83865a51\") " pod="metallb-system/speaker-5npj7" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.843748 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wtmq\" (UniqueName: \"kubernetes.io/projected/2d4aa91f-29e1-4129-b67e-493c83865a51-kube-api-access-9wtmq\") pod \"speaker-5npj7\" (UID: \"2d4aa91f-29e1-4129-b67e-493c83865a51\") " pod="metallb-system/speaker-5npj7" Mar 20 13:40:26 crc kubenswrapper[4973]: E0320 13:40:26.843793 4973 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 13:40:26 crc kubenswrapper[4973]: E0320 13:40:26.843872 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d4aa91f-29e1-4129-b67e-493c83865a51-memberlist podName:2d4aa91f-29e1-4129-b67e-493c83865a51 nodeName:}" failed. No retries permitted until 2026-03-20 13:40:27.343852842 +0000 UTC m=+1148.087522586 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2d4aa91f-29e1-4129-b67e-493c83865a51-memberlist") pod "speaker-5npj7" (UID: "2d4aa91f-29e1-4129-b67e-493c83865a51") : secret "metallb-memberlist" not found Mar 20 13:40:26 crc kubenswrapper[4973]: E0320 13:40:26.844237 4973 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 20 13:40:26 crc kubenswrapper[4973]: E0320 13:40:26.844299 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d4aa91f-29e1-4129-b67e-493c83865a51-metrics-certs podName:2d4aa91f-29e1-4129-b67e-493c83865a51 nodeName:}" failed. No retries permitted until 2026-03-20 13:40:27.344278894 +0000 UTC m=+1148.087948728 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2d4aa91f-29e1-4129-b67e-493c83865a51-metrics-certs") pod "speaker-5npj7" (UID: "2d4aa91f-29e1-4129-b67e-493c83865a51") : secret "speaker-certs-secret" not found Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.844733 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/2d4aa91f-29e1-4129-b67e-493c83865a51-metallb-excludel2\") pod \"speaker-5npj7\" (UID: \"2d4aa91f-29e1-4129-b67e-493c83865a51\") " pod="metallb-system/speaker-5npj7" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.845504 4973 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.847283 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/549c9d4d-d8f5-441e-9e2b-faa4bc5bd589-metrics-certs\") pod \"controller-7bb4cc7c98-k2z22\" (UID: \"549c9d4d-d8f5-441e-9e2b-faa4bc5bd589\") " pod="metallb-system/controller-7bb4cc7c98-k2z22" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.859403 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/549c9d4d-d8f5-441e-9e2b-faa4bc5bd589-cert\") pod \"controller-7bb4cc7c98-k2z22\" (UID: \"549c9d4d-d8f5-441e-9e2b-faa4bc5bd589\") " pod="metallb-system/controller-7bb4cc7c98-k2z22" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.862324 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z6qr\" (UniqueName: \"kubernetes.io/projected/549c9d4d-d8f5-441e-9e2b-faa4bc5bd589-kube-api-access-2z6qr\") pod \"controller-7bb4cc7c98-k2z22\" (UID: \"549c9d4d-d8f5-441e-9e2b-faa4bc5bd589\") " pod="metallb-system/controller-7bb4cc7c98-k2z22" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.862834 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wtmq\" (UniqueName: \"kubernetes.io/projected/2d4aa91f-29e1-4129-b67e-493c83865a51-kube-api-access-9wtmq\") pod \"speaker-5npj7\" (UID: \"2d4aa91f-29e1-4129-b67e-493c83865a51\") " pod="metallb-system/speaker-5npj7" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.891585 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-cgmjd" Mar 20 13:40:26 crc kubenswrapper[4973]: I0320 13:40:26.902056 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjhq2" Mar 20 13:40:27 crc kubenswrapper[4973]: I0320 13:40:27.019931 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-k2z22" Mar 20 13:40:27 crc kubenswrapper[4973]: I0320 13:40:27.350067 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2d4aa91f-29e1-4129-b67e-493c83865a51-metrics-certs\") pod \"speaker-5npj7\" (UID: \"2d4aa91f-29e1-4129-b67e-493c83865a51\") " pod="metallb-system/speaker-5npj7" Mar 20 13:40:27 crc kubenswrapper[4973]: I0320 13:40:27.350158 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2d4aa91f-29e1-4129-b67e-493c83865a51-memberlist\") pod \"speaker-5npj7\" (UID: \"2d4aa91f-29e1-4129-b67e-493c83865a51\") " pod="metallb-system/speaker-5npj7" Mar 20 13:40:27 crc kubenswrapper[4973]: E0320 13:40:27.350273 4973 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 13:40:27 crc kubenswrapper[4973]: E0320 13:40:27.350316 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d4aa91f-29e1-4129-b67e-493c83865a51-memberlist podName:2d4aa91f-29e1-4129-b67e-493c83865a51 nodeName:}" failed. No retries permitted until 2026-03-20 13:40:28.350303726 +0000 UTC m=+1149.093973470 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/2d4aa91f-29e1-4129-b67e-493c83865a51-memberlist") pod "speaker-5npj7" (UID: "2d4aa91f-29e1-4129-b67e-493c83865a51") : secret "metallb-memberlist" not found Mar 20 13:40:27 crc kubenswrapper[4973]: I0320 13:40:27.356765 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2d4aa91f-29e1-4129-b67e-493c83865a51-metrics-certs\") pod \"speaker-5npj7\" (UID: \"2d4aa91f-29e1-4129-b67e-493c83865a51\") " pod="metallb-system/speaker-5npj7" Mar 20 13:40:27 crc kubenswrapper[4973]: I0320 13:40:27.470668 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjhq2"] Mar 20 13:40:27 crc kubenswrapper[4973]: I0320 13:40:27.543105 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-k2z22"] Mar 20 13:40:27 crc kubenswrapper[4973]: W0320 13:40:27.545011 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod549c9d4d_d8f5_441e_9e2b_faa4bc5bd589.slice/crio-3e9fc6d2b98eed91b9747649615feafa18f779a3c53acaa6cd537c258c609d6b WatchSource:0}: Error finding container 3e9fc6d2b98eed91b9747649615feafa18f779a3c53acaa6cd537c258c609d6b: Status 404 returned error can't find the container with id 3e9fc6d2b98eed91b9747649615feafa18f779a3c53acaa6cd537c258c609d6b Mar 20 13:40:27 crc kubenswrapper[4973]: I0320 13:40:27.658251 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-k2z22" event={"ID":"549c9d4d-d8f5-441e-9e2b-faa4bc5bd589","Type":"ContainerStarted","Data":"3e9fc6d2b98eed91b9747649615feafa18f779a3c53acaa6cd537c258c609d6b"} Mar 20 13:40:27 crc kubenswrapper[4973]: I0320 13:40:27.661516 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjhq2" event={"ID":"1221336e-652c-45b4-bd66-43e96cf2c643","Type":"ContainerStarted","Data":"b0d8fa9d0c2bd260edcd7b10428965a9b0fa5e1d86f7c65f5245642ee8b5e73f"} Mar 20 13:40:27 crc kubenswrapper[4973]: I0320 13:40:27.662928 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cgmjd" event={"ID":"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478","Type":"ContainerStarted","Data":"6160c539726d5df371d3afc455b77c4640acb456bdc7f5f82852f63ba6e920a7"} Mar 20 13:40:28 crc kubenswrapper[4973]: I0320 13:40:28.364738 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2d4aa91f-29e1-4129-b67e-493c83865a51-memberlist\") pod \"speaker-5npj7\" (UID: \"2d4aa91f-29e1-4129-b67e-493c83865a51\") " pod="metallb-system/speaker-5npj7" Mar 20 13:40:28 crc kubenswrapper[4973]: I0320 13:40:28.378132 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/2d4aa91f-29e1-4129-b67e-493c83865a51-memberlist\") pod \"speaker-5npj7\" (UID: \"2d4aa91f-29e1-4129-b67e-493c83865a51\") " pod="metallb-system/speaker-5npj7" Mar 20 13:40:28 crc kubenswrapper[4973]: I0320 13:40:28.499809 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-5npj7" Mar 20 13:40:28 crc kubenswrapper[4973]: I0320 13:40:28.672115 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-k2z22" event={"ID":"549c9d4d-d8f5-441e-9e2b-faa4bc5bd589","Type":"ContainerStarted","Data":"353d48b610fde95c11b4fc4067f5f4e65e82055955427a0857e59cf19a7398b5"} Mar 20 13:40:28 crc kubenswrapper[4973]: I0320 13:40:28.672677 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-k2z22" Mar 20 13:40:28 crc kubenswrapper[4973]: I0320 13:40:28.672696 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-k2z22" event={"ID":"549c9d4d-d8f5-441e-9e2b-faa4bc5bd589","Type":"ContainerStarted","Data":"c5bf28c0652119437f1938409db6e60bf54195179b17645a144583423b11c87e"} Mar 20 13:40:28 crc kubenswrapper[4973]: I0320 13:40:28.677283 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5npj7" event={"ID":"2d4aa91f-29e1-4129-b67e-493c83865a51","Type":"ContainerStarted","Data":"cce1a977ef5db134cc6f4d8f941ac0bdde023e102fc1d538634377190d2933cc"} Mar 20 13:40:28 crc kubenswrapper[4973]: I0320 13:40:28.703794 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-k2z22" podStartSLOduration=2.703775884 podStartE2EDuration="2.703775884s" podCreationTimestamp="2026-03-20 13:40:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:40:28.70216404 +0000 UTC m=+1149.445833784" watchObservedRunningTime="2026-03-20 13:40:28.703775884 +0000 UTC m=+1149.447445628" Mar 20 13:40:29 crc kubenswrapper[4973]: I0320 13:40:29.694924 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5npj7" event={"ID":"2d4aa91f-29e1-4129-b67e-493c83865a51","Type":"ContainerStarted","Data":"2d807f6ef625bdd6aaf1b0a66a62e5c00250ec6c29b5d00576e8e2c4cdadf175"} Mar 20 13:40:29 crc kubenswrapper[4973]: I0320 13:40:29.695221 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-5npj7" event={"ID":"2d4aa91f-29e1-4129-b67e-493c83865a51","Type":"ContainerStarted","Data":"e44b0051e51ea1f58b5c6f0a86b98c834f17a2ffeeb607d0954b207e1aa756b4"} Mar 20 13:40:29 crc kubenswrapper[4973]: I0320 13:40:29.719922 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-5npj7" podStartSLOduration=3.719903681 podStartE2EDuration="3.719903681s" podCreationTimestamp="2026-03-20 13:40:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:40:29.714633076 +0000 UTC m=+1150.458302830" watchObservedRunningTime="2026-03-20 13:40:29.719903681 +0000 UTC m=+1150.463573425" Mar 20 13:40:30 crc kubenswrapper[4973]: I0320 13:40:30.701233 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-5npj7" Mar 20 13:40:36 crc kubenswrapper[4973]: I0320 13:40:36.756937 4973 generic.go:334] "Generic (PLEG): container finished" podID="75fc4720-ae9c-4ae5-8e4c-7c9a800f5478" containerID="19936c838044786cce6e4ce7d28d83a65a7352b67f39009dc3080eba70c2ab54" exitCode=0 Mar 20 13:40:36 crc kubenswrapper[4973]: I0320 13:40:36.756988 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cgmjd" event={"ID":"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478","Type":"ContainerDied","Data":"19936c838044786cce6e4ce7d28d83a65a7352b67f39009dc3080eba70c2ab54"} Mar 20 13:40:36 crc kubenswrapper[4973]: I0320 13:40:36.767226 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjhq2" event={"ID":"1221336e-652c-45b4-bd66-43e96cf2c643","Type":"ContainerStarted","Data":"35523775a8511fc6d56c1c8f9899beabafa74989050232a9ca3111d5d8881d61"} Mar 20 13:40:36 crc kubenswrapper[4973]: I0320 13:40:36.767459 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjhq2" Mar 20 13:40:36 crc kubenswrapper[4973]: I0320 13:40:36.809769 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjhq2" podStartSLOduration=2.413672208 podStartE2EDuration="10.809747246s" podCreationTimestamp="2026-03-20 13:40:26 +0000 UTC" firstStartedPulling="2026-03-20 13:40:27.472885765 +0000 UTC m=+1148.216555519" lastFinishedPulling="2026-03-20 13:40:35.868960813 +0000 UTC m=+1156.612630557" observedRunningTime="2026-03-20 13:40:36.803975368 +0000 UTC m=+1157.547645132" watchObservedRunningTime="2026-03-20 13:40:36.809747246 +0000 UTC m=+1157.553416990" Mar 20 13:40:37 crc kubenswrapper[4973]: I0320 13:40:37.024406 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-k2z22" Mar 20 13:40:37 crc kubenswrapper[4973]: I0320 13:40:37.027843 4973 scope.go:117] "RemoveContainer" containerID="db323a7526b36b7015171f6383383884f9410ee0c819b4f4fde67e961295e664" Mar 20 13:40:37 crc kubenswrapper[4973]: I0320 13:40:37.783691 4973 generic.go:334] "Generic (PLEG): container finished" podID="75fc4720-ae9c-4ae5-8e4c-7c9a800f5478" containerID="b934faf8ddab2cd70ad271508972eaffc8bb76448cb2a97a5addc8d6a08921d9" exitCode=0 Mar 20 13:40:37 crc kubenswrapper[4973]: I0320 13:40:37.785097 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cgmjd" event={"ID":"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478","Type":"ContainerDied","Data":"b934faf8ddab2cd70ad271508972eaffc8bb76448cb2a97a5addc8d6a08921d9"} Mar 20 13:40:38 crc kubenswrapper[4973]: I0320 13:40:38.505619 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-5npj7" Mar 20 13:40:38 crc kubenswrapper[4973]: I0320 13:40:38.792478 4973 generic.go:334] "Generic (PLEG): container finished" podID="75fc4720-ae9c-4ae5-8e4c-7c9a800f5478" containerID="98658f5cacd0df95aa3c5b3f3d458572a45dca9665c1537988566ffb35bf15e8" exitCode=0 Mar 20 13:40:38 crc kubenswrapper[4973]: I0320 13:40:38.792768 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cgmjd" event={"ID":"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478","Type":"ContainerDied","Data":"98658f5cacd0df95aa3c5b3f3d458572a45dca9665c1537988566ffb35bf15e8"} Mar 20 13:40:39 crc kubenswrapper[4973]: I0320 13:40:39.804024 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cgmjd" event={"ID":"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478","Type":"ContainerStarted","Data":"39f15c0ece2b14a4ab24ab287c72c423472aa4abdd65c2841d43f6e44e8044de"} Mar 20 13:40:39 crc kubenswrapper[4973]: I0320 13:40:39.804631 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cgmjd" event={"ID":"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478","Type":"ContainerStarted","Data":"f5d092668db435f0232884678a243ebba547a3a1e13a9215c3ef5af6acc7527f"} Mar 20 13:40:39 crc kubenswrapper[4973]: I0320 13:40:39.804645 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cgmjd" event={"ID":"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478","Type":"ContainerStarted","Data":"4cf8d2ea77c057930604a2eadf3c4f770f90119505fe08d361b1a71de52111c1"} Mar 20 13:40:39 crc kubenswrapper[4973]: I0320 13:40:39.804653 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cgmjd" event={"ID":"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478","Type":"ContainerStarted","Data":"3da8094c7961b1d5508014f85a570e7a748f3540826fec30ffabc2a2bad449ed"} Mar 20 13:40:39 crc kubenswrapper[4973]: I0320 13:40:39.804664 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cgmjd" event={"ID":"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478","Type":"ContainerStarted","Data":"11b749fb9936efbd97454f537fa2454823d199516b5d00a9c6c531f3fa411696"} Mar 20 13:40:40 crc kubenswrapper[4973]: I0320 13:40:40.831804 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cgmjd" event={"ID":"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478","Type":"ContainerStarted","Data":"aa492457e5ff3228e4a627349dafef991861750bdb0d97aae5f654926eca90ff"} Mar 20 13:40:40 crc kubenswrapper[4973]: I0320 13:40:40.832729 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-cgmjd" Mar 20 13:40:40 crc kubenswrapper[4973]: I0320 13:40:40.856628 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-cgmjd" podStartSLOduration=6.074862016 podStartE2EDuration="14.85661276s" podCreationTimestamp="2026-03-20 13:40:26 +0000 UTC" firstStartedPulling="2026-03-20 13:40:27.061061573 +0000 UTC m=+1147.804731317" lastFinishedPulling="2026-03-20 13:40:35.842812317 +0000 UTC m=+1156.586482061" observedRunningTime="2026-03-20 13:40:40.855001085 +0000 UTC m=+1161.598670849" watchObservedRunningTime="2026-03-20 13:40:40.85661276 +0000 UTC m=+1161.600282504" Mar 20 13:40:41 crc kubenswrapper[4973]: I0320 13:40:41.111628 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-xrf97"] Mar 20 13:40:41 crc kubenswrapper[4973]: I0320 13:40:41.113194 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xrf97" Mar 20 13:40:41 crc kubenswrapper[4973]: I0320 13:40:41.125867 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 20 13:40:41 crc kubenswrapper[4973]: I0320 13:40:41.126145 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 20 13:40:41 crc kubenswrapper[4973]: I0320 13:40:41.126269 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-w4qhf" Mar 20 13:40:41 crc kubenswrapper[4973]: I0320 13:40:41.137639 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xrf97"] Mar 20 13:40:41 crc kubenswrapper[4973]: I0320 13:40:41.291301 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghhkt\" (UniqueName: \"kubernetes.io/projected/e5c6dfb6-89c6-4de2-9359-aa64d8d86285-kube-api-access-ghhkt\") pod \"openstack-operator-index-xrf97\" (UID: \"e5c6dfb6-89c6-4de2-9359-aa64d8d86285\") " pod="openstack-operators/openstack-operator-index-xrf97" Mar 20 13:40:41 crc kubenswrapper[4973]: I0320 13:40:41.393312 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghhkt\" (UniqueName: \"kubernetes.io/projected/e5c6dfb6-89c6-4de2-9359-aa64d8d86285-kube-api-access-ghhkt\") pod \"openstack-operator-index-xrf97\" (UID: \"e5c6dfb6-89c6-4de2-9359-aa64d8d86285\") " pod="openstack-operators/openstack-operator-index-xrf97" Mar 20 13:40:41 crc kubenswrapper[4973]: I0320 13:40:41.413539 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghhkt\" (UniqueName: \"kubernetes.io/projected/e5c6dfb6-89c6-4de2-9359-aa64d8d86285-kube-api-access-ghhkt\") pod \"openstack-operator-index-xrf97\" (UID: \"e5c6dfb6-89c6-4de2-9359-aa64d8d86285\") " pod="openstack-operators/openstack-operator-index-xrf97" Mar 20 13:40:41 crc kubenswrapper[4973]: I0320 13:40:41.449452 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xrf97" Mar 20 13:40:41 crc kubenswrapper[4973]: I0320 13:40:41.891966 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-cgmjd" Mar 20 13:40:41 crc kubenswrapper[4973]: I0320 13:40:41.942732 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-xrf97"] Mar 20 13:40:41 crc kubenswrapper[4973]: I0320 13:40:41.945837 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-cgmjd" Mar 20 13:40:42 crc kubenswrapper[4973]: I0320 13:40:42.845686 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xrf97" event={"ID":"e5c6dfb6-89c6-4de2-9359-aa64d8d86285","Type":"ContainerStarted","Data":"0f9befe0968cad6e4814e123d2a8bc411fbd02a9f09a3466f50719e041e430f8"} Mar 20 13:40:43 crc kubenswrapper[4973]: I0320 13:40:43.321034 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:40:43 crc kubenswrapper[4973]: I0320 13:40:43.321106 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:40:44 crc kubenswrapper[4973]: I0320 13:40:44.487274 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-xrf97"] Mar 20 13:40:44 crc kubenswrapper[4973]: I0320 13:40:44.860377 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xrf97" event={"ID":"e5c6dfb6-89c6-4de2-9359-aa64d8d86285","Type":"ContainerStarted","Data":"31b34f391116d7f7a7af8317e3b5382ce60b93de3a63967c8b72fd031fd5a4bb"} Mar 20 13:40:44 crc kubenswrapper[4973]: I0320 13:40:44.860493 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-xrf97" podUID="e5c6dfb6-89c6-4de2-9359-aa64d8d86285" containerName="registry-server" containerID="cri-o://31b34f391116d7f7a7af8317e3b5382ce60b93de3a63967c8b72fd031fd5a4bb" gracePeriod=2 Mar 20 13:40:44 crc kubenswrapper[4973]: I0320 13:40:44.875709 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-xrf97" podStartSLOduration=1.211019472 podStartE2EDuration="3.875688731s" podCreationTimestamp="2026-03-20 13:40:41 +0000 UTC" firstStartedPulling="2026-03-20 13:40:41.961260821 +0000 UTC m=+1162.704930585" lastFinishedPulling="2026-03-20 13:40:44.6259301 +0000 UTC m=+1165.369599844" observedRunningTime="2026-03-20 13:40:44.872314749 +0000 UTC m=+1165.615984503" watchObservedRunningTime="2026-03-20 13:40:44.875688731 +0000 UTC m=+1165.619358495" Mar 20 13:40:45 crc kubenswrapper[4973]: I0320 13:40:45.096865 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-kwvd2"] Mar 20 13:40:45 crc kubenswrapper[4973]: I0320 13:40:45.098195 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kwvd2" Mar 20 13:40:45 crc kubenswrapper[4973]: I0320 13:40:45.105064 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-kwvd2"] Mar 20 13:40:45 crc kubenswrapper[4973]: I0320 13:40:45.159389 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5zxf\" (UniqueName: \"kubernetes.io/projected/f580709c-eab2-41f5-96b4-2e32cf02cdcb-kube-api-access-b5zxf\") pod \"openstack-operator-index-kwvd2\" (UID: \"f580709c-eab2-41f5-96b4-2e32cf02cdcb\") " pod="openstack-operators/openstack-operator-index-kwvd2" Mar 20 13:40:45 crc kubenswrapper[4973]: I0320 13:40:45.260498 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5zxf\" (UniqueName: \"kubernetes.io/projected/f580709c-eab2-41f5-96b4-2e32cf02cdcb-kube-api-access-b5zxf\") pod \"openstack-operator-index-kwvd2\" (UID: \"f580709c-eab2-41f5-96b4-2e32cf02cdcb\") " pod="openstack-operators/openstack-operator-index-kwvd2" Mar 20 13:40:45 crc kubenswrapper[4973]: I0320 13:40:45.280852 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5zxf\" (UniqueName: \"kubernetes.io/projected/f580709c-eab2-41f5-96b4-2e32cf02cdcb-kube-api-access-b5zxf\") pod \"openstack-operator-index-kwvd2\" (UID: \"f580709c-eab2-41f5-96b4-2e32cf02cdcb\") " pod="openstack-operators/openstack-operator-index-kwvd2" Mar 20 13:40:45 crc kubenswrapper[4973]: I0320 13:40:45.338612 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xrf97" Mar 20 13:40:45 crc kubenswrapper[4973]: I0320 13:40:45.415319 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-kwvd2" Mar 20 13:40:45 crc kubenswrapper[4973]: I0320 13:40:45.463527 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ghhkt\" (UniqueName: \"kubernetes.io/projected/e5c6dfb6-89c6-4de2-9359-aa64d8d86285-kube-api-access-ghhkt\") pod \"e5c6dfb6-89c6-4de2-9359-aa64d8d86285\" (UID: \"e5c6dfb6-89c6-4de2-9359-aa64d8d86285\") " Mar 20 13:40:45 crc kubenswrapper[4973]: I0320 13:40:45.467462 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5c6dfb6-89c6-4de2-9359-aa64d8d86285-kube-api-access-ghhkt" (OuterVolumeSpecName: "kube-api-access-ghhkt") pod "e5c6dfb6-89c6-4de2-9359-aa64d8d86285" (UID: "e5c6dfb6-89c6-4de2-9359-aa64d8d86285"). InnerVolumeSpecName "kube-api-access-ghhkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:40:45 crc kubenswrapper[4973]: I0320 13:40:45.565409 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ghhkt\" (UniqueName: \"kubernetes.io/projected/e5c6dfb6-89c6-4de2-9359-aa64d8d86285-kube-api-access-ghhkt\") on node \"crc\" DevicePath \"\"" Mar 20 13:40:45 crc kubenswrapper[4973]: I0320 13:40:45.829530 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-kwvd2"] Mar 20 13:40:45 crc kubenswrapper[4973]: I0320 13:40:45.868487 4973 generic.go:334] "Generic (PLEG): container finished" podID="e5c6dfb6-89c6-4de2-9359-aa64d8d86285" containerID="31b34f391116d7f7a7af8317e3b5382ce60b93de3a63967c8b72fd031fd5a4bb" exitCode=0 Mar 20 13:40:45 crc kubenswrapper[4973]: I0320 13:40:45.868532 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-xrf97" Mar 20 13:40:45 crc kubenswrapper[4973]: I0320 13:40:45.868565 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xrf97" event={"ID":"e5c6dfb6-89c6-4de2-9359-aa64d8d86285","Type":"ContainerDied","Data":"31b34f391116d7f7a7af8317e3b5382ce60b93de3a63967c8b72fd031fd5a4bb"} Mar 20 13:40:45 crc kubenswrapper[4973]: I0320 13:40:45.868594 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-xrf97" event={"ID":"e5c6dfb6-89c6-4de2-9359-aa64d8d86285","Type":"ContainerDied","Data":"0f9befe0968cad6e4814e123d2a8bc411fbd02a9f09a3466f50719e041e430f8"} Mar 20 13:40:45 crc kubenswrapper[4973]: I0320 13:40:45.868614 4973 scope.go:117] "RemoveContainer" containerID="31b34f391116d7f7a7af8317e3b5382ce60b93de3a63967c8b72fd031fd5a4bb" Mar 20 13:40:45 crc kubenswrapper[4973]: I0320 13:40:45.874624 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kwvd2" event={"ID":"f580709c-eab2-41f5-96b4-2e32cf02cdcb","Type":"ContainerStarted","Data":"e64550ed62617bf3237ec50b70d85d4e2a37d867f90a02ae7eb6b5894d671ef8"} Mar 20 13:40:45 crc kubenswrapper[4973]: I0320 13:40:45.894531 4973 scope.go:117] "RemoveContainer" containerID="31b34f391116d7f7a7af8317e3b5382ce60b93de3a63967c8b72fd031fd5a4bb" Mar 20 13:40:45 crc kubenswrapper[4973]: E0320 13:40:45.895503 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31b34f391116d7f7a7af8317e3b5382ce60b93de3a63967c8b72fd031fd5a4bb\": container with ID starting with 31b34f391116d7f7a7af8317e3b5382ce60b93de3a63967c8b72fd031fd5a4bb not found: ID does not exist" containerID="31b34f391116d7f7a7af8317e3b5382ce60b93de3a63967c8b72fd031fd5a4bb" Mar 20 13:40:45 crc kubenswrapper[4973]: I0320 13:40:45.895544 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31b34f391116d7f7a7af8317e3b5382ce60b93de3a63967c8b72fd031fd5a4bb"} err="failed to get container status \"31b34f391116d7f7a7af8317e3b5382ce60b93de3a63967c8b72fd031fd5a4bb\": rpc error: code = NotFound desc = could not find container \"31b34f391116d7f7a7af8317e3b5382ce60b93de3a63967c8b72fd031fd5a4bb\": container with ID starting with 31b34f391116d7f7a7af8317e3b5382ce60b93de3a63967c8b72fd031fd5a4bb not found: ID does not exist" Mar 20 13:40:45 crc kubenswrapper[4973]: I0320 13:40:45.916110 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-xrf97"] Mar 20 13:40:45 crc kubenswrapper[4973]: I0320 13:40:45.925455 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-xrf97"] Mar 20 13:40:45 crc kubenswrapper[4973]: I0320 13:40:45.962565 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5c6dfb6-89c6-4de2-9359-aa64d8d86285" path="/var/lib/kubelet/pods/e5c6dfb6-89c6-4de2-9359-aa64d8d86285/volumes" Mar 20 13:40:46 crc kubenswrapper[4973]: I0320 13:40:46.883747 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-kwvd2" event={"ID":"f580709c-eab2-41f5-96b4-2e32cf02cdcb","Type":"ContainerStarted","Data":"2d2fa2cac4a517357807e0e195c982ba4c6ece0f0a9223b60434729c39b95583"} Mar 20 13:40:46 crc kubenswrapper[4973]: I0320 13:40:46.906218 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-kwvd2" podStartSLOduration=1.844825505 podStartE2EDuration="1.906187726s" podCreationTimestamp="2026-03-20 13:40:45 +0000 UTC" firstStartedPulling="2026-03-20 13:40:45.837564652 +0000 UTC m=+1166.581234396" lastFinishedPulling="2026-03-20 13:40:45.898926873 +0000 UTC m=+1166.642596617" observedRunningTime="2026-03-20 13:40:46.899478172 +0000 UTC m=+1167.643147936" watchObservedRunningTime="2026-03-20 13:40:46.906187726 +0000 UTC m=+1167.649857480" Mar 20 13:40:46 crc kubenswrapper[4973]: I0320 13:40:46.911815 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjhq2" Mar 20 13:40:55 crc kubenswrapper[4973]: I0320 13:40:55.415941 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-kwvd2" Mar 20 13:40:55 crc kubenswrapper[4973]: I0320 13:40:55.417588 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-kwvd2" Mar 20 13:40:55 crc kubenswrapper[4973]: I0320 13:40:55.453276 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-kwvd2" Mar 20 13:40:55 crc kubenswrapper[4973]: I0320 13:40:55.986411 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-kwvd2" Mar 20 13:40:56 crc kubenswrapper[4973]: I0320 13:40:56.895621 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-cgmjd" Mar 20 13:41:10 crc kubenswrapper[4973]: I0320 13:41:10.139075 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b"] Mar 20 13:41:10 crc kubenswrapper[4973]: E0320 13:41:10.139856 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5c6dfb6-89c6-4de2-9359-aa64d8d86285" containerName="registry-server" Mar 20 13:41:10 crc kubenswrapper[4973]: I0320 13:41:10.139870 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5c6dfb6-89c6-4de2-9359-aa64d8d86285" containerName="registry-server" Mar 20 13:41:10 crc kubenswrapper[4973]: I0320 13:41:10.140038 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5c6dfb6-89c6-4de2-9359-aa64d8d86285" containerName="registry-server" Mar 20 13:41:10 crc kubenswrapper[4973]: I0320 13:41:10.141388 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b" Mar 20 13:41:10 crc kubenswrapper[4973]: I0320 13:41:10.144643 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-6llcc" Mar 20 13:41:10 crc kubenswrapper[4973]: I0320 13:41:10.161223 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b"] Mar 20 13:41:10 crc kubenswrapper[4973]: I0320 13:41:10.198794 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/112512e7-8063-4695-9c7e-c4bdb94bd796-bundle\") pod \"3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b\" (UID: \"112512e7-8063-4695-9c7e-c4bdb94bd796\") " pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b" Mar 20 13:41:10 crc kubenswrapper[4973]: I0320 13:41:10.198935 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72nsb\" (UniqueName: \"kubernetes.io/projected/112512e7-8063-4695-9c7e-c4bdb94bd796-kube-api-access-72nsb\") pod \"3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b\" (UID: \"112512e7-8063-4695-9c7e-c4bdb94bd796\") " pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b" Mar 20 13:41:10 crc kubenswrapper[4973]: I0320 13:41:10.198976 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/112512e7-8063-4695-9c7e-c4bdb94bd796-util\") pod \"3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b\" (UID: \"112512e7-8063-4695-9c7e-c4bdb94bd796\") " pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b" Mar 20 13:41:10 crc kubenswrapper[4973]: I0320 13:41:10.299938 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72nsb\" (UniqueName: \"kubernetes.io/projected/112512e7-8063-4695-9c7e-c4bdb94bd796-kube-api-access-72nsb\") pod \"3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b\" (UID: \"112512e7-8063-4695-9c7e-c4bdb94bd796\") " pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b" Mar 20 13:41:10 crc kubenswrapper[4973]: I0320 13:41:10.300010 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/112512e7-8063-4695-9c7e-c4bdb94bd796-util\") pod \"3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b\" (UID: \"112512e7-8063-4695-9c7e-c4bdb94bd796\") " pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b" Mar 20 13:41:10 crc kubenswrapper[4973]: I0320 13:41:10.300045 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/112512e7-8063-4695-9c7e-c4bdb94bd796-bundle\") pod \"3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b\" (UID: \"112512e7-8063-4695-9c7e-c4bdb94bd796\") " pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b" Mar 20 13:41:10 crc kubenswrapper[4973]: I0320 13:41:10.301521 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/112512e7-8063-4695-9c7e-c4bdb94bd796-bundle\") pod \"3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b\" (UID: \"112512e7-8063-4695-9c7e-c4bdb94bd796\") " pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b" Mar 20 13:41:10 crc kubenswrapper[4973]: I0320 13:41:10.301585 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/112512e7-8063-4695-9c7e-c4bdb94bd796-util\") pod \"3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b\" (UID: \"112512e7-8063-4695-9c7e-c4bdb94bd796\") " pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b" Mar 20 13:41:10 crc kubenswrapper[4973]: I0320 13:41:10.317041 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72nsb\" (UniqueName: \"kubernetes.io/projected/112512e7-8063-4695-9c7e-c4bdb94bd796-kube-api-access-72nsb\") pod \"3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b\" (UID: \"112512e7-8063-4695-9c7e-c4bdb94bd796\") " pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b" Mar 20 13:41:10 crc kubenswrapper[4973]: I0320 13:41:10.473557 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b" Mar 20 13:41:10 crc kubenswrapper[4973]: I0320 13:41:10.966353 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b"] Mar 20 13:41:11 crc kubenswrapper[4973]: I0320 13:41:11.112831 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b" event={"ID":"112512e7-8063-4695-9c7e-c4bdb94bd796","Type":"ContainerStarted","Data":"91fa7de618da15d48ce601eb8b7e5265ca622e8b5e6b00965fefe7c1f9ff5165"} Mar 20 13:41:12 crc kubenswrapper[4973]: I0320 13:41:12.123507 4973 generic.go:334] "Generic (PLEG): container finished" podID="112512e7-8063-4695-9c7e-c4bdb94bd796" containerID="3feb7d529158a54767543f047da62456b1e2f1ecdf2c28fdd4c4619264270459" exitCode=0 Mar 20 13:41:12 crc kubenswrapper[4973]: I0320 13:41:12.123623 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b" event={"ID":"112512e7-8063-4695-9c7e-c4bdb94bd796","Type":"ContainerDied","Data":"3feb7d529158a54767543f047da62456b1e2f1ecdf2c28fdd4c4619264270459"} Mar 20 13:41:13 crc kubenswrapper[4973]: I0320 13:41:13.321221 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:41:13 crc kubenswrapper[4973]: I0320 13:41:13.321295 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:41:15 crc kubenswrapper[4973]: I0320 13:41:15.158866 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b" event={"ID":"112512e7-8063-4695-9c7e-c4bdb94bd796","Type":"ContainerStarted","Data":"572232200c3949c625dc2ebb767ed5d995f65facf7978683795e5566bc456647"} Mar 20 13:41:16 crc kubenswrapper[4973]: I0320 13:41:16.167926 4973 generic.go:334] "Generic (PLEG): container finished" podID="112512e7-8063-4695-9c7e-c4bdb94bd796" containerID="572232200c3949c625dc2ebb767ed5d995f65facf7978683795e5566bc456647" exitCode=0 Mar 20 13:41:16 crc kubenswrapper[4973]: I0320 13:41:16.168144 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b" event={"ID":"112512e7-8063-4695-9c7e-c4bdb94bd796","Type":"ContainerDied","Data":"572232200c3949c625dc2ebb767ed5d995f65facf7978683795e5566bc456647"} Mar 20 13:41:17 crc kubenswrapper[4973]: I0320 13:41:17.178164 4973 generic.go:334] "Generic (PLEG): container finished" podID="112512e7-8063-4695-9c7e-c4bdb94bd796" containerID="d4e7a41e8bc4c10427ff5786e769115205f776ed030c29f8fc88e8f8372106d6" exitCode=0 Mar 20 13:41:17 crc kubenswrapper[4973]: I0320 13:41:17.178495 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b" event={"ID":"112512e7-8063-4695-9c7e-c4bdb94bd796","Type":"ContainerDied","Data":"d4e7a41e8bc4c10427ff5786e769115205f776ed030c29f8fc88e8f8372106d6"} Mar 20 13:41:18 crc kubenswrapper[4973]: I0320 13:41:18.529731 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b" Mar 20 13:41:18 crc kubenswrapper[4973]: I0320 13:41:18.655476 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72nsb\" (UniqueName: \"kubernetes.io/projected/112512e7-8063-4695-9c7e-c4bdb94bd796-kube-api-access-72nsb\") pod \"112512e7-8063-4695-9c7e-c4bdb94bd796\" (UID: \"112512e7-8063-4695-9c7e-c4bdb94bd796\") " Mar 20 13:41:18 crc kubenswrapper[4973]: I0320 13:41:18.655698 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/112512e7-8063-4695-9c7e-c4bdb94bd796-bundle\") pod \"112512e7-8063-4695-9c7e-c4bdb94bd796\" (UID: \"112512e7-8063-4695-9c7e-c4bdb94bd796\") " Mar 20 13:41:18 crc kubenswrapper[4973]: I0320 13:41:18.655754 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/112512e7-8063-4695-9c7e-c4bdb94bd796-util\") pod \"112512e7-8063-4695-9c7e-c4bdb94bd796\" (UID: \"112512e7-8063-4695-9c7e-c4bdb94bd796\") " Mar 20 13:41:18 crc kubenswrapper[4973]: I0320 13:41:18.657452 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/112512e7-8063-4695-9c7e-c4bdb94bd796-bundle" (OuterVolumeSpecName: "bundle") pod "112512e7-8063-4695-9c7e-c4bdb94bd796" (UID: "112512e7-8063-4695-9c7e-c4bdb94bd796"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:41:18 crc kubenswrapper[4973]: I0320 13:41:18.661945 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/112512e7-8063-4695-9c7e-c4bdb94bd796-kube-api-access-72nsb" (OuterVolumeSpecName: "kube-api-access-72nsb") pod "112512e7-8063-4695-9c7e-c4bdb94bd796" (UID: "112512e7-8063-4695-9c7e-c4bdb94bd796"). InnerVolumeSpecName "kube-api-access-72nsb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:41:18 crc kubenswrapper[4973]: I0320 13:41:18.671410 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/112512e7-8063-4695-9c7e-c4bdb94bd796-util" (OuterVolumeSpecName: "util") pod "112512e7-8063-4695-9c7e-c4bdb94bd796" (UID: "112512e7-8063-4695-9c7e-c4bdb94bd796"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:41:18 crc kubenswrapper[4973]: I0320 13:41:18.758267 4973 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/112512e7-8063-4695-9c7e-c4bdb94bd796-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:18 crc kubenswrapper[4973]: I0320 13:41:18.758487 4973 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/112512e7-8063-4695-9c7e-c4bdb94bd796-util\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:18 crc kubenswrapper[4973]: I0320 13:41:18.758505 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72nsb\" (UniqueName: \"kubernetes.io/projected/112512e7-8063-4695-9c7e-c4bdb94bd796-kube-api-access-72nsb\") on node \"crc\" DevicePath \"\"" Mar 20 13:41:19 crc kubenswrapper[4973]: I0320 13:41:19.196743 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b" event={"ID":"112512e7-8063-4695-9c7e-c4bdb94bd796","Type":"ContainerDied","Data":"91fa7de618da15d48ce601eb8b7e5265ca622e8b5e6b00965fefe7c1f9ff5165"} Mar 20 13:41:19 crc kubenswrapper[4973]: I0320 13:41:19.197021 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91fa7de618da15d48ce601eb8b7e5265ca622e8b5e6b00965fefe7c1f9ff5165" Mar 20 13:41:19 crc kubenswrapper[4973]: I0320 13:41:19.196887 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b" Mar 20 13:41:22 crc kubenswrapper[4973]: I0320 13:41:22.528928 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6f7459b8bf-nf8rm"] Mar 20 13:41:22 crc kubenswrapper[4973]: E0320 13:41:22.529878 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="112512e7-8063-4695-9c7e-c4bdb94bd796" containerName="extract" Mar 20 13:41:22 crc kubenswrapper[4973]: I0320 13:41:22.529898 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="112512e7-8063-4695-9c7e-c4bdb94bd796" containerName="extract" Mar 20 13:41:22 crc kubenswrapper[4973]: E0320 13:41:22.529924 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="112512e7-8063-4695-9c7e-c4bdb94bd796" containerName="util" Mar 20 13:41:22 crc kubenswrapper[4973]: I0320 13:41:22.529932 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="112512e7-8063-4695-9c7e-c4bdb94bd796" containerName="util" Mar 20 13:41:22 crc kubenswrapper[4973]: E0320 13:41:22.529949 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="112512e7-8063-4695-9c7e-c4bdb94bd796" containerName="pull" Mar 20 13:41:22 crc kubenswrapper[4973]: I0320 13:41:22.529956 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="112512e7-8063-4695-9c7e-c4bdb94bd796" containerName="pull" Mar 20 13:41:22 crc kubenswrapper[4973]: I0320 13:41:22.530194 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="112512e7-8063-4695-9c7e-c4bdb94bd796" containerName="extract" Mar 20 13:41:22 crc kubenswrapper[4973]: I0320 13:41:22.530939 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6f7459b8bf-nf8rm" Mar 20 13:41:22 crc kubenswrapper[4973]: I0320 13:41:22.533287 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-n8kfm" Mar 20 13:41:22 crc kubenswrapper[4973]: I0320 13:41:22.558548 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6f7459b8bf-nf8rm"] Mar 20 13:41:22 crc kubenswrapper[4973]: I0320 13:41:22.640112 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgfk2\" (UniqueName: \"kubernetes.io/projected/bad13d41-c3be-4f23-b40f-f621e669ef5b-kube-api-access-pgfk2\") pod \"openstack-operator-controller-init-6f7459b8bf-nf8rm\" (UID: \"bad13d41-c3be-4f23-b40f-f621e669ef5b\") " pod="openstack-operators/openstack-operator-controller-init-6f7459b8bf-nf8rm" Mar 20 13:41:22 crc kubenswrapper[4973]: I0320 13:41:22.741717 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgfk2\" (UniqueName: \"kubernetes.io/projected/bad13d41-c3be-4f23-b40f-f621e669ef5b-kube-api-access-pgfk2\") pod \"openstack-operator-controller-init-6f7459b8bf-nf8rm\" (UID: \"bad13d41-c3be-4f23-b40f-f621e669ef5b\") " pod="openstack-operators/openstack-operator-controller-init-6f7459b8bf-nf8rm" Mar 20 13:41:22 crc kubenswrapper[4973]: I0320 13:41:22.763274 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgfk2\" (UniqueName: \"kubernetes.io/projected/bad13d41-c3be-4f23-b40f-f621e669ef5b-kube-api-access-pgfk2\") pod \"openstack-operator-controller-init-6f7459b8bf-nf8rm\" (UID: \"bad13d41-c3be-4f23-b40f-f621e669ef5b\") " pod="openstack-operators/openstack-operator-controller-init-6f7459b8bf-nf8rm" Mar 20 13:41:22 crc kubenswrapper[4973]: I0320 13:41:22.853799 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6f7459b8bf-nf8rm" Mar 20 13:41:23 crc kubenswrapper[4973]: I0320 13:41:23.281577 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6f7459b8bf-nf8rm"] Mar 20 13:41:23 crc kubenswrapper[4973]: W0320 13:41:23.295568 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbad13d41_c3be_4f23_b40f_f621e669ef5b.slice/crio-3ed4c90d6d0cc4909dd45b8f11c624b5463bc11a3e424776d0b70064a809740f WatchSource:0}: Error finding container 3ed4c90d6d0cc4909dd45b8f11c624b5463bc11a3e424776d0b70064a809740f: Status 404 returned error can't find the container with id 3ed4c90d6d0cc4909dd45b8f11c624b5463bc11a3e424776d0b70064a809740f Mar 20 13:41:24 crc kubenswrapper[4973]: I0320 13:41:24.248178 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6f7459b8bf-nf8rm" event={"ID":"bad13d41-c3be-4f23-b40f-f621e669ef5b","Type":"ContainerStarted","Data":"3ed4c90d6d0cc4909dd45b8f11c624b5463bc11a3e424776d0b70064a809740f"} Mar 20 13:41:28 crc kubenswrapper[4973]: I0320 13:41:28.286519 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6f7459b8bf-nf8rm" event={"ID":"bad13d41-c3be-4f23-b40f-f621e669ef5b","Type":"ContainerStarted","Data":"8ed9a03f1ec110acb254929f12975366f9e3cb2aabfe1aa951448e8814ab072a"} Mar 20 13:41:28 crc kubenswrapper[4973]: I0320 13:41:28.286967 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6f7459b8bf-nf8rm" Mar 20 13:41:28 crc kubenswrapper[4973]: I0320 13:41:28.334569 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6f7459b8bf-nf8rm" podStartSLOduration=1.602527184 podStartE2EDuration="6.334540067s" podCreationTimestamp="2026-03-20 13:41:22 +0000 UTC" firstStartedPulling="2026-03-20 13:41:23.297553909 +0000 UTC m=+1204.041223663" lastFinishedPulling="2026-03-20 13:41:28.029566802 +0000 UTC m=+1208.773236546" observedRunningTime="2026-03-20 13:41:28.323957847 +0000 UTC m=+1209.067627591" watchObservedRunningTime="2026-03-20 13:41:28.334540067 +0000 UTC m=+1209.078209831" Mar 20 13:41:42 crc kubenswrapper[4973]: I0320 13:41:42.856712 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6f7459b8bf-nf8rm" Mar 20 13:41:43 crc kubenswrapper[4973]: I0320 13:41:43.320879 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:41:43 crc kubenswrapper[4973]: I0320 13:41:43.320932 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:41:43 crc kubenswrapper[4973]: I0320 13:41:43.320970 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 13:41:43 crc kubenswrapper[4973]: I0320 13:41:43.321650 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6ee146100b8d3ae20a6493daed451ee0c8c9f7d655dfaba5ce2b9446864d5f7d"} pod="openshift-machine-config-operator/machine-config-daemon-qlztx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:41:43 crc kubenswrapper[4973]: I0320 13:41:43.321703 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" containerID="cri-o://6ee146100b8d3ae20a6493daed451ee0c8c9f7d655dfaba5ce2b9446864d5f7d" gracePeriod=600 Mar 20 13:41:44 crc kubenswrapper[4973]: I0320 13:41:44.416712 4973 generic.go:334] "Generic (PLEG): container finished" podID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerID="6ee146100b8d3ae20a6493daed451ee0c8c9f7d655dfaba5ce2b9446864d5f7d" exitCode=0 Mar 20 13:41:44 crc kubenswrapper[4973]: I0320 13:41:44.416815 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerDied","Data":"6ee146100b8d3ae20a6493daed451ee0c8c9f7d655dfaba5ce2b9446864d5f7d"} Mar 20 13:41:44 crc kubenswrapper[4973]: I0320 13:41:44.418225 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerStarted","Data":"96fa5ee864c868aadb2da3b33886e9ad0c244086b57b687cf9a31b416fea8562"} Mar 20 13:41:44 crc kubenswrapper[4973]: I0320 13:41:44.418325 4973 scope.go:117] "RemoveContainer" containerID="f898eb1e5a0799379b7d7bcd473943134d5addff2e02fbb8f8a3d4d7eb5c66a6" Mar 20 13:42:00 crc kubenswrapper[4973]: I0320 13:42:00.126081 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566902-mjw5k"] Mar 20 13:42:00 crc kubenswrapper[4973]: I0320 13:42:00.127494 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566902-mjw5k" Mar 20 13:42:00 crc kubenswrapper[4973]: I0320 13:42:00.130257 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:42:00 crc kubenswrapper[4973]: I0320 13:42:00.130330 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:42:00 crc kubenswrapper[4973]: I0320 13:42:00.130527 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 13:42:00 crc kubenswrapper[4973]: I0320 13:42:00.137265 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566902-mjw5k"] Mar 20 13:42:00 crc kubenswrapper[4973]: I0320 13:42:00.242199 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bx97t\" (UniqueName: \"kubernetes.io/projected/4aecf729-16cf-45ff-80bb-6d4f4cfd5f7d-kube-api-access-bx97t\") pod \"auto-csr-approver-29566902-mjw5k\" (UID: \"4aecf729-16cf-45ff-80bb-6d4f4cfd5f7d\") " pod="openshift-infra/auto-csr-approver-29566902-mjw5k" Mar 20 13:42:00 crc kubenswrapper[4973]: I0320 13:42:00.344013 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bx97t\" (UniqueName: \"kubernetes.io/projected/4aecf729-16cf-45ff-80bb-6d4f4cfd5f7d-kube-api-access-bx97t\") pod \"auto-csr-approver-29566902-mjw5k\" (UID: \"4aecf729-16cf-45ff-80bb-6d4f4cfd5f7d\") " pod="openshift-infra/auto-csr-approver-29566902-mjw5k" Mar 20 13:42:00 crc kubenswrapper[4973]: I0320 13:42:00.404034 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bx97t\" (UniqueName: \"kubernetes.io/projected/4aecf729-16cf-45ff-80bb-6d4f4cfd5f7d-kube-api-access-bx97t\") pod \"auto-csr-approver-29566902-mjw5k\" (UID: \"4aecf729-16cf-45ff-80bb-6d4f4cfd5f7d\") " pod="openshift-infra/auto-csr-approver-29566902-mjw5k" Mar 20 13:42:00 crc kubenswrapper[4973]: I0320 13:42:00.449082 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566902-mjw5k" Mar 20 13:42:00 crc kubenswrapper[4973]: I0320 13:42:00.942806 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566902-mjw5k"] Mar 20 13:42:01 crc kubenswrapper[4973]: I0320 13:42:01.580885 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566902-mjw5k" event={"ID":"4aecf729-16cf-45ff-80bb-6d4f4cfd5f7d","Type":"ContainerStarted","Data":"2b74767c7d487c97fc96246ba2594ab7655d525354e2d30d5389abbbf3385067"} Mar 20 13:42:02 crc kubenswrapper[4973]: I0320 13:42:02.595427 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566902-mjw5k" event={"ID":"4aecf729-16cf-45ff-80bb-6d4f4cfd5f7d","Type":"ContainerStarted","Data":"413934feb0e3a7b1e8afa46eb5e13abbcfd0f3044499ba98c2be17557158c20d"} Mar 20 13:42:02 crc kubenswrapper[4973]: I0320 13:42:02.619020 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566902-mjw5k" podStartSLOduration=1.6483191879999999 podStartE2EDuration="2.618996458s" podCreationTimestamp="2026-03-20 13:42:00 +0000 UTC" firstStartedPulling="2026-03-20 13:42:00.954678872 +0000 UTC m=+1241.698348616" lastFinishedPulling="2026-03-20 13:42:01.925356142 +0000 UTC m=+1242.669025886" observedRunningTime="2026-03-20 13:42:02.613911886 +0000 UTC m=+1243.357581620" watchObservedRunningTime="2026-03-20 13:42:02.618996458 +0000 UTC m=+1243.362666202" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.027016 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-w7228"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.028110 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-w7228" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.030730 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-xzfmc" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.048808 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-pzl6t"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.050134 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pzl6t" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.060052 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-756nm" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.061488 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-w7228"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.075074 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-pzl6t"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.096656 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-2sflb"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.097979 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2sflb" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.105390 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-rm7q7"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.106844 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-rm7q7" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.109239 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-f2lt5" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.109497 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-qsfkq" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.111748 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wngq2\" (UniqueName: \"kubernetes.io/projected/9d106cd3-cadb-4cc7-b237-f05294c67dcd-kube-api-access-wngq2\") pod \"barbican-operator-controller-manager-59bc569d95-w7228\" (UID: \"9d106cd3-cadb-4cc7-b237-f05294c67dcd\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-w7228" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.113492 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdhmj\" (UniqueName: \"kubernetes.io/projected/dbb02721-66ce-44b6-bffe-59851197efa8-kube-api-access-cdhmj\") pod \"cinder-operator-controller-manager-8d58dc466-pzl6t\" (UID: \"dbb02721-66ce-44b6-bffe-59851197efa8\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pzl6t" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.135435 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-z4dzx"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.136596 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-z4dzx" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.139136 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-q2st4" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.167364 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-2sflb"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.195537 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-z4dzx"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.213804 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-rm7q7"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.215043 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wngq2\" (UniqueName: \"kubernetes.io/projected/9d106cd3-cadb-4cc7-b237-f05294c67dcd-kube-api-access-wngq2\") pod \"barbican-operator-controller-manager-59bc569d95-w7228\" (UID: \"9d106cd3-cadb-4cc7-b237-f05294c67dcd\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-w7228" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.215246 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdhmj\" (UniqueName: \"kubernetes.io/projected/dbb02721-66ce-44b6-bffe-59851197efa8-kube-api-access-cdhmj\") pod \"cinder-operator-controller-manager-8d58dc466-pzl6t\" (UID: \"dbb02721-66ce-44b6-bffe-59851197efa8\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pzl6t" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.215369 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96886\" (UniqueName: \"kubernetes.io/projected/96f5f9d5-bb8c-497c-bfbb-8fd46342ce69-kube-api-access-96886\") pod \"designate-operator-controller-manager-588d4d986b-2sflb\" (UID: \"96f5f9d5-bb8c-497c-bfbb-8fd46342ce69\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2sflb" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.215467 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbq4j\" (UniqueName: \"kubernetes.io/projected/7260cd47-ce83-44db-951d-757908bf5953-kube-api-access-kbq4j\") pod \"glance-operator-controller-manager-79df6bcc97-rm7q7\" (UID: \"7260cd47-ce83-44db-951d-757908bf5953\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-rm7q7" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.215564 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x569\" (UniqueName: \"kubernetes.io/projected/3233d229-1d2f-4c90-b76a-f27ca914f0ad-kube-api-access-5x569\") pod \"heat-operator-controller-manager-67dd5f86f5-z4dzx\" (UID: \"3233d229-1d2f-4c90-b76a-f27ca914f0ad\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-z4dzx" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.238349 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-zmqsw"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.239555 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zmqsw" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.244599 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdhmj\" (UniqueName: \"kubernetes.io/projected/dbb02721-66ce-44b6-bffe-59851197efa8-kube-api-access-cdhmj\") pod \"cinder-operator-controller-manager-8d58dc466-pzl6t\" (UID: \"dbb02721-66ce-44b6-bffe-59851197efa8\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pzl6t" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.244626 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wngq2\" (UniqueName: \"kubernetes.io/projected/9d106cd3-cadb-4cc7-b237-f05294c67dcd-kube-api-access-wngq2\") pod \"barbican-operator-controller-manager-59bc569d95-w7228\" (UID: \"9d106cd3-cadb-4cc7-b237-f05294c67dcd\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-w7228" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.247109 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-ffwvh" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.254582 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-zmqsw"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.277414 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-27jg7"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.278614 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-27jg7" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.281230 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.281431 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-jhg9k" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.317033 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96886\" (UniqueName: \"kubernetes.io/projected/96f5f9d5-bb8c-497c-bfbb-8fd46342ce69-kube-api-access-96886\") pod \"designate-operator-controller-manager-588d4d986b-2sflb\" (UID: \"96f5f9d5-bb8c-497c-bfbb-8fd46342ce69\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2sflb" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.317082 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbq4j\" (UniqueName: \"kubernetes.io/projected/7260cd47-ce83-44db-951d-757908bf5953-kube-api-access-kbq4j\") pod \"glance-operator-controller-manager-79df6bcc97-rm7q7\" (UID: \"7260cd47-ce83-44db-951d-757908bf5953\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-rm7q7" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.317113 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82zxv\" (UniqueName: \"kubernetes.io/projected/84e8fd4a-d562-4e92-adfc-479867cf9d3a-kube-api-access-82zxv\") pod \"horizon-operator-controller-manager-8464cc45fb-zmqsw\" (UID: \"84e8fd4a-d562-4e92-adfc-479867cf9d3a\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zmqsw" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.317170 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x569\" (UniqueName: \"kubernetes.io/projected/3233d229-1d2f-4c90-b76a-f27ca914f0ad-kube-api-access-5x569\") pod \"heat-operator-controller-manager-67dd5f86f5-z4dzx\" (UID: \"3233d229-1d2f-4c90-b76a-f27ca914f0ad\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-z4dzx" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.338400 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-27jg7"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.341865 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x569\" (UniqueName: \"kubernetes.io/projected/3233d229-1d2f-4c90-b76a-f27ca914f0ad-kube-api-access-5x569\") pod \"heat-operator-controller-manager-67dd5f86f5-z4dzx\" (UID: \"3233d229-1d2f-4c90-b76a-f27ca914f0ad\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-z4dzx" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.345740 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96886\" (UniqueName: \"kubernetes.io/projected/96f5f9d5-bb8c-497c-bfbb-8fd46342ce69-kube-api-access-96886\") pod \"designate-operator-controller-manager-588d4d986b-2sflb\" (UID: \"96f5f9d5-bb8c-497c-bfbb-8fd46342ce69\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2sflb" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.354892 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-qxlrn"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.356024 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qxlrn" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.360109 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-8klwq" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.360557 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-w7228" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.363896 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbq4j\" (UniqueName: \"kubernetes.io/projected/7260cd47-ce83-44db-951d-757908bf5953-kube-api-access-kbq4j\") pod \"glance-operator-controller-manager-79df6bcc97-rm7q7\" (UID: \"7260cd47-ce83-44db-951d-757908bf5953\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-rm7q7" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.375131 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-jn8rq"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.376397 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-jn8rq" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.378360 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-p755d" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.389772 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-qxlrn"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.401375 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pzl6t" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.409599 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-jn8rq"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.420177 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-592gb\" (UniqueName: \"kubernetes.io/projected/c5689d29-0b82-4482-9444-5da20e2da57a-kube-api-access-592gb\") pod \"infra-operator-controller-manager-7b9c774f96-27jg7\" (UID: \"c5689d29-0b82-4482-9444-5da20e2da57a\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-27jg7" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.420572 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndrrp\" (UniqueName: \"kubernetes.io/projected/13c558f8-2e66-49c3-b184-7fdbbf4ff6b1-kube-api-access-ndrrp\") pod \"ironic-operator-controller-manager-6f787dddc9-qxlrn\" (UID: \"13c558f8-2e66-49c3-b184-7fdbbf4ff6b1\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qxlrn" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.420680 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5689d29-0b82-4482-9444-5da20e2da57a-cert\") pod \"infra-operator-controller-manager-7b9c774f96-27jg7\" (UID: \"c5689d29-0b82-4482-9444-5da20e2da57a\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-27jg7" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.420720 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxttv\" (UniqueName: \"kubernetes.io/projected/c0abcba1-e57c-4d90-a8cb-61989da15e87-kube-api-access-wxttv\") pod \"keystone-operator-controller-manager-768b96df4c-jn8rq\" (UID: \"c0abcba1-e57c-4d90-a8cb-61989da15e87\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-jn8rq" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.420784 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82zxv\" (UniqueName: \"kubernetes.io/projected/84e8fd4a-d562-4e92-adfc-479867cf9d3a-kube-api-access-82zxv\") pod \"horizon-operator-controller-manager-8464cc45fb-zmqsw\" (UID: \"84e8fd4a-d562-4e92-adfc-479867cf9d3a\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zmqsw" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.421151 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-gs46s"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.422283 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gs46s" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.438419 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-gs46s"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.453207 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2sflb" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.462473 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-s84pg" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.466652 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-crjg2"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.468089 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-crjg2" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.468844 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-rm7q7" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.475186 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82zxv\" (UniqueName: \"kubernetes.io/projected/84e8fd4a-d562-4e92-adfc-479867cf9d3a-kube-api-access-82zxv\") pod \"horizon-operator-controller-manager-8464cc45fb-zmqsw\" (UID: \"84e8fd4a-d562-4e92-adfc-479867cf9d3a\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zmqsw" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.475574 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-jdmkw" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.482225 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-z4dzx" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.493573 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-crjg2"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.519862 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-kzx4f"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.522113 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-kzx4f" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.523414 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fxdm\" (UniqueName: \"kubernetes.io/projected/97aab498-21c1-476f-a64b-a526745fc64a-kube-api-access-4fxdm\") pod \"mariadb-operator-controller-manager-67ccfc9778-crjg2\" (UID: \"97aab498-21c1-476f-a64b-a526745fc64a\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-crjg2" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.523485 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5689d29-0b82-4482-9444-5da20e2da57a-cert\") pod \"infra-operator-controller-manager-7b9c774f96-27jg7\" (UID: \"c5689d29-0b82-4482-9444-5da20e2da57a\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-27jg7" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.523525 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxttv\" (UniqueName: \"kubernetes.io/projected/c0abcba1-e57c-4d90-a8cb-61989da15e87-kube-api-access-wxttv\") pod \"keystone-operator-controller-manager-768b96df4c-jn8rq\" (UID: \"c0abcba1-e57c-4d90-a8cb-61989da15e87\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-jn8rq" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.523548 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-694zz\" (UniqueName: \"kubernetes.io/projected/530d31a0-48a0-4d06-9b03-c9c205312bdc-kube-api-access-694zz\") pod \"manila-operator-controller-manager-55f864c847-gs46s\" (UID: \"530d31a0-48a0-4d06-9b03-c9c205312bdc\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-gs46s" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.523615 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-592gb\" (UniqueName: \"kubernetes.io/projected/c5689d29-0b82-4482-9444-5da20e2da57a-kube-api-access-592gb\") pod \"infra-operator-controller-manager-7b9c774f96-27jg7\" (UID: \"c5689d29-0b82-4482-9444-5da20e2da57a\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-27jg7" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.523641 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndrrp\" (UniqueName: \"kubernetes.io/projected/13c558f8-2e66-49c3-b184-7fdbbf4ff6b1-kube-api-access-ndrrp\") pod \"ironic-operator-controller-manager-6f787dddc9-qxlrn\" (UID: \"13c558f8-2e66-49c3-b184-7fdbbf4ff6b1\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qxlrn" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.523924 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-btrql" Mar 20 13:42:03 crc kubenswrapper[4973]: E0320 13:42:03.524054 4973 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:42:03 crc kubenswrapper[4973]: E0320 13:42:03.524099 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5689d29-0b82-4482-9444-5da20e2da57a-cert podName:c5689d29-0b82-4482-9444-5da20e2da57a nodeName:}" failed. No retries permitted until 2026-03-20 13:42:04.024085072 +0000 UTC m=+1244.767754816 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c5689d29-0b82-4482-9444-5da20e2da57a-cert") pod "infra-operator-controller-manager-7b9c774f96-27jg7" (UID: "c5689d29-0b82-4482-9444-5da20e2da57a") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.547515 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-592gb\" (UniqueName: \"kubernetes.io/projected/c5689d29-0b82-4482-9444-5da20e2da57a-kube-api-access-592gb\") pod \"infra-operator-controller-manager-7b9c774f96-27jg7\" (UID: \"c5689d29-0b82-4482-9444-5da20e2da57a\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-27jg7" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.547882 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxttv\" (UniqueName: \"kubernetes.io/projected/c0abcba1-e57c-4d90-a8cb-61989da15e87-kube-api-access-wxttv\") pod \"keystone-operator-controller-manager-768b96df4c-jn8rq\" (UID: \"c0abcba1-e57c-4d90-a8cb-61989da15e87\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-jn8rq" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.549068 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndrrp\" (UniqueName: \"kubernetes.io/projected/13c558f8-2e66-49c3-b184-7fdbbf4ff6b1-kube-api-access-ndrrp\") pod \"ironic-operator-controller-manager-6f787dddc9-qxlrn\" (UID: \"13c558f8-2e66-49c3-b184-7fdbbf4ff6b1\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qxlrn" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.583181 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-p6vx2"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.595264 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-p6vx2" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.597810 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-nkv6f" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.600036 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zmqsw" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.628098 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-694zz\" (UniqueName: \"kubernetes.io/projected/530d31a0-48a0-4d06-9b03-c9c205312bdc-kube-api-access-694zz\") pod \"manila-operator-controller-manager-55f864c847-gs46s\" (UID: \"530d31a0-48a0-4d06-9b03-c9c205312bdc\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-gs46s" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.630530 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsttx\" (UniqueName: \"kubernetes.io/projected/45fa2923-6b6f-44da-8693-6ee06c476a8f-kube-api-access-tsttx\") pod \"neutron-operator-controller-manager-767865f676-kzx4f\" (UID: \"45fa2923-6b6f-44da-8693-6ee06c476a8f\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-kzx4f" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.630697 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fxdm\" (UniqueName: \"kubernetes.io/projected/97aab498-21c1-476f-a64b-a526745fc64a-kube-api-access-4fxdm\") pod \"mariadb-operator-controller-manager-67ccfc9778-crjg2\" (UID: \"97aab498-21c1-476f-a64b-a526745fc64a\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-crjg2" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.640369 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-p6vx2"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.671269 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-694zz\" (UniqueName: \"kubernetes.io/projected/530d31a0-48a0-4d06-9b03-c9c205312bdc-kube-api-access-694zz\") pod \"manila-operator-controller-manager-55f864c847-gs46s\" (UID: \"530d31a0-48a0-4d06-9b03-c9c205312bdc\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-gs46s" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.671940 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fxdm\" (UniqueName: \"kubernetes.io/projected/97aab498-21c1-476f-a64b-a526745fc64a-kube-api-access-4fxdm\") pod \"mariadb-operator-controller-manager-67ccfc9778-crjg2\" (UID: \"97aab498-21c1-476f-a64b-a526745fc64a\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-crjg2" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.703556 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-kzx4f"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.732374 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsttx\" (UniqueName: \"kubernetes.io/projected/45fa2923-6b6f-44da-8693-6ee06c476a8f-kube-api-access-tsttx\") pod \"neutron-operator-controller-manager-767865f676-kzx4f\" (UID: \"45fa2923-6b6f-44da-8693-6ee06c476a8f\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-kzx4f" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.732452 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjjm9\" (UniqueName: \"kubernetes.io/projected/d5a271f2-b17d-487d-a61b-00bd17841392-kube-api-access-sjjm9\") pod \"nova-operator-controller-manager-5d488d59fb-p6vx2\" (UID: \"d5a271f2-b17d-487d-a61b-00bd17841392\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-p6vx2" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.739671 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-x4r2t"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.741325 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-x4r2t" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.746885 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-6bc5w" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.763772 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsttx\" (UniqueName: \"kubernetes.io/projected/45fa2923-6b6f-44da-8693-6ee06c476a8f-kube-api-access-tsttx\") pod \"neutron-operator-controller-manager-767865f676-kzx4f\" (UID: \"45fa2923-6b6f-44da-8693-6ee06c476a8f\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-kzx4f" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.769277 4973 generic.go:334] "Generic (PLEG): container finished" podID="4aecf729-16cf-45ff-80bb-6d4f4cfd5f7d" containerID="413934feb0e3a7b1e8afa46eb5e13abbcfd0f3044499ba98c2be17557158c20d" exitCode=0 Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.769332 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566902-mjw5k" event={"ID":"4aecf729-16cf-45ff-80bb-6d4f4cfd5f7d","Type":"ContainerDied","Data":"413934feb0e3a7b1e8afa46eb5e13abbcfd0f3044499ba98c2be17557158c20d"} Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.778998 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-x4r2t"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.800490 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qxlrn" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.804040 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.805848 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.810203 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.811615 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-xcxmb" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.835412 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-2zn7v"] Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.835705 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-jn8rq" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.891028 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2zn7v" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.897524 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-crjg2" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.898906 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gs46s" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.918120 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shhzf\" (UniqueName: \"kubernetes.io/projected/39227253-9885-4ba2-a216-c04066dc7c84-kube-api-access-shhzf\") pod \"octavia-operator-controller-manager-5b9f45d989-x4r2t\" (UID: \"39227253-9885-4ba2-a216-c04066dc7c84\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-x4r2t" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.918213 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjjm9\" (UniqueName: \"kubernetes.io/projected/d5a271f2-b17d-487d-a61b-00bd17841392-kube-api-access-sjjm9\") pod \"nova-operator-controller-manager-5d488d59fb-p6vx2\" (UID: \"d5a271f2-b17d-487d-a61b-00bd17841392\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-p6vx2" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.929076 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-8z57b" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.957687 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjjm9\" (UniqueName: \"kubernetes.io/projected/d5a271f2-b17d-487d-a61b-00bd17841392-kube-api-access-sjjm9\") pod \"nova-operator-controller-manager-5d488d59fb-p6vx2\" (UID: \"d5a271f2-b17d-487d-a61b-00bd17841392\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-p6vx2" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.950509 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-kzx4f" Mar 20 13:42:03 crc kubenswrapper[4973]: I0320 13:42:03.981036 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-p6vx2" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.038481 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shhzf\" (UniqueName: \"kubernetes.io/projected/39227253-9885-4ba2-a216-c04066dc7c84-kube-api-access-shhzf\") pod \"octavia-operator-controller-manager-5b9f45d989-x4r2t\" (UID: \"39227253-9885-4ba2-a216-c04066dc7c84\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-x4r2t" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.078571 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf4gq\" (UniqueName: \"kubernetes.io/projected/991cca2c-022f-4c90-a1ba-287191fc2d49-kube-api-access-qf4gq\") pod \"ovn-operator-controller-manager-884679f54-2zn7v\" (UID: \"991cca2c-022f-4c90-a1ba-287191fc2d49\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-2zn7v" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.078620 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flqtz\" (UniqueName: \"kubernetes.io/projected/3c89c7dd-500b-4bd5-a30e-273c2a485728-kube-api-access-flqtz\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fjztv\" (UID: \"3c89c7dd-500b-4bd5-a30e-273c2a485728\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.078752 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c89c7dd-500b-4bd5-a30e-273c2a485728-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fjztv\" (UID: \"3c89c7dd-500b-4bd5-a30e-273c2a485728\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.078868 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5689d29-0b82-4482-9444-5da20e2da57a-cert\") pod \"infra-operator-controller-manager-7b9c774f96-27jg7\" (UID: \"c5689d29-0b82-4482-9444-5da20e2da57a\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-27jg7" Mar 20 13:42:04 crc kubenswrapper[4973]: E0320 13:42:04.079229 4973 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:42:04 crc kubenswrapper[4973]: E0320 13:42:04.079283 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5689d29-0b82-4482-9444-5da20e2da57a-cert podName:c5689d29-0b82-4482-9444-5da20e2da57a nodeName:}" failed. No retries permitted until 2026-03-20 13:42:05.079263818 +0000 UTC m=+1245.822933562 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c5689d29-0b82-4482-9444-5da20e2da57a-cert") pod "infra-operator-controller-manager-7b9c774f96-27jg7" (UID: "c5689d29-0b82-4482-9444-5da20e2da57a") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.113201 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shhzf\" (UniqueName: \"kubernetes.io/projected/39227253-9885-4ba2-a216-c04066dc7c84-kube-api-access-shhzf\") pod \"octavia-operator-controller-manager-5b9f45d989-x4r2t\" (UID: \"39227253-9885-4ba2-a216-c04066dc7c84\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-x4r2t" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.115831 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-x4r2t" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.205442 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-2zn7v"] Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.205493 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv"] Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.205519 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-9dv29"] Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.207752 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9dv29" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.223447 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-jkktc" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.254178 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf4gq\" (UniqueName: \"kubernetes.io/projected/991cca2c-022f-4c90-a1ba-287191fc2d49-kube-api-access-qf4gq\") pod \"ovn-operator-controller-manager-884679f54-2zn7v\" (UID: \"991cca2c-022f-4c90-a1ba-287191fc2d49\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-2zn7v" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.254241 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flqtz\" (UniqueName: \"kubernetes.io/projected/3c89c7dd-500b-4bd5-a30e-273c2a485728-kube-api-access-flqtz\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fjztv\" (UID: \"3c89c7dd-500b-4bd5-a30e-273c2a485728\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.254358 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c89c7dd-500b-4bd5-a30e-273c2a485728-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fjztv\" (UID: \"3c89c7dd-500b-4bd5-a30e-273c2a485728\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.254427 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2bfv\" (UniqueName: \"kubernetes.io/projected/106eb66b-ca71-49b1-a80e-699f34ac9df9-kube-api-access-l2bfv\") pod \"placement-operator-controller-manager-5784578c99-9dv29\" (UID: \"106eb66b-ca71-49b1-a80e-699f34ac9df9\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-9dv29" Mar 20 13:42:04 crc kubenswrapper[4973]: E0320 13:42:04.255718 4973 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:42:04 crc kubenswrapper[4973]: E0320 13:42:04.255781 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c89c7dd-500b-4bd5-a30e-273c2a485728-cert podName:3c89c7dd-500b-4bd5-a30e-273c2a485728 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:04.755764087 +0000 UTC m=+1245.499433831 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3c89c7dd-500b-4bd5-a30e-273c2a485728-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-fjztv" (UID: "3c89c7dd-500b-4bd5-a30e-273c2a485728") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.314478 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-9dv29"] Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.388766 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf4gq\" (UniqueName: \"kubernetes.io/projected/991cca2c-022f-4c90-a1ba-287191fc2d49-kube-api-access-qf4gq\") pod \"ovn-operator-controller-manager-884679f54-2zn7v\" (UID: \"991cca2c-022f-4c90-a1ba-287191fc2d49\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-2zn7v" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.392384 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flqtz\" (UniqueName: \"kubernetes.io/projected/3c89c7dd-500b-4bd5-a30e-273c2a485728-kube-api-access-flqtz\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fjztv\" (UID: \"3c89c7dd-500b-4bd5-a30e-273c2a485728\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.393501 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2bfv\" (UniqueName: \"kubernetes.io/projected/106eb66b-ca71-49b1-a80e-699f34ac9df9-kube-api-access-l2bfv\") pod \"placement-operator-controller-manager-5784578c99-9dv29\" (UID: \"106eb66b-ca71-49b1-a80e-699f34ac9df9\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-9dv29" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.421206 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2bfv\" (UniqueName: \"kubernetes.io/projected/106eb66b-ca71-49b1-a80e-699f34ac9df9-kube-api-access-l2bfv\") pod \"placement-operator-controller-manager-5784578c99-9dv29\" (UID: \"106eb66b-ca71-49b1-a80e-699f34ac9df9\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-9dv29" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.430471 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-6bzww"] Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.436575 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6bzww" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.440518 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-9vpll" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.441761 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-6bzww"] Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.450714 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-nh7dd"] Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.452265 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-nh7dd" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.454868 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-h6vzf" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.467719 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-nh7dd"] Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.489370 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9dv29" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.497167 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xlqnk"] Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.497709 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq2sh\" (UniqueName: \"kubernetes.io/projected/8b21705a-8662-4147-9a10-9a95982d961c-kube-api-access-vq2sh\") pod \"swift-operator-controller-manager-c674c5965-6bzww\" (UID: \"8b21705a-8662-4147-9a10-9a95982d961c\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-6bzww" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.499675 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xlqnk" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.505742 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-ks4mc" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.544771 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xlqnk"] Mar 20 13:42:04 crc kubenswrapper[4973]: W0320 13:42:04.556836 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddbb02721_66ce_44b6_bffe_59851197efa8.slice/crio-c6817c65d27cf270ae95a11da358aa50ddecc2e78ebc867997f2f5197fdc1e61 WatchSource:0}: Error finding container c6817c65d27cf270ae95a11da358aa50ddecc2e78ebc867997f2f5197fdc1e61: Status 404 returned error can't find the container with id c6817c65d27cf270ae95a11da358aa50ddecc2e78ebc867997f2f5197fdc1e61 Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.557243 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2zn7v" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.600075 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qnmn\" (UniqueName: \"kubernetes.io/projected/ecf17bc8-3c8a-4791-a205-2bdc718ec15f-kube-api-access-6qnmn\") pod \"telemetry-operator-controller-manager-fbb6f4f4f-nh7dd\" (UID: \"ecf17bc8-3c8a-4791-a205-2bdc718ec15f\") " pod="openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-nh7dd" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.600228 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq2sh\" (UniqueName: \"kubernetes.io/projected/8b21705a-8662-4147-9a10-9a95982d961c-kube-api-access-vq2sh\") pod \"swift-operator-controller-manager-c674c5965-6bzww\" (UID: \"8b21705a-8662-4147-9a10-9a95982d961c\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-6bzww" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.603886 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kzrpr"] Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.605428 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kzrpr" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.612489 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kzrpr"] Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.613511 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-tsxz2" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.623168 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f"] Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.624740 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.627732 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.627946 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.627957 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-whkmt" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.637242 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq2sh\" (UniqueName: \"kubernetes.io/projected/8b21705a-8662-4147-9a10-9a95982d961c-kube-api-access-vq2sh\") pod \"swift-operator-controller-manager-c674c5965-6bzww\" (UID: \"8b21705a-8662-4147-9a10-9a95982d961c\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-6bzww" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.646263 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f"] Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.676276 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tgmwc"] Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.689330 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tgmwc" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.693444 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-76ccb" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.703877 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tgmwc"] Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.704933 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hr7c7\" (UniqueName: \"kubernetes.io/projected/88d5fd4b-c230-4b94-b988-0b79ec98d991-kube-api-access-hr7c7\") pod \"rabbitmq-cluster-operator-manager-668c99d594-tgmwc\" (UID: \"88d5fd4b-c230-4b94-b988-0b79ec98d991\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tgmwc" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.705010 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qnmn\" (UniqueName: \"kubernetes.io/projected/ecf17bc8-3c8a-4791-a205-2bdc718ec15f-kube-api-access-6qnmn\") pod \"telemetry-operator-controller-manager-fbb6f4f4f-nh7dd\" (UID: \"ecf17bc8-3c8a-4791-a205-2bdc718ec15f\") " pod="openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-nh7dd" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.705059 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwqp4\" (UniqueName: \"kubernetes.io/projected/5a41f6b9-9f79-454a-af8a-c0ad746f1d42-kube-api-access-hwqp4\") pod \"watcher-operator-controller-manager-6c4d75f7f9-kzrpr\" (UID: \"5a41f6b9-9f79-454a-af8a-c0ad746f1d42\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kzrpr" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.705082 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c9zj\" (UniqueName: \"kubernetes.io/projected/aa34abe0-30d3-4d49-9f20-c15990a91a36-kube-api-access-9c9zj\") pod \"test-operator-controller-manager-5c5cb9c4d7-xlqnk\" (UID: \"aa34abe0-30d3-4d49-9f20-c15990a91a36\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xlqnk" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.741858 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qnmn\" (UniqueName: \"kubernetes.io/projected/ecf17bc8-3c8a-4791-a205-2bdc718ec15f-kube-api-access-6qnmn\") pod \"telemetry-operator-controller-manager-fbb6f4f4f-nh7dd\" (UID: \"ecf17bc8-3c8a-4791-a205-2bdc718ec15f\") " pod="openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-nh7dd" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.755437 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-w7228"] Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.802446 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-pzl6t"] Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.804509 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-w7228" event={"ID":"9d106cd3-cadb-4cc7-b237-f05294c67dcd","Type":"ContainerStarted","Data":"74aea4594141cdbd9e286b7d3b5959893bfe63d5723abeef251ba9fedffac63d"} Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.806427 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwqp4\" (UniqueName: \"kubernetes.io/projected/5a41f6b9-9f79-454a-af8a-c0ad746f1d42-kube-api-access-hwqp4\") pod \"watcher-operator-controller-manager-6c4d75f7f9-kzrpr\" (UID: \"5a41f6b9-9f79-454a-af8a-c0ad746f1d42\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kzrpr" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.806463 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c9zj\" (UniqueName: \"kubernetes.io/projected/aa34abe0-30d3-4d49-9f20-c15990a91a36-kube-api-access-9c9zj\") pod \"test-operator-controller-manager-5c5cb9c4d7-xlqnk\" (UID: \"aa34abe0-30d3-4d49-9f20-c15990a91a36\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xlqnk" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.806492 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c89c7dd-500b-4bd5-a30e-273c2a485728-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fjztv\" (UID: \"3c89c7dd-500b-4bd5-a30e-273c2a485728\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.806540 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ph8gb\" (UniqueName: \"kubernetes.io/projected/835537e8-dced-4516-a7b9-168d9bb6b687-kube-api-access-ph8gb\") pod \"openstack-operator-controller-manager-78865ff6b4-wlv2f\" (UID: \"835537e8-dced-4516-a7b9-168d9bb6b687\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.806599 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hr7c7\" (UniqueName: \"kubernetes.io/projected/88d5fd4b-c230-4b94-b988-0b79ec98d991-kube-api-access-hr7c7\") pod \"rabbitmq-cluster-operator-manager-668c99d594-tgmwc\" (UID: \"88d5fd4b-c230-4b94-b988-0b79ec98d991\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tgmwc" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.806631 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-webhook-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-wlv2f\" (UID: \"835537e8-dced-4516-a7b9-168d9bb6b687\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.806682 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-metrics-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-wlv2f\" (UID: \"835537e8-dced-4516-a7b9-168d9bb6b687\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" Mar 20 13:42:04 crc kubenswrapper[4973]: E0320 13:42:04.807111 4973 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:42:04 crc kubenswrapper[4973]: E0320 13:42:04.807153 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c89c7dd-500b-4bd5-a30e-273c2a485728-cert podName:3c89c7dd-500b-4bd5-a30e-273c2a485728 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:05.807138784 +0000 UTC m=+1246.550808528 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3c89c7dd-500b-4bd5-a30e-273c2a485728-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-fjztv" (UID: "3c89c7dd-500b-4bd5-a30e-273c2a485728") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.808147 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pzl6t" event={"ID":"dbb02721-66ce-44b6-bffe-59851197efa8","Type":"ContainerStarted","Data":"c6817c65d27cf270ae95a11da358aa50ddecc2e78ebc867997f2f5197fdc1e61"} Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.825161 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6bzww" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.829801 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwqp4\" (UniqueName: \"kubernetes.io/projected/5a41f6b9-9f79-454a-af8a-c0ad746f1d42-kube-api-access-hwqp4\") pod \"watcher-operator-controller-manager-6c4d75f7f9-kzrpr\" (UID: \"5a41f6b9-9f79-454a-af8a-c0ad746f1d42\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kzrpr" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.838302 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-nh7dd" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.842160 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hr7c7\" (UniqueName: \"kubernetes.io/projected/88d5fd4b-c230-4b94-b988-0b79ec98d991-kube-api-access-hr7c7\") pod \"rabbitmq-cluster-operator-manager-668c99d594-tgmwc\" (UID: \"88d5fd4b-c230-4b94-b988-0b79ec98d991\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tgmwc" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.847047 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c9zj\" (UniqueName: \"kubernetes.io/projected/aa34abe0-30d3-4d49-9f20-c15990a91a36-kube-api-access-9c9zj\") pod \"test-operator-controller-manager-5c5cb9c4d7-xlqnk\" (UID: \"aa34abe0-30d3-4d49-9f20-c15990a91a36\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xlqnk" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.861929 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xlqnk" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.907767 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-webhook-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-wlv2f\" (UID: \"835537e8-dced-4516-a7b9-168d9bb6b687\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.907849 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-metrics-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-wlv2f\" (UID: \"835537e8-dced-4516-a7b9-168d9bb6b687\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.907920 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ph8gb\" (UniqueName: \"kubernetes.io/projected/835537e8-dced-4516-a7b9-168d9bb6b687-kube-api-access-ph8gb\") pod \"openstack-operator-controller-manager-78865ff6b4-wlv2f\" (UID: \"835537e8-dced-4516-a7b9-168d9bb6b687\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" Mar 20 13:42:04 crc kubenswrapper[4973]: E0320 13:42:04.909021 4973 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:42:04 crc kubenswrapper[4973]: E0320 13:42:04.909051 4973 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:42:04 crc kubenswrapper[4973]: E0320 13:42:04.909431 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-webhook-certs podName:835537e8-dced-4516-a7b9-168d9bb6b687 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:05.409413994 +0000 UTC m=+1246.153083738 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-webhook-certs") pod "openstack-operator-controller-manager-78865ff6b4-wlv2f" (UID: "835537e8-dced-4516-a7b9-168d9bb6b687") : secret "webhook-server-cert" not found Mar 20 13:42:04 crc kubenswrapper[4973]: E0320 13:42:04.909509 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-metrics-certs podName:835537e8-dced-4516-a7b9-168d9bb6b687 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:05.409486566 +0000 UTC m=+1246.153156390 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-metrics-certs") pod "openstack-operator-controller-manager-78865ff6b4-wlv2f" (UID: "835537e8-dced-4516-a7b9-168d9bb6b687") : secret "metrics-server-cert" not found Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.947804 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kzrpr" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.947937 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ph8gb\" (UniqueName: \"kubernetes.io/projected/835537e8-dced-4516-a7b9-168d9bb6b687-kube-api-access-ph8gb\") pod \"openstack-operator-controller-manager-78865ff6b4-wlv2f\" (UID: \"835537e8-dced-4516-a7b9-168d9bb6b687\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.947948 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-rm7q7"] Mar 20 13:42:04 crc kubenswrapper[4973]: W0320 13:42:04.960585 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7260cd47_ce83_44db_951d_757908bf5953.slice/crio-a2a05cd0954998d781fd1cd281841030fdf6e957046f136acce009fc72ad6e35 WatchSource:0}: Error finding container a2a05cd0954998d781fd1cd281841030fdf6e957046f136acce009fc72ad6e35: Status 404 returned error can't find the container with id a2a05cd0954998d781fd1cd281841030fdf6e957046f136acce009fc72ad6e35 Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.967745 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-z4dzx"] Mar 20 13:42:04 crc kubenswrapper[4973]: I0320 13:42:04.984001 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-2sflb"] Mar 20 13:42:05 crc kubenswrapper[4973]: I0320 13:42:05.035094 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tgmwc" Mar 20 13:42:05 crc kubenswrapper[4973]: I0320 13:42:05.116714 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5689d29-0b82-4482-9444-5da20e2da57a-cert\") pod \"infra-operator-controller-manager-7b9c774f96-27jg7\" (UID: \"c5689d29-0b82-4482-9444-5da20e2da57a\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-27jg7" Mar 20 13:42:05 crc kubenswrapper[4973]: E0320 13:42:05.116937 4973 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:42:05 crc kubenswrapper[4973]: E0320 13:42:05.117419 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5689d29-0b82-4482-9444-5da20e2da57a-cert podName:c5689d29-0b82-4482-9444-5da20e2da57a nodeName:}" failed. No retries permitted until 2026-03-20 13:42:07.117397892 +0000 UTC m=+1247.861067636 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c5689d29-0b82-4482-9444-5da20e2da57a-cert") pod "infra-operator-controller-manager-7b9c774f96-27jg7" (UID: "c5689d29-0b82-4482-9444-5da20e2da57a") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:42:05 crc kubenswrapper[4973]: I0320 13:42:05.225771 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-qxlrn"] Mar 20 13:42:05 crc kubenswrapper[4973]: I0320 13:42:05.427363 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-metrics-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-wlv2f\" (UID: \"835537e8-dced-4516-a7b9-168d9bb6b687\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" Mar 20 13:42:05 crc kubenswrapper[4973]: I0320 13:42:05.427594 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-webhook-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-wlv2f\" (UID: \"835537e8-dced-4516-a7b9-168d9bb6b687\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" Mar 20 13:42:05 crc kubenswrapper[4973]: E0320 13:42:05.427594 4973 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:42:05 crc kubenswrapper[4973]: E0320 13:42:05.427768 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-metrics-certs podName:835537e8-dced-4516-a7b9-168d9bb6b687 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:06.427744911 +0000 UTC m=+1247.171414655 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-metrics-certs") pod "openstack-operator-controller-manager-78865ff6b4-wlv2f" (UID: "835537e8-dced-4516-a7b9-168d9bb6b687") : secret "metrics-server-cert" not found Mar 20 13:42:05 crc kubenswrapper[4973]: E0320 13:42:05.427648 4973 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:42:05 crc kubenswrapper[4973]: E0320 13:42:05.427867 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-webhook-certs podName:835537e8-dced-4516-a7b9-168d9bb6b687 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:06.427838924 +0000 UTC m=+1247.171508708 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-webhook-certs") pod "openstack-operator-controller-manager-78865ff6b4-wlv2f" (UID: "835537e8-dced-4516-a7b9-168d9bb6b687") : secret "webhook-server-cert" not found Mar 20 13:42:05 crc kubenswrapper[4973]: I0320 13:42:05.765153 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-crjg2"] Mar 20 13:42:05 crc kubenswrapper[4973]: I0320 13:42:05.781897 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-kzx4f"] Mar 20 13:42:05 crc kubenswrapper[4973]: W0320 13:42:05.786255 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod530d31a0_48a0_4d06_9b03_c9c205312bdc.slice/crio-84580fbfd58ac3ba46d4cb3cb2e9eb66317aa8865101fd1b793ff17ada329bf9 WatchSource:0}: Error finding container 84580fbfd58ac3ba46d4cb3cb2e9eb66317aa8865101fd1b793ff17ada329bf9: Status 404 returned error can't find the container with id 84580fbfd58ac3ba46d4cb3cb2e9eb66317aa8865101fd1b793ff17ada329bf9 Mar 20 13:42:05 crc kubenswrapper[4973]: I0320 13:42:05.789431 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-gs46s"] Mar 20 13:42:05 crc kubenswrapper[4973]: I0320 13:42:05.799702 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-zmqsw"] Mar 20 13:42:05 crc kubenswrapper[4973]: I0320 13:42:05.816623 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-jn8rq"] Mar 20 13:42:05 crc kubenswrapper[4973]: I0320 13:42:05.819334 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-z4dzx" event={"ID":"3233d229-1d2f-4c90-b76a-f27ca914f0ad","Type":"ContainerStarted","Data":"775403feb88036271bb6c9dd1343b7a09e5b4118e72d326375a21be72ebbbcb4"} Mar 20 13:42:05 crc kubenswrapper[4973]: I0320 13:42:05.830209 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qxlrn" event={"ID":"13c558f8-2e66-49c3-b184-7fdbbf4ff6b1","Type":"ContainerStarted","Data":"0ec4375dfefa6abf71ba0f7361836530be8258fb59f140f26e4354f50a1c64af"} Mar 20 13:42:05 crc kubenswrapper[4973]: I0320 13:42:05.831906 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gs46s" event={"ID":"530d31a0-48a0-4d06-9b03-c9c205312bdc","Type":"ContainerStarted","Data":"84580fbfd58ac3ba46d4cb3cb2e9eb66317aa8865101fd1b793ff17ada329bf9"} Mar 20 13:42:05 crc kubenswrapper[4973]: I0320 13:42:05.834216 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c89c7dd-500b-4bd5-a30e-273c2a485728-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fjztv\" (UID: \"3c89c7dd-500b-4bd5-a30e-273c2a485728\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv" Mar 20 13:42:05 crc kubenswrapper[4973]: E0320 13:42:05.834512 4973 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:42:05 crc kubenswrapper[4973]: I0320 13:42:05.834585 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-crjg2" event={"ID":"97aab498-21c1-476f-a64b-a526745fc64a","Type":"ContainerStarted","Data":"99bde8409d3e8f4c3b3117b1590f869440996f59503e571a7543dd4319f6eb61"} Mar 20 13:42:05 crc kubenswrapper[4973]: E0320 13:42:05.834720 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c89c7dd-500b-4bd5-a30e-273c2a485728-cert podName:3c89c7dd-500b-4bd5-a30e-273c2a485728 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:07.834698114 +0000 UTC m=+1248.578367878 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3c89c7dd-500b-4bd5-a30e-273c2a485728-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-fjztv" (UID: "3c89c7dd-500b-4bd5-a30e-273c2a485728") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:42:05 crc kubenswrapper[4973]: I0320 13:42:05.835977 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-rm7q7" event={"ID":"7260cd47-ce83-44db-951d-757908bf5953","Type":"ContainerStarted","Data":"a2a05cd0954998d781fd1cd281841030fdf6e957046f136acce009fc72ad6e35"} Mar 20 13:42:05 crc kubenswrapper[4973]: I0320 13:42:05.840814 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566902-mjw5k" event={"ID":"4aecf729-16cf-45ff-80bb-6d4f4cfd5f7d","Type":"ContainerDied","Data":"2b74767c7d487c97fc96246ba2594ab7655d525354e2d30d5389abbbf3385067"} Mar 20 13:42:05 crc kubenswrapper[4973]: I0320 13:42:05.840921 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b74767c7d487c97fc96246ba2594ab7655d525354e2d30d5389abbbf3385067" Mar 20 13:42:05 crc kubenswrapper[4973]: I0320 13:42:05.864645 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zmqsw" event={"ID":"84e8fd4a-d562-4e92-adfc-479867cf9d3a","Type":"ContainerStarted","Data":"acf31eecee195ee328d1d88dce4e7c7a5bf8c93e10a7af6a7b0ff9839e1e099e"} Mar 20 13:42:05 crc kubenswrapper[4973]: I0320 13:42:05.866764 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2sflb" event={"ID":"96f5f9d5-bb8c-497c-bfbb-8fd46342ce69","Type":"ContainerStarted","Data":"9509a235abfb28a20f88154de2e9841206bcc720563366b4bbb079f4619a0d5d"} Mar 20 13:42:05 crc kubenswrapper[4973]: I0320 13:42:05.892107 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566902-mjw5k" Mar 20 13:42:06 crc kubenswrapper[4973]: I0320 13:42:06.045489 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bx97t\" (UniqueName: \"kubernetes.io/projected/4aecf729-16cf-45ff-80bb-6d4f4cfd5f7d-kube-api-access-bx97t\") pod \"4aecf729-16cf-45ff-80bb-6d4f4cfd5f7d\" (UID: \"4aecf729-16cf-45ff-80bb-6d4f4cfd5f7d\") " Mar 20 13:42:07 crc kubenswrapper[4973]: I0320 13:42:07.043153 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-webhook-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-wlv2f\" (UID: \"835537e8-dced-4516-a7b9-168d9bb6b687\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" Mar 20 13:42:07 crc kubenswrapper[4973]: I0320 13:42:07.043652 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-metrics-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-wlv2f\" (UID: \"835537e8-dced-4516-a7b9-168d9bb6b687\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" Mar 20 13:42:07 crc kubenswrapper[4973]: E0320 13:42:07.043817 4973 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:42:07 crc kubenswrapper[4973]: E0320 13:42:07.043873 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-metrics-certs podName:835537e8-dced-4516-a7b9-168d9bb6b687 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:09.043855425 +0000 UTC m=+1249.787525169 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-metrics-certs") pod "openstack-operator-controller-manager-78865ff6b4-wlv2f" (UID: "835537e8-dced-4516-a7b9-168d9bb6b687") : secret "metrics-server-cert" not found Mar 20 13:42:07 crc kubenswrapper[4973]: E0320 13:42:07.045171 4973 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:42:07 crc kubenswrapper[4973]: E0320 13:42:07.045213 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-webhook-certs podName:835537e8-dced-4516-a7b9-168d9bb6b687 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:09.045199709 +0000 UTC m=+1249.788869453 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-webhook-certs") pod "openstack-operator-controller-manager-78865ff6b4-wlv2f" (UID: "835537e8-dced-4516-a7b9-168d9bb6b687") : secret "webhook-server-cert" not found Mar 20 13:42:07 crc kubenswrapper[4973]: I0320 13:42:07.138409 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4aecf729-16cf-45ff-80bb-6d4f4cfd5f7d-kube-api-access-bx97t" (OuterVolumeSpecName: "kube-api-access-bx97t") pod "4aecf729-16cf-45ff-80bb-6d4f4cfd5f7d" (UID: "4aecf729-16cf-45ff-80bb-6d4f4cfd5f7d"). InnerVolumeSpecName "kube-api-access-bx97t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:42:07 crc kubenswrapper[4973]: I0320 13:42:07.156734 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5689d29-0b82-4482-9444-5da20e2da57a-cert\") pod \"infra-operator-controller-manager-7b9c774f96-27jg7\" (UID: \"c5689d29-0b82-4482-9444-5da20e2da57a\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-27jg7" Mar 20 13:42:07 crc kubenswrapper[4973]: E0320 13:42:07.158152 4973 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:42:07 crc kubenswrapper[4973]: E0320 13:42:07.158258 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5689d29-0b82-4482-9444-5da20e2da57a-cert podName:c5689d29-0b82-4482-9444-5da20e2da57a nodeName:}" failed. No retries permitted until 2026-03-20 13:42:11.158234648 +0000 UTC m=+1251.901904392 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c5689d29-0b82-4482-9444-5da20e2da57a-cert") pod "infra-operator-controller-manager-7b9c774f96-27jg7" (UID: "c5689d29-0b82-4482-9444-5da20e2da57a") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:42:07 crc kubenswrapper[4973]: I0320 13:42:07.158529 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bx97t\" (UniqueName: \"kubernetes.io/projected/4aecf729-16cf-45ff-80bb-6d4f4cfd5f7d-kube-api-access-bx97t\") on node \"crc\" DevicePath \"\"" Mar 20 13:42:07 crc kubenswrapper[4973]: I0320 13:42:07.163855 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-kzx4f" event={"ID":"45fa2923-6b6f-44da-8693-6ee06c476a8f","Type":"ContainerStarted","Data":"2caff50dcf7fe2a76bd0e338f0185a72f1c1b6d6ec5f35ae9f4aeef02f79f72c"} Mar 20 13:42:07 crc kubenswrapper[4973]: W0320 13:42:07.192039 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod991cca2c_022f_4c90_a1ba_287191fc2d49.slice/crio-eddb2bd56daa9c345b0d51970dffe5374b6c5943d52ab02d38a07907fc98aecb WatchSource:0}: Error finding container eddb2bd56daa9c345b0d51970dffe5374b6c5943d52ab02d38a07907fc98aecb: Status 404 returned error can't find the container with id eddb2bd56daa9c345b0d51970dffe5374b6c5943d52ab02d38a07907fc98aecb Mar 20 13:42:07 crc kubenswrapper[4973]: I0320 13:42:07.206273 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566902-mjw5k" Mar 20 13:42:07 crc kubenswrapper[4973]: I0320 13:42:07.206282 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-jn8rq" event={"ID":"c0abcba1-e57c-4d90-a8cb-61989da15e87","Type":"ContainerStarted","Data":"472bf9a083ab661fd85ef9ef53d2bb6677a85021a97554e51eae60a0f7adf891"} Mar 20 13:42:07 crc kubenswrapper[4973]: I0320 13:42:07.271653 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-2zn7v"] Mar 20 13:42:07 crc kubenswrapper[4973]: I0320 13:42:07.281308 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566896-zmpzm"] Mar 20 13:42:07 crc kubenswrapper[4973]: I0320 13:42:07.288392 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-x4r2t"] Mar 20 13:42:07 crc kubenswrapper[4973]: I0320 13:42:07.296538 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-p6vx2"] Mar 20 13:42:07 crc kubenswrapper[4973]: I0320 13:42:07.302939 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566896-zmpzm"] Mar 20 13:42:07 crc kubenswrapper[4973]: I0320 13:42:07.311820 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-9dv29"] Mar 20 13:42:07 crc kubenswrapper[4973]: I0320 13:42:07.323098 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-nh7dd"] Mar 20 13:42:07 crc kubenswrapper[4973]: I0320 13:42:07.333690 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xlqnk"] Mar 20 13:42:07 crc kubenswrapper[4973]: W0320 13:42:07.340009 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod88d5fd4b_c230_4b94_b988_0b79ec98d991.slice/crio-c77a91fc2eb21c2db409fad45fc13de1452f3889fcfd4d429c7a1bf0f7289162 WatchSource:0}: Error finding container c77a91fc2eb21c2db409fad45fc13de1452f3889fcfd4d429c7a1bf0f7289162: Status 404 returned error can't find the container with id c77a91fc2eb21c2db409fad45fc13de1452f3889fcfd4d429c7a1bf0f7289162 Mar 20 13:42:07 crc kubenswrapper[4973]: I0320 13:42:07.345570 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tgmwc"] Mar 20 13:42:07 crc kubenswrapper[4973]: I0320 13:42:07.357400 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-6bzww"] Mar 20 13:42:07 crc kubenswrapper[4973]: I0320 13:42:07.369075 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kzrpr"] Mar 20 13:42:07 crc kubenswrapper[4973]: I0320 13:42:07.882055 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c89c7dd-500b-4bd5-a30e-273c2a485728-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fjztv\" (UID: \"3c89c7dd-500b-4bd5-a30e-273c2a485728\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv" Mar 20 13:42:07 crc kubenswrapper[4973]: E0320 13:42:07.882254 4973 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:42:07 crc kubenswrapper[4973]: E0320 13:42:07.882651 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c89c7dd-500b-4bd5-a30e-273c2a485728-cert podName:3c89c7dd-500b-4bd5-a30e-273c2a485728 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:11.882611615 +0000 UTC m=+1252.626281359 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3c89c7dd-500b-4bd5-a30e-273c2a485728-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-fjztv" (UID: "3c89c7dd-500b-4bd5-a30e-273c2a485728") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:42:07 crc kubenswrapper[4973]: I0320 13:42:07.970914 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="578dfffc-6f28-4ee3-9635-5dd5efe7be69" path="/var/lib/kubelet/pods/578dfffc-6f28-4ee3-9635-5dd5efe7be69/volumes" Mar 20 13:42:08 crc kubenswrapper[4973]: I0320 13:42:08.230961 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-p6vx2" event={"ID":"d5a271f2-b17d-487d-a61b-00bd17841392","Type":"ContainerStarted","Data":"047019e124e3f617d0ad169732e533725b21593dffbc005cf1a5e7807ed29afa"} Mar 20 13:42:08 crc kubenswrapper[4973]: I0320 13:42:08.241113 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tgmwc" event={"ID":"88d5fd4b-c230-4b94-b988-0b79ec98d991","Type":"ContainerStarted","Data":"c77a91fc2eb21c2db409fad45fc13de1452f3889fcfd4d429c7a1bf0f7289162"} Mar 20 13:42:08 crc kubenswrapper[4973]: I0320 13:42:08.247462 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2zn7v" event={"ID":"991cca2c-022f-4c90-a1ba-287191fc2d49","Type":"ContainerStarted","Data":"eddb2bd56daa9c345b0d51970dffe5374b6c5943d52ab02d38a07907fc98aecb"} Mar 20 13:42:08 crc kubenswrapper[4973]: I0320 13:42:08.258163 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6bzww" event={"ID":"8b21705a-8662-4147-9a10-9a95982d961c","Type":"ContainerStarted","Data":"77fc243ff34387dd19ed2a48e16b8734cc4b109808521eff08d0689ebddf2a6f"} Mar 20 13:42:08 crc kubenswrapper[4973]: I0320 13:42:08.260294 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-nh7dd" event={"ID":"ecf17bc8-3c8a-4791-a205-2bdc718ec15f","Type":"ContainerStarted","Data":"339cc51cc97579c4ddd847e37df937d481f60a8ef7397fd889cc91efae2380d7"} Mar 20 13:42:08 crc kubenswrapper[4973]: I0320 13:42:08.262164 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xlqnk" event={"ID":"aa34abe0-30d3-4d49-9f20-c15990a91a36","Type":"ContainerStarted","Data":"df528ae4361d77b2c8a176ea995c8c1230ad2cd2f1aec98d8f168970b996eadc"} Mar 20 13:42:08 crc kubenswrapper[4973]: I0320 13:42:08.264659 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9dv29" event={"ID":"106eb66b-ca71-49b1-a80e-699f34ac9df9","Type":"ContainerStarted","Data":"4527a6fa0547a13598117ae81c9e1782a05554bc7ed2b6c4c21e42c8f9892ffc"} Mar 20 13:42:08 crc kubenswrapper[4973]: I0320 13:42:08.266372 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-x4r2t" event={"ID":"39227253-9885-4ba2-a216-c04066dc7c84","Type":"ContainerStarted","Data":"f8f2506b0787efb13b6e35057e4aa5fc041e87c05667c8cd37bf655ddc078947"} Mar 20 13:42:08 crc kubenswrapper[4973]: I0320 13:42:08.270190 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kzrpr" event={"ID":"5a41f6b9-9f79-454a-af8a-c0ad746f1d42","Type":"ContainerStarted","Data":"43fff2f927f61c22155474f3d438442bb6e87b8b7e33b1dcde65be05929640eb"} Mar 20 13:42:09 crc kubenswrapper[4973]: I0320 13:42:09.107317 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-metrics-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-wlv2f\" (UID: \"835537e8-dced-4516-a7b9-168d9bb6b687\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" Mar 20 13:42:09 crc kubenswrapper[4973]: E0320 13:42:09.107558 4973 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:42:09 crc kubenswrapper[4973]: I0320 13:42:09.107806 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-webhook-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-wlv2f\" (UID: \"835537e8-dced-4516-a7b9-168d9bb6b687\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" Mar 20 13:42:09 crc kubenswrapper[4973]: E0320 13:42:09.107851 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-metrics-certs podName:835537e8-dced-4516-a7b9-168d9bb6b687 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:13.107828863 +0000 UTC m=+1253.851498657 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-metrics-certs") pod "openstack-operator-controller-manager-78865ff6b4-wlv2f" (UID: "835537e8-dced-4516-a7b9-168d9bb6b687") : secret "metrics-server-cert" not found Mar 20 13:42:09 crc kubenswrapper[4973]: E0320 13:42:09.107948 4973 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:42:09 crc kubenswrapper[4973]: E0320 13:42:09.108006 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-webhook-certs podName:835537e8-dced-4516-a7b9-168d9bb6b687 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:13.107989217 +0000 UTC m=+1253.851658961 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-webhook-certs") pod "openstack-operator-controller-manager-78865ff6b4-wlv2f" (UID: "835537e8-dced-4516-a7b9-168d9bb6b687") : secret "webhook-server-cert" not found Mar 20 13:42:11 crc kubenswrapper[4973]: I0320 13:42:11.247729 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5689d29-0b82-4482-9444-5da20e2da57a-cert\") pod \"infra-operator-controller-manager-7b9c774f96-27jg7\" (UID: \"c5689d29-0b82-4482-9444-5da20e2da57a\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-27jg7" Mar 20 13:42:11 crc kubenswrapper[4973]: E0320 13:42:11.248087 4973 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 13:42:11 crc kubenswrapper[4973]: E0320 13:42:11.248165 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5689d29-0b82-4482-9444-5da20e2da57a-cert podName:c5689d29-0b82-4482-9444-5da20e2da57a nodeName:}" failed. No retries permitted until 2026-03-20 13:42:19.248146136 +0000 UTC m=+1259.991815900 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c5689d29-0b82-4482-9444-5da20e2da57a-cert") pod "infra-operator-controller-manager-7b9c774f96-27jg7" (UID: "c5689d29-0b82-4482-9444-5da20e2da57a") : secret "infra-operator-webhook-server-cert" not found Mar 20 13:42:11 crc kubenswrapper[4973]: I0320 13:42:11.970227 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c89c7dd-500b-4bd5-a30e-273c2a485728-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fjztv\" (UID: \"3c89c7dd-500b-4bd5-a30e-273c2a485728\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv" Mar 20 13:42:11 crc kubenswrapper[4973]: E0320 13:42:11.970730 4973 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:42:11 crc kubenswrapper[4973]: E0320 13:42:11.970839 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c89c7dd-500b-4bd5-a30e-273c2a485728-cert podName:3c89c7dd-500b-4bd5-a30e-273c2a485728 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:19.970815127 +0000 UTC m=+1260.714484871 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3c89c7dd-500b-4bd5-a30e-273c2a485728-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-fjztv" (UID: "3c89c7dd-500b-4bd5-a30e-273c2a485728") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:42:13 crc kubenswrapper[4973]: I0320 13:42:13.585767 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-webhook-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-wlv2f\" (UID: \"835537e8-dced-4516-a7b9-168d9bb6b687\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" Mar 20 13:42:13 crc kubenswrapper[4973]: I0320 13:42:13.585868 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-metrics-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-wlv2f\" (UID: \"835537e8-dced-4516-a7b9-168d9bb6b687\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" Mar 20 13:42:13 crc kubenswrapper[4973]: E0320 13:42:13.585866 4973 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 13:42:13 crc kubenswrapper[4973]: E0320 13:42:13.585905 4973 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 13:42:13 crc kubenswrapper[4973]: E0320 13:42:13.586056 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-webhook-certs podName:835537e8-dced-4516-a7b9-168d9bb6b687 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:21.586034347 +0000 UTC m=+1262.329704091 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-webhook-certs") pod "openstack-operator-controller-manager-78865ff6b4-wlv2f" (UID: "835537e8-dced-4516-a7b9-168d9bb6b687") : secret "webhook-server-cert" not found Mar 20 13:42:13 crc kubenswrapper[4973]: E0320 13:42:13.586081 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-metrics-certs podName:835537e8-dced-4516-a7b9-168d9bb6b687 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:21.586070988 +0000 UTC m=+1262.329740732 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-metrics-certs") pod "openstack-operator-controller-manager-78865ff6b4-wlv2f" (UID: "835537e8-dced-4516-a7b9-168d9bb6b687") : secret "metrics-server-cert" not found Mar 20 13:42:18 crc kubenswrapper[4973]: E0320 13:42:18.207040 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d" Mar 20 13:42:18 crc kubenswrapper[4973]: E0320 13:42:18.208819 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kbq4j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-79df6bcc97-rm7q7_openstack-operators(7260cd47-ce83-44db-951d-757908bf5953): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:42:18 crc kubenswrapper[4973]: E0320 13:42:18.210307 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-rm7q7" podUID="7260cd47-ce83-44db-951d-757908bf5953" Mar 20 13:42:18 crc kubenswrapper[4973]: E0320 13:42:18.646667 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d\\\"\"" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-rm7q7" podUID="7260cd47-ce83-44db-951d-757908bf5953" Mar 20 13:42:18 crc kubenswrapper[4973]: E0320 13:42:18.662765 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56" Mar 20 13:42:18 crc kubenswrapper[4973]: E0320 13:42:18.663201 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wxttv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-jn8rq_openstack-operators(c0abcba1-e57c-4d90-a8cb-61989da15e87): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:42:18 crc kubenswrapper[4973]: E0320 13:42:18.664551 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-jn8rq" podUID="c0abcba1-e57c-4d90-a8cb-61989da15e87" Mar 20 13:42:19 crc kubenswrapper[4973]: I0320 13:42:19.294325 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5689d29-0b82-4482-9444-5da20e2da57a-cert\") pod \"infra-operator-controller-manager-7b9c774f96-27jg7\" (UID: \"c5689d29-0b82-4482-9444-5da20e2da57a\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-27jg7" Mar 20 13:42:19 crc kubenswrapper[4973]: I0320 13:42:19.301853 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c5689d29-0b82-4482-9444-5da20e2da57a-cert\") pod \"infra-operator-controller-manager-7b9c774f96-27jg7\" (UID: \"c5689d29-0b82-4482-9444-5da20e2da57a\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-27jg7" Mar 20 13:42:19 crc kubenswrapper[4973]: I0320 13:42:19.509442 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-27jg7" Mar 20 13:42:19 crc kubenswrapper[4973]: E0320 13:42:19.650603 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-jn8rq" podUID="c0abcba1-e57c-4d90-a8cb-61989da15e87" Mar 20 13:42:20 crc kubenswrapper[4973]: I0320 13:42:20.006629 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c89c7dd-500b-4bd5-a30e-273c2a485728-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fjztv\" (UID: \"3c89c7dd-500b-4bd5-a30e-273c2a485728\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv" Mar 20 13:42:20 crc kubenswrapper[4973]: E0320 13:42:20.006829 4973 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:42:20 crc kubenswrapper[4973]: E0320 13:42:20.006908 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3c89c7dd-500b-4bd5-a30e-273c2a485728-cert podName:3c89c7dd-500b-4bd5-a30e-273c2a485728 nodeName:}" failed. No retries permitted until 2026-03-20 13:42:36.006884694 +0000 UTC m=+1276.750554438 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/3c89c7dd-500b-4bd5-a30e-273c2a485728-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-fjztv" (UID: "3c89c7dd-500b-4bd5-a30e-273c2a485728") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 13:42:20 crc kubenswrapper[4973]: E0320 13:42:20.216112 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55" Mar 20 13:42:20 crc kubenswrapper[4973]: E0320 13:42:20.216304 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qf4gq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-884679f54-2zn7v_openstack-operators(991cca2c-022f-4c90-a1ba-287191fc2d49): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:42:20 crc kubenswrapper[4973]: E0320 13:42:20.218056 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2zn7v" podUID="991cca2c-022f-4c90-a1ba-287191fc2d49" Mar 20 13:42:20 crc kubenswrapper[4973]: E0320 13:42:20.657266 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2zn7v" podUID="991cca2c-022f-4c90-a1ba-287191fc2d49" Mar 20 13:42:20 crc kubenswrapper[4973]: E0320 13:42:20.687897 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113" Mar 20 13:42:20 crc kubenswrapper[4973]: E0320 13:42:20.688124 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-82zxv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-8464cc45fb-zmqsw_openstack-operators(84e8fd4a-d562-4e92-adfc-479867cf9d3a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:42:20 crc kubenswrapper[4973]: E0320 13:42:20.689358 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zmqsw" podUID="84e8fd4a-d562-4e92-adfc-479867cf9d3a" Mar 20 13:42:21 crc kubenswrapper[4973]: E0320 13:42:21.265868 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a" Mar 20 13:42:21 crc kubenswrapper[4973]: E0320 13:42:21.266372 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-shhzf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5b9f45d989-x4r2t_openstack-operators(39227253-9885-4ba2-a216-c04066dc7c84): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:42:21 crc kubenswrapper[4973]: E0320 13:42:21.268272 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-x4r2t" podUID="39227253-9885-4ba2-a216-c04066dc7c84" Mar 20 13:42:21 crc kubenswrapper[4973]: I0320 13:42:21.638481 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-webhook-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-wlv2f\" (UID: \"835537e8-dced-4516-a7b9-168d9bb6b687\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" Mar 20 13:42:21 crc kubenswrapper[4973]: I0320 13:42:21.638575 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-metrics-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-wlv2f\" (UID: \"835537e8-dced-4516-a7b9-168d9bb6b687\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" Mar 20 13:42:21 crc kubenswrapper[4973]: I0320 13:42:21.645507 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-metrics-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-wlv2f\" (UID: \"835537e8-dced-4516-a7b9-168d9bb6b687\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" Mar 20 13:42:21 crc kubenswrapper[4973]: I0320 13:42:21.645903 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/835537e8-dced-4516-a7b9-168d9bb6b687-webhook-certs\") pod \"openstack-operator-controller-manager-78865ff6b4-wlv2f\" (UID: \"835537e8-dced-4516-a7b9-168d9bb6b687\") " pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" Mar 20 13:42:21 crc kubenswrapper[4973]: E0320 13:42:21.665402 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zmqsw" podUID="84e8fd4a-d562-4e92-adfc-479867cf9d3a" Mar 20 13:42:21 crc kubenswrapper[4973]: E0320 13:42:21.665785 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-x4r2t" podUID="39227253-9885-4ba2-a216-c04066dc7c84" Mar 20 13:42:21 crc kubenswrapper[4973]: I0320 13:42:21.807682 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-whkmt" Mar 20 13:42:21 crc kubenswrapper[4973]: I0320 13:42:21.815734 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" Mar 20 13:42:22 crc kubenswrapper[4973]: E0320 13:42:22.811128 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a" Mar 20 13:42:22 crc kubenswrapper[4973]: E0320 13:42:22.812289 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-tsttx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-767865f676-kzx4f_openstack-operators(45fa2923-6b6f-44da-8693-6ee06c476a8f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:42:22 crc kubenswrapper[4973]: E0320 13:42:22.813619 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-kzx4f" podUID="45fa2923-6b6f-44da-8693-6ee06c476a8f" Mar 20 13:42:23 crc kubenswrapper[4973]: E0320 13:42:23.438862 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:9dd26bc51e7757d84736528d4988a1f980ad50ccb070aef6fc252e32c5c423a8" Mar 20 13:42:23 crc kubenswrapper[4973]: E0320 13:42:23.439069 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:9dd26bc51e7757d84736528d4988a1f980ad50ccb070aef6fc252e32c5c423a8,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ndrrp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-6f787dddc9-qxlrn_openstack-operators(13c558f8-2e66-49c3-b184-7fdbbf4ff6b1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:42:23 crc kubenswrapper[4973]: E0320 13:42:23.440355 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qxlrn" podUID="13c558f8-2e66-49c3-b184-7fdbbf4ff6b1" Mar 20 13:42:23 crc kubenswrapper[4973]: E0320 13:42:23.684438 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:9dd26bc51e7757d84736528d4988a1f980ad50ccb070aef6fc252e32c5c423a8\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qxlrn" podUID="13c558f8-2e66-49c3-b184-7fdbbf4ff6b1" Mar 20 13:42:23 crc kubenswrapper[4973]: E0320 13:42:23.684531 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-kzx4f" podUID="45fa2923-6b6f-44da-8693-6ee06c476a8f" Mar 20 13:42:23 crc kubenswrapper[4973]: E0320 13:42:23.972945 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42" Mar 20 13:42:23 crc kubenswrapper[4973]: E0320 13:42:23.973150 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9c9zj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-xlqnk_openstack-operators(aa34abe0-30d3-4d49-9f20-c15990a91a36): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:42:23 crc kubenswrapper[4973]: E0320 13:42:23.974379 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xlqnk" podUID="aa34abe0-30d3-4d49-9f20-c15990a91a36" Mar 20 13:42:24 crc kubenswrapper[4973]: E0320 13:42:24.449228 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622" Mar 20 13:42:24 crc kubenswrapper[4973]: E0320 13:42:24.449758 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-l2bfv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5784578c99-9dv29_openstack-operators(106eb66b-ca71-49b1-a80e-699f34ac9df9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:42:24 crc kubenswrapper[4973]: E0320 13:42:24.451271 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9dv29" podUID="106eb66b-ca71-49b1-a80e-699f34ac9df9" Mar 20 13:42:24 crc kubenswrapper[4973]: E0320 13:42:24.692245 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9dv29" podUID="106eb66b-ca71-49b1-a80e-699f34ac9df9" Mar 20 13:42:24 crc kubenswrapper[4973]: E0320 13:42:24.692444 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xlqnk" podUID="aa34abe0-30d3-4d49-9f20-c15990a91a36" Mar 20 13:42:24 crc kubenswrapper[4973]: E0320 13:42:24.986251 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e" Mar 20 13:42:24 crc kubenswrapper[4973]: E0320 13:42:24.986470 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-vq2sh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-6bzww_openstack-operators(8b21705a-8662-4147-9a10-9a95982d961c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:42:24 crc kubenswrapper[4973]: E0320 13:42:24.987676 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6bzww" podUID="8b21705a-8662-4147-9a10-9a95982d961c" Mar 20 13:42:25 crc kubenswrapper[4973]: E0320 13:42:25.698578 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6bzww" podUID="8b21705a-8662-4147-9a10-9a95982d961c" Mar 20 13:42:27 crc kubenswrapper[4973]: E0320 13:42:27.928884 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807" Mar 20 13:42:27 crc kubenswrapper[4973]: E0320 13:42:27.929061 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hwqp4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-kzrpr_openstack-operators(5a41f6b9-9f79-454a-af8a-c0ad746f1d42): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:42:27 crc kubenswrapper[4973]: E0320 13:42:27.930257 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kzrpr" podUID="5a41f6b9-9f79-454a-af8a-c0ad746f1d42" Mar 20 13:42:28 crc kubenswrapper[4973]: E0320 13:42:28.718841 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kzrpr" podUID="5a41f6b9-9f79-454a-af8a-c0ad746f1d42" Mar 20 13:42:29 crc kubenswrapper[4973]: E0320 13:42:29.803535 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a" Mar 20 13:42:29 crc kubenswrapper[4973]: E0320 13:42:29.804021 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sjjm9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-p6vx2_openstack-operators(d5a271f2-b17d-487d-a61b-00bd17841392): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:42:29 crc kubenswrapper[4973]: E0320 13:42:29.805233 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-p6vx2" podUID="d5a271f2-b17d-487d-a61b-00bd17841392" Mar 20 13:42:30 crc kubenswrapper[4973]: E0320 13:42:30.300379 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900" Mar 20 13:42:30 crc kubenswrapper[4973]: E0320 13:42:30.300565 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5x569,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-67dd5f86f5-z4dzx_openstack-operators(3233d229-1d2f-4c90-b76a-f27ca914f0ad): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:42:30 crc kubenswrapper[4973]: E0320 13:42:30.302020 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-z4dzx" podUID="3233d229-1d2f-4c90-b76a-f27ca914f0ad" Mar 20 13:42:30 crc kubenswrapper[4973]: E0320 13:42:30.738990 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-p6vx2" podUID="d5a271f2-b17d-487d-a61b-00bd17841392" Mar 20 13:42:30 crc kubenswrapper[4973]: E0320 13:42:30.739096 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900\\\"\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-z4dzx" podUID="3233d229-1d2f-4c90-b76a-f27ca914f0ad" Mar 20 13:42:30 crc kubenswrapper[4973]: E0320 13:42:30.855545 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1" Mar 20 13:42:30 crc kubenswrapper[4973]: E0320 13:42:30.855740 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4fxdm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67ccfc9778-crjg2_openstack-operators(97aab498-21c1-476f-a64b-a526745fc64a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:42:30 crc kubenswrapper[4973]: E0320 13:42:30.857485 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-crjg2" podUID="97aab498-21c1-476f-a64b-a526745fc64a" Mar 20 13:42:30 crc kubenswrapper[4973]: E0320 13:42:30.910432 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.115:5001/openstack-k8s-operators/telemetry-operator:64eb99221d5a8d2494c3622abbc61f411be16a05" Mar 20 13:42:30 crc kubenswrapper[4973]: E0320 13:42:30.910528 4973 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="38.102.83.115:5001/openstack-k8s-operators/telemetry-operator:64eb99221d5a8d2494c3622abbc61f411be16a05" Mar 20 13:42:30 crc kubenswrapper[4973]: E0320 13:42:30.910687 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.115:5001/openstack-k8s-operators/telemetry-operator:64eb99221d5a8d2494c3622abbc61f411be16a05,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6qnmn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-fbb6f4f4f-nh7dd_openstack-operators(ecf17bc8-3c8a-4791-a205-2bdc718ec15f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:42:30 crc kubenswrapper[4973]: E0320 13:42:30.911871 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-nh7dd" podUID="ecf17bc8-3c8a-4791-a205-2bdc718ec15f" Mar 20 13:42:31 crc kubenswrapper[4973]: E0320 13:42:31.504731 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 20 13:42:31 crc kubenswrapper[4973]: E0320 13:42:31.505443 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-hr7c7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-tgmwc_openstack-operators(88d5fd4b-c230-4b94-b988-0b79ec98d991): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:42:31 crc kubenswrapper[4973]: E0320 13:42:31.511565 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tgmwc" podUID="88d5fd4b-c230-4b94-b988-0b79ec98d991" Mar 20 13:42:32 crc kubenswrapper[4973]: E0320 13:42:31.762535 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tgmwc" podUID="88d5fd4b-c230-4b94-b988-0b79ec98d991" Mar 20 13:42:32 crc kubenswrapper[4973]: E0320 13:42:31.762998 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-crjg2" podUID="97aab498-21c1-476f-a64b-a526745fc64a" Mar 20 13:42:32 crc kubenswrapper[4973]: E0320 13:42:31.763089 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.115:5001/openstack-k8s-operators/telemetry-operator:64eb99221d5a8d2494c3622abbc61f411be16a05\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-nh7dd" podUID="ecf17bc8-3c8a-4791-a205-2bdc718ec15f" Mar 20 13:42:32 crc kubenswrapper[4973]: I0320 13:42:32.769594 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-w7228" event={"ID":"9d106cd3-cadb-4cc7-b237-f05294c67dcd","Type":"ContainerStarted","Data":"d15650ff90940373c55cda7b4d5ef14360dd8762e23c99582b32c23d89bf6aed"} Mar 20 13:42:32 crc kubenswrapper[4973]: I0320 13:42:32.770512 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-w7228" Mar 20 13:42:32 crc kubenswrapper[4973]: I0320 13:42:32.772389 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pzl6t" event={"ID":"dbb02721-66ce-44b6-bffe-59851197efa8","Type":"ContainerStarted","Data":"0111d5d1cf00e3e46ec97b9f8664c131d4161b2e129484b3560bb44458842393"} Mar 20 13:42:32 crc kubenswrapper[4973]: I0320 13:42:32.773002 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pzl6t" Mar 20 13:42:32 crc kubenswrapper[4973]: I0320 13:42:32.776603 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gs46s" event={"ID":"530d31a0-48a0-4d06-9b03-c9c205312bdc","Type":"ContainerStarted","Data":"428d92b2cba30f88ba78dcbb0eda9c73a23388fa3b9673a8c548fad1ecddfd97"} Mar 20 13:42:32 crc kubenswrapper[4973]: I0320 13:42:32.777754 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gs46s" Mar 20 13:42:32 crc kubenswrapper[4973]: I0320 13:42:32.780504 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-rm7q7" event={"ID":"7260cd47-ce83-44db-951d-757908bf5953","Type":"ContainerStarted","Data":"3ddbe1f860241747ffe2bcb119db96fce26c5d9ed87277f74369caa994895aa5"} Mar 20 13:42:32 crc kubenswrapper[4973]: I0320 13:42:32.781165 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-rm7q7" Mar 20 13:42:32 crc kubenswrapper[4973]: I0320 13:42:32.785100 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2sflb" event={"ID":"96f5f9d5-bb8c-497c-bfbb-8fd46342ce69","Type":"ContainerStarted","Data":"5aec6d45056afe52d3b7ce9621c75b68b8b10e62423f978764b6b8bea0ec2ee4"} Mar 20 13:42:32 crc kubenswrapper[4973]: I0320 13:42:32.786107 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2sflb" Mar 20 13:42:32 crc kubenswrapper[4973]: I0320 13:42:32.810827 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-w7228" podStartSLOduration=4.401691387 podStartE2EDuration="30.810791697s" podCreationTimestamp="2026-03-20 13:42:02 +0000 UTC" firstStartedPulling="2026-03-20 13:42:04.491043515 +0000 UTC m=+1245.234713259" lastFinishedPulling="2026-03-20 13:42:30.900143825 +0000 UTC m=+1271.643813569" observedRunningTime="2026-03-20 13:42:32.797769848 +0000 UTC m=+1273.541439612" watchObservedRunningTime="2026-03-20 13:42:32.810791697 +0000 UTC m=+1273.554461441" Mar 20 13:42:32 crc kubenswrapper[4973]: I0320 13:42:32.833913 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gs46s" podStartSLOduration=4.132974722 podStartE2EDuration="29.833884757s" podCreationTimestamp="2026-03-20 13:42:03 +0000 UTC" firstStartedPulling="2026-03-20 13:42:05.792468566 +0000 UTC m=+1246.536138310" lastFinishedPulling="2026-03-20 13:42:31.493378601 +0000 UTC m=+1272.237048345" observedRunningTime="2026-03-20 13:42:32.824779771 +0000 UTC m=+1273.568449515" watchObservedRunningTime="2026-03-20 13:42:32.833884757 +0000 UTC m=+1273.577554501" Mar 20 13:42:32 crc kubenswrapper[4973]: I0320 13:42:32.879784 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-rm7q7" podStartSLOduration=3.18833163 podStartE2EDuration="29.87976317s" podCreationTimestamp="2026-03-20 13:42:03 +0000 UTC" firstStartedPulling="2026-03-20 13:42:04.963227954 +0000 UTC m=+1245.706897698" lastFinishedPulling="2026-03-20 13:42:31.654659484 +0000 UTC m=+1272.398329238" observedRunningTime="2026-03-20 13:42:32.869890663 +0000 UTC m=+1273.613560427" watchObservedRunningTime="2026-03-20 13:42:32.87976317 +0000 UTC m=+1273.623432914" Mar 20 13:42:32 crc kubenswrapper[4973]: I0320 13:42:32.880155 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pzl6t" podStartSLOduration=2.996251864 podStartE2EDuration="29.88014888s" podCreationTimestamp="2026-03-20 13:42:03 +0000 UTC" firstStartedPulling="2026-03-20 13:42:04.61120639 +0000 UTC m=+1245.354876134" lastFinishedPulling="2026-03-20 13:42:31.495103406 +0000 UTC m=+1272.238773150" observedRunningTime="2026-03-20 13:42:32.849582415 +0000 UTC m=+1273.593252159" watchObservedRunningTime="2026-03-20 13:42:32.88014888 +0000 UTC m=+1273.623818624" Mar 20 13:42:32 crc kubenswrapper[4973]: I0320 13:42:32.904928 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2sflb" podStartSLOduration=3.386027739 podStartE2EDuration="29.904902924s" podCreationTimestamp="2026-03-20 13:42:03 +0000 UTC" firstStartedPulling="2026-03-20 13:42:04.982002501 +0000 UTC m=+1245.725672245" lastFinishedPulling="2026-03-20 13:42:31.500877686 +0000 UTC m=+1272.244547430" observedRunningTime="2026-03-20 13:42:32.885046938 +0000 UTC m=+1273.628716692" watchObservedRunningTime="2026-03-20 13:42:32.904902924 +0000 UTC m=+1273.648572668" Mar 20 13:42:33 crc kubenswrapper[4973]: I0320 13:42:33.026107 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f"] Mar 20 13:42:33 crc kubenswrapper[4973]: I0320 13:42:33.046100 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-27jg7"] Mar 20 13:42:33 crc kubenswrapper[4973]: I0320 13:42:33.801326 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" event={"ID":"835537e8-dced-4516-a7b9-168d9bb6b687","Type":"ContainerStarted","Data":"a7ffc355f470e3995b3a71df1e6fbec46d566288201fb2caa5f013b1b2cb3a17"} Mar 20 13:42:33 crc kubenswrapper[4973]: I0320 13:42:33.802129 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" event={"ID":"835537e8-dced-4516-a7b9-168d9bb6b687","Type":"ContainerStarted","Data":"78c99d74fdc33f3de83b406aaace4b416eafadbcac98220269845cfbf1d74add"} Mar 20 13:42:33 crc kubenswrapper[4973]: I0320 13:42:33.802150 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" Mar 20 13:42:33 crc kubenswrapper[4973]: I0320 13:42:33.803799 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-jn8rq" event={"ID":"c0abcba1-e57c-4d90-a8cb-61989da15e87","Type":"ContainerStarted","Data":"f30aa87b4fd83d54a738630d207249662b16536e8808c72eb9cc47bf620062a0"} Mar 20 13:42:33 crc kubenswrapper[4973]: I0320 13:42:33.804577 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-jn8rq" Mar 20 13:42:33 crc kubenswrapper[4973]: I0320 13:42:33.807435 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-27jg7" event={"ID":"c5689d29-0b82-4482-9444-5da20e2da57a","Type":"ContainerStarted","Data":"72c0724e69b918427341b7a07635526b3a0708e3164bea112e4e827d5b3a40c1"} Mar 20 13:42:33 crc kubenswrapper[4973]: I0320 13:42:33.839004 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" podStartSLOduration=30.838980022 podStartE2EDuration="30.838980022s" podCreationTimestamp="2026-03-20 13:42:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:42:33.827986356 +0000 UTC m=+1274.571656100" watchObservedRunningTime="2026-03-20 13:42:33.838980022 +0000 UTC m=+1274.582649766" Mar 20 13:42:33 crc kubenswrapper[4973]: I0320 13:42:33.859874 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-jn8rq" podStartSLOduration=3.135030153 podStartE2EDuration="30.859855595s" podCreationTimestamp="2026-03-20 13:42:03 +0000 UTC" firstStartedPulling="2026-03-20 13:42:05.818251226 +0000 UTC m=+1246.561920970" lastFinishedPulling="2026-03-20 13:42:33.543076668 +0000 UTC m=+1274.286746412" observedRunningTime="2026-03-20 13:42:33.852785061 +0000 UTC m=+1274.596454825" watchObservedRunningTime="2026-03-20 13:42:33.859855595 +0000 UTC m=+1274.603525339" Mar 20 13:42:34 crc kubenswrapper[4973]: I0320 13:42:34.816801 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2zn7v" event={"ID":"991cca2c-022f-4c90-a1ba-287191fc2d49","Type":"ContainerStarted","Data":"1e83d0d3d344840e15b09faedf936387abd1bd53b32675e69311dacf29c9941f"} Mar 20 13:42:34 crc kubenswrapper[4973]: I0320 13:42:34.817854 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2zn7v" Mar 20 13:42:34 crc kubenswrapper[4973]: I0320 13:42:34.849825 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2zn7v" podStartSLOduration=5.473861209 podStartE2EDuration="31.849799596s" podCreationTimestamp="2026-03-20 13:42:03 +0000 UTC" firstStartedPulling="2026-03-20 13:42:07.265483538 +0000 UTC m=+1248.009153282" lastFinishedPulling="2026-03-20 13:42:33.641421925 +0000 UTC m=+1274.385091669" observedRunningTime="2026-03-20 13:42:34.843947334 +0000 UTC m=+1275.587617078" watchObservedRunningTime="2026-03-20 13:42:34.849799596 +0000 UTC m=+1275.593469340" Mar 20 13:42:36 crc kubenswrapper[4973]: I0320 13:42:36.017536 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c89c7dd-500b-4bd5-a30e-273c2a485728-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fjztv\" (UID: \"3c89c7dd-500b-4bd5-a30e-273c2a485728\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv" Mar 20 13:42:36 crc kubenswrapper[4973]: I0320 13:42:36.034214 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/3c89c7dd-500b-4bd5-a30e-273c2a485728-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fjztv\" (UID: \"3c89c7dd-500b-4bd5-a30e-273c2a485728\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv" Mar 20 13:42:36 crc kubenswrapper[4973]: I0320 13:42:36.315611 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-xcxmb" Mar 20 13:42:36 crc kubenswrapper[4973]: I0320 13:42:36.324375 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv" Mar 20 13:42:36 crc kubenswrapper[4973]: I0320 13:42:36.814079 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv"] Mar 20 13:42:37 crc kubenswrapper[4973]: I0320 13:42:37.125487 4973 scope.go:117] "RemoveContainer" containerID="ac4f59f9001d9e345861f200ad8dfcc9222083eb317def46c65718537696db7d" Mar 20 13:42:37 crc kubenswrapper[4973]: W0320 13:42:37.652608 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c89c7dd_500b_4bd5_a30e_273c2a485728.slice/crio-b52021a548e3506ddb3e6c73647dcd57fb4a37483d950e943749a90d1b8efb29 WatchSource:0}: Error finding container b52021a548e3506ddb3e6c73647dcd57fb4a37483d950e943749a90d1b8efb29: Status 404 returned error can't find the container with id b52021a548e3506ddb3e6c73647dcd57fb4a37483d950e943749a90d1b8efb29 Mar 20 13:42:37 crc kubenswrapper[4973]: I0320 13:42:37.859843 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv" event={"ID":"3c89c7dd-500b-4bd5-a30e-273c2a485728","Type":"ContainerStarted","Data":"b52021a548e3506ddb3e6c73647dcd57fb4a37483d950e943749a90d1b8efb29"} Mar 20 13:42:39 crc kubenswrapper[4973]: I0320 13:42:39.876425 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-kzx4f" event={"ID":"45fa2923-6b6f-44da-8693-6ee06c476a8f","Type":"ContainerStarted","Data":"27cba3fce4e3f62c5795c36bc1e2f7dab7455f7bb4586ac220f0d6f90a43c06b"} Mar 20 13:42:39 crc kubenswrapper[4973]: I0320 13:42:39.878804 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qxlrn" event={"ID":"13c558f8-2e66-49c3-b184-7fdbbf4ff6b1","Type":"ContainerStarted","Data":"d2d121d7035c3885763efb5565bb22cfa32accad5ab35846d118705515c7f09d"} Mar 20 13:42:39 crc kubenswrapper[4973]: I0320 13:42:39.879020 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qxlrn" Mar 20 13:42:39 crc kubenswrapper[4973]: I0320 13:42:39.879509 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9dv29" event={"ID":"106eb66b-ca71-49b1-a80e-699f34ac9df9","Type":"ContainerStarted","Data":"8268f2f172a5b4b54617b5e5b3b452a9f51255c11941846244b57bedc529d4f3"} Mar 20 13:42:39 crc kubenswrapper[4973]: I0320 13:42:39.879751 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9dv29" Mar 20 13:42:39 crc kubenswrapper[4973]: I0320 13:42:39.881102 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-x4r2t" event={"ID":"39227253-9885-4ba2-a216-c04066dc7c84","Type":"ContainerStarted","Data":"792ecf353438fd833d1b1d4cb929edfe2e7ae7877fd6b48a788fc162ec9ead64"} Mar 20 13:42:39 crc kubenswrapper[4973]: I0320 13:42:39.881251 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-x4r2t" Mar 20 13:42:39 crc kubenswrapper[4973]: I0320 13:42:39.895523 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-27jg7" event={"ID":"c5689d29-0b82-4482-9444-5da20e2da57a","Type":"ContainerStarted","Data":"8ce4605521819a312fb937d0ef6fde4e566048abc948715a0980ce37a1f82174"} Mar 20 13:42:39 crc kubenswrapper[4973]: I0320 13:42:39.896316 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-27jg7" Mar 20 13:42:39 crc kubenswrapper[4973]: I0320 13:42:39.901942 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zmqsw" event={"ID":"84e8fd4a-d562-4e92-adfc-479867cf9d3a","Type":"ContainerStarted","Data":"47e061e7cb141ec16e8c858b8bb5972cee4732cf9c9dd196c7e45dc458872437"} Mar 20 13:42:39 crc kubenswrapper[4973]: I0320 13:42:39.902222 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zmqsw" Mar 20 13:42:39 crc kubenswrapper[4973]: I0320 13:42:39.903320 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qxlrn" podStartSLOduration=3.708850344 podStartE2EDuration="36.903309868s" podCreationTimestamp="2026-03-20 13:42:03 +0000 UTC" firstStartedPulling="2026-03-20 13:42:05.282544287 +0000 UTC m=+1246.026214031" lastFinishedPulling="2026-03-20 13:42:38.477003811 +0000 UTC m=+1279.220673555" observedRunningTime="2026-03-20 13:42:39.899755187 +0000 UTC m=+1280.643424931" watchObservedRunningTime="2026-03-20 13:42:39.903309868 +0000 UTC m=+1280.646979612" Mar 20 13:42:39 crc kubenswrapper[4973]: I0320 13:42:39.924289 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-27jg7" podStartSLOduration=31.525301098 podStartE2EDuration="36.924274234s" podCreationTimestamp="2026-03-20 13:42:03 +0000 UTC" firstStartedPulling="2026-03-20 13:42:33.077212264 +0000 UTC m=+1273.820881998" lastFinishedPulling="2026-03-20 13:42:38.47618539 +0000 UTC m=+1279.219855134" observedRunningTime="2026-03-20 13:42:39.921515912 +0000 UTC m=+1280.665185646" watchObservedRunningTime="2026-03-20 13:42:39.924274234 +0000 UTC m=+1280.667943978" Mar 20 13:42:39 crc kubenswrapper[4973]: I0320 13:42:39.942081 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-x4r2t" podStartSLOduration=5.768599441 podStartE2EDuration="36.942060656s" podCreationTimestamp="2026-03-20 13:42:03 +0000 UTC" firstStartedPulling="2026-03-20 13:42:07.304136472 +0000 UTC m=+1248.047806226" lastFinishedPulling="2026-03-20 13:42:38.477597697 +0000 UTC m=+1279.221267441" observedRunningTime="2026-03-20 13:42:39.935304771 +0000 UTC m=+1280.678974525" watchObservedRunningTime="2026-03-20 13:42:39.942060656 +0000 UTC m=+1280.685730400" Mar 20 13:42:39 crc kubenswrapper[4973]: I0320 13:42:39.956757 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9dv29" podStartSLOduration=5.783853898 podStartE2EDuration="36.956735938s" podCreationTimestamp="2026-03-20 13:42:03 +0000 UTC" firstStartedPulling="2026-03-20 13:42:07.304153852 +0000 UTC m=+1248.047823596" lastFinishedPulling="2026-03-20 13:42:38.477035892 +0000 UTC m=+1279.220705636" observedRunningTime="2026-03-20 13:42:39.952031966 +0000 UTC m=+1280.695701710" watchObservedRunningTime="2026-03-20 13:42:39.956735938 +0000 UTC m=+1280.700405682" Mar 20 13:42:39 crc kubenswrapper[4973]: I0320 13:42:39.974967 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-kzx4f" podStartSLOduration=4.18564328 podStartE2EDuration="36.974951471s" podCreationTimestamp="2026-03-20 13:42:03 +0000 UTC" firstStartedPulling="2026-03-20 13:42:05.863275206 +0000 UTC m=+1246.606944950" lastFinishedPulling="2026-03-20 13:42:38.652583397 +0000 UTC m=+1279.396253141" observedRunningTime="2026-03-20 13:42:39.96528541 +0000 UTC m=+1280.708955154" watchObservedRunningTime="2026-03-20 13:42:39.974951471 +0000 UTC m=+1280.718621215" Mar 20 13:42:40 crc kubenswrapper[4973]: I0320 13:42:40.047518 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zmqsw" podStartSLOduration=4.37867077 podStartE2EDuration="37.047494038s" podCreationTimestamp="2026-03-20 13:42:03 +0000 UTC" firstStartedPulling="2026-03-20 13:42:05.806953172 +0000 UTC m=+1246.550622956" lastFinishedPulling="2026-03-20 13:42:38.47577648 +0000 UTC m=+1279.219446224" observedRunningTime="2026-03-20 13:42:39.987662432 +0000 UTC m=+1280.731332196" watchObservedRunningTime="2026-03-20 13:42:40.047494038 +0000 UTC m=+1280.791163782" Mar 20 13:42:41 crc kubenswrapper[4973]: I0320 13:42:41.821919 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" Mar 20 13:42:41 crc kubenswrapper[4973]: I0320 13:42:41.919306 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv" event={"ID":"3c89c7dd-500b-4bd5-a30e-273c2a485728","Type":"ContainerStarted","Data":"0cbde50b23078806cbba2a7b7f25abe930451a91277ad55c22e173aac2c88dcd"} Mar 20 13:42:41 crc kubenswrapper[4973]: I0320 13:42:41.919649 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv" Mar 20 13:42:41 crc kubenswrapper[4973]: I0320 13:42:41.921457 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xlqnk" event={"ID":"aa34abe0-30d3-4d49-9f20-c15990a91a36","Type":"ContainerStarted","Data":"3c1bffd6433b45be6a1d186a7b2961a5f81280343d5db821ffc7fb0b41e0980a"} Mar 20 13:42:41 crc kubenswrapper[4973]: I0320 13:42:41.921634 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xlqnk" Mar 20 13:42:41 crc kubenswrapper[4973]: I0320 13:42:41.924323 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6bzww" event={"ID":"8b21705a-8662-4147-9a10-9a95982d961c","Type":"ContainerStarted","Data":"a7900f19224085083761d2f9076a29e9c62571a564080e27cacdaa94f2f3f80f"} Mar 20 13:42:41 crc kubenswrapper[4973]: I0320 13:42:41.924661 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6bzww" Mar 20 13:42:41 crc kubenswrapper[4973]: I0320 13:42:41.949836 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv" podStartSLOduration=35.786472009 podStartE2EDuration="38.949815193s" podCreationTimestamp="2026-03-20 13:42:03 +0000 UTC" firstStartedPulling="2026-03-20 13:42:37.658330944 +0000 UTC m=+1278.402000678" lastFinishedPulling="2026-03-20 13:42:40.821674118 +0000 UTC m=+1281.565343862" observedRunningTime="2026-03-20 13:42:41.942122472 +0000 UTC m=+1282.685792206" watchObservedRunningTime="2026-03-20 13:42:41.949815193 +0000 UTC m=+1282.693484927" Mar 20 13:42:41 crc kubenswrapper[4973]: I0320 13:42:41.973859 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6bzww" podStartSLOduration=5.450991113 podStartE2EDuration="38.973838137s" podCreationTimestamp="2026-03-20 13:42:03 +0000 UTC" firstStartedPulling="2026-03-20 13:42:07.360043286 +0000 UTC m=+1248.103713030" lastFinishedPulling="2026-03-20 13:42:40.88289031 +0000 UTC m=+1281.626560054" observedRunningTime="2026-03-20 13:42:41.966774374 +0000 UTC m=+1282.710444118" watchObservedRunningTime="2026-03-20 13:42:41.973838137 +0000 UTC m=+1282.717507881" Mar 20 13:42:41 crc kubenswrapper[4973]: I0320 13:42:41.987645 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xlqnk" podStartSLOduration=5.406721393 podStartE2EDuration="38.987626606s" podCreationTimestamp="2026-03-20 13:42:03 +0000 UTC" firstStartedPulling="2026-03-20 13:42:07.303567868 +0000 UTC m=+1248.047237612" lastFinishedPulling="2026-03-20 13:42:40.884473041 +0000 UTC m=+1281.628142825" observedRunningTime="2026-03-20 13:42:41.980951292 +0000 UTC m=+1282.724621036" watchObservedRunningTime="2026-03-20 13:42:41.987626606 +0000 UTC m=+1282.731296350" Mar 20 13:42:42 crc kubenswrapper[4973]: I0320 13:42:42.933015 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kzrpr" event={"ID":"5a41f6b9-9f79-454a-af8a-c0ad746f1d42","Type":"ContainerStarted","Data":"c5348ed0f1912672f1f5eb2ece1c261cf4c18140bfeaa16f9e583d41e2b71a1b"} Mar 20 13:42:42 crc kubenswrapper[4973]: I0320 13:42:42.933414 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kzrpr" Mar 20 13:42:42 crc kubenswrapper[4973]: I0320 13:42:42.935443 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-p6vx2" event={"ID":"d5a271f2-b17d-487d-a61b-00bd17841392","Type":"ContainerStarted","Data":"2ce2d16fbcff28a3900a92820ec183b88d9e82601313e6218bdfccd52f8601be"} Mar 20 13:42:42 crc kubenswrapper[4973]: I0320 13:42:42.980282 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kzrpr" podStartSLOduration=4.806557958 podStartE2EDuration="39.980262757s" podCreationTimestamp="2026-03-20 13:42:03 +0000 UTC" firstStartedPulling="2026-03-20 13:42:07.314166424 +0000 UTC m=+1248.057836168" lastFinishedPulling="2026-03-20 13:42:42.487871223 +0000 UTC m=+1283.231540967" observedRunningTime="2026-03-20 13:42:42.947484484 +0000 UTC m=+1283.691154238" watchObservedRunningTime="2026-03-20 13:42:42.980262757 +0000 UTC m=+1283.723932501" Mar 20 13:42:42 crc kubenswrapper[4973]: I0320 13:42:42.990981 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-p6vx2" podStartSLOduration=4.805872938 podStartE2EDuration="39.990939744s" podCreationTimestamp="2026-03-20 13:42:03 +0000 UTC" firstStartedPulling="2026-03-20 13:42:07.303787223 +0000 UTC m=+1248.047456967" lastFinishedPulling="2026-03-20 13:42:42.488854029 +0000 UTC m=+1283.232523773" observedRunningTime="2026-03-20 13:42:42.972682349 +0000 UTC m=+1283.716352093" watchObservedRunningTime="2026-03-20 13:42:42.990939744 +0000 UTC m=+1283.734609488" Mar 20 13:42:43 crc kubenswrapper[4973]: I0320 13:42:43.365670 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-w7228" Mar 20 13:42:43 crc kubenswrapper[4973]: I0320 13:42:43.405367 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pzl6t" Mar 20 13:42:43 crc kubenswrapper[4973]: I0320 13:42:43.460656 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2sflb" Mar 20 13:42:43 crc kubenswrapper[4973]: I0320 13:42:43.473289 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-rm7q7" Mar 20 13:42:43 crc kubenswrapper[4973]: I0320 13:42:43.604998 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zmqsw" Mar 20 13:42:43 crc kubenswrapper[4973]: I0320 13:42:43.808450 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qxlrn" Mar 20 13:42:43 crc kubenswrapper[4973]: I0320 13:42:43.839197 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-jn8rq" Mar 20 13:42:43 crc kubenswrapper[4973]: I0320 13:42:43.908702 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gs46s" Mar 20 13:42:43 crc kubenswrapper[4973]: I0320 13:42:43.944822 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-crjg2" event={"ID":"97aab498-21c1-476f-a64b-a526745fc64a","Type":"ContainerStarted","Data":"f4c76055694873061bb11d9349a1be8025ea0b68446285ccce9560476eb4d87a"} Mar 20 13:42:43 crc kubenswrapper[4973]: I0320 13:42:43.946136 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-crjg2" Mar 20 13:42:43 crc kubenswrapper[4973]: I0320 13:42:43.948036 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tgmwc" event={"ID":"88d5fd4b-c230-4b94-b988-0b79ec98d991","Type":"ContainerStarted","Data":"8581f1c05acabc297c1ebaac4b82e6d3be4b4ded01793ba322a62f420671b08c"} Mar 20 13:42:43 crc kubenswrapper[4973]: I0320 13:42:43.964152 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-kzx4f" Mar 20 13:42:43 crc kubenswrapper[4973]: I0320 13:42:43.964212 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-kzx4f" Mar 20 13:42:43 crc kubenswrapper[4973]: I0320 13:42:43.970758 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-crjg2" podStartSLOduration=3.278835072 podStartE2EDuration="40.970740311s" podCreationTimestamp="2026-03-20 13:42:03 +0000 UTC" firstStartedPulling="2026-03-20 13:42:05.78687565 +0000 UTC m=+1246.530545394" lastFinishedPulling="2026-03-20 13:42:43.478780889 +0000 UTC m=+1284.222450633" observedRunningTime="2026-03-20 13:42:43.967892567 +0000 UTC m=+1284.711562321" watchObservedRunningTime="2026-03-20 13:42:43.970740311 +0000 UTC m=+1284.714410055" Mar 20 13:42:43 crc kubenswrapper[4973]: I0320 13:42:43.983683 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-p6vx2" Mar 20 13:42:43 crc kubenswrapper[4973]: I0320 13:42:43.990133 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-tgmwc" podStartSLOduration=4.854563035 podStartE2EDuration="40.990113855s" podCreationTimestamp="2026-03-20 13:42:03 +0000 UTC" firstStartedPulling="2026-03-20 13:42:07.356653908 +0000 UTC m=+1248.100323652" lastFinishedPulling="2026-03-20 13:42:43.492204728 +0000 UTC m=+1284.235874472" observedRunningTime="2026-03-20 13:42:43.982721543 +0000 UTC m=+1284.726391297" watchObservedRunningTime="2026-03-20 13:42:43.990113855 +0000 UTC m=+1284.733783599" Mar 20 13:42:44 crc kubenswrapper[4973]: I0320 13:42:44.122857 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-x4r2t" Mar 20 13:42:44 crc kubenswrapper[4973]: I0320 13:42:44.493819 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9dv29" Mar 20 13:42:44 crc kubenswrapper[4973]: I0320 13:42:44.567037 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2zn7v" Mar 20 13:42:45 crc kubenswrapper[4973]: I0320 13:42:45.966054 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-z4dzx" event={"ID":"3233d229-1d2f-4c90-b76a-f27ca914f0ad","Type":"ContainerStarted","Data":"801bf5e0082df44a297619058823eacffa4783ddc1a63a0f4835639181adb12b"} Mar 20 13:42:45 crc kubenswrapper[4973]: I0320 13:42:45.966903 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-z4dzx" Mar 20 13:42:45 crc kubenswrapper[4973]: I0320 13:42:45.967199 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-nh7dd" event={"ID":"ecf17bc8-3c8a-4791-a205-2bdc718ec15f","Type":"ContainerStarted","Data":"0831f025147760915050a99ca534c922e1e47e0b73f05b01e273cc0201a46989"} Mar 20 13:42:45 crc kubenswrapper[4973]: I0320 13:42:45.967389 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-nh7dd" Mar 20 13:42:45 crc kubenswrapper[4973]: I0320 13:42:45.983605 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-z4dzx" podStartSLOduration=2.913881863 podStartE2EDuration="42.98358983s" podCreationTimestamp="2026-03-20 13:42:03 +0000 UTC" firstStartedPulling="2026-03-20 13:42:04.98193469 +0000 UTC m=+1245.725604434" lastFinishedPulling="2026-03-20 13:42:45.051642657 +0000 UTC m=+1285.795312401" observedRunningTime="2026-03-20 13:42:45.981129176 +0000 UTC m=+1286.724798920" watchObservedRunningTime="2026-03-20 13:42:45.98358983 +0000 UTC m=+1286.727259574" Mar 20 13:42:45 crc kubenswrapper[4973]: I0320 13:42:45.999310 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-nh7dd" podStartSLOduration=4.74976662 podStartE2EDuration="42.999294448s" podCreationTimestamp="2026-03-20 13:42:03 +0000 UTC" firstStartedPulling="2026-03-20 13:42:07.36056305 +0000 UTC m=+1248.104232804" lastFinishedPulling="2026-03-20 13:42:45.610090898 +0000 UTC m=+1286.353760632" observedRunningTime="2026-03-20 13:42:45.997217445 +0000 UTC m=+1286.740887189" watchObservedRunningTime="2026-03-20 13:42:45.999294448 +0000 UTC m=+1286.742964192" Mar 20 13:42:46 crc kubenswrapper[4973]: I0320 13:42:46.330456 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv" Mar 20 13:42:49 crc kubenswrapper[4973]: I0320 13:42:49.516682 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-27jg7" Mar 20 13:42:53 crc kubenswrapper[4973]: I0320 13:42:53.485971 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-z4dzx" Mar 20 13:42:53 crc kubenswrapper[4973]: I0320 13:42:53.904796 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-crjg2" Mar 20 13:42:53 crc kubenswrapper[4973]: I0320 13:42:53.985837 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-p6vx2" Mar 20 13:42:54 crc kubenswrapper[4973]: I0320 13:42:54.827721 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-6bzww" Mar 20 13:42:54 crc kubenswrapper[4973]: I0320 13:42:54.841427 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-nh7dd" Mar 20 13:42:54 crc kubenswrapper[4973]: I0320 13:42:54.864644 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xlqnk" Mar 20 13:42:54 crc kubenswrapper[4973]: I0320 13:42:54.950721 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kzrpr" Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.652928 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5nwgs"] Mar 20 13:43:10 crc kubenswrapper[4973]: E0320 13:43:10.654047 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4aecf729-16cf-45ff-80bb-6d4f4cfd5f7d" containerName="oc" Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.654064 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="4aecf729-16cf-45ff-80bb-6d4f4cfd5f7d" containerName="oc" Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.654284 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="4aecf729-16cf-45ff-80bb-6d4f4cfd5f7d" containerName="oc" Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.655502 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-5nwgs" Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.663839 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.663956 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.664126 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.664259 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-p7l2w" Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.668165 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5nwgs"] Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.677733 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf9k9\" (UniqueName: \"kubernetes.io/projected/12680dc2-8341-4d82-bf13-e292d9898d90-kube-api-access-qf9k9\") pod \"dnsmasq-dns-675f4bcbfc-5nwgs\" (UID: \"12680dc2-8341-4d82-bf13-e292d9898d90\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5nwgs" Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.677933 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12680dc2-8341-4d82-bf13-e292d9898d90-config\") pod \"dnsmasq-dns-675f4bcbfc-5nwgs\" (UID: \"12680dc2-8341-4d82-bf13-e292d9898d90\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5nwgs" Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.724418 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tjbs5"] Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.726144 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-tjbs5" Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.732132 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.738711 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tjbs5"] Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.779620 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2200265-2c2e-4f89-ae21-e065a8ec1288-config\") pod \"dnsmasq-dns-78dd6ddcc-tjbs5\" (UID: \"f2200265-2c2e-4f89-ae21-e065a8ec1288\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tjbs5" Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.780031 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12680dc2-8341-4d82-bf13-e292d9898d90-config\") pod \"dnsmasq-dns-675f4bcbfc-5nwgs\" (UID: \"12680dc2-8341-4d82-bf13-e292d9898d90\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5nwgs" Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.780312 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2200265-2c2e-4f89-ae21-e065a8ec1288-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-tjbs5\" (UID: \"f2200265-2c2e-4f89-ae21-e065a8ec1288\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tjbs5" Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.780462 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qf9k9\" (UniqueName: \"kubernetes.io/projected/12680dc2-8341-4d82-bf13-e292d9898d90-kube-api-access-qf9k9\") pod \"dnsmasq-dns-675f4bcbfc-5nwgs\" (UID: \"12680dc2-8341-4d82-bf13-e292d9898d90\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5nwgs" Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.780566 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s67rc\" (UniqueName: \"kubernetes.io/projected/f2200265-2c2e-4f89-ae21-e065a8ec1288-kube-api-access-s67rc\") pod \"dnsmasq-dns-78dd6ddcc-tjbs5\" (UID: \"f2200265-2c2e-4f89-ae21-e065a8ec1288\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tjbs5" Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.781114 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12680dc2-8341-4d82-bf13-e292d9898d90-config\") pod \"dnsmasq-dns-675f4bcbfc-5nwgs\" (UID: \"12680dc2-8341-4d82-bf13-e292d9898d90\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5nwgs" Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.801821 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf9k9\" (UniqueName: \"kubernetes.io/projected/12680dc2-8341-4d82-bf13-e292d9898d90-kube-api-access-qf9k9\") pod \"dnsmasq-dns-675f4bcbfc-5nwgs\" (UID: \"12680dc2-8341-4d82-bf13-e292d9898d90\") " pod="openstack/dnsmasq-dns-675f4bcbfc-5nwgs" Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.897581 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2200265-2c2e-4f89-ae21-e065a8ec1288-config\") pod \"dnsmasq-dns-78dd6ddcc-tjbs5\" (UID: \"f2200265-2c2e-4f89-ae21-e065a8ec1288\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tjbs5" Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.897870 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2200265-2c2e-4f89-ae21-e065a8ec1288-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-tjbs5\" (UID: \"f2200265-2c2e-4f89-ae21-e065a8ec1288\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tjbs5" Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.898502 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s67rc\" (UniqueName: \"kubernetes.io/projected/f2200265-2c2e-4f89-ae21-e065a8ec1288-kube-api-access-s67rc\") pod \"dnsmasq-dns-78dd6ddcc-tjbs5\" (UID: \"f2200265-2c2e-4f89-ae21-e065a8ec1288\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tjbs5" Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.898574 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2200265-2c2e-4f89-ae21-e065a8ec1288-config\") pod \"dnsmasq-dns-78dd6ddcc-tjbs5\" (UID: \"f2200265-2c2e-4f89-ae21-e065a8ec1288\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tjbs5" Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.898925 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2200265-2c2e-4f89-ae21-e065a8ec1288-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-tjbs5\" (UID: \"f2200265-2c2e-4f89-ae21-e065a8ec1288\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tjbs5" Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.918977 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s67rc\" (UniqueName: \"kubernetes.io/projected/f2200265-2c2e-4f89-ae21-e065a8ec1288-kube-api-access-s67rc\") pod \"dnsmasq-dns-78dd6ddcc-tjbs5\" (UID: \"f2200265-2c2e-4f89-ae21-e065a8ec1288\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tjbs5" Mar 20 13:43:10 crc kubenswrapper[4973]: I0320 13:43:10.985614 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-5nwgs" Mar 20 13:43:11 crc kubenswrapper[4973]: I0320 13:43:11.060957 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-tjbs5" Mar 20 13:43:11 crc kubenswrapper[4973]: I0320 13:43:11.484006 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5nwgs"] Mar 20 13:43:11 crc kubenswrapper[4973]: I0320 13:43:11.579860 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tjbs5"] Mar 20 13:43:11 crc kubenswrapper[4973]: W0320 13:43:11.582476 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2200265_2c2e_4f89_ae21_e065a8ec1288.slice/crio-3e1ba7e3289395fafbf125318a2e32b24ebd859583ac4d25974ba19bebdec515 WatchSource:0}: Error finding container 3e1ba7e3289395fafbf125318a2e32b24ebd859583ac4d25974ba19bebdec515: Status 404 returned error can't find the container with id 3e1ba7e3289395fafbf125318a2e32b24ebd859583ac4d25974ba19bebdec515 Mar 20 13:43:12 crc kubenswrapper[4973]: I0320 13:43:12.225683 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-5nwgs" event={"ID":"12680dc2-8341-4d82-bf13-e292d9898d90","Type":"ContainerStarted","Data":"7eca94781d61cc8b8ac3516e18f4d2b9e87128c6af73b3b55f34b97f2bedc4cf"} Mar 20 13:43:12 crc kubenswrapper[4973]: I0320 13:43:12.226886 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-tjbs5" event={"ID":"f2200265-2c2e-4f89-ae21-e065a8ec1288","Type":"ContainerStarted","Data":"3e1ba7e3289395fafbf125318a2e32b24ebd859583ac4d25974ba19bebdec515"} Mar 20 13:43:13 crc kubenswrapper[4973]: I0320 13:43:13.568724 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5nwgs"] Mar 20 13:43:13 crc kubenswrapper[4973]: I0320 13:43:13.638631 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-rxsvm"] Mar 20 13:43:13 crc kubenswrapper[4973]: I0320 13:43:13.640379 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-rxsvm" Mar 20 13:43:13 crc kubenswrapper[4973]: I0320 13:43:13.657062 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-rxsvm"] Mar 20 13:43:13 crc kubenswrapper[4973]: I0320 13:43:13.673290 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ee928ab-23ef-4f3e-a025-7576302eb628-config\") pod \"dnsmasq-dns-5ccc8479f9-rxsvm\" (UID: \"6ee928ab-23ef-4f3e-a025-7576302eb628\") " pod="openstack/dnsmasq-dns-5ccc8479f9-rxsvm" Mar 20 13:43:13 crc kubenswrapper[4973]: I0320 13:43:13.673385 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zznvc\" (UniqueName: \"kubernetes.io/projected/6ee928ab-23ef-4f3e-a025-7576302eb628-kube-api-access-zznvc\") pod \"dnsmasq-dns-5ccc8479f9-rxsvm\" (UID: \"6ee928ab-23ef-4f3e-a025-7576302eb628\") " pod="openstack/dnsmasq-dns-5ccc8479f9-rxsvm" Mar 20 13:43:13 crc kubenswrapper[4973]: I0320 13:43:13.675473 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ee928ab-23ef-4f3e-a025-7576302eb628-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-rxsvm\" (UID: \"6ee928ab-23ef-4f3e-a025-7576302eb628\") " pod="openstack/dnsmasq-dns-5ccc8479f9-rxsvm" Mar 20 13:43:13 crc kubenswrapper[4973]: I0320 13:43:13.779893 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ee928ab-23ef-4f3e-a025-7576302eb628-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-rxsvm\" (UID: \"6ee928ab-23ef-4f3e-a025-7576302eb628\") " pod="openstack/dnsmasq-dns-5ccc8479f9-rxsvm" Mar 20 13:43:13 crc kubenswrapper[4973]: I0320 13:43:13.780046 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ee928ab-23ef-4f3e-a025-7576302eb628-config\") pod \"dnsmasq-dns-5ccc8479f9-rxsvm\" (UID: \"6ee928ab-23ef-4f3e-a025-7576302eb628\") " pod="openstack/dnsmasq-dns-5ccc8479f9-rxsvm" Mar 20 13:43:13 crc kubenswrapper[4973]: I0320 13:43:13.780070 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zznvc\" (UniqueName: \"kubernetes.io/projected/6ee928ab-23ef-4f3e-a025-7576302eb628-kube-api-access-zznvc\") pod \"dnsmasq-dns-5ccc8479f9-rxsvm\" (UID: \"6ee928ab-23ef-4f3e-a025-7576302eb628\") " pod="openstack/dnsmasq-dns-5ccc8479f9-rxsvm" Mar 20 13:43:13 crc kubenswrapper[4973]: I0320 13:43:13.781022 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ee928ab-23ef-4f3e-a025-7576302eb628-config\") pod \"dnsmasq-dns-5ccc8479f9-rxsvm\" (UID: \"6ee928ab-23ef-4f3e-a025-7576302eb628\") " pod="openstack/dnsmasq-dns-5ccc8479f9-rxsvm" Mar 20 13:43:13 crc kubenswrapper[4973]: I0320 13:43:13.782512 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ee928ab-23ef-4f3e-a025-7576302eb628-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-rxsvm\" (UID: \"6ee928ab-23ef-4f3e-a025-7576302eb628\") " pod="openstack/dnsmasq-dns-5ccc8479f9-rxsvm" Mar 20 13:43:13 crc kubenswrapper[4973]: I0320 13:43:13.830935 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zznvc\" (UniqueName: \"kubernetes.io/projected/6ee928ab-23ef-4f3e-a025-7576302eb628-kube-api-access-zznvc\") pod \"dnsmasq-dns-5ccc8479f9-rxsvm\" (UID: \"6ee928ab-23ef-4f3e-a025-7576302eb628\") " pod="openstack/dnsmasq-dns-5ccc8479f9-rxsvm" Mar 20 13:43:13 crc kubenswrapper[4973]: I0320 13:43:13.917245 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tjbs5"] Mar 20 13:43:13 crc kubenswrapper[4973]: I0320 13:43:13.962324 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-rxsvm" Mar 20 13:43:13 crc kubenswrapper[4973]: I0320 13:43:13.992927 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hzr72"] Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.000820 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hzr72" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.053180 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hzr72"] Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.190469 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c513f134-e86b-47ac-9d13-5b92975be947-config\") pod \"dnsmasq-dns-57d769cc4f-hzr72\" (UID: \"c513f134-e86b-47ac-9d13-5b92975be947\") " pod="openstack/dnsmasq-dns-57d769cc4f-hzr72" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.190541 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c513f134-e86b-47ac-9d13-5b92975be947-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-hzr72\" (UID: \"c513f134-e86b-47ac-9d13-5b92975be947\") " pod="openstack/dnsmasq-dns-57d769cc4f-hzr72" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.190590 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7m7q\" (UniqueName: \"kubernetes.io/projected/c513f134-e86b-47ac-9d13-5b92975be947-kube-api-access-w7m7q\") pod \"dnsmasq-dns-57d769cc4f-hzr72\" (UID: \"c513f134-e86b-47ac-9d13-5b92975be947\") " pod="openstack/dnsmasq-dns-57d769cc4f-hzr72" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.299275 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c513f134-e86b-47ac-9d13-5b92975be947-config\") pod \"dnsmasq-dns-57d769cc4f-hzr72\" (UID: \"c513f134-e86b-47ac-9d13-5b92975be947\") " pod="openstack/dnsmasq-dns-57d769cc4f-hzr72" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.299395 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c513f134-e86b-47ac-9d13-5b92975be947-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-hzr72\" (UID: \"c513f134-e86b-47ac-9d13-5b92975be947\") " pod="openstack/dnsmasq-dns-57d769cc4f-hzr72" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.299458 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7m7q\" (UniqueName: \"kubernetes.io/projected/c513f134-e86b-47ac-9d13-5b92975be947-kube-api-access-w7m7q\") pod \"dnsmasq-dns-57d769cc4f-hzr72\" (UID: \"c513f134-e86b-47ac-9d13-5b92975be947\") " pod="openstack/dnsmasq-dns-57d769cc4f-hzr72" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.300862 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c513f134-e86b-47ac-9d13-5b92975be947-config\") pod \"dnsmasq-dns-57d769cc4f-hzr72\" (UID: \"c513f134-e86b-47ac-9d13-5b92975be947\") " pod="openstack/dnsmasq-dns-57d769cc4f-hzr72" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.301375 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c513f134-e86b-47ac-9d13-5b92975be947-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-hzr72\" (UID: \"c513f134-e86b-47ac-9d13-5b92975be947\") " pod="openstack/dnsmasq-dns-57d769cc4f-hzr72" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.325551 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7m7q\" (UniqueName: \"kubernetes.io/projected/c513f134-e86b-47ac-9d13-5b92975be947-kube-api-access-w7m7q\") pod \"dnsmasq-dns-57d769cc4f-hzr72\" (UID: \"c513f134-e86b-47ac-9d13-5b92975be947\") " pod="openstack/dnsmasq-dns-57d769cc4f-hzr72" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.413372 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hzr72" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.527241 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-rxsvm"] Mar 20 13:43:14 crc kubenswrapper[4973]: W0320 13:43:14.531980 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6ee928ab_23ef_4f3e_a025_7576302eb628.slice/crio-a41f5b2c634978ced4693d157e576f8f7a29f03548bfc7254cf91b8a31c739c1 WatchSource:0}: Error finding container a41f5b2c634978ced4693d157e576f8f7a29f03548bfc7254cf91b8a31c739c1: Status 404 returned error can't find the container with id a41f5b2c634978ced4693d157e576f8f7a29f03548bfc7254cf91b8a31c739c1 Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.755273 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.757498 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.761574 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.762169 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.771884 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.772093 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.772225 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-trmvg" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.772577 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.772748 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.774273 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.904124 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hzr72"] Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.917274 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xs8k\" (UniqueName: \"kubernetes.io/projected/20780ec2-d338-45a4-9259-16a651e46e55-kube-api-access-6xs8k\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.917370 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/20780ec2-d338-45a4-9259-16a651e46e55-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.917407 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20780ec2-d338-45a4-9259-16a651e46e55-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.917437 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/20780ec2-d338-45a4-9259-16a651e46e55-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.917458 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/20780ec2-d338-45a4-9259-16a651e46e55-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.917474 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/20780ec2-d338-45a4-9259-16a651e46e55-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.917497 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/20780ec2-d338-45a4-9259-16a651e46e55-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.917554 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/20780ec2-d338-45a4-9259-16a651e46e55-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.917585 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/20780ec2-d338-45a4-9259-16a651e46e55-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.917615 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/20780ec2-d338-45a4-9259-16a651e46e55-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:14 crc kubenswrapper[4973]: I0320 13:43:14.917639 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b04e78e5-659b-4e93-8c0c-4bb9c3f6007a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b04e78e5-659b-4e93-8c0c-4bb9c3f6007a\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.022369 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/20780ec2-d338-45a4-9259-16a651e46e55-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.022452 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20780ec2-d338-45a4-9259-16a651e46e55-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.022503 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/20780ec2-d338-45a4-9259-16a651e46e55-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.022536 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/20780ec2-d338-45a4-9259-16a651e46e55-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.022567 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/20780ec2-d338-45a4-9259-16a651e46e55-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.022597 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/20780ec2-d338-45a4-9259-16a651e46e55-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.022665 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/20780ec2-d338-45a4-9259-16a651e46e55-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.022720 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/20780ec2-d338-45a4-9259-16a651e46e55-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.022761 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/20780ec2-d338-45a4-9259-16a651e46e55-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.022791 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b04e78e5-659b-4e93-8c0c-4bb9c3f6007a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b04e78e5-659b-4e93-8c0c-4bb9c3f6007a\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.022865 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xs8k\" (UniqueName: \"kubernetes.io/projected/20780ec2-d338-45a4-9259-16a651e46e55-kube-api-access-6xs8k\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.024361 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/20780ec2-d338-45a4-9259-16a651e46e55-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.025952 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/20780ec2-d338-45a4-9259-16a651e46e55-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.026774 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20780ec2-d338-45a4-9259-16a651e46e55-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.026961 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/20780ec2-d338-45a4-9259-16a651e46e55-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.027962 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/20780ec2-d338-45a4-9259-16a651e46e55-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.041288 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/20780ec2-d338-45a4-9259-16a651e46e55-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.045102 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/20780ec2-d338-45a4-9259-16a651e46e55-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.045568 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/20780ec2-d338-45a4-9259-16a651e46e55-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.045876 4973 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.045922 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b04e78e5-659b-4e93-8c0c-4bb9c3f6007a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b04e78e5-659b-4e93-8c0c-4bb9c3f6007a\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9287085a89b925405c99e289731435a8079d99592059ce42321d27146daffc28/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.047114 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xs8k\" (UniqueName: \"kubernetes.io/projected/20780ec2-d338-45a4-9259-16a651e46e55-kube-api-access-6xs8k\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.058384 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/20780ec2-d338-45a4-9259-16a651e46e55-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.099557 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.101176 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.101563 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b04e78e5-659b-4e93-8c0c-4bb9c3f6007a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b04e78e5-659b-4e93-8c0c-4bb9c3f6007a\") pod \"rabbitmq-cell1-server-0\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.104714 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.104752 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.105413 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.105863 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.106148 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-zx7n7" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.106551 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.106684 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.120802 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.134239 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.137030 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.154920 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-2"] Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.156958 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.169395 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.182283 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.226300 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/96de22e2-f61c-4f75-8faa-9a0591aa0f38-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.226372 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/96de22e2-f61c-4f75-8faa-9a0591aa0f38-server-conf\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.226399 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/797b38f5-d9a7-4f82-bd12-e40e021ef28e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.226480 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/797b38f5-d9a7-4f82-bd12-e40e021ef28e-pod-info\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.226511 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/96de22e2-f61c-4f75-8faa-9a0591aa0f38-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.226534 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/797b38f5-d9a7-4f82-bd12-e40e021ef28e-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.226557 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/797b38f5-d9a7-4f82-bd12-e40e021ef28e-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.226598 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96de22e2-f61c-4f75-8faa-9a0591aa0f38-config-data\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.226652 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/797b38f5-d9a7-4f82-bd12-e40e021ef28e-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.226673 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/96de22e2-f61c-4f75-8faa-9a0591aa0f38-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.226704 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bc9ff\" (UniqueName: \"kubernetes.io/projected/96de22e2-f61c-4f75-8faa-9a0591aa0f38-kube-api-access-bc9ff\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.226733 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c6474f3c-a7e9-419a-baaa-854971e6b73f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6474f3c-a7e9-419a-baaa-854971e6b73f\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.226957 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/797b38f5-d9a7-4f82-bd12-e40e021ef28e-config-data\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.226981 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/96de22e2-f61c-4f75-8faa-9a0591aa0f38-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.227049 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/96de22e2-f61c-4f75-8faa-9a0591aa0f38-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.227122 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/797b38f5-d9a7-4f82-bd12-e40e021ef28e-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.227175 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/96de22e2-f61c-4f75-8faa-9a0591aa0f38-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.227209 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/96de22e2-f61c-4f75-8faa-9a0591aa0f38-pod-info\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.227282 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/797b38f5-d9a7-4f82-bd12-e40e021ef28e-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.227310 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/797b38f5-d9a7-4f82-bd12-e40e021ef28e-server-conf\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.227475 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klrdc\" (UniqueName: \"kubernetes.io/projected/797b38f5-d9a7-4f82-bd12-e40e021ef28e-kube-api-access-klrdc\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.228643 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d7bd4023-c28d-4f2d-85ff-5fa21c10825d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7bd4023-c28d-4f2d-85ff-5fa21c10825d\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.265549 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-rxsvm" event={"ID":"6ee928ab-23ef-4f3e-a025-7576302eb628","Type":"ContainerStarted","Data":"a41f5b2c634978ced4693d157e576f8f7a29f03548bfc7254cf91b8a31c739c1"} Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.269059 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hzr72" event={"ID":"c513f134-e86b-47ac-9d13-5b92975be947","Type":"ContainerStarted","Data":"1823cd06704869fc3018b78115ce24a8269d73322216c8f4fdf81425ae7130c9"} Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.332415 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/797b38f5-d9a7-4f82-bd12-e40e021ef28e-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.332491 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4ed60638-5022-406b-b568-7fa0d6bf4ba8-server-conf\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.332515 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/96de22e2-f61c-4f75-8faa-9a0591aa0f38-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.332531 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/96de22e2-f61c-4f75-8faa-9a0591aa0f38-pod-info\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.332558 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/797b38f5-d9a7-4f82-bd12-e40e021ef28e-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.332573 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/797b38f5-d9a7-4f82-bd12-e40e021ef28e-server-conf\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.332593 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4ed60638-5022-406b-b568-7fa0d6bf4ba8-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.332619 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ed60638-5022-406b-b568-7fa0d6bf4ba8-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.332643 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ed60638-5022-406b-b568-7fa0d6bf4ba8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.332677 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klrdc\" (UniqueName: \"kubernetes.io/projected/797b38f5-d9a7-4f82-bd12-e40e021ef28e-kube-api-access-klrdc\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.332714 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d7bd4023-c28d-4f2d-85ff-5fa21c10825d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7bd4023-c28d-4f2d-85ff-5fa21c10825d\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.332752 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-95f86f18-af9b-4797-b8b2-e3f3d8804c52\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95f86f18-af9b-4797-b8b2-e3f3d8804c52\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.332798 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ed60638-5022-406b-b568-7fa0d6bf4ba8-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.332916 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ed60638-5022-406b-b568-7fa0d6bf4ba8-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.332944 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ed60638-5022-406b-b568-7fa0d6bf4ba8-config-data\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.332968 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/96de22e2-f61c-4f75-8faa-9a0591aa0f38-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.333005 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/96de22e2-f61c-4f75-8faa-9a0591aa0f38-server-conf\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.333033 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/797b38f5-d9a7-4f82-bd12-e40e021ef28e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.333073 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/797b38f5-d9a7-4f82-bd12-e40e021ef28e-pod-info\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.333089 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/96de22e2-f61c-4f75-8faa-9a0591aa0f38-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.333104 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/797b38f5-d9a7-4f82-bd12-e40e021ef28e-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.333137 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/797b38f5-d9a7-4f82-bd12-e40e021ef28e-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.333170 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96de22e2-f61c-4f75-8faa-9a0591aa0f38-config-data\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.333203 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ed60638-5022-406b-b568-7fa0d6bf4ba8-pod-info\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.333237 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ed60638-5022-406b-b568-7fa0d6bf4ba8-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.333498 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/797b38f5-d9a7-4f82-bd12-e40e021ef28e-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.333534 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/96de22e2-f61c-4f75-8faa-9a0591aa0f38-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.333566 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f28mv\" (UniqueName: \"kubernetes.io/projected/4ed60638-5022-406b-b568-7fa0d6bf4ba8-kube-api-access-f28mv\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.333595 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bc9ff\" (UniqueName: \"kubernetes.io/projected/96de22e2-f61c-4f75-8faa-9a0591aa0f38-kube-api-access-bc9ff\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.333628 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c6474f3c-a7e9-419a-baaa-854971e6b73f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6474f3c-a7e9-419a-baaa-854971e6b73f\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.333660 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/797b38f5-d9a7-4f82-bd12-e40e021ef28e-config-data\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.333681 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/96de22e2-f61c-4f75-8faa-9a0591aa0f38-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.333706 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/96de22e2-f61c-4f75-8faa-9a0591aa0f38-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.334479 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/797b38f5-d9a7-4f82-bd12-e40e021ef28e-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.336072 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/797b38f5-d9a7-4f82-bd12-e40e021ef28e-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.336308 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/797b38f5-d9a7-4f82-bd12-e40e021ef28e-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.337258 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/96de22e2-f61c-4f75-8faa-9a0591aa0f38-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.338074 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/96de22e2-f61c-4f75-8faa-9a0591aa0f38-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.338158 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96de22e2-f61c-4f75-8faa-9a0591aa0f38-config-data\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.338962 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/797b38f5-d9a7-4f82-bd12-e40e021ef28e-config-data\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.339275 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/96de22e2-f61c-4f75-8faa-9a0591aa0f38-server-conf\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.339465 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/96de22e2-f61c-4f75-8faa-9a0591aa0f38-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.340022 4973 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.340046 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c6474f3c-a7e9-419a-baaa-854971e6b73f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6474f3c-a7e9-419a-baaa-854971e6b73f\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/70e0bd6723351d45432d0557e984e104e23058730f41a4be8f1efa0e578a3f37/globalmount\"" pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.341065 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/96de22e2-f61c-4f75-8faa-9a0591aa0f38-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.341784 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/96de22e2-f61c-4f75-8faa-9a0591aa0f38-pod-info\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.342248 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/96de22e2-f61c-4f75-8faa-9a0591aa0f38-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.343402 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/96de22e2-f61c-4f75-8faa-9a0591aa0f38-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.346226 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/797b38f5-d9a7-4f82-bd12-e40e021ef28e-pod-info\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.344917 4973 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.346290 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d7bd4023-c28d-4f2d-85ff-5fa21c10825d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7bd4023-c28d-4f2d-85ff-5fa21c10825d\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/65cf3f7a1b4243e3581d40e35123a2951f3bcb7180b938dca1f1d05b08317567/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.346545 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/797b38f5-d9a7-4f82-bd12-e40e021ef28e-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.354802 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/797b38f5-d9a7-4f82-bd12-e40e021ef28e-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.355099 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/797b38f5-d9a7-4f82-bd12-e40e021ef28e-server-conf\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.356839 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klrdc\" (UniqueName: \"kubernetes.io/projected/797b38f5-d9a7-4f82-bd12-e40e021ef28e-kube-api-access-klrdc\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.363180 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/797b38f5-d9a7-4f82-bd12-e40e021ef28e-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.372511 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bc9ff\" (UniqueName: \"kubernetes.io/projected/96de22e2-f61c-4f75-8faa-9a0591aa0f38-kube-api-access-bc9ff\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.400109 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.404942 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c6474f3c-a7e9-419a-baaa-854971e6b73f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6474f3c-a7e9-419a-baaa-854971e6b73f\") pod \"rabbitmq-server-1\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.417823 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d7bd4023-c28d-4f2d-85ff-5fa21c10825d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7bd4023-c28d-4f2d-85ff-5fa21c10825d\") pod \"rabbitmq-server-0\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.436084 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4ed60638-5022-406b-b568-7fa0d6bf4ba8-server-conf\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.436267 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4ed60638-5022-406b-b568-7fa0d6bf4ba8-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.436297 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ed60638-5022-406b-b568-7fa0d6bf4ba8-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.436319 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ed60638-5022-406b-b568-7fa0d6bf4ba8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.436374 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-95f86f18-af9b-4797-b8b2-e3f3d8804c52\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95f86f18-af9b-4797-b8b2-e3f3d8804c52\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.436405 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ed60638-5022-406b-b568-7fa0d6bf4ba8-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.436428 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ed60638-5022-406b-b568-7fa0d6bf4ba8-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.436451 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ed60638-5022-406b-b568-7fa0d6bf4ba8-config-data\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.436524 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ed60638-5022-406b-b568-7fa0d6bf4ba8-pod-info\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.436547 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ed60638-5022-406b-b568-7fa0d6bf4ba8-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.436574 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f28mv\" (UniqueName: \"kubernetes.io/projected/4ed60638-5022-406b-b568-7fa0d6bf4ba8-kube-api-access-f28mv\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.437430 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ed60638-5022-406b-b568-7fa0d6bf4ba8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.437941 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ed60638-5022-406b-b568-7fa0d6bf4ba8-config-data\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.439858 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ed60638-5022-406b-b568-7fa0d6bf4ba8-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.447183 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4ed60638-5022-406b-b568-7fa0d6bf4ba8-server-conf\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.448363 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ed60638-5022-406b-b568-7fa0d6bf4ba8-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.448852 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ed60638-5022-406b-b568-7fa0d6bf4ba8-pod-info\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.452494 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ed60638-5022-406b-b568-7fa0d6bf4ba8-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.453662 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.454040 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f28mv\" (UniqueName: \"kubernetes.io/projected/4ed60638-5022-406b-b568-7fa0d6bf4ba8-kube-api-access-f28mv\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.454932 4973 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.454973 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-95f86f18-af9b-4797-b8b2-e3f3d8804c52\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95f86f18-af9b-4797-b8b2-e3f3d8804c52\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1164d215983f7dc2868a1be48f97f5ce9134f1d9c7b479725732de51a01c727b/globalmount\"" pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.455708 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4ed60638-5022-406b-b568-7fa0d6bf4ba8-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.464938 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ed60638-5022-406b-b568-7fa0d6bf4ba8-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.474742 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.507234 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-95f86f18-af9b-4797-b8b2-e3f3d8804c52\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95f86f18-af9b-4797-b8b2-e3f3d8804c52\") pod \"rabbitmq-server-2\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " pod="openstack/rabbitmq-server-2" Mar 20 13:43:15 crc kubenswrapper[4973]: I0320 13:43:15.788932 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.058694 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.062464 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.064558 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.071790 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.109088 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.109496 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.109983 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-bmgnw" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.113817 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.152049 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-98f062ee-370e-41a4-bcf7-aef2fc488906\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-98f062ee-370e-41a4-bcf7-aef2fc488906\") pod \"openstack-galera-0\" (UID: \"27fed14c-9051-4d46-80d5-badf224805a9\") " pod="openstack/openstack-galera-0" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.152135 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27fed14c-9051-4d46-80d5-badf224805a9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"27fed14c-9051-4d46-80d5-badf224805a9\") " pod="openstack/openstack-galera-0" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.152161 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2gm9\" (UniqueName: \"kubernetes.io/projected/27fed14c-9051-4d46-80d5-badf224805a9-kube-api-access-g2gm9\") pod \"openstack-galera-0\" (UID: \"27fed14c-9051-4d46-80d5-badf224805a9\") " pod="openstack/openstack-galera-0" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.152191 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/27fed14c-9051-4d46-80d5-badf224805a9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"27fed14c-9051-4d46-80d5-badf224805a9\") " pod="openstack/openstack-galera-0" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.152299 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/27fed14c-9051-4d46-80d5-badf224805a9-config-data-default\") pod \"openstack-galera-0\" (UID: \"27fed14c-9051-4d46-80d5-badf224805a9\") " pod="openstack/openstack-galera-0" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.152349 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/27fed14c-9051-4d46-80d5-badf224805a9-kolla-config\") pod \"openstack-galera-0\" (UID: \"27fed14c-9051-4d46-80d5-badf224805a9\") " pod="openstack/openstack-galera-0" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.152367 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/27fed14c-9051-4d46-80d5-badf224805a9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"27fed14c-9051-4d46-80d5-badf224805a9\") " pod="openstack/openstack-galera-0" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.152386 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27fed14c-9051-4d46-80d5-badf224805a9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"27fed14c-9051-4d46-80d5-badf224805a9\") " pod="openstack/openstack-galera-0" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.261180 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-98f062ee-370e-41a4-bcf7-aef2fc488906\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-98f062ee-370e-41a4-bcf7-aef2fc488906\") pod \"openstack-galera-0\" (UID: \"27fed14c-9051-4d46-80d5-badf224805a9\") " pod="openstack/openstack-galera-0" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.261256 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27fed14c-9051-4d46-80d5-badf224805a9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"27fed14c-9051-4d46-80d5-badf224805a9\") " pod="openstack/openstack-galera-0" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.261281 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2gm9\" (UniqueName: \"kubernetes.io/projected/27fed14c-9051-4d46-80d5-badf224805a9-kube-api-access-g2gm9\") pod \"openstack-galera-0\" (UID: \"27fed14c-9051-4d46-80d5-badf224805a9\") " pod="openstack/openstack-galera-0" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.261306 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/27fed14c-9051-4d46-80d5-badf224805a9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"27fed14c-9051-4d46-80d5-badf224805a9\") " pod="openstack/openstack-galera-0" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.261754 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/27fed14c-9051-4d46-80d5-badf224805a9-config-data-default\") pod \"openstack-galera-0\" (UID: \"27fed14c-9051-4d46-80d5-badf224805a9\") " pod="openstack/openstack-galera-0" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.261818 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/27fed14c-9051-4d46-80d5-badf224805a9-kolla-config\") pod \"openstack-galera-0\" (UID: \"27fed14c-9051-4d46-80d5-badf224805a9\") " pod="openstack/openstack-galera-0" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.261853 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/27fed14c-9051-4d46-80d5-badf224805a9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"27fed14c-9051-4d46-80d5-badf224805a9\") " pod="openstack/openstack-galera-0" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.261881 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27fed14c-9051-4d46-80d5-badf224805a9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"27fed14c-9051-4d46-80d5-badf224805a9\") " pod="openstack/openstack-galera-0" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.262994 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/27fed14c-9051-4d46-80d5-badf224805a9-kolla-config\") pod \"openstack-galera-0\" (UID: \"27fed14c-9051-4d46-80d5-badf224805a9\") " pod="openstack/openstack-galera-0" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.263724 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/27fed14c-9051-4d46-80d5-badf224805a9-config-data-default\") pod \"openstack-galera-0\" (UID: \"27fed14c-9051-4d46-80d5-badf224805a9\") " pod="openstack/openstack-galera-0" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.264706 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/27fed14c-9051-4d46-80d5-badf224805a9-config-data-generated\") pod \"openstack-galera-0\" (UID: \"27fed14c-9051-4d46-80d5-badf224805a9\") " pod="openstack/openstack-galera-0" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.264886 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27fed14c-9051-4d46-80d5-badf224805a9-operator-scripts\") pod \"openstack-galera-0\" (UID: \"27fed14c-9051-4d46-80d5-badf224805a9\") " pod="openstack/openstack-galera-0" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.266752 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/27fed14c-9051-4d46-80d5-badf224805a9-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"27fed14c-9051-4d46-80d5-badf224805a9\") " pod="openstack/openstack-galera-0" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.267246 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/27fed14c-9051-4d46-80d5-badf224805a9-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"27fed14c-9051-4d46-80d5-badf224805a9\") " pod="openstack/openstack-galera-0" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.267309 4973 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.267416 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-98f062ee-370e-41a4-bcf7-aef2fc488906\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-98f062ee-370e-41a4-bcf7-aef2fc488906\") pod \"openstack-galera-0\" (UID: \"27fed14c-9051-4d46-80d5-badf224805a9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0a6921f9e02f6ecc7aad6cb9408803c9c0a0672e3e795a070787805eaf1ec83f/globalmount\"" pod="openstack/openstack-galera-0" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.287520 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2gm9\" (UniqueName: \"kubernetes.io/projected/27fed14c-9051-4d46-80d5-badf224805a9-kube-api-access-g2gm9\") pod \"openstack-galera-0\" (UID: \"27fed14c-9051-4d46-80d5-badf224805a9\") " pod="openstack/openstack-galera-0" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.308740 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-98f062ee-370e-41a4-bcf7-aef2fc488906\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-98f062ee-370e-41a4-bcf7-aef2fc488906\") pod \"openstack-galera-0\" (UID: \"27fed14c-9051-4d46-80d5-badf224805a9\") " pod="openstack/openstack-galera-0" Mar 20 13:43:16 crc kubenswrapper[4973]: I0320 13:43:16.440447 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.519580 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.522399 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.539072 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-5f4cb" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.539278 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.539327 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.543028 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.549739 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.587826 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-13110d6b-61b8-4d5c-949a-a07f3be0fe7d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13110d6b-61b8-4d5c-949a-a07f3be0fe7d\") pod \"openstack-cell1-galera-0\" (UID: \"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.587881 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bdbb8eb-c36d-43f0-a705-3b3e59128b7f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.587955 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpqr2\" (UniqueName: \"kubernetes.io/projected/4bdbb8eb-c36d-43f0-a705-3b3e59128b7f-kube-api-access-zpqr2\") pod \"openstack-cell1-galera-0\" (UID: \"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.587976 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4bdbb8eb-c36d-43f0-a705-3b3e59128b7f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.588084 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4bdbb8eb-c36d-43f0-a705-3b3e59128b7f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.588149 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4bdbb8eb-c36d-43f0-a705-3b3e59128b7f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.588243 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4bdbb8eb-c36d-43f0-a705-3b3e59128b7f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.588319 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bdbb8eb-c36d-43f0-a705-3b3e59128b7f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.692714 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpqr2\" (UniqueName: \"kubernetes.io/projected/4bdbb8eb-c36d-43f0-a705-3b3e59128b7f-kube-api-access-zpqr2\") pod \"openstack-cell1-galera-0\" (UID: \"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.692769 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4bdbb8eb-c36d-43f0-a705-3b3e59128b7f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.692813 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4bdbb8eb-c36d-43f0-a705-3b3e59128b7f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.692836 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4bdbb8eb-c36d-43f0-a705-3b3e59128b7f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.692890 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4bdbb8eb-c36d-43f0-a705-3b3e59128b7f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.692924 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bdbb8eb-c36d-43f0-a705-3b3e59128b7f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.692974 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-13110d6b-61b8-4d5c-949a-a07f3be0fe7d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13110d6b-61b8-4d5c-949a-a07f3be0fe7d\") pod \"openstack-cell1-galera-0\" (UID: \"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.693003 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bdbb8eb-c36d-43f0-a705-3b3e59128b7f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.694880 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/4bdbb8eb-c36d-43f0-a705-3b3e59128b7f-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.695221 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/4bdbb8eb-c36d-43f0-a705-3b3e59128b7f-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.695941 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4bdbb8eb-c36d-43f0-a705-3b3e59128b7f-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.696246 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/4bdbb8eb-c36d-43f0-a705-3b3e59128b7f-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.697389 4973 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.697437 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-13110d6b-61b8-4d5c-949a-a07f3be0fe7d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13110d6b-61b8-4d5c-949a-a07f3be0fe7d\") pod \"openstack-cell1-galera-0\" (UID: \"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/cf1894916b3818de8aadb81649d3ae09e4cc8f6c7af8e386098d8d1207d47fa7/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.700295 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4bdbb8eb-c36d-43f0-a705-3b3e59128b7f-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.703019 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/4bdbb8eb-c36d-43f0-a705-3b3e59128b7f-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.717133 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpqr2\" (UniqueName: \"kubernetes.io/projected/4bdbb8eb-c36d-43f0-a705-3b3e59128b7f-kube-api-access-zpqr2\") pod \"openstack-cell1-galera-0\" (UID: \"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.758354 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-13110d6b-61b8-4d5c-949a-a07f3be0fe7d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-13110d6b-61b8-4d5c-949a-a07f3be0fe7d\") pod \"openstack-cell1-galera-0\" (UID: \"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f\") " pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.758578 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.759875 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.761615 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-qn54w" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.761757 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.763311 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.767210 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.794636 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/144466b4-e5d6-4cca-b67b-b76dc24f8dea-config-data\") pod \"memcached-0\" (UID: \"144466b4-e5d6-4cca-b67b-b76dc24f8dea\") " pod="openstack/memcached-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.794992 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/144466b4-e5d6-4cca-b67b-b76dc24f8dea-combined-ca-bundle\") pod \"memcached-0\" (UID: \"144466b4-e5d6-4cca-b67b-b76dc24f8dea\") " pod="openstack/memcached-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.795056 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/144466b4-e5d6-4cca-b67b-b76dc24f8dea-memcached-tls-certs\") pod \"memcached-0\" (UID: \"144466b4-e5d6-4cca-b67b-b76dc24f8dea\") " pod="openstack/memcached-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.795125 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvxd6\" (UniqueName: \"kubernetes.io/projected/144466b4-e5d6-4cca-b67b-b76dc24f8dea-kube-api-access-tvxd6\") pod \"memcached-0\" (UID: \"144466b4-e5d6-4cca-b67b-b76dc24f8dea\") " pod="openstack/memcached-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.795169 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/144466b4-e5d6-4cca-b67b-b76dc24f8dea-kolla-config\") pod \"memcached-0\" (UID: \"144466b4-e5d6-4cca-b67b-b76dc24f8dea\") " pod="openstack/memcached-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.862376 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.896742 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/144466b4-e5d6-4cca-b67b-b76dc24f8dea-config-data\") pod \"memcached-0\" (UID: \"144466b4-e5d6-4cca-b67b-b76dc24f8dea\") " pod="openstack/memcached-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.896812 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/144466b4-e5d6-4cca-b67b-b76dc24f8dea-combined-ca-bundle\") pod \"memcached-0\" (UID: \"144466b4-e5d6-4cca-b67b-b76dc24f8dea\") " pod="openstack/memcached-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.896921 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/144466b4-e5d6-4cca-b67b-b76dc24f8dea-memcached-tls-certs\") pod \"memcached-0\" (UID: \"144466b4-e5d6-4cca-b67b-b76dc24f8dea\") " pod="openstack/memcached-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.896995 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvxd6\" (UniqueName: \"kubernetes.io/projected/144466b4-e5d6-4cca-b67b-b76dc24f8dea-kube-api-access-tvxd6\") pod \"memcached-0\" (UID: \"144466b4-e5d6-4cca-b67b-b76dc24f8dea\") " pod="openstack/memcached-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.897043 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/144466b4-e5d6-4cca-b67b-b76dc24f8dea-kolla-config\") pod \"memcached-0\" (UID: \"144466b4-e5d6-4cca-b67b-b76dc24f8dea\") " pod="openstack/memcached-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.897757 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/144466b4-e5d6-4cca-b67b-b76dc24f8dea-kolla-config\") pod \"memcached-0\" (UID: \"144466b4-e5d6-4cca-b67b-b76dc24f8dea\") " pod="openstack/memcached-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.898235 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/144466b4-e5d6-4cca-b67b-b76dc24f8dea-config-data\") pod \"memcached-0\" (UID: \"144466b4-e5d6-4cca-b67b-b76dc24f8dea\") " pod="openstack/memcached-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.919287 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/144466b4-e5d6-4cca-b67b-b76dc24f8dea-memcached-tls-certs\") pod \"memcached-0\" (UID: \"144466b4-e5d6-4cca-b67b-b76dc24f8dea\") " pod="openstack/memcached-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.919402 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/144466b4-e5d6-4cca-b67b-b76dc24f8dea-combined-ca-bundle\") pod \"memcached-0\" (UID: \"144466b4-e5d6-4cca-b67b-b76dc24f8dea\") " pod="openstack/memcached-0" Mar 20 13:43:17 crc kubenswrapper[4973]: I0320 13:43:17.926653 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvxd6\" (UniqueName: \"kubernetes.io/projected/144466b4-e5d6-4cca-b67b-b76dc24f8dea-kube-api-access-tvxd6\") pod \"memcached-0\" (UID: \"144466b4-e5d6-4cca-b67b-b76dc24f8dea\") " pod="openstack/memcached-0" Mar 20 13:43:18 crc kubenswrapper[4973]: I0320 13:43:18.106092 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 20 13:43:20 crc kubenswrapper[4973]: I0320 13:43:20.542221 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:43:20 crc kubenswrapper[4973]: I0320 13:43:20.554059 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:43:20 crc kubenswrapper[4973]: I0320 13:43:20.560455 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-97ps4" Mar 20 13:43:20 crc kubenswrapper[4973]: I0320 13:43:20.563807 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:43:20 crc kubenswrapper[4973]: I0320 13:43:20.577940 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzlzm\" (UniqueName: \"kubernetes.io/projected/4d08d0fa-8d0e-412d-b657-3af016e5c0d1-kube-api-access-xzlzm\") pod \"kube-state-metrics-0\" (UID: \"4d08d0fa-8d0e-412d-b657-3af016e5c0d1\") " pod="openstack/kube-state-metrics-0" Mar 20 13:43:20 crc kubenswrapper[4973]: I0320 13:43:20.687724 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzlzm\" (UniqueName: \"kubernetes.io/projected/4d08d0fa-8d0e-412d-b657-3af016e5c0d1-kube-api-access-xzlzm\") pod \"kube-state-metrics-0\" (UID: \"4d08d0fa-8d0e-412d-b657-3af016e5c0d1\") " pod="openstack/kube-state-metrics-0" Mar 20 13:43:20 crc kubenswrapper[4973]: I0320 13:43:20.755514 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzlzm\" (UniqueName: \"kubernetes.io/projected/4d08d0fa-8d0e-412d-b657-3af016e5c0d1-kube-api-access-xzlzm\") pod \"kube-state-metrics-0\" (UID: \"4d08d0fa-8d0e-412d-b657-3af016e5c0d1\") " pod="openstack/kube-state-metrics-0" Mar 20 13:43:20 crc kubenswrapper[4973]: I0320 13:43:20.781216 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:43:20 crc kubenswrapper[4973]: I0320 13:43:20.894494 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.447580 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-ui-dashboards-7f87b9b85b-l6hmv"] Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.449326 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-l6hmv" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.453412 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.456775 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-ui-dashboards-sa-dockercfg-h86rx" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.483222 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7f87b9b85b-l6hmv"] Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.506077 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/863783c6-0106-43c7-b097-35c4f30db388-serving-cert\") pod \"observability-ui-dashboards-7f87b9b85b-l6hmv\" (UID: \"863783c6-0106-43c7-b097-35c4f30db388\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-l6hmv" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.506558 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2975\" (UniqueName: \"kubernetes.io/projected/863783c6-0106-43c7-b097-35c4f30db388-kube-api-access-l2975\") pod \"observability-ui-dashboards-7f87b9b85b-l6hmv\" (UID: \"863783c6-0106-43c7-b097-35c4f30db388\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-l6hmv" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.609301 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/863783c6-0106-43c7-b097-35c4f30db388-serving-cert\") pod \"observability-ui-dashboards-7f87b9b85b-l6hmv\" (UID: \"863783c6-0106-43c7-b097-35c4f30db388\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-l6hmv" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.609426 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2975\" (UniqueName: \"kubernetes.io/projected/863783c6-0106-43c7-b097-35c4f30db388-kube-api-access-l2975\") pod \"observability-ui-dashboards-7f87b9b85b-l6hmv\" (UID: \"863783c6-0106-43c7-b097-35c4f30db388\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-l6hmv" Mar 20 13:43:21 crc kubenswrapper[4973]: E0320 13:43:21.609916 4973 secret.go:188] Couldn't get secret openshift-operators/observability-ui-dashboards: secret "observability-ui-dashboards" not found Mar 20 13:43:21 crc kubenswrapper[4973]: E0320 13:43:21.609969 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/863783c6-0106-43c7-b097-35c4f30db388-serving-cert podName:863783c6-0106-43c7-b097-35c4f30db388 nodeName:}" failed. No retries permitted until 2026-03-20 13:43:22.10994872 +0000 UTC m=+1322.853618474 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/863783c6-0106-43c7-b097-35c4f30db388-serving-cert") pod "observability-ui-dashboards-7f87b9b85b-l6hmv" (UID: "863783c6-0106-43c7-b097-35c4f30db388") : secret "observability-ui-dashboards" not found Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.657546 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2975\" (UniqueName: \"kubernetes.io/projected/863783c6-0106-43c7-b097-35c4f30db388-kube-api-access-l2975\") pod \"observability-ui-dashboards-7f87b9b85b-l6hmv\" (UID: \"863783c6-0106-43c7-b097-35c4f30db388\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-l6hmv" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.802229 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7b5fcf6d5d-lqlbt"] Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.804050 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b5fcf6d5d-lqlbt" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.816243 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b5fcf6d5d-lqlbt"] Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.880854 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.888292 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.900713 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.900794 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.900910 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.900951 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.901060 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.901081 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-rhx8x" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.901134 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.901209 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.924185 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/642c51b4-2774-4114-808c-1fb722862437-config\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.924237 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/642c51b4-2774-4114-808c-1fb722862437-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.924262 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7-trusted-ca-bundle\") pod \"console-7b5fcf6d5d-lqlbt\" (UID: \"2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7\") " pod="openshift-console/console-7b5fcf6d5d-lqlbt" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.924283 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-38da6b63-5881-4301-941a-df15b37fb4c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38da6b63-5881-4301-941a-df15b37fb4c2\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.924376 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7-oauth-serving-cert\") pod \"console-7b5fcf6d5d-lqlbt\" (UID: \"2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7\") " pod="openshift-console/console-7b5fcf6d5d-lqlbt" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.924498 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7-console-config\") pod \"console-7b5fcf6d5d-lqlbt\" (UID: \"2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7\") " pod="openshift-console/console-7b5fcf6d5d-lqlbt" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.924526 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/642c51b4-2774-4114-808c-1fb722862437-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.924553 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/642c51b4-2774-4114-808c-1fb722862437-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.924705 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/642c51b4-2774-4114-808c-1fb722862437-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.924747 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/642c51b4-2774-4114-808c-1fb722862437-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.924782 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhdp9\" (UniqueName: \"kubernetes.io/projected/2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7-kube-api-access-xhdp9\") pod \"console-7b5fcf6d5d-lqlbt\" (UID: \"2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7\") " pod="openshift-console/console-7b5fcf6d5d-lqlbt" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.924800 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7-service-ca\") pod \"console-7b5fcf6d5d-lqlbt\" (UID: \"2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7\") " pod="openshift-console/console-7b5fcf6d5d-lqlbt" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.924823 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7-console-serving-cert\") pod \"console-7b5fcf6d5d-lqlbt\" (UID: \"2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7\") " pod="openshift-console/console-7b5fcf6d5d-lqlbt" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.924853 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpb49\" (UniqueName: \"kubernetes.io/projected/642c51b4-2774-4114-808c-1fb722862437-kube-api-access-gpb49\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.924884 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7-console-oauth-config\") pod \"console-7b5fcf6d5d-lqlbt\" (UID: \"2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7\") " pod="openshift-console/console-7b5fcf6d5d-lqlbt" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.924908 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/642c51b4-2774-4114-808c-1fb722862437-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.924933 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/642c51b4-2774-4114-808c-1fb722862437-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:21 crc kubenswrapper[4973]: I0320 13:43:21.990518 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.028306 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/642c51b4-2774-4114-808c-1fb722862437-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.028394 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/642c51b4-2774-4114-808c-1fb722862437-config\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.028422 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/642c51b4-2774-4114-808c-1fb722862437-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.028446 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7-trusted-ca-bundle\") pod \"console-7b5fcf6d5d-lqlbt\" (UID: \"2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7\") " pod="openshift-console/console-7b5fcf6d5d-lqlbt" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.028466 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-38da6b63-5881-4301-941a-df15b37fb4c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38da6b63-5881-4301-941a-df15b37fb4c2\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.028491 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7-oauth-serving-cert\") pod \"console-7b5fcf6d5d-lqlbt\" (UID: \"2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7\") " pod="openshift-console/console-7b5fcf6d5d-lqlbt" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.028539 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7-console-config\") pod \"console-7b5fcf6d5d-lqlbt\" (UID: \"2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7\") " pod="openshift-console/console-7b5fcf6d5d-lqlbt" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.028558 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/642c51b4-2774-4114-808c-1fb722862437-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.028580 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/642c51b4-2774-4114-808c-1fb722862437-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.028666 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/642c51b4-2774-4114-808c-1fb722862437-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.028695 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/642c51b4-2774-4114-808c-1fb722862437-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.028728 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhdp9\" (UniqueName: \"kubernetes.io/projected/2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7-kube-api-access-xhdp9\") pod \"console-7b5fcf6d5d-lqlbt\" (UID: \"2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7\") " pod="openshift-console/console-7b5fcf6d5d-lqlbt" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.028745 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7-service-ca\") pod \"console-7b5fcf6d5d-lqlbt\" (UID: \"2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7\") " pod="openshift-console/console-7b5fcf6d5d-lqlbt" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.028773 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7-console-serving-cert\") pod \"console-7b5fcf6d5d-lqlbt\" (UID: \"2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7\") " pod="openshift-console/console-7b5fcf6d5d-lqlbt" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.028814 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpb49\" (UniqueName: \"kubernetes.io/projected/642c51b4-2774-4114-808c-1fb722862437-kube-api-access-gpb49\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.028861 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7-console-oauth-config\") pod \"console-7b5fcf6d5d-lqlbt\" (UID: \"2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7\") " pod="openshift-console/console-7b5fcf6d5d-lqlbt" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.028886 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/642c51b4-2774-4114-808c-1fb722862437-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.029280 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/642c51b4-2774-4114-808c-1fb722862437-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.029816 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7-oauth-serving-cert\") pod \"console-7b5fcf6d5d-lqlbt\" (UID: \"2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7\") " pod="openshift-console/console-7b5fcf6d5d-lqlbt" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.030303 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7-service-ca\") pod \"console-7b5fcf6d5d-lqlbt\" (UID: \"2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7\") " pod="openshift-console/console-7b5fcf6d5d-lqlbt" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.030611 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7-console-config\") pod \"console-7b5fcf6d5d-lqlbt\" (UID: \"2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7\") " pod="openshift-console/console-7b5fcf6d5d-lqlbt" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.030730 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7-trusted-ca-bundle\") pod \"console-7b5fcf6d5d-lqlbt\" (UID: \"2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7\") " pod="openshift-console/console-7b5fcf6d5d-lqlbt" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.031049 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/642c51b4-2774-4114-808c-1fb722862437-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.033598 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/642c51b4-2774-4114-808c-1fb722862437-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.041269 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/642c51b4-2774-4114-808c-1fb722862437-config\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.055856 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7-console-serving-cert\") pod \"console-7b5fcf6d5d-lqlbt\" (UID: \"2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7\") " pod="openshift-console/console-7b5fcf6d5d-lqlbt" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.056624 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/642c51b4-2774-4114-808c-1fb722862437-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.057369 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/642c51b4-2774-4114-808c-1fb722862437-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.057571 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/642c51b4-2774-4114-808c-1fb722862437-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.057833 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/642c51b4-2774-4114-808c-1fb722862437-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.070226 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7-console-oauth-config\") pod \"console-7b5fcf6d5d-lqlbt\" (UID: \"2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7\") " pod="openshift-console/console-7b5fcf6d5d-lqlbt" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.074091 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhdp9\" (UniqueName: \"kubernetes.io/projected/2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7-kube-api-access-xhdp9\") pod \"console-7b5fcf6d5d-lqlbt\" (UID: \"2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7\") " pod="openshift-console/console-7b5fcf6d5d-lqlbt" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.074905 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpb49\" (UniqueName: \"kubernetes.io/projected/642c51b4-2774-4114-808c-1fb722862437-kube-api-access-gpb49\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.130415 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7b5fcf6d5d-lqlbt" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.131055 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/863783c6-0106-43c7-b097-35c4f30db388-serving-cert\") pod \"observability-ui-dashboards-7f87b9b85b-l6hmv\" (UID: \"863783c6-0106-43c7-b097-35c4f30db388\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-l6hmv" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.141249 4973 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.141539 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-38da6b63-5881-4301-941a-df15b37fb4c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38da6b63-5881-4301-941a-df15b37fb4c2\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9fc2caf26902e4acb0c152b4b6832175eae9a7bf975678e3cfa2391449f57e79/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.165007 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/863783c6-0106-43c7-b097-35c4f30db388-serving-cert\") pod \"observability-ui-dashboards-7f87b9b85b-l6hmv\" (UID: \"863783c6-0106-43c7-b097-35c4f30db388\") " pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-l6hmv" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.211667 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-38da6b63-5881-4301-941a-df15b37fb4c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38da6b63-5881-4301-941a-df15b37fb4c2\") pod \"prometheus-metric-storage-0\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.217478 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 20 13:43:22 crc kubenswrapper[4973]: I0320 13:43:22.390867 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-l6hmv" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.222843 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-x2nll"] Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.224673 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x2nll" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.227361 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.227848 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.234528 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-x2nll"] Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.235001 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-lkwbj" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.249718 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-gvrbc"] Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.252142 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-gvrbc" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.253653 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f2c7b535-ad26-4bf4-848b-26890c0eb580-var-log-ovn\") pod \"ovn-controller-x2nll\" (UID: \"f2c7b535-ad26-4bf4-848b-26890c0eb580\") " pod="openstack/ovn-controller-x2nll" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.253686 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f2c7b535-ad26-4bf4-848b-26890c0eb580-var-run-ovn\") pod \"ovn-controller-x2nll\" (UID: \"f2c7b535-ad26-4bf4-848b-26890c0eb580\") " pod="openstack/ovn-controller-x2nll" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.253733 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2c7b535-ad26-4bf4-848b-26890c0eb580-scripts\") pod \"ovn-controller-x2nll\" (UID: \"f2c7b535-ad26-4bf4-848b-26890c0eb580\") " pod="openstack/ovn-controller-x2nll" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.253776 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2nv6\" (UniqueName: \"kubernetes.io/projected/f2c7b535-ad26-4bf4-848b-26890c0eb580-kube-api-access-z2nv6\") pod \"ovn-controller-x2nll\" (UID: \"f2c7b535-ad26-4bf4-848b-26890c0eb580\") " pod="openstack/ovn-controller-x2nll" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.253809 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f2c7b535-ad26-4bf4-848b-26890c0eb580-var-run\") pod \"ovn-controller-x2nll\" (UID: \"f2c7b535-ad26-4bf4-848b-26890c0eb580\") " pod="openstack/ovn-controller-x2nll" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.253842 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2c7b535-ad26-4bf4-848b-26890c0eb580-ovn-controller-tls-certs\") pod \"ovn-controller-x2nll\" (UID: \"f2c7b535-ad26-4bf4-848b-26890c0eb580\") " pod="openstack/ovn-controller-x2nll" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.253900 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2c7b535-ad26-4bf4-848b-26890c0eb580-combined-ca-bundle\") pod \"ovn-controller-x2nll\" (UID: \"f2c7b535-ad26-4bf4-848b-26890c0eb580\") " pod="openstack/ovn-controller-x2nll" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.256212 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-gvrbc"] Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.356323 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dpb9\" (UniqueName: \"kubernetes.io/projected/3809b124-46bb-42ba-a467-279857c61ef6-kube-api-access-7dpb9\") pod \"ovn-controller-ovs-gvrbc\" (UID: \"3809b124-46bb-42ba-a467-279857c61ef6\") " pod="openstack/ovn-controller-ovs-gvrbc" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.357062 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3809b124-46bb-42ba-a467-279857c61ef6-var-log\") pod \"ovn-controller-ovs-gvrbc\" (UID: \"3809b124-46bb-42ba-a467-279857c61ef6\") " pod="openstack/ovn-controller-ovs-gvrbc" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.357189 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3809b124-46bb-42ba-a467-279857c61ef6-var-lib\") pod \"ovn-controller-ovs-gvrbc\" (UID: \"3809b124-46bb-42ba-a467-279857c61ef6\") " pod="openstack/ovn-controller-ovs-gvrbc" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.357302 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f2c7b535-ad26-4bf4-848b-26890c0eb580-var-log-ovn\") pod \"ovn-controller-x2nll\" (UID: \"f2c7b535-ad26-4bf4-848b-26890c0eb580\") " pod="openstack/ovn-controller-x2nll" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.357508 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f2c7b535-ad26-4bf4-848b-26890c0eb580-var-run-ovn\") pod \"ovn-controller-x2nll\" (UID: \"f2c7b535-ad26-4bf4-848b-26890c0eb580\") " pod="openstack/ovn-controller-x2nll" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.357996 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3809b124-46bb-42ba-a467-279857c61ef6-scripts\") pod \"ovn-controller-ovs-gvrbc\" (UID: \"3809b124-46bb-42ba-a467-279857c61ef6\") " pod="openstack/ovn-controller-ovs-gvrbc" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.358259 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2c7b535-ad26-4bf4-848b-26890c0eb580-scripts\") pod \"ovn-controller-x2nll\" (UID: \"f2c7b535-ad26-4bf4-848b-26890c0eb580\") " pod="openstack/ovn-controller-x2nll" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.358417 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f2c7b535-ad26-4bf4-848b-26890c0eb580-var-run-ovn\") pod \"ovn-controller-x2nll\" (UID: \"f2c7b535-ad26-4bf4-848b-26890c0eb580\") " pod="openstack/ovn-controller-x2nll" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.358441 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f2c7b535-ad26-4bf4-848b-26890c0eb580-var-log-ovn\") pod \"ovn-controller-x2nll\" (UID: \"f2c7b535-ad26-4bf4-848b-26890c0eb580\") " pod="openstack/ovn-controller-x2nll" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.358667 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z2nv6\" (UniqueName: \"kubernetes.io/projected/f2c7b535-ad26-4bf4-848b-26890c0eb580-kube-api-access-z2nv6\") pod \"ovn-controller-x2nll\" (UID: \"f2c7b535-ad26-4bf4-848b-26890c0eb580\") " pod="openstack/ovn-controller-x2nll" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.358833 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f2c7b535-ad26-4bf4-848b-26890c0eb580-var-run\") pod \"ovn-controller-x2nll\" (UID: \"f2c7b535-ad26-4bf4-848b-26890c0eb580\") " pod="openstack/ovn-controller-x2nll" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.358943 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f2c7b535-ad26-4bf4-848b-26890c0eb580-var-run\") pod \"ovn-controller-x2nll\" (UID: \"f2c7b535-ad26-4bf4-848b-26890c0eb580\") " pod="openstack/ovn-controller-x2nll" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.358952 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3809b124-46bb-42ba-a467-279857c61ef6-etc-ovs\") pod \"ovn-controller-ovs-gvrbc\" (UID: \"3809b124-46bb-42ba-a467-279857c61ef6\") " pod="openstack/ovn-controller-ovs-gvrbc" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.359157 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2c7b535-ad26-4bf4-848b-26890c0eb580-ovn-controller-tls-certs\") pod \"ovn-controller-x2nll\" (UID: \"f2c7b535-ad26-4bf4-848b-26890c0eb580\") " pod="openstack/ovn-controller-x2nll" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.359277 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3809b124-46bb-42ba-a467-279857c61ef6-var-run\") pod \"ovn-controller-ovs-gvrbc\" (UID: \"3809b124-46bb-42ba-a467-279857c61ef6\") " pod="openstack/ovn-controller-ovs-gvrbc" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.359417 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2c7b535-ad26-4bf4-848b-26890c0eb580-combined-ca-bundle\") pod \"ovn-controller-x2nll\" (UID: \"f2c7b535-ad26-4bf4-848b-26890c0eb580\") " pod="openstack/ovn-controller-x2nll" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.360722 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f2c7b535-ad26-4bf4-848b-26890c0eb580-scripts\") pod \"ovn-controller-x2nll\" (UID: \"f2c7b535-ad26-4bf4-848b-26890c0eb580\") " pod="openstack/ovn-controller-x2nll" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.364899 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2c7b535-ad26-4bf4-848b-26890c0eb580-combined-ca-bundle\") pod \"ovn-controller-x2nll\" (UID: \"f2c7b535-ad26-4bf4-848b-26890c0eb580\") " pod="openstack/ovn-controller-x2nll" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.366437 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2c7b535-ad26-4bf4-848b-26890c0eb580-ovn-controller-tls-certs\") pod \"ovn-controller-x2nll\" (UID: \"f2c7b535-ad26-4bf4-848b-26890c0eb580\") " pod="openstack/ovn-controller-x2nll" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.376600 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2nv6\" (UniqueName: \"kubernetes.io/projected/f2c7b535-ad26-4bf4-848b-26890c0eb580-kube-api-access-z2nv6\") pod \"ovn-controller-x2nll\" (UID: \"f2c7b535-ad26-4bf4-848b-26890c0eb580\") " pod="openstack/ovn-controller-x2nll" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.460994 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dpb9\" (UniqueName: \"kubernetes.io/projected/3809b124-46bb-42ba-a467-279857c61ef6-kube-api-access-7dpb9\") pod \"ovn-controller-ovs-gvrbc\" (UID: \"3809b124-46bb-42ba-a467-279857c61ef6\") " pod="openstack/ovn-controller-ovs-gvrbc" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.462451 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3809b124-46bb-42ba-a467-279857c61ef6-var-log\") pod \"ovn-controller-ovs-gvrbc\" (UID: \"3809b124-46bb-42ba-a467-279857c61ef6\") " pod="openstack/ovn-controller-ovs-gvrbc" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.462627 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3809b124-46bb-42ba-a467-279857c61ef6-var-lib\") pod \"ovn-controller-ovs-gvrbc\" (UID: \"3809b124-46bb-42ba-a467-279857c61ef6\") " pod="openstack/ovn-controller-ovs-gvrbc" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.462675 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3809b124-46bb-42ba-a467-279857c61ef6-var-log\") pod \"ovn-controller-ovs-gvrbc\" (UID: \"3809b124-46bb-42ba-a467-279857c61ef6\") " pod="openstack/ovn-controller-ovs-gvrbc" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.462830 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3809b124-46bb-42ba-a467-279857c61ef6-scripts\") pod \"ovn-controller-ovs-gvrbc\" (UID: \"3809b124-46bb-42ba-a467-279857c61ef6\") " pod="openstack/ovn-controller-ovs-gvrbc" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.463190 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3809b124-46bb-42ba-a467-279857c61ef6-etc-ovs\") pod \"ovn-controller-ovs-gvrbc\" (UID: \"3809b124-46bb-42ba-a467-279857c61ef6\") " pod="openstack/ovn-controller-ovs-gvrbc" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.463406 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3809b124-46bb-42ba-a467-279857c61ef6-var-run\") pod \"ovn-controller-ovs-gvrbc\" (UID: \"3809b124-46bb-42ba-a467-279857c61ef6\") " pod="openstack/ovn-controller-ovs-gvrbc" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.462844 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/3809b124-46bb-42ba-a467-279857c61ef6-var-lib\") pod \"ovn-controller-ovs-gvrbc\" (UID: \"3809b124-46bb-42ba-a467-279857c61ef6\") " pod="openstack/ovn-controller-ovs-gvrbc" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.463878 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/3809b124-46bb-42ba-a467-279857c61ef6-etc-ovs\") pod \"ovn-controller-ovs-gvrbc\" (UID: \"3809b124-46bb-42ba-a467-279857c61ef6\") " pod="openstack/ovn-controller-ovs-gvrbc" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.463946 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/3809b124-46bb-42ba-a467-279857c61ef6-var-run\") pod \"ovn-controller-ovs-gvrbc\" (UID: \"3809b124-46bb-42ba-a467-279857c61ef6\") " pod="openstack/ovn-controller-ovs-gvrbc" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.467759 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/3809b124-46bb-42ba-a467-279857c61ef6-scripts\") pod \"ovn-controller-ovs-gvrbc\" (UID: \"3809b124-46bb-42ba-a467-279857c61ef6\") " pod="openstack/ovn-controller-ovs-gvrbc" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.481146 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dpb9\" (UniqueName: \"kubernetes.io/projected/3809b124-46bb-42ba-a467-279857c61ef6-kube-api-access-7dpb9\") pod \"ovn-controller-ovs-gvrbc\" (UID: \"3809b124-46bb-42ba-a467-279857c61ef6\") " pod="openstack/ovn-controller-ovs-gvrbc" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.594905 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x2nll" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.649888 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-gvrbc" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.872943 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.875614 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.877920 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.878525 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.878548 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.879236 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-wfps4" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.879958 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 20 13:43:23 crc kubenswrapper[4973]: I0320 13:43:23.888233 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 13:43:24 crc kubenswrapper[4973]: I0320 13:43:24.076649 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feb13208-e959-4cb3-8d6f-185bf075036c-config\") pod \"ovsdbserver-nb-0\" (UID: \"feb13208-e959-4cb3-8d6f-185bf075036c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:24 crc kubenswrapper[4973]: I0320 13:43:24.076781 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb13208-e959-4cb3-8d6f-185bf075036c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"feb13208-e959-4cb3-8d6f-185bf075036c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:24 crc kubenswrapper[4973]: I0320 13:43:24.076935 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlj4x\" (UniqueName: \"kubernetes.io/projected/feb13208-e959-4cb3-8d6f-185bf075036c-kube-api-access-dlj4x\") pod \"ovsdbserver-nb-0\" (UID: \"feb13208-e959-4cb3-8d6f-185bf075036c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:24 crc kubenswrapper[4973]: I0320 13:43:24.077035 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/feb13208-e959-4cb3-8d6f-185bf075036c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"feb13208-e959-4cb3-8d6f-185bf075036c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:24 crc kubenswrapper[4973]: I0320 13:43:24.077083 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/feb13208-e959-4cb3-8d6f-185bf075036c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"feb13208-e959-4cb3-8d6f-185bf075036c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:24 crc kubenswrapper[4973]: I0320 13:43:24.077370 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-1d758f90-9a0d-4025-b848-67682a0a4fb0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1d758f90-9a0d-4025-b848-67682a0a4fb0\") pod \"ovsdbserver-nb-0\" (UID: \"feb13208-e959-4cb3-8d6f-185bf075036c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:24 crc kubenswrapper[4973]: I0320 13:43:24.077500 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/feb13208-e959-4cb3-8d6f-185bf075036c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"feb13208-e959-4cb3-8d6f-185bf075036c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:24 crc kubenswrapper[4973]: I0320 13:43:24.077634 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/feb13208-e959-4cb3-8d6f-185bf075036c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"feb13208-e959-4cb3-8d6f-185bf075036c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:24 crc kubenswrapper[4973]: I0320 13:43:24.181014 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/feb13208-e959-4cb3-8d6f-185bf075036c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"feb13208-e959-4cb3-8d6f-185bf075036c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:24 crc kubenswrapper[4973]: I0320 13:43:24.181397 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/feb13208-e959-4cb3-8d6f-185bf075036c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"feb13208-e959-4cb3-8d6f-185bf075036c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:24 crc kubenswrapper[4973]: I0320 13:43:24.181418 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feb13208-e959-4cb3-8d6f-185bf075036c-config\") pod \"ovsdbserver-nb-0\" (UID: \"feb13208-e959-4cb3-8d6f-185bf075036c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:24 crc kubenswrapper[4973]: I0320 13:43:24.181490 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb13208-e959-4cb3-8d6f-185bf075036c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"feb13208-e959-4cb3-8d6f-185bf075036c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:24 crc kubenswrapper[4973]: I0320 13:43:24.181517 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlj4x\" (UniqueName: \"kubernetes.io/projected/feb13208-e959-4cb3-8d6f-185bf075036c-kube-api-access-dlj4x\") pod \"ovsdbserver-nb-0\" (UID: \"feb13208-e959-4cb3-8d6f-185bf075036c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:24 crc kubenswrapper[4973]: I0320 13:43:24.181544 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/feb13208-e959-4cb3-8d6f-185bf075036c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"feb13208-e959-4cb3-8d6f-185bf075036c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:24 crc kubenswrapper[4973]: I0320 13:43:24.181565 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/feb13208-e959-4cb3-8d6f-185bf075036c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"feb13208-e959-4cb3-8d6f-185bf075036c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:24 crc kubenswrapper[4973]: I0320 13:43:24.181631 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-1d758f90-9a0d-4025-b848-67682a0a4fb0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1d758f90-9a0d-4025-b848-67682a0a4fb0\") pod \"ovsdbserver-nb-0\" (UID: \"feb13208-e959-4cb3-8d6f-185bf075036c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:24 crc kubenswrapper[4973]: I0320 13:43:24.181671 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/feb13208-e959-4cb3-8d6f-185bf075036c-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"feb13208-e959-4cb3-8d6f-185bf075036c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:24 crc kubenswrapper[4973]: I0320 13:43:24.183481 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/feb13208-e959-4cb3-8d6f-185bf075036c-config\") pod \"ovsdbserver-nb-0\" (UID: \"feb13208-e959-4cb3-8d6f-185bf075036c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:24 crc kubenswrapper[4973]: I0320 13:43:24.184208 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/feb13208-e959-4cb3-8d6f-185bf075036c-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"feb13208-e959-4cb3-8d6f-185bf075036c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:24 crc kubenswrapper[4973]: I0320 13:43:24.187099 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/feb13208-e959-4cb3-8d6f-185bf075036c-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"feb13208-e959-4cb3-8d6f-185bf075036c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:24 crc kubenswrapper[4973]: I0320 13:43:24.187124 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/feb13208-e959-4cb3-8d6f-185bf075036c-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"feb13208-e959-4cb3-8d6f-185bf075036c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:24 crc kubenswrapper[4973]: I0320 13:43:24.188001 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/feb13208-e959-4cb3-8d6f-185bf075036c-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"feb13208-e959-4cb3-8d6f-185bf075036c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:24 crc kubenswrapper[4973]: I0320 13:43:24.192669 4973 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:43:24 crc kubenswrapper[4973]: I0320 13:43:24.192715 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-1d758f90-9a0d-4025-b848-67682a0a4fb0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1d758f90-9a0d-4025-b848-67682a0a4fb0\") pod \"ovsdbserver-nb-0\" (UID: \"feb13208-e959-4cb3-8d6f-185bf075036c\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/ca8931c96f19fa815c657acbb326ddf280469375de49847edd3fe5fea11d914c/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:24 crc kubenswrapper[4973]: I0320 13:43:24.204467 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlj4x\" (UniqueName: \"kubernetes.io/projected/feb13208-e959-4cb3-8d6f-185bf075036c-kube-api-access-dlj4x\") pod \"ovsdbserver-nb-0\" (UID: \"feb13208-e959-4cb3-8d6f-185bf075036c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:24 crc kubenswrapper[4973]: I0320 13:43:24.239312 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-1d758f90-9a0d-4025-b848-67682a0a4fb0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-1d758f90-9a0d-4025-b848-67682a0a4fb0\") pod \"ovsdbserver-nb-0\" (UID: \"feb13208-e959-4cb3-8d6f-185bf075036c\") " pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:24 crc kubenswrapper[4973]: I0320 13:43:24.505075 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 20 13:43:27 crc kubenswrapper[4973]: W0320 13:43:27.303855 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20780ec2_d338_45a4_9259_16a651e46e55.slice/crio-a171d65528ccf508aec93ce498b6de689df033ad309b08561c3b732cb72f2011 WatchSource:0}: Error finding container a171d65528ccf508aec93ce498b6de689df033ad309b08561c3b732cb72f2011: Status 404 returned error can't find the container with id a171d65528ccf508aec93ce498b6de689df033ad309b08561c3b732cb72f2011 Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.466472 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"20780ec2-d338-45a4-9259-16a651e46e55","Type":"ContainerStarted","Data":"a171d65528ccf508aec93ce498b6de689df033ad309b08561c3b732cb72f2011"} Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.496629 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.498365 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.501075 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.501657 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.501794 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-58564" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.501902 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.508750 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.659107 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.659231 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-6f358d24-0897-4a10-aa64-fe05cba6ac26\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6f358d24-0897-4a10-aa64-fe05cba6ac26\") pod \"ovsdbserver-sb-0\" (UID: \"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.659378 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.659423 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z29g5\" (UniqueName: \"kubernetes.io/projected/2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a-kube-api-access-z29g5\") pod \"ovsdbserver-sb-0\" (UID: \"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.659552 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a-config\") pod \"ovsdbserver-sb-0\" (UID: \"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.659608 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.659647 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.659692 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.761934 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.762014 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-6f358d24-0897-4a10-aa64-fe05cba6ac26\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6f358d24-0897-4a10-aa64-fe05cba6ac26\") pod \"ovsdbserver-sb-0\" (UID: \"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.762094 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.762126 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z29g5\" (UniqueName: \"kubernetes.io/projected/2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a-kube-api-access-z29g5\") pod \"ovsdbserver-sb-0\" (UID: \"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.762163 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a-config\") pod \"ovsdbserver-sb-0\" (UID: \"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.762182 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.762203 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.762225 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.762886 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.763881 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.764364 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a-config\") pod \"ovsdbserver-sb-0\" (UID: \"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.765179 4973 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.765216 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-6f358d24-0897-4a10-aa64-fe05cba6ac26\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6f358d24-0897-4a10-aa64-fe05cba6ac26\") pod \"ovsdbserver-sb-0\" (UID: \"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/0893a590b8e58ab6e196a8d36cded887592facdd8920b7a3d23deade10555766/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.770274 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.770782 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.775117 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.782729 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z29g5\" (UniqueName: \"kubernetes.io/projected/2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a-kube-api-access-z29g5\") pod \"ovsdbserver-sb-0\" (UID: \"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.806004 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-6f358d24-0897-4a10-aa64-fe05cba6ac26\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-6f358d24-0897-4a10-aa64-fe05cba6ac26\") pod \"ovsdbserver-sb-0\" (UID: \"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a\") " pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:27 crc kubenswrapper[4973]: I0320 13:43:27.815292 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 20 13:43:34 crc kubenswrapper[4973]: I0320 13:43:34.727690 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 20 13:43:34 crc kubenswrapper[4973]: I0320 13:43:34.871347 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 20 13:43:35 crc kubenswrapper[4973]: E0320 13:43:35.195144 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 13:43:35 crc kubenswrapper[4973]: E0320 13:43:35.195687 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qf9k9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-5nwgs_openstack(12680dc2-8341-4d82-bf13-e292d9898d90): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:43:35 crc kubenswrapper[4973]: E0320 13:43:35.197584 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-5nwgs" podUID="12680dc2-8341-4d82-bf13-e292d9898d90" Mar 20 13:43:35 crc kubenswrapper[4973]: W0320 13:43:35.234490 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod144466b4_e5d6_4cca_b67b_b76dc24f8dea.slice/crio-8f5badc1f405a49a32c8e7e568763cb6d0c2ef07cd60eb776c3c5df210262714 WatchSource:0}: Error finding container 8f5badc1f405a49a32c8e7e568763cb6d0c2ef07cd60eb776c3c5df210262714: Status 404 returned error can't find the container with id 8f5badc1f405a49a32c8e7e568763cb6d0c2ef07cd60eb776c3c5df210262714 Mar 20 13:43:35 crc kubenswrapper[4973]: W0320 13:43:35.245266 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bdbb8eb_c36d_43f0_a705_3b3e59128b7f.slice/crio-dab72d4d4a060e4fe38884ca894d7514bf4b637d9ed871abdb7274fe313dbd89 WatchSource:0}: Error finding container dab72d4d4a060e4fe38884ca894d7514bf4b637d9ed871abdb7274fe313dbd89: Status 404 returned error can't find the container with id dab72d4d4a060e4fe38884ca894d7514bf4b637d9ed871abdb7274fe313dbd89 Mar 20 13:43:35 crc kubenswrapper[4973]: E0320 13:43:35.263419 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 13:43:35 crc kubenswrapper[4973]: E0320 13:43:35.263573 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zznvc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-rxsvm_openstack(6ee928ab-23ef-4f3e-a025-7576302eb628): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:43:35 crc kubenswrapper[4973]: E0320 13:43:35.264944 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-rxsvm" podUID="6ee928ab-23ef-4f3e-a025-7576302eb628" Mar 20 13:43:35 crc kubenswrapper[4973]: E0320 13:43:35.271303 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 13:43:35 crc kubenswrapper[4973]: E0320 13:43:35.271476 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s67rc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-tjbs5_openstack(f2200265-2c2e-4f89-ae21-e065a8ec1288): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:43:35 crc kubenswrapper[4973]: E0320 13:43:35.273612 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-tjbs5" podUID="f2200265-2c2e-4f89-ae21-e065a8ec1288" Mar 20 13:43:35 crc kubenswrapper[4973]: I0320 13:43:35.535126 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"144466b4-e5d6-4cca-b67b-b76dc24f8dea","Type":"ContainerStarted","Data":"8f5badc1f405a49a32c8e7e568763cb6d0c2ef07cd60eb776c3c5df210262714"} Mar 20 13:43:35 crc kubenswrapper[4973]: I0320 13:43:35.536834 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f","Type":"ContainerStarted","Data":"dab72d4d4a060e4fe38884ca894d7514bf4b637d9ed871abdb7274fe313dbd89"} Mar 20 13:43:35 crc kubenswrapper[4973]: E0320 13:43:35.540624 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-rxsvm" podUID="6ee928ab-23ef-4f3e-a025-7576302eb628" Mar 20 13:43:35 crc kubenswrapper[4973]: I0320 13:43:35.601586 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.059415 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-5nwgs" Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.069018 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-tjbs5" Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.170225 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2200265-2c2e-4f89-ae21-e065a8ec1288-config\") pod \"f2200265-2c2e-4f89-ae21-e065a8ec1288\" (UID: \"f2200265-2c2e-4f89-ae21-e065a8ec1288\") " Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.170277 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qf9k9\" (UniqueName: \"kubernetes.io/projected/12680dc2-8341-4d82-bf13-e292d9898d90-kube-api-access-qf9k9\") pod \"12680dc2-8341-4d82-bf13-e292d9898d90\" (UID: \"12680dc2-8341-4d82-bf13-e292d9898d90\") " Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.170370 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12680dc2-8341-4d82-bf13-e292d9898d90-config\") pod \"12680dc2-8341-4d82-bf13-e292d9898d90\" (UID: \"12680dc2-8341-4d82-bf13-e292d9898d90\") " Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.170481 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s67rc\" (UniqueName: \"kubernetes.io/projected/f2200265-2c2e-4f89-ae21-e065a8ec1288-kube-api-access-s67rc\") pod \"f2200265-2c2e-4f89-ae21-e065a8ec1288\" (UID: \"f2200265-2c2e-4f89-ae21-e065a8ec1288\") " Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.170518 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2200265-2c2e-4f89-ae21-e065a8ec1288-dns-svc\") pod \"f2200265-2c2e-4f89-ae21-e065a8ec1288\" (UID: \"f2200265-2c2e-4f89-ae21-e065a8ec1288\") " Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.173390 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12680dc2-8341-4d82-bf13-e292d9898d90-config" (OuterVolumeSpecName: "config") pod "12680dc2-8341-4d82-bf13-e292d9898d90" (UID: "12680dc2-8341-4d82-bf13-e292d9898d90"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.174132 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2200265-2c2e-4f89-ae21-e065a8ec1288-config" (OuterVolumeSpecName: "config") pod "f2200265-2c2e-4f89-ae21-e065a8ec1288" (UID: "f2200265-2c2e-4f89-ae21-e065a8ec1288"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.178757 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2200265-2c2e-4f89-ae21-e065a8ec1288-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f2200265-2c2e-4f89-ae21-e065a8ec1288" (UID: "f2200265-2c2e-4f89-ae21-e065a8ec1288"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.187570 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2200265-2c2e-4f89-ae21-e065a8ec1288-kube-api-access-s67rc" (OuterVolumeSpecName: "kube-api-access-s67rc") pod "f2200265-2c2e-4f89-ae21-e065a8ec1288" (UID: "f2200265-2c2e-4f89-ae21-e065a8ec1288"). InnerVolumeSpecName "kube-api-access-s67rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.188567 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12680dc2-8341-4d82-bf13-e292d9898d90-kube-api-access-qf9k9" (OuterVolumeSpecName: "kube-api-access-qf9k9") pod "12680dc2-8341-4d82-bf13-e292d9898d90" (UID: "12680dc2-8341-4d82-bf13-e292d9898d90"). InnerVolumeSpecName "kube-api-access-qf9k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.273207 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12680dc2-8341-4d82-bf13-e292d9898d90-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.273238 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s67rc\" (UniqueName: \"kubernetes.io/projected/f2200265-2c2e-4f89-ae21-e065a8ec1288-kube-api-access-s67rc\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.273249 4973 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f2200265-2c2e-4f89-ae21-e065a8ec1288-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.273259 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2200265-2c2e-4f89-ae21-e065a8ec1288-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.273268 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qf9k9\" (UniqueName: \"kubernetes.io/projected/12680dc2-8341-4d82-bf13-e292d9898d90-kube-api-access-qf9k9\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.572774 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"27fed14c-9051-4d46-80d5-badf224805a9","Type":"ContainerStarted","Data":"6e9c9c09d7bb13ec94767270e74576aba098ca695d9d45d689f86bb6a74a21a4"} Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.577806 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-tjbs5" event={"ID":"f2200265-2c2e-4f89-ae21-e065a8ec1288","Type":"ContainerDied","Data":"3e1ba7e3289395fafbf125318a2e32b24ebd859583ac4d25974ba19bebdec515"} Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.577840 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-tjbs5" Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.582000 4973 generic.go:334] "Generic (PLEG): container finished" podID="c513f134-e86b-47ac-9d13-5b92975be947" containerID="7767ee741ad44167df01fca209ddb897e2520004858f64bde7d50c3e6b7dd04a" exitCode=0 Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.582112 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hzr72" event={"ID":"c513f134-e86b-47ac-9d13-5b92975be947","Type":"ContainerDied","Data":"7767ee741ad44167df01fca209ddb897e2520004858f64bde7d50c3e6b7dd04a"} Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.592974 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-5nwgs" event={"ID":"12680dc2-8341-4d82-bf13-e292d9898d90","Type":"ContainerDied","Data":"7eca94781d61cc8b8ac3516e18f4d2b9e87128c6af73b3b55f34b97f2bedc4cf"} Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.593001 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-5nwgs" Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.652148 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tjbs5"] Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.659043 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tjbs5"] Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.706383 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5nwgs"] Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.719552 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-5nwgs"] Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.939825 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-ui-dashboards-7f87b9b85b-l6hmv"] Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.994080 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12680dc2-8341-4d82-bf13-e292d9898d90" path="/var/lib/kubelet/pods/12680dc2-8341-4d82-bf13-e292d9898d90/volumes" Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.994566 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2200265-2c2e-4f89-ae21-e065a8ec1288" path="/var/lib/kubelet/pods/f2200265-2c2e-4f89-ae21-e065a8ec1288/volumes" Mar 20 13:43:37 crc kubenswrapper[4973]: I0320 13:43:37.995013 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 13:43:38 crc kubenswrapper[4973]: I0320 13:43:38.004944 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:43:38 crc kubenswrapper[4973]: I0320 13:43:38.041660 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 20 13:43:38 crc kubenswrapper[4973]: I0320 13:43:38.050806 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 20 13:43:38 crc kubenswrapper[4973]: W0320 13:43:38.296800 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod863783c6_0106_43c7_b097_35c4f30db388.slice/crio-ce3740ddcce949b1fd9ed6e31adce30b27b3aa49e7cbb864e65f0b869695f17c WatchSource:0}: Error finding container ce3740ddcce949b1fd9ed6e31adce30b27b3aa49e7cbb864e65f0b869695f17c: Status 404 returned error can't find the container with id ce3740ddcce949b1fd9ed6e31adce30b27b3aa49e7cbb864e65f0b869695f17c Mar 20 13:43:38 crc kubenswrapper[4973]: W0320 13:43:38.303129 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod797b38f5_d9a7_4f82_bd12_e40e021ef28e.slice/crio-c88911f10d54bcd424c533ed9559a1a1dd6ad0e4525526d2be28da8bb9fb5cb9 WatchSource:0}: Error finding container c88911f10d54bcd424c533ed9559a1a1dd6ad0e4525526d2be28da8bb9fb5cb9: Status 404 returned error can't find the container with id c88911f10d54bcd424c533ed9559a1a1dd6ad0e4525526d2be28da8bb9fb5cb9 Mar 20 13:43:38 crc kubenswrapper[4973]: W0320 13:43:38.313818 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ed60638_5022_406b_b568_7fa0d6bf4ba8.slice/crio-45a0ee277eb1a8e5d445d9edeb1429a0a44c51aaa2702d64b321a8fa5e60ea0d WatchSource:0}: Error finding container 45a0ee277eb1a8e5d445d9edeb1429a0a44c51aaa2702d64b321a8fa5e60ea0d: Status 404 returned error can't find the container with id 45a0ee277eb1a8e5d445d9edeb1429a0a44c51aaa2702d64b321a8fa5e60ea0d Mar 20 13:43:38 crc kubenswrapper[4973]: W0320 13:43:38.318997 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96de22e2_f61c_4f75_8faa_9a0591aa0f38.slice/crio-14149fd1a8a3c68cdbd420943b49e4787cc4084fc7a74204535c3a549168e610 WatchSource:0}: Error finding container 14149fd1a8a3c68cdbd420943b49e4787cc4084fc7a74204535c3a549168e610: Status 404 returned error can't find the container with id 14149fd1a8a3c68cdbd420943b49e4787cc4084fc7a74204535c3a549168e610 Mar 20 13:43:38 crc kubenswrapper[4973]: I0320 13:43:38.478935 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-x2nll"] Mar 20 13:43:38 crc kubenswrapper[4973]: I0320 13:43:38.493645 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7b5fcf6d5d-lqlbt"] Mar 20 13:43:38 crc kubenswrapper[4973]: I0320 13:43:38.603644 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hzr72" event={"ID":"c513f134-e86b-47ac-9d13-5b92975be947","Type":"ContainerStarted","Data":"f808256dc23d67b39e8559ba701d28a699d976e4be355909a0e5235924b7b558"} Mar 20 13:43:38 crc kubenswrapper[4973]: I0320 13:43:38.603713 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-hzr72" Mar 20 13:43:38 crc kubenswrapper[4973]: I0320 13:43:38.607500 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"642c51b4-2774-4114-808c-1fb722862437","Type":"ContainerStarted","Data":"8b1e9828042ee52bb66810736a8d1325dd64830aeaedfb2a3ebe46bbc8f62a7f"} Mar 20 13:43:38 crc kubenswrapper[4973]: I0320 13:43:38.610029 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"797b38f5-d9a7-4f82-bd12-e40e021ef28e","Type":"ContainerStarted","Data":"c88911f10d54bcd424c533ed9559a1a1dd6ad0e4525526d2be28da8bb9fb5cb9"} Mar 20 13:43:38 crc kubenswrapper[4973]: I0320 13:43:38.611294 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"4ed60638-5022-406b-b568-7fa0d6bf4ba8","Type":"ContainerStarted","Data":"45a0ee277eb1a8e5d445d9edeb1429a0a44c51aaa2702d64b321a8fa5e60ea0d"} Mar 20 13:43:38 crc kubenswrapper[4973]: I0320 13:43:38.612368 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-l6hmv" event={"ID":"863783c6-0106-43c7-b097-35c4f30db388","Type":"ContainerStarted","Data":"ce3740ddcce949b1fd9ed6e31adce30b27b3aa49e7cbb864e65f0b869695f17c"} Mar 20 13:43:38 crc kubenswrapper[4973]: I0320 13:43:38.628316 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"20780ec2-d338-45a4-9259-16a651e46e55","Type":"ContainerStarted","Data":"4fe78d802a26e68c3351b6b1daff5a6fc354aeb5b49e215fb4795ef080c317df"} Mar 20 13:43:38 crc kubenswrapper[4973]: I0320 13:43:38.633909 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"96de22e2-f61c-4f75-8faa-9a0591aa0f38","Type":"ContainerStarted","Data":"14149fd1a8a3c68cdbd420943b49e4787cc4084fc7a74204535c3a549168e610"} Mar 20 13:43:38 crc kubenswrapper[4973]: I0320 13:43:38.660865 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-hzr72" podStartSLOduration=3.568162742 podStartE2EDuration="25.660842464s" podCreationTimestamp="2026-03-20 13:43:13 +0000 UTC" firstStartedPulling="2026-03-20 13:43:14.929531673 +0000 UTC m=+1315.673201417" lastFinishedPulling="2026-03-20 13:43:37.022211395 +0000 UTC m=+1337.765881139" observedRunningTime="2026-03-20 13:43:38.63228122 +0000 UTC m=+1339.375950974" watchObservedRunningTime="2026-03-20 13:43:38.660842464 +0000 UTC m=+1339.404512208" Mar 20 13:43:38 crc kubenswrapper[4973]: I0320 13:43:38.712327 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:43:38 crc kubenswrapper[4973]: I0320 13:43:38.867535 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 20 13:43:39 crc kubenswrapper[4973]: I0320 13:43:39.382767 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 20 13:43:39 crc kubenswrapper[4973]: W0320 13:43:39.503774 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfeb13208_e959_4cb3_8d6f_185bf075036c.slice/crio-836c052fc1c5917ffdbe10d0723f1f21a8016c7f5e78e6270b7e7f0f03fa9882 WatchSource:0}: Error finding container 836c052fc1c5917ffdbe10d0723f1f21a8016c7f5e78e6270b7e7f0f03fa9882: Status 404 returned error can't find the container with id 836c052fc1c5917ffdbe10d0723f1f21a8016c7f5e78e6270b7e7f0f03fa9882 Mar 20 13:43:39 crc kubenswrapper[4973]: I0320 13:43:39.599519 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-gvrbc"] Mar 20 13:43:39 crc kubenswrapper[4973]: W0320 13:43:39.614393 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3809b124_46bb_42ba_a467_279857c61ef6.slice/crio-9c8d990cde12a029b7561ee609ec69e9b6a7b56f8d1c01ac95fc080af9c61224 WatchSource:0}: Error finding container 9c8d990cde12a029b7561ee609ec69e9b6a7b56f8d1c01ac95fc080af9c61224: Status 404 returned error can't find the container with id 9c8d990cde12a029b7561ee609ec69e9b6a7b56f8d1c01ac95fc080af9c61224 Mar 20 13:43:39 crc kubenswrapper[4973]: I0320 13:43:39.653223 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gvrbc" event={"ID":"3809b124-46bb-42ba-a467-279857c61ef6","Type":"ContainerStarted","Data":"9c8d990cde12a029b7561ee609ec69e9b6a7b56f8d1c01ac95fc080af9c61224"} Mar 20 13:43:39 crc kubenswrapper[4973]: I0320 13:43:39.654739 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a","Type":"ContainerStarted","Data":"354a0ed34b93bf57bd75a1644a04ed56a8774d663e94550641a3973cdfa6a4f2"} Mar 20 13:43:39 crc kubenswrapper[4973]: I0320 13:43:39.656979 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b5fcf6d5d-lqlbt" event={"ID":"2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7","Type":"ContainerStarted","Data":"fd365b3f9bae704cda5f9dc395f07aa28b879aa3eebf0d00fa5992b869d6ee78"} Mar 20 13:43:39 crc kubenswrapper[4973]: I0320 13:43:39.659705 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x2nll" event={"ID":"f2c7b535-ad26-4bf4-848b-26890c0eb580","Type":"ContainerStarted","Data":"41cb32f123b5daec2517db66d7f4fbe0c0914844cd1adf3a3fa88bb16f283f5d"} Mar 20 13:43:39 crc kubenswrapper[4973]: I0320 13:43:39.661108 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4d08d0fa-8d0e-412d-b657-3af016e5c0d1","Type":"ContainerStarted","Data":"5113571eba9c33b9c22a30a577be48fca98fb1861dbad0cb54ae52c90bb99230"} Mar 20 13:43:39 crc kubenswrapper[4973]: I0320 13:43:39.677461 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"feb13208-e959-4cb3-8d6f-185bf075036c","Type":"ContainerStarted","Data":"836c052fc1c5917ffdbe10d0723f1f21a8016c7f5e78e6270b7e7f0f03fa9882"} Mar 20 13:43:40 crc kubenswrapper[4973]: I0320 13:43:40.675155 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7b5fcf6d5d-lqlbt" event={"ID":"2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7","Type":"ContainerStarted","Data":"0eb48d346aa4bd3990c7fdbd48499ea5fd466b25a21c7568701dcca52ffcce6c"} Mar 20 13:43:40 crc kubenswrapper[4973]: I0320 13:43:40.698097 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7b5fcf6d5d-lqlbt" podStartSLOduration=19.698078297 podStartE2EDuration="19.698078297s" podCreationTimestamp="2026-03-20 13:43:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:43:40.692471481 +0000 UTC m=+1341.436141225" watchObservedRunningTime="2026-03-20 13:43:40.698078297 +0000 UTC m=+1341.441748041" Mar 20 13:43:41 crc kubenswrapper[4973]: I0320 13:43:41.685777 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"4ed60638-5022-406b-b568-7fa0d6bf4ba8","Type":"ContainerStarted","Data":"1197fb893b7f76b1e4555b8d0ff5bfaca6b9fd60a6146cd1c20b9f45d87f3162"} Mar 20 13:43:41 crc kubenswrapper[4973]: I0320 13:43:41.688854 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"96de22e2-f61c-4f75-8faa-9a0591aa0f38","Type":"ContainerStarted","Data":"2f909be8e4577c3906dd8fe036f0cec95e6cfd7584a8f17b0b28bb39aa6d6a30"} Mar 20 13:43:41 crc kubenswrapper[4973]: I0320 13:43:41.690871 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"797b38f5-d9a7-4f82-bd12-e40e021ef28e","Type":"ContainerStarted","Data":"43a6b2817e43c4fa48cce3c4bb7ae49ce43b81d1e67941c6e11676db49de2111"} Mar 20 13:43:42 crc kubenswrapper[4973]: I0320 13:43:42.131818 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7b5fcf6d5d-lqlbt" Mar 20 13:43:42 crc kubenswrapper[4973]: I0320 13:43:42.131862 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7b5fcf6d5d-lqlbt" Mar 20 13:43:42 crc kubenswrapper[4973]: I0320 13:43:42.138443 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7b5fcf6d5d-lqlbt" Mar 20 13:43:42 crc kubenswrapper[4973]: I0320 13:43:42.711774 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"144466b4-e5d6-4cca-b67b-b76dc24f8dea","Type":"ContainerStarted","Data":"e11b7717a37bfe45fa600c3e544c68a02ab42df68a903e1368018d62864b42f4"} Mar 20 13:43:42 crc kubenswrapper[4973]: I0320 13:43:42.715260 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7b5fcf6d5d-lqlbt" Mar 20 13:43:42 crc kubenswrapper[4973]: I0320 13:43:42.742630 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=21.418052952 podStartE2EDuration="25.74260669s" podCreationTimestamp="2026-03-20 13:43:17 +0000 UTC" firstStartedPulling="2026-03-20 13:43:35.239004238 +0000 UTC m=+1335.982673982" lastFinishedPulling="2026-03-20 13:43:39.563557986 +0000 UTC m=+1340.307227720" observedRunningTime="2026-03-20 13:43:42.728877333 +0000 UTC m=+1343.472547097" watchObservedRunningTime="2026-03-20 13:43:42.74260669 +0000 UTC m=+1343.486276434" Mar 20 13:43:42 crc kubenswrapper[4973]: I0320 13:43:42.792172 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7794c74589-2c6lq"] Mar 20 13:43:43 crc kubenswrapper[4973]: I0320 13:43:43.107296 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 20 13:43:43 crc kubenswrapper[4973]: I0320 13:43:43.321226 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:43:43 crc kubenswrapper[4973]: I0320 13:43:43.321281 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:43:45 crc kubenswrapper[4973]: I0320 13:43:45.184154 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-hzr72" Mar 20 13:43:45 crc kubenswrapper[4973]: I0320 13:43:45.394694 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-rxsvm"] Mar 20 13:43:46 crc kubenswrapper[4973]: I0320 13:43:46.547583 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-rxsvm" Mar 20 13:43:46 crc kubenswrapper[4973]: I0320 13:43:46.659919 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ee928ab-23ef-4f3e-a025-7576302eb628-dns-svc\") pod \"6ee928ab-23ef-4f3e-a025-7576302eb628\" (UID: \"6ee928ab-23ef-4f3e-a025-7576302eb628\") " Mar 20 13:43:46 crc kubenswrapper[4973]: I0320 13:43:46.660445 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ee928ab-23ef-4f3e-a025-7576302eb628-config\") pod \"6ee928ab-23ef-4f3e-a025-7576302eb628\" (UID: \"6ee928ab-23ef-4f3e-a025-7576302eb628\") " Mar 20 13:43:46 crc kubenswrapper[4973]: I0320 13:43:46.660544 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zznvc\" (UniqueName: \"kubernetes.io/projected/6ee928ab-23ef-4f3e-a025-7576302eb628-kube-api-access-zznvc\") pod \"6ee928ab-23ef-4f3e-a025-7576302eb628\" (UID: \"6ee928ab-23ef-4f3e-a025-7576302eb628\") " Mar 20 13:43:46 crc kubenswrapper[4973]: I0320 13:43:46.660659 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ee928ab-23ef-4f3e-a025-7576302eb628-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6ee928ab-23ef-4f3e-a025-7576302eb628" (UID: "6ee928ab-23ef-4f3e-a025-7576302eb628"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:46 crc kubenswrapper[4973]: I0320 13:43:46.660965 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ee928ab-23ef-4f3e-a025-7576302eb628-config" (OuterVolumeSpecName: "config") pod "6ee928ab-23ef-4f3e-a025-7576302eb628" (UID: "6ee928ab-23ef-4f3e-a025-7576302eb628"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:43:46 crc kubenswrapper[4973]: I0320 13:43:46.661359 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ee928ab-23ef-4f3e-a025-7576302eb628-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:46 crc kubenswrapper[4973]: I0320 13:43:46.661377 4973 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6ee928ab-23ef-4f3e-a025-7576302eb628-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:46 crc kubenswrapper[4973]: I0320 13:43:46.664376 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ee928ab-23ef-4f3e-a025-7576302eb628-kube-api-access-zznvc" (OuterVolumeSpecName: "kube-api-access-zznvc") pod "6ee928ab-23ef-4f3e-a025-7576302eb628" (UID: "6ee928ab-23ef-4f3e-a025-7576302eb628"). InnerVolumeSpecName "kube-api-access-zznvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:43:46 crc kubenswrapper[4973]: I0320 13:43:46.769509 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zznvc\" (UniqueName: \"kubernetes.io/projected/6ee928ab-23ef-4f3e-a025-7576302eb628-kube-api-access-zznvc\") on node \"crc\" DevicePath \"\"" Mar 20 13:43:47 crc kubenswrapper[4973]: I0320 13:43:47.312596 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-rxsvm" Mar 20 13:43:47 crc kubenswrapper[4973]: I0320 13:43:47.312562 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-rxsvm" event={"ID":"6ee928ab-23ef-4f3e-a025-7576302eb628","Type":"ContainerDied","Data":"a41f5b2c634978ced4693d157e576f8f7a29f03548bfc7254cf91b8a31c739c1"} Mar 20 13:43:47 crc kubenswrapper[4973]: I0320 13:43:47.380696 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-rxsvm"] Mar 20 13:43:47 crc kubenswrapper[4973]: I0320 13:43:47.394885 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-rxsvm"] Mar 20 13:43:47 crc kubenswrapper[4973]: I0320 13:43:47.970661 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ee928ab-23ef-4f3e-a025-7576302eb628" path="/var/lib/kubelet/pods/6ee928ab-23ef-4f3e-a025-7576302eb628/volumes" Mar 20 13:43:48 crc kubenswrapper[4973]: I0320 13:43:48.107494 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 20 13:43:49 crc kubenswrapper[4973]: I0320 13:43:49.333564 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gvrbc" event={"ID":"3809b124-46bb-42ba-a467-279857c61ef6","Type":"ContainerStarted","Data":"3913f10b3a368f1c38396f01b2936afb6894aa42cf78dd8843133ac5d14cace2"} Mar 20 13:43:50 crc kubenswrapper[4973]: I0320 13:43:50.764597 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-9jcwr"] Mar 20 13:43:50 crc kubenswrapper[4973]: I0320 13:43:50.767190 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-9jcwr" Mar 20 13:43:50 crc kubenswrapper[4973]: I0320 13:43:50.778412 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-9jcwr"] Mar 20 13:43:50 crc kubenswrapper[4973]: I0320 13:43:50.858494 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e324e333-c4ed-44f5-abc5-0ee4083027eb-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-9jcwr\" (UID: \"e324e333-c4ed-44f5-abc5-0ee4083027eb\") " pod="openstack/dnsmasq-dns-7cb5889db5-9jcwr" Mar 20 13:43:50 crc kubenswrapper[4973]: I0320 13:43:50.858902 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4sdw\" (UniqueName: \"kubernetes.io/projected/e324e333-c4ed-44f5-abc5-0ee4083027eb-kube-api-access-l4sdw\") pod \"dnsmasq-dns-7cb5889db5-9jcwr\" (UID: \"e324e333-c4ed-44f5-abc5-0ee4083027eb\") " pod="openstack/dnsmasq-dns-7cb5889db5-9jcwr" Mar 20 13:43:50 crc kubenswrapper[4973]: I0320 13:43:50.859160 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e324e333-c4ed-44f5-abc5-0ee4083027eb-config\") pod \"dnsmasq-dns-7cb5889db5-9jcwr\" (UID: \"e324e333-c4ed-44f5-abc5-0ee4083027eb\") " pod="openstack/dnsmasq-dns-7cb5889db5-9jcwr" Mar 20 13:43:50 crc kubenswrapper[4973]: I0320 13:43:50.960893 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e324e333-c4ed-44f5-abc5-0ee4083027eb-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-9jcwr\" (UID: \"e324e333-c4ed-44f5-abc5-0ee4083027eb\") " pod="openstack/dnsmasq-dns-7cb5889db5-9jcwr" Mar 20 13:43:50 crc kubenswrapper[4973]: I0320 13:43:50.961045 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4sdw\" (UniqueName: \"kubernetes.io/projected/e324e333-c4ed-44f5-abc5-0ee4083027eb-kube-api-access-l4sdw\") pod \"dnsmasq-dns-7cb5889db5-9jcwr\" (UID: \"e324e333-c4ed-44f5-abc5-0ee4083027eb\") " pod="openstack/dnsmasq-dns-7cb5889db5-9jcwr" Mar 20 13:43:50 crc kubenswrapper[4973]: I0320 13:43:50.961125 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e324e333-c4ed-44f5-abc5-0ee4083027eb-config\") pod \"dnsmasq-dns-7cb5889db5-9jcwr\" (UID: \"e324e333-c4ed-44f5-abc5-0ee4083027eb\") " pod="openstack/dnsmasq-dns-7cb5889db5-9jcwr" Mar 20 13:43:50 crc kubenswrapper[4973]: I0320 13:43:50.962092 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e324e333-c4ed-44f5-abc5-0ee4083027eb-config\") pod \"dnsmasq-dns-7cb5889db5-9jcwr\" (UID: \"e324e333-c4ed-44f5-abc5-0ee4083027eb\") " pod="openstack/dnsmasq-dns-7cb5889db5-9jcwr" Mar 20 13:43:50 crc kubenswrapper[4973]: I0320 13:43:50.962130 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e324e333-c4ed-44f5-abc5-0ee4083027eb-dns-svc\") pod \"dnsmasq-dns-7cb5889db5-9jcwr\" (UID: \"e324e333-c4ed-44f5-abc5-0ee4083027eb\") " pod="openstack/dnsmasq-dns-7cb5889db5-9jcwr" Mar 20 13:43:51 crc kubenswrapper[4973]: I0320 13:43:51.083923 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4sdw\" (UniqueName: \"kubernetes.io/projected/e324e333-c4ed-44f5-abc5-0ee4083027eb-kube-api-access-l4sdw\") pod \"dnsmasq-dns-7cb5889db5-9jcwr\" (UID: \"e324e333-c4ed-44f5-abc5-0ee4083027eb\") " pod="openstack/dnsmasq-dns-7cb5889db5-9jcwr" Mar 20 13:43:51 crc kubenswrapper[4973]: I0320 13:43:51.092011 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-9jcwr" Mar 20 13:43:51 crc kubenswrapper[4973]: I0320 13:43:51.873981 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 20 13:43:51 crc kubenswrapper[4973]: I0320 13:43:51.881232 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 13:43:51 crc kubenswrapper[4973]: I0320 13:43:51.883509 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 20 13:43:51 crc kubenswrapper[4973]: I0320 13:43:51.883532 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 20 13:43:51 crc kubenswrapper[4973]: I0320 13:43:51.883607 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-vqncv" Mar 20 13:43:51 crc kubenswrapper[4973]: I0320 13:43:51.884396 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 20 13:43:51 crc kubenswrapper[4973]: I0320 13:43:51.895471 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 13:43:51 crc kubenswrapper[4973]: I0320 13:43:51.981146 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b\") " pod="openstack/swift-storage-0" Mar 20 13:43:51 crc kubenswrapper[4973]: I0320 13:43:51.981201 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-cache\") pod \"swift-storage-0\" (UID: \"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b\") " pod="openstack/swift-storage-0" Mar 20 13:43:51 crc kubenswrapper[4973]: I0320 13:43:51.981276 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-etc-swift\") pod \"swift-storage-0\" (UID: \"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b\") " pod="openstack/swift-storage-0" Mar 20 13:43:51 crc kubenswrapper[4973]: I0320 13:43:51.981315 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-lock\") pod \"swift-storage-0\" (UID: \"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b\") " pod="openstack/swift-storage-0" Mar 20 13:43:51 crc kubenswrapper[4973]: I0320 13:43:51.981442 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76nh6\" (UniqueName: \"kubernetes.io/projected/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-kube-api-access-76nh6\") pod \"swift-storage-0\" (UID: \"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b\") " pod="openstack/swift-storage-0" Mar 20 13:43:51 crc kubenswrapper[4973]: I0320 13:43:51.981513 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-dade3888-3a59-4dbf-9967-035d1a8ce18d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dade3888-3a59-4dbf-9967-035d1a8ce18d\") pod \"swift-storage-0\" (UID: \"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b\") " pod="openstack/swift-storage-0" Mar 20 13:43:52 crc kubenswrapper[4973]: I0320 13:43:52.083538 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-etc-swift\") pod \"swift-storage-0\" (UID: \"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b\") " pod="openstack/swift-storage-0" Mar 20 13:43:52 crc kubenswrapper[4973]: I0320 13:43:52.083664 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-lock\") pod \"swift-storage-0\" (UID: \"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b\") " pod="openstack/swift-storage-0" Mar 20 13:43:52 crc kubenswrapper[4973]: I0320 13:43:52.083734 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76nh6\" (UniqueName: \"kubernetes.io/projected/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-kube-api-access-76nh6\") pod \"swift-storage-0\" (UID: \"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b\") " pod="openstack/swift-storage-0" Mar 20 13:43:52 crc kubenswrapper[4973]: E0320 13:43:52.083780 4973 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:43:52 crc kubenswrapper[4973]: E0320 13:43:52.083821 4973 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:43:52 crc kubenswrapper[4973]: E0320 13:43:52.083881 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-etc-swift podName:8184c0e7-f9ef-48a3-9461-5cc6c1188e6b nodeName:}" failed. No retries permitted until 2026-03-20 13:43:52.583859714 +0000 UTC m=+1353.327529548 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-etc-swift") pod "swift-storage-0" (UID: "8184c0e7-f9ef-48a3-9461-5cc6c1188e6b") : configmap "swift-ring-files" not found Mar 20 13:43:52 crc kubenswrapper[4973]: I0320 13:43:52.083787 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-dade3888-3a59-4dbf-9967-035d1a8ce18d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dade3888-3a59-4dbf-9967-035d1a8ce18d\") pod \"swift-storage-0\" (UID: \"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b\") " pod="openstack/swift-storage-0" Mar 20 13:43:52 crc kubenswrapper[4973]: I0320 13:43:52.084213 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-lock\") pod \"swift-storage-0\" (UID: \"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b\") " pod="openstack/swift-storage-0" Mar 20 13:43:52 crc kubenswrapper[4973]: I0320 13:43:52.084535 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b\") " pod="openstack/swift-storage-0" Mar 20 13:43:52 crc kubenswrapper[4973]: I0320 13:43:52.084591 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-cache\") pod \"swift-storage-0\" (UID: \"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b\") " pod="openstack/swift-storage-0" Mar 20 13:43:52 crc kubenswrapper[4973]: I0320 13:43:52.085014 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-cache\") pod \"swift-storage-0\" (UID: \"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b\") " pod="openstack/swift-storage-0" Mar 20 13:43:52 crc kubenswrapper[4973]: I0320 13:43:52.090550 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b\") " pod="openstack/swift-storage-0" Mar 20 13:43:52 crc kubenswrapper[4973]: I0320 13:43:52.091394 4973 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:43:52 crc kubenswrapper[4973]: I0320 13:43:52.091417 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-dade3888-3a59-4dbf-9967-035d1a8ce18d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dade3888-3a59-4dbf-9967-035d1a8ce18d\") pod \"swift-storage-0\" (UID: \"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/40fa5372b2204494199ba8c8d984d285463ad68a1654a6e924fec5a5d8b397a4/globalmount\"" pod="openstack/swift-storage-0" Mar 20 13:43:52 crc kubenswrapper[4973]: I0320 13:43:52.103505 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76nh6\" (UniqueName: \"kubernetes.io/projected/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-kube-api-access-76nh6\") pod \"swift-storage-0\" (UID: \"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b\") " pod="openstack/swift-storage-0" Mar 20 13:43:52 crc kubenswrapper[4973]: I0320 13:43:52.130759 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-dade3888-3a59-4dbf-9967-035d1a8ce18d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-dade3888-3a59-4dbf-9967-035d1a8ce18d\") pod \"swift-storage-0\" (UID: \"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b\") " pod="openstack/swift-storage-0" Mar 20 13:43:52 crc kubenswrapper[4973]: I0320 13:43:52.593545 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-etc-swift\") pod \"swift-storage-0\" (UID: \"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b\") " pod="openstack/swift-storage-0" Mar 20 13:43:52 crc kubenswrapper[4973]: E0320 13:43:52.593698 4973 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:43:52 crc kubenswrapper[4973]: E0320 13:43:52.593723 4973 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:43:52 crc kubenswrapper[4973]: E0320 13:43:52.593770 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-etc-swift podName:8184c0e7-f9ef-48a3-9461-5cc6c1188e6b nodeName:}" failed. No retries permitted until 2026-03-20 13:43:53.593755963 +0000 UTC m=+1354.337425707 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-etc-swift") pod "swift-storage-0" (UID: "8184c0e7-f9ef-48a3-9461-5cc6c1188e6b") : configmap "swift-ring-files" not found Mar 20 13:43:53 crc kubenswrapper[4973]: I0320 13:43:53.363605 4973 generic.go:334] "Generic (PLEG): container finished" podID="3809b124-46bb-42ba-a467-279857c61ef6" containerID="3913f10b3a368f1c38396f01b2936afb6894aa42cf78dd8843133ac5d14cace2" exitCode=0 Mar 20 13:43:53 crc kubenswrapper[4973]: I0320 13:43:53.363677 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gvrbc" event={"ID":"3809b124-46bb-42ba-a467-279857c61ef6","Type":"ContainerDied","Data":"3913f10b3a368f1c38396f01b2936afb6894aa42cf78dd8843133ac5d14cace2"} Mar 20 13:43:53 crc kubenswrapper[4973]: I0320 13:43:53.376573 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"642c51b4-2774-4114-808c-1fb722862437","Type":"ContainerStarted","Data":"85fdf994d82452b154747e1cea74fc781b71f5de0915d878d8caa20f3828abc5"} Mar 20 13:43:53 crc kubenswrapper[4973]: I0320 13:43:53.614988 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-etc-swift\") pod \"swift-storage-0\" (UID: \"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b\") " pod="openstack/swift-storage-0" Mar 20 13:43:53 crc kubenswrapper[4973]: E0320 13:43:53.615170 4973 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:43:53 crc kubenswrapper[4973]: E0320 13:43:53.615749 4973 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:43:53 crc kubenswrapper[4973]: E0320 13:43:53.615813 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-etc-swift podName:8184c0e7-f9ef-48a3-9461-5cc6c1188e6b nodeName:}" failed. No retries permitted until 2026-03-20 13:43:55.615789138 +0000 UTC m=+1356.359458882 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-etc-swift") pod "swift-storage-0" (UID: "8184c0e7-f9ef-48a3-9461-5cc6c1188e6b") : configmap "swift-ring-files" not found Mar 20 13:43:54 crc kubenswrapper[4973]: I0320 13:43:54.392643 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"27fed14c-9051-4d46-80d5-badf224805a9","Type":"ContainerStarted","Data":"311fd32ac792a70179f6e4321b5251d3c6ee0bcb503e0545c3f5985b013f687d"} Mar 20 13:43:54 crc kubenswrapper[4973]: I0320 13:43:54.397134 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x2nll" event={"ID":"f2c7b535-ad26-4bf4-848b-26890c0eb580","Type":"ContainerStarted","Data":"355fe00955b567198b3b0892936c92ca630a83dab5f20f8e8012dd35de88e93c"} Mar 20 13:43:54 crc kubenswrapper[4973]: I0320 13:43:54.397253 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-x2nll" Mar 20 13:43:54 crc kubenswrapper[4973]: I0320 13:43:54.400089 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f","Type":"ContainerStarted","Data":"13c5fb3c41d0b367a6cd9312b5eeed0e0c4dd5e9a1cc39778d081d53499be6f1"} Mar 20 13:43:54 crc kubenswrapper[4973]: I0320 13:43:54.409590 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-l6hmv" event={"ID":"863783c6-0106-43c7-b097-35c4f30db388","Type":"ContainerStarted","Data":"5f460bdf447cc0587eeee9bd4b374987e06b84de248a4b7b4c00e983710e9a14"} Mar 20 13:43:54 crc kubenswrapper[4973]: I0320 13:43:54.417904 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"feb13208-e959-4cb3-8d6f-185bf075036c","Type":"ContainerStarted","Data":"efa2825f0741b3b3ff66fff589dfee459778de3465c786fd13a113a0269de24c"} Mar 20 13:43:54 crc kubenswrapper[4973]: I0320 13:43:54.429066 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-9jcwr"] Mar 20 13:43:54 crc kubenswrapper[4973]: I0320 13:43:54.473746 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-ui-dashboards-7f87b9b85b-l6hmv" podStartSLOduration=24.661909308 podStartE2EDuration="33.473721876s" podCreationTimestamp="2026-03-20 13:43:21 +0000 UTC" firstStartedPulling="2026-03-20 13:43:38.313764629 +0000 UTC m=+1339.057434373" lastFinishedPulling="2026-03-20 13:43:47.125577197 +0000 UTC m=+1347.869246941" observedRunningTime="2026-03-20 13:43:54.441018126 +0000 UTC m=+1355.184687870" watchObservedRunningTime="2026-03-20 13:43:54.473721876 +0000 UTC m=+1355.217391620" Mar 20 13:43:54 crc kubenswrapper[4973]: I0320 13:43:54.487078 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-x2nll" podStartSLOduration=23.327772582 podStartE2EDuration="31.486986081s" podCreationTimestamp="2026-03-20 13:43:23 +0000 UTC" firstStartedPulling="2026-03-20 13:43:39.488527285 +0000 UTC m=+1340.232197029" lastFinishedPulling="2026-03-20 13:43:47.647740784 +0000 UTC m=+1348.391410528" observedRunningTime="2026-03-20 13:43:54.467291139 +0000 UTC m=+1355.210960903" watchObservedRunningTime="2026-03-20 13:43:54.486986081 +0000 UTC m=+1355.230655825" Mar 20 13:43:55 crc kubenswrapper[4973]: I0320 13:43:55.666813 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-etc-swift\") pod \"swift-storage-0\" (UID: \"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b\") " pod="openstack/swift-storage-0" Mar 20 13:43:55 crc kubenswrapper[4973]: E0320 13:43:55.666999 4973 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:43:55 crc kubenswrapper[4973]: E0320 13:43:55.667591 4973 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:43:55 crc kubenswrapper[4973]: E0320 13:43:55.667648 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-etc-swift podName:8184c0e7-f9ef-48a3-9461-5cc6c1188e6b nodeName:}" failed. No retries permitted until 2026-03-20 13:43:59.66762942 +0000 UTC m=+1360.411299164 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-etc-swift") pod "swift-storage-0" (UID: "8184c0e7-f9ef-48a3-9461-5cc6c1188e6b") : configmap "swift-ring-files" not found Mar 20 13:43:55 crc kubenswrapper[4973]: I0320 13:43:55.858589 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-qjvlf"] Mar 20 13:43:55 crc kubenswrapper[4973]: I0320 13:43:55.859942 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qjvlf" Mar 20 13:43:55 crc kubenswrapper[4973]: I0320 13:43:55.866325 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 13:43:55 crc kubenswrapper[4973]: I0320 13:43:55.866638 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 20 13:43:55 crc kubenswrapper[4973]: I0320 13:43:55.868061 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 20 13:43:55 crc kubenswrapper[4973]: I0320 13:43:55.934160 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-qjvlf"] Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.002772 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p8qj\" (UniqueName: \"kubernetes.io/projected/dbc7e778-9029-42ce-9a8e-e76636aad6a5-kube-api-access-5p8qj\") pod \"swift-ring-rebalance-qjvlf\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " pod="openstack/swift-ring-rebalance-qjvlf" Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.003121 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dbc7e778-9029-42ce-9a8e-e76636aad6a5-dispersionconf\") pod \"swift-ring-rebalance-qjvlf\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " pod="openstack/swift-ring-rebalance-qjvlf" Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.003286 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dbc7e778-9029-42ce-9a8e-e76636aad6a5-etc-swift\") pod \"swift-ring-rebalance-qjvlf\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " pod="openstack/swift-ring-rebalance-qjvlf" Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.003403 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc7e778-9029-42ce-9a8e-e76636aad6a5-combined-ca-bundle\") pod \"swift-ring-rebalance-qjvlf\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " pod="openstack/swift-ring-rebalance-qjvlf" Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.003565 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbc7e778-9029-42ce-9a8e-e76636aad6a5-scripts\") pod \"swift-ring-rebalance-qjvlf\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " pod="openstack/swift-ring-rebalance-qjvlf" Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.003600 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dbc7e778-9029-42ce-9a8e-e76636aad6a5-ring-data-devices\") pod \"swift-ring-rebalance-qjvlf\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " pod="openstack/swift-ring-rebalance-qjvlf" Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.003662 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dbc7e778-9029-42ce-9a8e-e76636aad6a5-swiftconf\") pod \"swift-ring-rebalance-qjvlf\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " pod="openstack/swift-ring-rebalance-qjvlf" Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.105480 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dbc7e778-9029-42ce-9a8e-e76636aad6a5-swiftconf\") pod \"swift-ring-rebalance-qjvlf\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " pod="openstack/swift-ring-rebalance-qjvlf" Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.105825 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p8qj\" (UniqueName: \"kubernetes.io/projected/dbc7e778-9029-42ce-9a8e-e76636aad6a5-kube-api-access-5p8qj\") pod \"swift-ring-rebalance-qjvlf\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " pod="openstack/swift-ring-rebalance-qjvlf" Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.105900 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dbc7e778-9029-42ce-9a8e-e76636aad6a5-dispersionconf\") pod \"swift-ring-rebalance-qjvlf\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " pod="openstack/swift-ring-rebalance-qjvlf" Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.105942 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dbc7e778-9029-42ce-9a8e-e76636aad6a5-etc-swift\") pod \"swift-ring-rebalance-qjvlf\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " pod="openstack/swift-ring-rebalance-qjvlf" Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.105981 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc7e778-9029-42ce-9a8e-e76636aad6a5-combined-ca-bundle\") pod \"swift-ring-rebalance-qjvlf\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " pod="openstack/swift-ring-rebalance-qjvlf" Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.106019 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbc7e778-9029-42ce-9a8e-e76636aad6a5-scripts\") pod \"swift-ring-rebalance-qjvlf\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " pod="openstack/swift-ring-rebalance-qjvlf" Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.106035 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dbc7e778-9029-42ce-9a8e-e76636aad6a5-ring-data-devices\") pod \"swift-ring-rebalance-qjvlf\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " pod="openstack/swift-ring-rebalance-qjvlf" Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.106781 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dbc7e778-9029-42ce-9a8e-e76636aad6a5-ring-data-devices\") pod \"swift-ring-rebalance-qjvlf\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " pod="openstack/swift-ring-rebalance-qjvlf" Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.107094 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dbc7e778-9029-42ce-9a8e-e76636aad6a5-etc-swift\") pod \"swift-ring-rebalance-qjvlf\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " pod="openstack/swift-ring-rebalance-qjvlf" Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.107662 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbc7e778-9029-42ce-9a8e-e76636aad6a5-scripts\") pod \"swift-ring-rebalance-qjvlf\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " pod="openstack/swift-ring-rebalance-qjvlf" Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.111962 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dbc7e778-9029-42ce-9a8e-e76636aad6a5-swiftconf\") pod \"swift-ring-rebalance-qjvlf\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " pod="openstack/swift-ring-rebalance-qjvlf" Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.112225 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dbc7e778-9029-42ce-9a8e-e76636aad6a5-dispersionconf\") pod \"swift-ring-rebalance-qjvlf\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " pod="openstack/swift-ring-rebalance-qjvlf" Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.112821 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc7e778-9029-42ce-9a8e-e76636aad6a5-combined-ca-bundle\") pod \"swift-ring-rebalance-qjvlf\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " pod="openstack/swift-ring-rebalance-qjvlf" Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.126015 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p8qj\" (UniqueName: \"kubernetes.io/projected/dbc7e778-9029-42ce-9a8e-e76636aad6a5-kube-api-access-5p8qj\") pod \"swift-ring-rebalance-qjvlf\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " pod="openstack/swift-ring-rebalance-qjvlf" Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.262792 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qjvlf" Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.463460 4973 generic.go:334] "Generic (PLEG): container finished" podID="e324e333-c4ed-44f5-abc5-0ee4083027eb" containerID="af5781a711f799e3009428c39379e5691c00fa06d55896f42a46404b5df0af2d" exitCode=0 Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.463809 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-9jcwr" event={"ID":"e324e333-c4ed-44f5-abc5-0ee4083027eb","Type":"ContainerDied","Data":"af5781a711f799e3009428c39379e5691c00fa06d55896f42a46404b5df0af2d"} Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.463846 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-9jcwr" event={"ID":"e324e333-c4ed-44f5-abc5-0ee4083027eb","Type":"ContainerStarted","Data":"3d53830d21a3c218183322b1d7819cc6551106444a7f42303b2992ee1f35dcd9"} Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.472619 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4d08d0fa-8d0e-412d-b657-3af016e5c0d1","Type":"ContainerStarted","Data":"a0a18060aa59c7fdc5f65165d3080ab833f126993bdd47953b7effb4b67e5261"} Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.472676 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.476783 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gvrbc" event={"ID":"3809b124-46bb-42ba-a467-279857c61ef6","Type":"ContainerStarted","Data":"b54492a127a6ea363c105fbb4f89d8f78d5dd056ad109c8b57309524d8293540"} Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.477458 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-gvrbc" Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.477615 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-gvrbc" Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.487885 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a","Type":"ContainerStarted","Data":"b6668f18e05e1fc32ef362103660106ae38fa0760c540169fbcdbb6fe3b7ca96"} Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.553594 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=20.386188356 podStartE2EDuration="36.553572037s" podCreationTimestamp="2026-03-20 13:43:20 +0000 UTC" firstStartedPulling="2026-03-20 13:43:39.534753257 +0000 UTC m=+1340.278423001" lastFinishedPulling="2026-03-20 13:43:55.702136938 +0000 UTC m=+1356.445806682" observedRunningTime="2026-03-20 13:43:56.513939326 +0000 UTC m=+1357.257609070" watchObservedRunningTime="2026-03-20 13:43:56.553572037 +0000 UTC m=+1357.297241781" Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.559040 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-gvrbc" podStartSLOduration=25.559357538 podStartE2EDuration="33.559025219s" podCreationTimestamp="2026-03-20 13:43:23 +0000 UTC" firstStartedPulling="2026-03-20 13:43:39.618093864 +0000 UTC m=+1340.361763608" lastFinishedPulling="2026-03-20 13:43:47.617761545 +0000 UTC m=+1348.361431289" observedRunningTime="2026-03-20 13:43:56.534406409 +0000 UTC m=+1357.278076163" watchObservedRunningTime="2026-03-20 13:43:56.559025219 +0000 UTC m=+1357.302694963" Mar 20 13:43:56 crc kubenswrapper[4973]: I0320 13:43:56.739636 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-qjvlf"] Mar 20 13:43:56 crc kubenswrapper[4973]: W0320 13:43:56.761123 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddbc7e778_9029_42ce_9a8e_e76636aad6a5.slice/crio-6d2d9cbba70b40d11f35eaeff96830c8b0b223472e33ba69d6245757221204ae WatchSource:0}: Error finding container 6d2d9cbba70b40d11f35eaeff96830c8b0b223472e33ba69d6245757221204ae: Status 404 returned error can't find the container with id 6d2d9cbba70b40d11f35eaeff96830c8b0b223472e33ba69d6245757221204ae Mar 20 13:43:57 crc kubenswrapper[4973]: I0320 13:43:57.500097 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-gvrbc" event={"ID":"3809b124-46bb-42ba-a467-279857c61ef6","Type":"ContainerStarted","Data":"fc5bf137d3dbb9344b78ef3e1b0700fd1c1e468f2c7e3486f42a737c493154f3"} Mar 20 13:43:57 crc kubenswrapper[4973]: I0320 13:43:57.505805 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-9jcwr" event={"ID":"e324e333-c4ed-44f5-abc5-0ee4083027eb","Type":"ContainerStarted","Data":"c9ea00565b7792f08df0951efad1b2003b631c4e010521a32cc7cc75fb9c366f"} Mar 20 13:43:57 crc kubenswrapper[4973]: I0320 13:43:57.506844 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7cb5889db5-9jcwr" Mar 20 13:43:57 crc kubenswrapper[4973]: I0320 13:43:57.508463 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qjvlf" event={"ID":"dbc7e778-9029-42ce-9a8e-e76636aad6a5","Type":"ContainerStarted","Data":"6d2d9cbba70b40d11f35eaeff96830c8b0b223472e33ba69d6245757221204ae"} Mar 20 13:43:57 crc kubenswrapper[4973]: I0320 13:43:57.526135 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7cb5889db5-9jcwr" podStartSLOduration=7.526115725 podStartE2EDuration="7.526115725s" podCreationTimestamp="2026-03-20 13:43:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:43:57.525083719 +0000 UTC m=+1358.268753473" watchObservedRunningTime="2026-03-20 13:43:57.526115725 +0000 UTC m=+1358.269785469" Mar 20 13:43:58 crc kubenswrapper[4973]: I0320 13:43:58.521487 4973 generic.go:334] "Generic (PLEG): container finished" podID="642c51b4-2774-4114-808c-1fb722862437" containerID="85fdf994d82452b154747e1cea74fc781b71f5de0915d878d8caa20f3828abc5" exitCode=0 Mar 20 13:43:58 crc kubenswrapper[4973]: I0320 13:43:58.521584 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"642c51b4-2774-4114-808c-1fb722862437","Type":"ContainerDied","Data":"85fdf994d82452b154747e1cea74fc781b71f5de0915d878d8caa20f3828abc5"} Mar 20 13:43:59 crc kubenswrapper[4973]: I0320 13:43:59.533684 4973 generic.go:334] "Generic (PLEG): container finished" podID="27fed14c-9051-4d46-80d5-badf224805a9" containerID="311fd32ac792a70179f6e4321b5251d3c6ee0bcb503e0545c3f5985b013f687d" exitCode=0 Mar 20 13:43:59 crc kubenswrapper[4973]: I0320 13:43:59.534108 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"27fed14c-9051-4d46-80d5-badf224805a9","Type":"ContainerDied","Data":"311fd32ac792a70179f6e4321b5251d3c6ee0bcb503e0545c3f5985b013f687d"} Mar 20 13:43:59 crc kubenswrapper[4973]: I0320 13:43:59.536694 4973 generic.go:334] "Generic (PLEG): container finished" podID="4bdbb8eb-c36d-43f0-a705-3b3e59128b7f" containerID="13c5fb3c41d0b367a6cd9312b5eeed0e0c4dd5e9a1cc39778d081d53499be6f1" exitCode=0 Mar 20 13:43:59 crc kubenswrapper[4973]: I0320 13:43:59.536777 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f","Type":"ContainerDied","Data":"13c5fb3c41d0b367a6cd9312b5eeed0e0c4dd5e9a1cc39778d081d53499be6f1"} Mar 20 13:43:59 crc kubenswrapper[4973]: I0320 13:43:59.706436 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-etc-swift\") pod \"swift-storage-0\" (UID: \"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b\") " pod="openstack/swift-storage-0" Mar 20 13:43:59 crc kubenswrapper[4973]: E0320 13:43:59.706605 4973 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:43:59 crc kubenswrapper[4973]: E0320 13:43:59.706629 4973 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:43:59 crc kubenswrapper[4973]: E0320 13:43:59.706677 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-etc-swift podName:8184c0e7-f9ef-48a3-9461-5cc6c1188e6b nodeName:}" failed. No retries permitted until 2026-03-20 13:44:07.706661251 +0000 UTC m=+1368.450330995 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-etc-swift") pod "swift-storage-0" (UID: "8184c0e7-f9ef-48a3-9461-5cc6c1188e6b") : configmap "swift-ring-files" not found Mar 20 13:44:00 crc kubenswrapper[4973]: I0320 13:44:00.134751 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566904-l8zrn"] Mar 20 13:44:00 crc kubenswrapper[4973]: I0320 13:44:00.136157 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566904-l8zrn" Mar 20 13:44:00 crc kubenswrapper[4973]: I0320 13:44:00.138941 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:44:00 crc kubenswrapper[4973]: I0320 13:44:00.139145 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 13:44:00 crc kubenswrapper[4973]: I0320 13:44:00.139272 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:44:00 crc kubenswrapper[4973]: I0320 13:44:00.147122 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566904-l8zrn"] Mar 20 13:44:00 crc kubenswrapper[4973]: I0320 13:44:00.216816 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5th59\" (UniqueName: \"kubernetes.io/projected/0592b321-65ba-4ab3-987b-8384ec9ee7e2-kube-api-access-5th59\") pod \"auto-csr-approver-29566904-l8zrn\" (UID: \"0592b321-65ba-4ab3-987b-8384ec9ee7e2\") " pod="openshift-infra/auto-csr-approver-29566904-l8zrn" Mar 20 13:44:00 crc kubenswrapper[4973]: I0320 13:44:00.319268 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5th59\" (UniqueName: \"kubernetes.io/projected/0592b321-65ba-4ab3-987b-8384ec9ee7e2-kube-api-access-5th59\") pod \"auto-csr-approver-29566904-l8zrn\" (UID: \"0592b321-65ba-4ab3-987b-8384ec9ee7e2\") " pod="openshift-infra/auto-csr-approver-29566904-l8zrn" Mar 20 13:44:00 crc kubenswrapper[4973]: I0320 13:44:00.346014 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5th59\" (UniqueName: \"kubernetes.io/projected/0592b321-65ba-4ab3-987b-8384ec9ee7e2-kube-api-access-5th59\") pod \"auto-csr-approver-29566904-l8zrn\" (UID: \"0592b321-65ba-4ab3-987b-8384ec9ee7e2\") " pod="openshift-infra/auto-csr-approver-29566904-l8zrn" Mar 20 13:44:00 crc kubenswrapper[4973]: I0320 13:44:00.473563 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566904-l8zrn" Mar 20 13:44:01 crc kubenswrapper[4973]: I0320 13:44:01.093522 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7cb5889db5-9jcwr" Mar 20 13:44:01 crc kubenswrapper[4973]: I0320 13:44:01.146837 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hzr72"] Mar 20 13:44:01 crc kubenswrapper[4973]: I0320 13:44:01.147069 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-hzr72" podUID="c513f134-e86b-47ac-9d13-5b92975be947" containerName="dnsmasq-dns" containerID="cri-o://f808256dc23d67b39e8559ba701d28a699d976e4be355909a0e5235924b7b558" gracePeriod=10 Mar 20 13:44:01 crc kubenswrapper[4973]: I0320 13:44:01.557547 4973 generic.go:334] "Generic (PLEG): container finished" podID="c513f134-e86b-47ac-9d13-5b92975be947" containerID="f808256dc23d67b39e8559ba701d28a699d976e4be355909a0e5235924b7b558" exitCode=0 Mar 20 13:44:01 crc kubenswrapper[4973]: I0320 13:44:01.557592 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hzr72" event={"ID":"c513f134-e86b-47ac-9d13-5b92975be947","Type":"ContainerDied","Data":"f808256dc23d67b39e8559ba701d28a699d976e4be355909a0e5235924b7b558"} Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.412764 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hzr72" Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.481454 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c513f134-e86b-47ac-9d13-5b92975be947-dns-svc\") pod \"c513f134-e86b-47ac-9d13-5b92975be947\" (UID: \"c513f134-e86b-47ac-9d13-5b92975be947\") " Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.481523 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c513f134-e86b-47ac-9d13-5b92975be947-config\") pod \"c513f134-e86b-47ac-9d13-5b92975be947\" (UID: \"c513f134-e86b-47ac-9d13-5b92975be947\") " Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.481581 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7m7q\" (UniqueName: \"kubernetes.io/projected/c513f134-e86b-47ac-9d13-5b92975be947-kube-api-access-w7m7q\") pod \"c513f134-e86b-47ac-9d13-5b92975be947\" (UID: \"c513f134-e86b-47ac-9d13-5b92975be947\") " Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.488870 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c513f134-e86b-47ac-9d13-5b92975be947-kube-api-access-w7m7q" (OuterVolumeSpecName: "kube-api-access-w7m7q") pod "c513f134-e86b-47ac-9d13-5b92975be947" (UID: "c513f134-e86b-47ac-9d13-5b92975be947"). InnerVolumeSpecName "kube-api-access-w7m7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.531571 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566904-l8zrn"] Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.535233 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c513f134-e86b-47ac-9d13-5b92975be947-config" (OuterVolumeSpecName: "config") pod "c513f134-e86b-47ac-9d13-5b92975be947" (UID: "c513f134-e86b-47ac-9d13-5b92975be947"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:02 crc kubenswrapper[4973]: W0320 13:44:02.554103 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0592b321_65ba_4ab3_987b_8384ec9ee7e2.slice/crio-2569c20df14fc924d1e6c5a27ca58a9beab916f77d25fd662dcfda07a3b90fa2 WatchSource:0}: Error finding container 2569c20df14fc924d1e6c5a27ca58a9beab916f77d25fd662dcfda07a3b90fa2: Status 404 returned error can't find the container with id 2569c20df14fc924d1e6c5a27ca58a9beab916f77d25fd662dcfda07a3b90fa2 Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.565622 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c513f134-e86b-47ac-9d13-5b92975be947-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c513f134-e86b-47ac-9d13-5b92975be947" (UID: "c513f134-e86b-47ac-9d13-5b92975be947"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.583727 4973 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c513f134-e86b-47ac-9d13-5b92975be947-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.583756 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c513f134-e86b-47ac-9d13-5b92975be947-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.583766 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7m7q\" (UniqueName: \"kubernetes.io/projected/c513f134-e86b-47ac-9d13-5b92975be947-kube-api-access-w7m7q\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.586219 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qjvlf" event={"ID":"dbc7e778-9029-42ce-9a8e-e76636aad6a5","Type":"ContainerStarted","Data":"0f4d55f6330cb549a3f06cc7c71a6b1608668efdd56a700cd9417204745cd673"} Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.599386 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"feb13208-e959-4cb3-8d6f-185bf075036c","Type":"ContainerStarted","Data":"8f832cb62148f26e7ea0c47f3be97c9137d5eb9a93fd107cf36b5fca01a104d7"} Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.609274 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566904-l8zrn" event={"ID":"0592b321-65ba-4ab3-987b-8384ec9ee7e2","Type":"ContainerStarted","Data":"2569c20df14fc924d1e6c5a27ca58a9beab916f77d25fd662dcfda07a3b90fa2"} Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.612441 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a","Type":"ContainerStarted","Data":"6b4c6cad03c15222033a69f8bb5e8982ef2edd302482ab9cb0a4f2c6a8fa5fa4"} Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.612746 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-qjvlf" podStartSLOduration=2.433738805 podStartE2EDuration="7.612725921s" podCreationTimestamp="2026-03-20 13:43:55 +0000 UTC" firstStartedPulling="2026-03-20 13:43:56.767365496 +0000 UTC m=+1357.511035240" lastFinishedPulling="2026-03-20 13:44:01.946352612 +0000 UTC m=+1362.690022356" observedRunningTime="2026-03-20 13:44:02.607718104 +0000 UTC m=+1363.351387848" watchObservedRunningTime="2026-03-20 13:44:02.612725921 +0000 UTC m=+1363.356395665" Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.619857 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"27fed14c-9051-4d46-80d5-badf224805a9","Type":"ContainerStarted","Data":"896517b7ebc700d1ed501fe4fd56b18472ac17a48bafa5e5f2cc8d7a6847cfc7"} Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.622918 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"4bdbb8eb-c36d-43f0-a705-3b3e59128b7f","Type":"ContainerStarted","Data":"adff20d445c38fa3b9813da689f05c15bb1d6f83a131fff94e9388ecad3b7923"} Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.632035 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-hzr72" event={"ID":"c513f134-e86b-47ac-9d13-5b92975be947","Type":"ContainerDied","Data":"1823cd06704869fc3018b78115ce24a8269d73322216c8f4fdf81425ae7130c9"} Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.632100 4973 scope.go:117] "RemoveContainer" containerID="f808256dc23d67b39e8559ba701d28a699d976e4be355909a0e5235924b7b558" Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.632298 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-hzr72" Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.644102 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=18.281313974 podStartE2EDuration="40.644080497s" podCreationTimestamp="2026-03-20 13:43:22 +0000 UTC" firstStartedPulling="2026-03-20 13:43:39.534831999 +0000 UTC m=+1340.278501743" lastFinishedPulling="2026-03-20 13:44:01.897598522 +0000 UTC m=+1362.641268266" observedRunningTime="2026-03-20 13:44:02.629957402 +0000 UTC m=+1363.373627146" watchObservedRunningTime="2026-03-20 13:44:02.644080497 +0000 UTC m=+1363.387750251" Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.655245 4973 scope.go:117] "RemoveContainer" containerID="7767ee741ad44167df01fca209ddb897e2520004858f64bde7d50c3e6b7dd04a" Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.680196 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=37.677835026 podStartE2EDuration="47.680176431s" podCreationTimestamp="2026-03-20 13:43:15 +0000 UTC" firstStartedPulling="2026-03-20 13:43:36.820137361 +0000 UTC m=+1337.563807115" lastFinishedPulling="2026-03-20 13:43:46.822478776 +0000 UTC m=+1347.566148520" observedRunningTime="2026-03-20 13:44:02.650166153 +0000 UTC m=+1363.393835897" watchObservedRunningTime="2026-03-20 13:44:02.680176431 +0000 UTC m=+1363.423846175" Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.694324 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=35.504299961 podStartE2EDuration="46.694289908s" podCreationTimestamp="2026-03-20 13:43:16 +0000 UTC" firstStartedPulling="2026-03-20 13:43:35.260747853 +0000 UTC m=+1336.004417597" lastFinishedPulling="2026-03-20 13:43:46.4507378 +0000 UTC m=+1347.194407544" observedRunningTime="2026-03-20 13:44:02.670169038 +0000 UTC m=+1363.413838782" watchObservedRunningTime="2026-03-20 13:44:02.694289908 +0000 UTC m=+1363.437959652" Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.726242 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=14.297110116 podStartE2EDuration="36.726221248s" podCreationTimestamp="2026-03-20 13:43:26 +0000 UTC" firstStartedPulling="2026-03-20 13:43:39.486633955 +0000 UTC m=+1340.230303689" lastFinishedPulling="2026-03-20 13:44:01.915745077 +0000 UTC m=+1362.659414821" observedRunningTime="2026-03-20 13:44:02.707498008 +0000 UTC m=+1363.451167752" watchObservedRunningTime="2026-03-20 13:44:02.726221248 +0000 UTC m=+1363.469890992" Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.735767 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hzr72"] Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.743456 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-hzr72"] Mar 20 13:44:02 crc kubenswrapper[4973]: I0320 13:44:02.815890 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 20 13:44:03 crc kubenswrapper[4973]: I0320 13:44:03.507654 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 20 13:44:03 crc kubenswrapper[4973]: I0320 13:44:03.550517 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 20 13:44:03 crc kubenswrapper[4973]: I0320 13:44:03.653072 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 20 13:44:03 crc kubenswrapper[4973]: I0320 13:44:03.690819 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 20 13:44:03 crc kubenswrapper[4973]: I0320 13:44:03.815964 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 20 13:44:03 crc kubenswrapper[4973]: I0320 13:44:03.905770 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.048989 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c513f134-e86b-47ac-9d13-5b92975be947" path="/var/lib/kubelet/pods/c513f134-e86b-47ac-9d13-5b92975be947/volumes" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.057642 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-n2kr8"] Mar 20 13:44:04 crc kubenswrapper[4973]: E0320 13:44:04.058129 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c513f134-e86b-47ac-9d13-5b92975be947" containerName="init" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.058145 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="c513f134-e86b-47ac-9d13-5b92975be947" containerName="init" Mar 20 13:44:04 crc kubenswrapper[4973]: E0320 13:44:04.058163 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c513f134-e86b-47ac-9d13-5b92975be947" containerName="dnsmasq-dns" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.058172 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="c513f134-e86b-47ac-9d13-5b92975be947" containerName="dnsmasq-dns" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.058396 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="c513f134-e86b-47ac-9d13-5b92975be947" containerName="dnsmasq-dns" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.065386 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-n2kr8"] Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.065527 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-n2kr8" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.068023 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.074557 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-l28sq"] Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.080963 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-l28sq" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.085589 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.112398 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-l28sq"] Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.131954 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/704789c1-d8a3-4773-8b30-754120b59be4-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-n2kr8\" (UID: \"704789c1-d8a3-4773-8b30-754120b59be4\") " pod="openstack/dnsmasq-dns-74f6f696b9-n2kr8" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.132040 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwpzm\" (UniqueName: \"kubernetes.io/projected/704789c1-d8a3-4773-8b30-754120b59be4-kube-api-access-hwpzm\") pod \"dnsmasq-dns-74f6f696b9-n2kr8\" (UID: \"704789c1-d8a3-4773-8b30-754120b59be4\") " pod="openstack/dnsmasq-dns-74f6f696b9-n2kr8" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.132814 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/704789c1-d8a3-4773-8b30-754120b59be4-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-n2kr8\" (UID: \"704789c1-d8a3-4773-8b30-754120b59be4\") " pod="openstack/dnsmasq-dns-74f6f696b9-n2kr8" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.132861 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/704789c1-d8a3-4773-8b30-754120b59be4-config\") pod \"dnsmasq-dns-74f6f696b9-n2kr8\" (UID: \"704789c1-d8a3-4773-8b30-754120b59be4\") " pod="openstack/dnsmasq-dns-74f6f696b9-n2kr8" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.235167 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2425ebf1-c2bc-4b94-b3aa-473fd4690168-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-l28sq\" (UID: \"2425ebf1-c2bc-4b94-b3aa-473fd4690168\") " pod="openstack/ovn-controller-metrics-l28sq" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.235320 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/704789c1-d8a3-4773-8b30-754120b59be4-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-n2kr8\" (UID: \"704789c1-d8a3-4773-8b30-754120b59be4\") " pod="openstack/dnsmasq-dns-74f6f696b9-n2kr8" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.235387 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwpzm\" (UniqueName: \"kubernetes.io/projected/704789c1-d8a3-4773-8b30-754120b59be4-kube-api-access-hwpzm\") pod \"dnsmasq-dns-74f6f696b9-n2kr8\" (UID: \"704789c1-d8a3-4773-8b30-754120b59be4\") " pod="openstack/dnsmasq-dns-74f6f696b9-n2kr8" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.235432 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2425ebf1-c2bc-4b94-b3aa-473fd4690168-ovn-rundir\") pod \"ovn-controller-metrics-l28sq\" (UID: \"2425ebf1-c2bc-4b94-b3aa-473fd4690168\") " pod="openstack/ovn-controller-metrics-l28sq" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.235469 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q68qx\" (UniqueName: \"kubernetes.io/projected/2425ebf1-c2bc-4b94-b3aa-473fd4690168-kube-api-access-q68qx\") pod \"ovn-controller-metrics-l28sq\" (UID: \"2425ebf1-c2bc-4b94-b3aa-473fd4690168\") " pod="openstack/ovn-controller-metrics-l28sq" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.235526 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2425ebf1-c2bc-4b94-b3aa-473fd4690168-config\") pod \"ovn-controller-metrics-l28sq\" (UID: \"2425ebf1-c2bc-4b94-b3aa-473fd4690168\") " pod="openstack/ovn-controller-metrics-l28sq" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.235564 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/704789c1-d8a3-4773-8b30-754120b59be4-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-n2kr8\" (UID: \"704789c1-d8a3-4773-8b30-754120b59be4\") " pod="openstack/dnsmasq-dns-74f6f696b9-n2kr8" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.235593 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/704789c1-d8a3-4773-8b30-754120b59be4-config\") pod \"dnsmasq-dns-74f6f696b9-n2kr8\" (UID: \"704789c1-d8a3-4773-8b30-754120b59be4\") " pod="openstack/dnsmasq-dns-74f6f696b9-n2kr8" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.235643 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2425ebf1-c2bc-4b94-b3aa-473fd4690168-combined-ca-bundle\") pod \"ovn-controller-metrics-l28sq\" (UID: \"2425ebf1-c2bc-4b94-b3aa-473fd4690168\") " pod="openstack/ovn-controller-metrics-l28sq" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.235687 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2425ebf1-c2bc-4b94-b3aa-473fd4690168-ovs-rundir\") pod \"ovn-controller-metrics-l28sq\" (UID: \"2425ebf1-c2bc-4b94-b3aa-473fd4690168\") " pod="openstack/ovn-controller-metrics-l28sq" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.236817 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/704789c1-d8a3-4773-8b30-754120b59be4-ovsdbserver-nb\") pod \"dnsmasq-dns-74f6f696b9-n2kr8\" (UID: \"704789c1-d8a3-4773-8b30-754120b59be4\") " pod="openstack/dnsmasq-dns-74f6f696b9-n2kr8" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.237548 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/704789c1-d8a3-4773-8b30-754120b59be4-dns-svc\") pod \"dnsmasq-dns-74f6f696b9-n2kr8\" (UID: \"704789c1-d8a3-4773-8b30-754120b59be4\") " pod="openstack/dnsmasq-dns-74f6f696b9-n2kr8" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.237741 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/704789c1-d8a3-4773-8b30-754120b59be4-config\") pod \"dnsmasq-dns-74f6f696b9-n2kr8\" (UID: \"704789c1-d8a3-4773-8b30-754120b59be4\") " pod="openstack/dnsmasq-dns-74f6f696b9-n2kr8" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.259464 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwpzm\" (UniqueName: \"kubernetes.io/projected/704789c1-d8a3-4773-8b30-754120b59be4-kube-api-access-hwpzm\") pod \"dnsmasq-dns-74f6f696b9-n2kr8\" (UID: \"704789c1-d8a3-4773-8b30-754120b59be4\") " pod="openstack/dnsmasq-dns-74f6f696b9-n2kr8" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.339871 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2425ebf1-c2bc-4b94-b3aa-473fd4690168-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-l28sq\" (UID: \"2425ebf1-c2bc-4b94-b3aa-473fd4690168\") " pod="openstack/ovn-controller-metrics-l28sq" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.340371 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2425ebf1-c2bc-4b94-b3aa-473fd4690168-ovn-rundir\") pod \"ovn-controller-metrics-l28sq\" (UID: \"2425ebf1-c2bc-4b94-b3aa-473fd4690168\") " pod="openstack/ovn-controller-metrics-l28sq" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.340412 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q68qx\" (UniqueName: \"kubernetes.io/projected/2425ebf1-c2bc-4b94-b3aa-473fd4690168-kube-api-access-q68qx\") pod \"ovn-controller-metrics-l28sq\" (UID: \"2425ebf1-c2bc-4b94-b3aa-473fd4690168\") " pod="openstack/ovn-controller-metrics-l28sq" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.340475 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2425ebf1-c2bc-4b94-b3aa-473fd4690168-config\") pod \"ovn-controller-metrics-l28sq\" (UID: \"2425ebf1-c2bc-4b94-b3aa-473fd4690168\") " pod="openstack/ovn-controller-metrics-l28sq" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.340547 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2425ebf1-c2bc-4b94-b3aa-473fd4690168-combined-ca-bundle\") pod \"ovn-controller-metrics-l28sq\" (UID: \"2425ebf1-c2bc-4b94-b3aa-473fd4690168\") " pod="openstack/ovn-controller-metrics-l28sq" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.340598 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2425ebf1-c2bc-4b94-b3aa-473fd4690168-ovs-rundir\") pod \"ovn-controller-metrics-l28sq\" (UID: \"2425ebf1-c2bc-4b94-b3aa-473fd4690168\") " pod="openstack/ovn-controller-metrics-l28sq" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.340814 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/2425ebf1-c2bc-4b94-b3aa-473fd4690168-ovs-rundir\") pod \"ovn-controller-metrics-l28sq\" (UID: \"2425ebf1-c2bc-4b94-b3aa-473fd4690168\") " pod="openstack/ovn-controller-metrics-l28sq" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.341840 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2425ebf1-c2bc-4b94-b3aa-473fd4690168-config\") pod \"ovn-controller-metrics-l28sq\" (UID: \"2425ebf1-c2bc-4b94-b3aa-473fd4690168\") " pod="openstack/ovn-controller-metrics-l28sq" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.345428 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/2425ebf1-c2bc-4b94-b3aa-473fd4690168-ovn-rundir\") pod \"ovn-controller-metrics-l28sq\" (UID: \"2425ebf1-c2bc-4b94-b3aa-473fd4690168\") " pod="openstack/ovn-controller-metrics-l28sq" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.346923 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2425ebf1-c2bc-4b94-b3aa-473fd4690168-combined-ca-bundle\") pod \"ovn-controller-metrics-l28sq\" (UID: \"2425ebf1-c2bc-4b94-b3aa-473fd4690168\") " pod="openstack/ovn-controller-metrics-l28sq" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.347232 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/2425ebf1-c2bc-4b94-b3aa-473fd4690168-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-l28sq\" (UID: \"2425ebf1-c2bc-4b94-b3aa-473fd4690168\") " pod="openstack/ovn-controller-metrics-l28sq" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.359448 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q68qx\" (UniqueName: \"kubernetes.io/projected/2425ebf1-c2bc-4b94-b3aa-473fd4690168-kube-api-access-q68qx\") pod \"ovn-controller-metrics-l28sq\" (UID: \"2425ebf1-c2bc-4b94-b3aa-473fd4690168\") " pod="openstack/ovn-controller-metrics-l28sq" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.422435 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-n2kr8" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.435948 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-l28sq" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.475811 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-n2kr8"] Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.549546 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-698758b865-j6blp"] Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.551870 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-j6blp" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.568588 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.583827 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-j6blp"] Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.654008 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74c96d00-cb67-4436-a620-31f29aa6e358-config\") pod \"dnsmasq-dns-698758b865-j6blp\" (UID: \"74c96d00-cb67-4436-a620-31f29aa6e358\") " pod="openstack/dnsmasq-dns-698758b865-j6blp" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.654380 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dkzh\" (UniqueName: \"kubernetes.io/projected/74c96d00-cb67-4436-a620-31f29aa6e358-kube-api-access-6dkzh\") pod \"dnsmasq-dns-698758b865-j6blp\" (UID: \"74c96d00-cb67-4436-a620-31f29aa6e358\") " pod="openstack/dnsmasq-dns-698758b865-j6blp" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.654465 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74c96d00-cb67-4436-a620-31f29aa6e358-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-j6blp\" (UID: \"74c96d00-cb67-4436-a620-31f29aa6e358\") " pod="openstack/dnsmasq-dns-698758b865-j6blp" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.654591 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74c96d00-cb67-4436-a620-31f29aa6e358-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-j6blp\" (UID: \"74c96d00-cb67-4436-a620-31f29aa6e358\") " pod="openstack/dnsmasq-dns-698758b865-j6blp" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.654648 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74c96d00-cb67-4436-a620-31f29aa6e358-dns-svc\") pod \"dnsmasq-dns-698758b865-j6blp\" (UID: \"74c96d00-cb67-4436-a620-31f29aa6e358\") " pod="openstack/dnsmasq-dns-698758b865-j6blp" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.685681 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566904-l8zrn" event={"ID":"0592b321-65ba-4ab3-987b-8384ec9ee7e2","Type":"ContainerStarted","Data":"aca291581b66fd8cf40546b1ff467366c606359b6459beed3da383d55cc23e26"} Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.722231 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566904-l8zrn" podStartSLOduration=3.689050029 podStartE2EDuration="4.722208399s" podCreationTimestamp="2026-03-20 13:44:00 +0000 UTC" firstStartedPulling="2026-03-20 13:44:02.559402795 +0000 UTC m=+1363.303072539" lastFinishedPulling="2026-03-20 13:44:03.592561155 +0000 UTC m=+1364.336230909" observedRunningTime="2026-03-20 13:44:04.718306172 +0000 UTC m=+1365.461975936" watchObservedRunningTime="2026-03-20 13:44:04.722208399 +0000 UTC m=+1365.465878133" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.756935 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74c96d00-cb67-4436-a620-31f29aa6e358-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-j6blp\" (UID: \"74c96d00-cb67-4436-a620-31f29aa6e358\") " pod="openstack/dnsmasq-dns-698758b865-j6blp" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.756996 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74c96d00-cb67-4436-a620-31f29aa6e358-dns-svc\") pod \"dnsmasq-dns-698758b865-j6blp\" (UID: \"74c96d00-cb67-4436-a620-31f29aa6e358\") " pod="openstack/dnsmasq-dns-698758b865-j6blp" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.757072 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74c96d00-cb67-4436-a620-31f29aa6e358-config\") pod \"dnsmasq-dns-698758b865-j6blp\" (UID: \"74c96d00-cb67-4436-a620-31f29aa6e358\") " pod="openstack/dnsmasq-dns-698758b865-j6blp" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.757108 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dkzh\" (UniqueName: \"kubernetes.io/projected/74c96d00-cb67-4436-a620-31f29aa6e358-kube-api-access-6dkzh\") pod \"dnsmasq-dns-698758b865-j6blp\" (UID: \"74c96d00-cb67-4436-a620-31f29aa6e358\") " pod="openstack/dnsmasq-dns-698758b865-j6blp" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.757226 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74c96d00-cb67-4436-a620-31f29aa6e358-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-j6blp\" (UID: \"74c96d00-cb67-4436-a620-31f29aa6e358\") " pod="openstack/dnsmasq-dns-698758b865-j6blp" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.761501 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74c96d00-cb67-4436-a620-31f29aa6e358-ovsdbserver-nb\") pod \"dnsmasq-dns-698758b865-j6blp\" (UID: \"74c96d00-cb67-4436-a620-31f29aa6e358\") " pod="openstack/dnsmasq-dns-698758b865-j6blp" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.764712 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74c96d00-cb67-4436-a620-31f29aa6e358-ovsdbserver-sb\") pod \"dnsmasq-dns-698758b865-j6blp\" (UID: \"74c96d00-cb67-4436-a620-31f29aa6e358\") " pod="openstack/dnsmasq-dns-698758b865-j6blp" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.767243 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74c96d00-cb67-4436-a620-31f29aa6e358-dns-svc\") pod \"dnsmasq-dns-698758b865-j6blp\" (UID: \"74c96d00-cb67-4436-a620-31f29aa6e358\") " pod="openstack/dnsmasq-dns-698758b865-j6blp" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.767773 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74c96d00-cb67-4436-a620-31f29aa6e358-config\") pod \"dnsmasq-dns-698758b865-j6blp\" (UID: \"74c96d00-cb67-4436-a620-31f29aa6e358\") " pod="openstack/dnsmasq-dns-698758b865-j6blp" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.814234 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dkzh\" (UniqueName: \"kubernetes.io/projected/74c96d00-cb67-4436-a620-31f29aa6e358-kube-api-access-6dkzh\") pod \"dnsmasq-dns-698758b865-j6blp\" (UID: \"74c96d00-cb67-4436-a620-31f29aa6e358\") " pod="openstack/dnsmasq-dns-698758b865-j6blp" Mar 20 13:44:04 crc kubenswrapper[4973]: I0320 13:44:04.900048 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-j6blp" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.019090 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.151097 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-n2kr8"] Mar 20 13:44:05 crc kubenswrapper[4973]: W0320 13:44:05.180992 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2425ebf1_c2bc_4b94_b3aa_473fd4690168.slice/crio-f211882466c08cf3696b25425ca6f7b243a93238cb6a8b12f59a1b9d9d97ff32 WatchSource:0}: Error finding container f211882466c08cf3696b25425ca6f7b243a93238cb6a8b12f59a1b9d9d97ff32: Status 404 returned error can't find the container with id f211882466c08cf3696b25425ca6f7b243a93238cb6a8b12f59a1b9d9d97ff32 Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.210377 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-l28sq"] Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.294825 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.297246 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.307986 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.308043 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-qc8gq" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.308295 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.308458 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.347018 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.382471 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5936fcd4-c4a1-423b-b026-9a86d9964154-scripts\") pod \"ovn-northd-0\" (UID: \"5936fcd4-c4a1-423b-b026-9a86d9964154\") " pod="openstack/ovn-northd-0" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.382544 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5936fcd4-c4a1-423b-b026-9a86d9964154-config\") pod \"ovn-northd-0\" (UID: \"5936fcd4-c4a1-423b-b026-9a86d9964154\") " pod="openstack/ovn-northd-0" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.382599 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5936fcd4-c4a1-423b-b026-9a86d9964154-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5936fcd4-c4a1-423b-b026-9a86d9964154\") " pod="openstack/ovn-northd-0" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.382631 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5936fcd4-c4a1-423b-b026-9a86d9964154-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5936fcd4-c4a1-423b-b026-9a86d9964154\") " pod="openstack/ovn-northd-0" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.382670 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5936fcd4-c4a1-423b-b026-9a86d9964154-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5936fcd4-c4a1-423b-b026-9a86d9964154\") " pod="openstack/ovn-northd-0" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.382695 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5936fcd4-c4a1-423b-b026-9a86d9964154-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5936fcd4-c4a1-423b-b026-9a86d9964154\") " pod="openstack/ovn-northd-0" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.382767 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds2qs\" (UniqueName: \"kubernetes.io/projected/5936fcd4-c4a1-423b-b026-9a86d9964154-kube-api-access-ds2qs\") pod \"ovn-northd-0\" (UID: \"5936fcd4-c4a1-423b-b026-9a86d9964154\") " pod="openstack/ovn-northd-0" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.494298 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5936fcd4-c4a1-423b-b026-9a86d9964154-scripts\") pod \"ovn-northd-0\" (UID: \"5936fcd4-c4a1-423b-b026-9a86d9964154\") " pod="openstack/ovn-northd-0" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.494378 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5936fcd4-c4a1-423b-b026-9a86d9964154-config\") pod \"ovn-northd-0\" (UID: \"5936fcd4-c4a1-423b-b026-9a86d9964154\") " pod="openstack/ovn-northd-0" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.494479 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5936fcd4-c4a1-423b-b026-9a86d9964154-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5936fcd4-c4a1-423b-b026-9a86d9964154\") " pod="openstack/ovn-northd-0" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.494509 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5936fcd4-c4a1-423b-b026-9a86d9964154-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5936fcd4-c4a1-423b-b026-9a86d9964154\") " pod="openstack/ovn-northd-0" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.494561 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5936fcd4-c4a1-423b-b026-9a86d9964154-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5936fcd4-c4a1-423b-b026-9a86d9964154\") " pod="openstack/ovn-northd-0" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.494605 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5936fcd4-c4a1-423b-b026-9a86d9964154-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5936fcd4-c4a1-423b-b026-9a86d9964154\") " pod="openstack/ovn-northd-0" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.494735 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ds2qs\" (UniqueName: \"kubernetes.io/projected/5936fcd4-c4a1-423b-b026-9a86d9964154-kube-api-access-ds2qs\") pod \"ovn-northd-0\" (UID: \"5936fcd4-c4a1-423b-b026-9a86d9964154\") " pod="openstack/ovn-northd-0" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.496131 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5936fcd4-c4a1-423b-b026-9a86d9964154-scripts\") pod \"ovn-northd-0\" (UID: \"5936fcd4-c4a1-423b-b026-9a86d9964154\") " pod="openstack/ovn-northd-0" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.496860 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5936fcd4-c4a1-423b-b026-9a86d9964154-config\") pod \"ovn-northd-0\" (UID: \"5936fcd4-c4a1-423b-b026-9a86d9964154\") " pod="openstack/ovn-northd-0" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.507540 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5936fcd4-c4a1-423b-b026-9a86d9964154-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5936fcd4-c4a1-423b-b026-9a86d9964154\") " pod="openstack/ovn-northd-0" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.507797 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5936fcd4-c4a1-423b-b026-9a86d9964154-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5936fcd4-c4a1-423b-b026-9a86d9964154\") " pod="openstack/ovn-northd-0" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.507839 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5936fcd4-c4a1-423b-b026-9a86d9964154-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5936fcd4-c4a1-423b-b026-9a86d9964154\") " pod="openstack/ovn-northd-0" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.508706 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5936fcd4-c4a1-423b-b026-9a86d9964154-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5936fcd4-c4a1-423b-b026-9a86d9964154\") " pod="openstack/ovn-northd-0" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.521220 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds2qs\" (UniqueName: \"kubernetes.io/projected/5936fcd4-c4a1-423b-b026-9a86d9964154-kube-api-access-ds2qs\") pod \"ovn-northd-0\" (UID: \"5936fcd4-c4a1-423b-b026-9a86d9964154\") " pod="openstack/ovn-northd-0" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.683469 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-698758b865-j6blp"] Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.696281 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.704688 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-j6blp" event={"ID":"74c96d00-cb67-4436-a620-31f29aa6e358","Type":"ContainerStarted","Data":"f253fa348636fa0b871e6f71a7d260f293a88d1749aeb7e5b8aacd0e1912e7fe"} Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.709945 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-n2kr8" event={"ID":"704789c1-d8a3-4773-8b30-754120b59be4","Type":"ContainerStarted","Data":"23e1bbbd98a153444dd0287adaf98b15e5aca07bf21abb01078945115275555b"} Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.713152 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-l28sq" event={"ID":"2425ebf1-c2bc-4b94-b3aa-473fd4690168","Type":"ContainerStarted","Data":"f211882466c08cf3696b25425ca6f7b243a93238cb6a8b12f59a1b9d9d97ff32"} Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.720783 4973 generic.go:334] "Generic (PLEG): container finished" podID="0592b321-65ba-4ab3-987b-8384ec9ee7e2" containerID="aca291581b66fd8cf40546b1ff467366c606359b6459beed3da383d55cc23e26" exitCode=0 Mar 20 13:44:05 crc kubenswrapper[4973]: I0320 13:44:05.720845 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566904-l8zrn" event={"ID":"0592b321-65ba-4ab3-987b-8384ec9ee7e2","Type":"ContainerDied","Data":"aca291581b66fd8cf40546b1ff467366c606359b6459beed3da383d55cc23e26"} Mar 20 13:44:06 crc kubenswrapper[4973]: I0320 13:44:06.249582 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 20 13:44:06 crc kubenswrapper[4973]: W0320 13:44:06.255882 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5936fcd4_c4a1_423b_b026_9a86d9964154.slice/crio-251258181b2d9776e2d740699862b1c9470b2a65ed61073abb311a04c57925f6 WatchSource:0}: Error finding container 251258181b2d9776e2d740699862b1c9470b2a65ed61073abb311a04c57925f6: Status 404 returned error can't find the container with id 251258181b2d9776e2d740699862b1c9470b2a65ed61073abb311a04c57925f6 Mar 20 13:44:06 crc kubenswrapper[4973]: I0320 13:44:06.440787 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 20 13:44:06 crc kubenswrapper[4973]: I0320 13:44:06.441207 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 20 13:44:06 crc kubenswrapper[4973]: I0320 13:44:06.732858 4973 generic.go:334] "Generic (PLEG): container finished" podID="74c96d00-cb67-4436-a620-31f29aa6e358" containerID="3e9afe9d9b18aa750a8743da1a3b1c54904b1d5358c06debce83cdafa90239a5" exitCode=0 Mar 20 13:44:06 crc kubenswrapper[4973]: I0320 13:44:06.732917 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-j6blp" event={"ID":"74c96d00-cb67-4436-a620-31f29aa6e358","Type":"ContainerDied","Data":"3e9afe9d9b18aa750a8743da1a3b1c54904b1d5358c06debce83cdafa90239a5"} Mar 20 13:44:06 crc kubenswrapper[4973]: I0320 13:44:06.741684 4973 generic.go:334] "Generic (PLEG): container finished" podID="704789c1-d8a3-4773-8b30-754120b59be4" containerID="a85c29833dc8e67ab2da25eff91a21305893ac91ddb4372e7edbdc51bdfa397f" exitCode=0 Mar 20 13:44:06 crc kubenswrapper[4973]: I0320 13:44:06.741941 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-n2kr8" event={"ID":"704789c1-d8a3-4773-8b30-754120b59be4","Type":"ContainerDied","Data":"a85c29833dc8e67ab2da25eff91a21305893ac91ddb4372e7edbdc51bdfa397f"} Mar 20 13:44:06 crc kubenswrapper[4973]: I0320 13:44:06.743899 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-l28sq" event={"ID":"2425ebf1-c2bc-4b94-b3aa-473fd4690168","Type":"ContainerStarted","Data":"99d7428492b19354a5654c4a2b51d68d94bbd2655c0e79830a757843a791fa49"} Mar 20 13:44:06 crc kubenswrapper[4973]: I0320 13:44:06.747178 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5936fcd4-c4a1-423b-b026-9a86d9964154","Type":"ContainerStarted","Data":"251258181b2d9776e2d740699862b1c9470b2a65ed61073abb311a04c57925f6"} Mar 20 13:44:06 crc kubenswrapper[4973]: I0320 13:44:06.825393 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-l28sq" podStartSLOduration=2.825370254 podStartE2EDuration="2.825370254s" podCreationTimestamp="2026-03-20 13:44:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:06.822332141 +0000 UTC m=+1367.566001885" watchObservedRunningTime="2026-03-20 13:44:06.825370254 +0000 UTC m=+1367.569039998" Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.427097 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.436403 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566904-l8zrn" Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.455037 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5th59\" (UniqueName: \"kubernetes.io/projected/0592b321-65ba-4ab3-987b-8384ec9ee7e2-kube-api-access-5th59\") pod \"0592b321-65ba-4ab3-987b-8384ec9ee7e2\" (UID: \"0592b321-65ba-4ab3-987b-8384ec9ee7e2\") " Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.480624 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0592b321-65ba-4ab3-987b-8384ec9ee7e2-kube-api-access-5th59" (OuterVolumeSpecName: "kube-api-access-5th59") pod "0592b321-65ba-4ab3-987b-8384ec9ee7e2" (UID: "0592b321-65ba-4ab3-987b-8384ec9ee7e2"). InnerVolumeSpecName "kube-api-access-5th59". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.558931 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5th59\" (UniqueName: \"kubernetes.io/projected/0592b321-65ba-4ab3-987b-8384ec9ee7e2-kube-api-access-5th59\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.653928 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-n2kr8" Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.660028 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwpzm\" (UniqueName: \"kubernetes.io/projected/704789c1-d8a3-4773-8b30-754120b59be4-kube-api-access-hwpzm\") pod \"704789c1-d8a3-4773-8b30-754120b59be4\" (UID: \"704789c1-d8a3-4773-8b30-754120b59be4\") " Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.660070 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/704789c1-d8a3-4773-8b30-754120b59be4-config\") pod \"704789c1-d8a3-4773-8b30-754120b59be4\" (UID: \"704789c1-d8a3-4773-8b30-754120b59be4\") " Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.660135 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/704789c1-d8a3-4773-8b30-754120b59be4-dns-svc\") pod \"704789c1-d8a3-4773-8b30-754120b59be4\" (UID: \"704789c1-d8a3-4773-8b30-754120b59be4\") " Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.660178 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/704789c1-d8a3-4773-8b30-754120b59be4-ovsdbserver-nb\") pod \"704789c1-d8a3-4773-8b30-754120b59be4\" (UID: \"704789c1-d8a3-4773-8b30-754120b59be4\") " Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.677939 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/704789c1-d8a3-4773-8b30-754120b59be4-kube-api-access-hwpzm" (OuterVolumeSpecName: "kube-api-access-hwpzm") pod "704789c1-d8a3-4773-8b30-754120b59be4" (UID: "704789c1-d8a3-4773-8b30-754120b59be4"). InnerVolumeSpecName "kube-api-access-hwpzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.686059 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/704789c1-d8a3-4773-8b30-754120b59be4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "704789c1-d8a3-4773-8b30-754120b59be4" (UID: "704789c1-d8a3-4773-8b30-754120b59be4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.712073 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/704789c1-d8a3-4773-8b30-754120b59be4-config" (OuterVolumeSpecName: "config") pod "704789c1-d8a3-4773-8b30-754120b59be4" (UID: "704789c1-d8a3-4773-8b30-754120b59be4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.712759 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/704789c1-d8a3-4773-8b30-754120b59be4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "704789c1-d8a3-4773-8b30-754120b59be4" (UID: "704789c1-d8a3-4773-8b30-754120b59be4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:07 crc kubenswrapper[4973]: E0320 13:44:07.764014 4973 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 20 13:44:07 crc kubenswrapper[4973]: E0320 13:44:07.765460 4973 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 20 13:44:07 crc kubenswrapper[4973]: E0320 13:44:07.765567 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-etc-swift podName:8184c0e7-f9ef-48a3-9461-5cc6c1188e6b nodeName:}" failed. No retries permitted until 2026-03-20 13:44:23.765546825 +0000 UTC m=+1384.509216569 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-etc-swift") pod "swift-storage-0" (UID: "8184c0e7-f9ef-48a3-9461-5cc6c1188e6b") : configmap "swift-ring-files" not found Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.766194 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74f6f696b9-n2kr8" event={"ID":"704789c1-d8a3-4773-8b30-754120b59be4","Type":"ContainerDied","Data":"23e1bbbd98a153444dd0287adaf98b15e5aca07bf21abb01078945115275555b"} Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.766313 4973 scope.go:117] "RemoveContainer" containerID="a85c29833dc8e67ab2da25eff91a21305893ac91ddb4372e7edbdc51bdfa397f" Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.766486 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74f6f696b9-n2kr8" Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.770669 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-etc-swift\") pod \"swift-storage-0\" (UID: \"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b\") " pod="openstack/swift-storage-0" Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.771199 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwpzm\" (UniqueName: \"kubernetes.io/projected/704789c1-d8a3-4773-8b30-754120b59be4-kube-api-access-hwpzm\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.771218 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/704789c1-d8a3-4773-8b30-754120b59be4-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.771228 4973 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/704789c1-d8a3-4773-8b30-754120b59be4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.771237 4973 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/704789c1-d8a3-4773-8b30-754120b59be4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.771495 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566904-l8zrn" event={"ID":"0592b321-65ba-4ab3-987b-8384ec9ee7e2","Type":"ContainerDied","Data":"2569c20df14fc924d1e6c5a27ca58a9beab916f77d25fd662dcfda07a3b90fa2"} Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.771526 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2569c20df14fc924d1e6c5a27ca58a9beab916f77d25fd662dcfda07a3b90fa2" Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.771581 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566904-l8zrn" Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.781929 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-j6blp" event={"ID":"74c96d00-cb67-4436-a620-31f29aa6e358","Type":"ContainerStarted","Data":"d29b755af738150402a347d490bd2b8a5cf113faa9c2a9102a321ef3bb04c4cb"} Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.783413 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-698758b865-j6blp" Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.810419 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566898-tm499"] Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.837864 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566898-tm499"] Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.852034 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-698758b865-j6blp" podStartSLOduration=3.852016405 podStartE2EDuration="3.852016405s" podCreationTimestamp="2026-03-20 13:44:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:07.81039787 +0000 UTC m=+1368.554067614" watchObservedRunningTime="2026-03-20 13:44:07.852016405 +0000 UTC m=+1368.595686149" Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.862673 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.862947 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.911779 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-7794c74589-2c6lq" podUID="4a8f8e46-e4bd-440c-87dd-046f52b26e69" containerName="console" containerID="cri-o://0c11382906a33a4d1a09b8ce97b60d980b2fb98cdca388f8ba9c13611675de20" gracePeriod=15 Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.921188 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-n2kr8"] Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.926177 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 20 13:44:07 crc kubenswrapper[4973]: I0320 13:44:07.940962 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74f6f696b9-n2kr8"] Mar 20 13:44:08 crc kubenswrapper[4973]: I0320 13:44:08.004411 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7019dcc1-49da-40e5-ae40-80f09d83984d" path="/var/lib/kubelet/pods/7019dcc1-49da-40e5-ae40-80f09d83984d/volumes" Mar 20 13:44:08 crc kubenswrapper[4973]: I0320 13:44:08.005172 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="704789c1-d8a3-4773-8b30-754120b59be4" path="/var/lib/kubelet/pods/704789c1-d8a3-4773-8b30-754120b59be4/volumes" Mar 20 13:44:08 crc kubenswrapper[4973]: E0320 13:44:08.442554 4973 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.75:57696->38.102.83.75:38041: write tcp 38.102.83.75:57696->38.102.83.75:38041: write: broken pipe Mar 20 13:44:08 crc kubenswrapper[4973]: I0320 13:44:08.797755 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7794c74589-2c6lq_4a8f8e46-e4bd-440c-87dd-046f52b26e69/console/0.log" Mar 20 13:44:08 crc kubenswrapper[4973]: I0320 13:44:08.797795 4973 generic.go:334] "Generic (PLEG): container finished" podID="4a8f8e46-e4bd-440c-87dd-046f52b26e69" containerID="0c11382906a33a4d1a09b8ce97b60d980b2fb98cdca388f8ba9c13611675de20" exitCode=2 Mar 20 13:44:08 crc kubenswrapper[4973]: I0320 13:44:08.797924 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7794c74589-2c6lq" event={"ID":"4a8f8e46-e4bd-440c-87dd-046f52b26e69","Type":"ContainerDied","Data":"0c11382906a33a4d1a09b8ce97b60d980b2fb98cdca388f8ba9c13611675de20"} Mar 20 13:44:08 crc kubenswrapper[4973]: I0320 13:44:08.797959 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7794c74589-2c6lq" event={"ID":"4a8f8e46-e4bd-440c-87dd-046f52b26e69","Type":"ContainerDied","Data":"e28145cdfc9f3cc219150b38f20e11ad9d9b52fec6266a1bc807bb78a65e096c"} Mar 20 13:44:08 crc kubenswrapper[4973]: I0320 13:44:08.797975 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e28145cdfc9f3cc219150b38f20e11ad9d9b52fec6266a1bc807bb78a65e096c" Mar 20 13:44:08 crc kubenswrapper[4973]: I0320 13:44:08.863706 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7794c74589-2c6lq_4a8f8e46-e4bd-440c-87dd-046f52b26e69/console/0.log" Mar 20 13:44:08 crc kubenswrapper[4973]: I0320 13:44:08.863778 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.007795 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a8f8e46-e4bd-440c-87dd-046f52b26e69-trusted-ca-bundle\") pod \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.007943 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a8f8e46-e4bd-440c-87dd-046f52b26e69-console-serving-cert\") pod \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.007989 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a8f8e46-e4bd-440c-87dd-046f52b26e69-console-oauth-config\") pod \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.008025 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxt5v\" (UniqueName: \"kubernetes.io/projected/4a8f8e46-e4bd-440c-87dd-046f52b26e69-kube-api-access-sxt5v\") pod \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.008081 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a8f8e46-e4bd-440c-87dd-046f52b26e69-service-ca\") pod \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.008127 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a8f8e46-e4bd-440c-87dd-046f52b26e69-console-config\") pod \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.008355 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a8f8e46-e4bd-440c-87dd-046f52b26e69-oauth-serving-cert\") pod \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\" (UID: \"4a8f8e46-e4bd-440c-87dd-046f52b26e69\") " Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.010422 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a8f8e46-e4bd-440c-87dd-046f52b26e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "4a8f8e46-e4bd-440c-87dd-046f52b26e69" (UID: "4a8f8e46-e4bd-440c-87dd-046f52b26e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.010741 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a8f8e46-e4bd-440c-87dd-046f52b26e69-service-ca" (OuterVolumeSpecName: "service-ca") pod "4a8f8e46-e4bd-440c-87dd-046f52b26e69" (UID: "4a8f8e46-e4bd-440c-87dd-046f52b26e69"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.010758 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a8f8e46-e4bd-440c-87dd-046f52b26e69-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "4a8f8e46-e4bd-440c-87dd-046f52b26e69" (UID: "4a8f8e46-e4bd-440c-87dd-046f52b26e69"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.010887 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a8f8e46-e4bd-440c-87dd-046f52b26e69-console-config" (OuterVolumeSpecName: "console-config") pod "4a8f8e46-e4bd-440c-87dd-046f52b26e69" (UID: "4a8f8e46-e4bd-440c-87dd-046f52b26e69"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.014769 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a8f8e46-e4bd-440c-87dd-046f52b26e69-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "4a8f8e46-e4bd-440c-87dd-046f52b26e69" (UID: "4a8f8e46-e4bd-440c-87dd-046f52b26e69"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.014824 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a8f8e46-e4bd-440c-87dd-046f52b26e69-kube-api-access-sxt5v" (OuterVolumeSpecName: "kube-api-access-sxt5v") pod "4a8f8e46-e4bd-440c-87dd-046f52b26e69" (UID: "4a8f8e46-e4bd-440c-87dd-046f52b26e69"). InnerVolumeSpecName "kube-api-access-sxt5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.015044 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a8f8e46-e4bd-440c-87dd-046f52b26e69-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "4a8f8e46-e4bd-440c-87dd-046f52b26e69" (UID: "4a8f8e46-e4bd-440c-87dd-046f52b26e69"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.111167 4973 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/4a8f8e46-e4bd-440c-87dd-046f52b26e69-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.111197 4973 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/4a8f8e46-e4bd-440c-87dd-046f52b26e69-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.111208 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxt5v\" (UniqueName: \"kubernetes.io/projected/4a8f8e46-e4bd-440c-87dd-046f52b26e69-kube-api-access-sxt5v\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.111221 4973 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/4a8f8e46-e4bd-440c-87dd-046f52b26e69-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.111229 4973 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/4a8f8e46-e4bd-440c-87dd-046f52b26e69-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.111239 4973 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/4a8f8e46-e4bd-440c-87dd-046f52b26e69-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.111248 4973 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4a8f8e46-e4bd-440c-87dd-046f52b26e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.291803 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-wvx2d"] Mar 20 13:44:09 crc kubenswrapper[4973]: E0320 13:44:09.292217 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0592b321-65ba-4ab3-987b-8384ec9ee7e2" containerName="oc" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.292229 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="0592b321-65ba-4ab3-987b-8384ec9ee7e2" containerName="oc" Mar 20 13:44:09 crc kubenswrapper[4973]: E0320 13:44:09.292244 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="704789c1-d8a3-4773-8b30-754120b59be4" containerName="init" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.292251 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="704789c1-d8a3-4773-8b30-754120b59be4" containerName="init" Mar 20 13:44:09 crc kubenswrapper[4973]: E0320 13:44:09.292266 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a8f8e46-e4bd-440c-87dd-046f52b26e69" containerName="console" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.292274 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a8f8e46-e4bd-440c-87dd-046f52b26e69" containerName="console" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.292476 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="704789c1-d8a3-4773-8b30-754120b59be4" containerName="init" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.292502 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a8f8e46-e4bd-440c-87dd-046f52b26e69" containerName="console" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.292511 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="0592b321-65ba-4ab3-987b-8384ec9ee7e2" containerName="oc" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.303464 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wvx2d" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.325664 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-ee19-account-create-update-jjnq9"] Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.327498 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ee19-account-create-update-jjnq9" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.330528 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.338196 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-wvx2d"] Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.352177 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ee19-account-create-update-jjnq9"] Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.422584 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5444n\" (UniqueName: \"kubernetes.io/projected/1df509cd-ad47-4fa5-86bf-c2dc341259b7-kube-api-access-5444n\") pod \"keystone-db-create-wvx2d\" (UID: \"1df509cd-ad47-4fa5-86bf-c2dc341259b7\") " pod="openstack/keystone-db-create-wvx2d" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.422774 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1df509cd-ad47-4fa5-86bf-c2dc341259b7-operator-scripts\") pod \"keystone-db-create-wvx2d\" (UID: \"1df509cd-ad47-4fa5-86bf-c2dc341259b7\") " pod="openstack/keystone-db-create-wvx2d" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.494515 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-xbx75"] Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.496421 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xbx75" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.499980 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xbx75"] Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.525598 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5444n\" (UniqueName: \"kubernetes.io/projected/1df509cd-ad47-4fa5-86bf-c2dc341259b7-kube-api-access-5444n\") pod \"keystone-db-create-wvx2d\" (UID: \"1df509cd-ad47-4fa5-86bf-c2dc341259b7\") " pod="openstack/keystone-db-create-wvx2d" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.525645 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdj2q\" (UniqueName: \"kubernetes.io/projected/a6efe097-248d-4d68-b2e0-172c2d005a17-kube-api-access-wdj2q\") pod \"keystone-ee19-account-create-update-jjnq9\" (UID: \"a6efe097-248d-4d68-b2e0-172c2d005a17\") " pod="openstack/keystone-ee19-account-create-update-jjnq9" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.525710 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6efe097-248d-4d68-b2e0-172c2d005a17-operator-scripts\") pod \"keystone-ee19-account-create-update-jjnq9\" (UID: \"a6efe097-248d-4d68-b2e0-172c2d005a17\") " pod="openstack/keystone-ee19-account-create-update-jjnq9" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.525798 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1df509cd-ad47-4fa5-86bf-c2dc341259b7-operator-scripts\") pod \"keystone-db-create-wvx2d\" (UID: \"1df509cd-ad47-4fa5-86bf-c2dc341259b7\") " pod="openstack/keystone-db-create-wvx2d" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.526448 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1df509cd-ad47-4fa5-86bf-c2dc341259b7-operator-scripts\") pod \"keystone-db-create-wvx2d\" (UID: \"1df509cd-ad47-4fa5-86bf-c2dc341259b7\") " pod="openstack/keystone-db-create-wvx2d" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.546259 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5444n\" (UniqueName: \"kubernetes.io/projected/1df509cd-ad47-4fa5-86bf-c2dc341259b7-kube-api-access-5444n\") pod \"keystone-db-create-wvx2d\" (UID: \"1df509cd-ad47-4fa5-86bf-c2dc341259b7\") " pod="openstack/keystone-db-create-wvx2d" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.593445 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-68e0-account-create-update-8dkxv"] Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.594795 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68e0-account-create-update-8dkxv" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.598944 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.623841 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-68e0-account-create-update-8dkxv"] Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.629289 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpbtv\" (UniqueName: \"kubernetes.io/projected/33bc7d77-fed7-4177-83a1-6eff4b55ee6d-kube-api-access-wpbtv\") pod \"placement-db-create-xbx75\" (UID: \"33bc7d77-fed7-4177-83a1-6eff4b55ee6d\") " pod="openstack/placement-db-create-xbx75" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.629390 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdj2q\" (UniqueName: \"kubernetes.io/projected/a6efe097-248d-4d68-b2e0-172c2d005a17-kube-api-access-wdj2q\") pod \"keystone-ee19-account-create-update-jjnq9\" (UID: \"a6efe097-248d-4d68-b2e0-172c2d005a17\") " pod="openstack/keystone-ee19-account-create-update-jjnq9" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.629572 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6efe097-248d-4d68-b2e0-172c2d005a17-operator-scripts\") pod \"keystone-ee19-account-create-update-jjnq9\" (UID: \"a6efe097-248d-4d68-b2e0-172c2d005a17\") " pod="openstack/keystone-ee19-account-create-update-jjnq9" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.629702 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33bc7d77-fed7-4177-83a1-6eff4b55ee6d-operator-scripts\") pod \"placement-db-create-xbx75\" (UID: \"33bc7d77-fed7-4177-83a1-6eff4b55ee6d\") " pod="openstack/placement-db-create-xbx75" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.630713 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6efe097-248d-4d68-b2e0-172c2d005a17-operator-scripts\") pod \"keystone-ee19-account-create-update-jjnq9\" (UID: \"a6efe097-248d-4d68-b2e0-172c2d005a17\") " pod="openstack/keystone-ee19-account-create-update-jjnq9" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.646187 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdj2q\" (UniqueName: \"kubernetes.io/projected/a6efe097-248d-4d68-b2e0-172c2d005a17-kube-api-access-wdj2q\") pod \"keystone-ee19-account-create-update-jjnq9\" (UID: \"a6efe097-248d-4d68-b2e0-172c2d005a17\") " pod="openstack/keystone-ee19-account-create-update-jjnq9" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.659692 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wvx2d" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.668111 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ee19-account-create-update-jjnq9" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.733602 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n6bc\" (UniqueName: \"kubernetes.io/projected/e554d8c1-24d9-4dff-8495-2fb41208cdad-kube-api-access-9n6bc\") pod \"placement-68e0-account-create-update-8dkxv\" (UID: \"e554d8c1-24d9-4dff-8495-2fb41208cdad\") " pod="openstack/placement-68e0-account-create-update-8dkxv" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.733699 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e554d8c1-24d9-4dff-8495-2fb41208cdad-operator-scripts\") pod \"placement-68e0-account-create-update-8dkxv\" (UID: \"e554d8c1-24d9-4dff-8495-2fb41208cdad\") " pod="openstack/placement-68e0-account-create-update-8dkxv" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.733738 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpbtv\" (UniqueName: \"kubernetes.io/projected/33bc7d77-fed7-4177-83a1-6eff4b55ee6d-kube-api-access-wpbtv\") pod \"placement-db-create-xbx75\" (UID: \"33bc7d77-fed7-4177-83a1-6eff4b55ee6d\") " pod="openstack/placement-db-create-xbx75" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.733958 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33bc7d77-fed7-4177-83a1-6eff4b55ee6d-operator-scripts\") pod \"placement-db-create-xbx75\" (UID: \"33bc7d77-fed7-4177-83a1-6eff4b55ee6d\") " pod="openstack/placement-db-create-xbx75" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.738282 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33bc7d77-fed7-4177-83a1-6eff4b55ee6d-operator-scripts\") pod \"placement-db-create-xbx75\" (UID: \"33bc7d77-fed7-4177-83a1-6eff4b55ee6d\") " pod="openstack/placement-db-create-xbx75" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.757466 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpbtv\" (UniqueName: \"kubernetes.io/projected/33bc7d77-fed7-4177-83a1-6eff4b55ee6d-kube-api-access-wpbtv\") pod \"placement-db-create-xbx75\" (UID: \"33bc7d77-fed7-4177-83a1-6eff4b55ee6d\") " pod="openstack/placement-db-create-xbx75" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.818115 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xbx75" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.820460 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7794c74589-2c6lq" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.822254 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5936fcd4-c4a1-423b-b026-9a86d9964154","Type":"ContainerStarted","Data":"37e45dbaa39085c0033d830bc50cc2e6d8a969111fac8e0362f4fc83ef0d8b69"} Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.835488 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n6bc\" (UniqueName: \"kubernetes.io/projected/e554d8c1-24d9-4dff-8495-2fb41208cdad-kube-api-access-9n6bc\") pod \"placement-68e0-account-create-update-8dkxv\" (UID: \"e554d8c1-24d9-4dff-8495-2fb41208cdad\") " pod="openstack/placement-68e0-account-create-update-8dkxv" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.835564 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e554d8c1-24d9-4dff-8495-2fb41208cdad-operator-scripts\") pod \"placement-68e0-account-create-update-8dkxv\" (UID: \"e554d8c1-24d9-4dff-8495-2fb41208cdad\") " pod="openstack/placement-68e0-account-create-update-8dkxv" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.836584 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e554d8c1-24d9-4dff-8495-2fb41208cdad-operator-scripts\") pod \"placement-68e0-account-create-update-8dkxv\" (UID: \"e554d8c1-24d9-4dff-8495-2fb41208cdad\") " pod="openstack/placement-68e0-account-create-update-8dkxv" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.859485 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n6bc\" (UniqueName: \"kubernetes.io/projected/e554d8c1-24d9-4dff-8495-2fb41208cdad-kube-api-access-9n6bc\") pod \"placement-68e0-account-create-update-8dkxv\" (UID: \"e554d8c1-24d9-4dff-8495-2fb41208cdad\") " pod="openstack/placement-68e0-account-create-update-8dkxv" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.871042 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7794c74589-2c6lq"] Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.882459 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7794c74589-2c6lq"] Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.914958 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68e0-account-create-update-8dkxv" Mar 20 13:44:09 crc kubenswrapper[4973]: I0320 13:44:09.980469 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a8f8e46-e4bd-440c-87dd-046f52b26e69" path="/var/lib/kubelet/pods/4a8f8e46-e4bd-440c-87dd-046f52b26e69/volumes" Mar 20 13:44:10 crc kubenswrapper[4973]: I0320 13:44:10.443997 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 20 13:44:10 crc kubenswrapper[4973]: I0320 13:44:10.685051 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 20 13:44:10 crc kubenswrapper[4973]: I0320 13:44:10.697928 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-bkrl2"] Mar 20 13:44:10 crc kubenswrapper[4973]: I0320 13:44:10.699573 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-bkrl2" Mar 20 13:44:10 crc kubenswrapper[4973]: I0320 13:44:10.722600 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-bkrl2"] Mar 20 13:44:10 crc kubenswrapper[4973]: E0320 13:44:10.730036 4973 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20780ec2_d338_45a4_9259_16a651e46e55.slice/crio-4fe78d802a26e68c3351b6b1daff5a6fc354aeb5b49e215fb4795ef080c317df.scope\": RecentStats: unable to find data in memory cache]" Mar 20 13:44:10 crc kubenswrapper[4973]: I0320 13:44:10.849382 4973 generic.go:334] "Generic (PLEG): container finished" podID="20780ec2-d338-45a4-9259-16a651e46e55" containerID="4fe78d802a26e68c3351b6b1daff5a6fc354aeb5b49e215fb4795ef080c317df" exitCode=0 Mar 20 13:44:10 crc kubenswrapper[4973]: I0320 13:44:10.849503 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"20780ec2-d338-45a4-9259-16a651e46e55","Type":"ContainerDied","Data":"4fe78d802a26e68c3351b6b1daff5a6fc354aeb5b49e215fb4795ef080c317df"} Mar 20 13:44:10 crc kubenswrapper[4973]: I0320 13:44:10.878241 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-bkrl2\" (UID: \"9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8\") " pod="openstack/mysqld-exporter-openstack-db-create-bkrl2" Mar 20 13:44:10 crc kubenswrapper[4973]: I0320 13:44:10.878783 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rqph\" (UniqueName: \"kubernetes.io/projected/9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8-kube-api-access-5rqph\") pod \"mysqld-exporter-openstack-db-create-bkrl2\" (UID: \"9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8\") " pod="openstack/mysqld-exporter-openstack-db-create-bkrl2" Mar 20 13:44:10 crc kubenswrapper[4973]: I0320 13:44:10.906144 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0ba4-account-create-update-lcrs9"] Mar 20 13:44:10 crc kubenswrapper[4973]: I0320 13:44:10.908189 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0ba4-account-create-update-lcrs9" Mar 20 13:44:10 crc kubenswrapper[4973]: I0320 13:44:10.910712 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-db-secret" Mar 20 13:44:10 crc kubenswrapper[4973]: I0320 13:44:10.912863 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 13:44:10 crc kubenswrapper[4973]: I0320 13:44:10.919390 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0ba4-account-create-update-lcrs9"] Mar 20 13:44:10 crc kubenswrapper[4973]: I0320 13:44:10.981521 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-bkrl2\" (UID: \"9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8\") " pod="openstack/mysqld-exporter-openstack-db-create-bkrl2" Mar 20 13:44:10 crc kubenswrapper[4973]: I0320 13:44:10.982500 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8-operator-scripts\") pod \"mysqld-exporter-openstack-db-create-bkrl2\" (UID: \"9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8\") " pod="openstack/mysqld-exporter-openstack-db-create-bkrl2" Mar 20 13:44:10 crc kubenswrapper[4973]: I0320 13:44:10.983652 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rqph\" (UniqueName: \"kubernetes.io/projected/9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8-kube-api-access-5rqph\") pod \"mysqld-exporter-openstack-db-create-bkrl2\" (UID: \"9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8\") " pod="openstack/mysqld-exporter-openstack-db-create-bkrl2" Mar 20 13:44:11 crc kubenswrapper[4973]: I0320 13:44:11.008495 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rqph\" (UniqueName: \"kubernetes.io/projected/9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8-kube-api-access-5rqph\") pod \"mysqld-exporter-openstack-db-create-bkrl2\" (UID: \"9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8\") " pod="openstack/mysqld-exporter-openstack-db-create-bkrl2" Mar 20 13:44:11 crc kubenswrapper[4973]: I0320 13:44:11.026569 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-bkrl2" Mar 20 13:44:11 crc kubenswrapper[4973]: I0320 13:44:11.088367 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpxzc\" (UniqueName: \"kubernetes.io/projected/707ca787-d762-44eb-932e-0130c091ae6a-kube-api-access-tpxzc\") pod \"mysqld-exporter-0ba4-account-create-update-lcrs9\" (UID: \"707ca787-d762-44eb-932e-0130c091ae6a\") " pod="openstack/mysqld-exporter-0ba4-account-create-update-lcrs9" Mar 20 13:44:11 crc kubenswrapper[4973]: I0320 13:44:11.089156 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/707ca787-d762-44eb-932e-0130c091ae6a-operator-scripts\") pod \"mysqld-exporter-0ba4-account-create-update-lcrs9\" (UID: \"707ca787-d762-44eb-932e-0130c091ae6a\") " pod="openstack/mysqld-exporter-0ba4-account-create-update-lcrs9" Mar 20 13:44:11 crc kubenswrapper[4973]: I0320 13:44:11.288227 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/707ca787-d762-44eb-932e-0130c091ae6a-operator-scripts\") pod \"mysqld-exporter-0ba4-account-create-update-lcrs9\" (UID: \"707ca787-d762-44eb-932e-0130c091ae6a\") " pod="openstack/mysqld-exporter-0ba4-account-create-update-lcrs9" Mar 20 13:44:11 crc kubenswrapper[4973]: I0320 13:44:11.288690 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpxzc\" (UniqueName: \"kubernetes.io/projected/707ca787-d762-44eb-932e-0130c091ae6a-kube-api-access-tpxzc\") pod \"mysqld-exporter-0ba4-account-create-update-lcrs9\" (UID: \"707ca787-d762-44eb-932e-0130c091ae6a\") " pod="openstack/mysqld-exporter-0ba4-account-create-update-lcrs9" Mar 20 13:44:11 crc kubenswrapper[4973]: I0320 13:44:11.290395 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/707ca787-d762-44eb-932e-0130c091ae6a-operator-scripts\") pod \"mysqld-exporter-0ba4-account-create-update-lcrs9\" (UID: \"707ca787-d762-44eb-932e-0130c091ae6a\") " pod="openstack/mysqld-exporter-0ba4-account-create-update-lcrs9" Mar 20 13:44:11 crc kubenswrapper[4973]: I0320 13:44:11.309680 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpxzc\" (UniqueName: \"kubernetes.io/projected/707ca787-d762-44eb-932e-0130c091ae6a-kube-api-access-tpxzc\") pod \"mysqld-exporter-0ba4-account-create-update-lcrs9\" (UID: \"707ca787-d762-44eb-932e-0130c091ae6a\") " pod="openstack/mysqld-exporter-0ba4-account-create-update-lcrs9" Mar 20 13:44:11 crc kubenswrapper[4973]: I0320 13:44:11.529820 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0ba4-account-create-update-lcrs9" Mar 20 13:44:12 crc kubenswrapper[4973]: I0320 13:44:12.869033 4973 generic.go:334] "Generic (PLEG): container finished" podID="96de22e2-f61c-4f75-8faa-9a0591aa0f38" containerID="2f909be8e4577c3906dd8fe036f0cec95e6cfd7584a8f17b0b28bb39aa6d6a30" exitCode=0 Mar 20 13:44:12 crc kubenswrapper[4973]: I0320 13:44:12.869158 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"96de22e2-f61c-4f75-8faa-9a0591aa0f38","Type":"ContainerDied","Data":"2f909be8e4577c3906dd8fe036f0cec95e6cfd7584a8f17b0b28bb39aa6d6a30"} Mar 20 13:44:13 crc kubenswrapper[4973]: I0320 13:44:13.320390 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:44:13 crc kubenswrapper[4973]: I0320 13:44:13.320455 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:44:13 crc kubenswrapper[4973]: I0320 13:44:13.884576 4973 generic.go:334] "Generic (PLEG): container finished" podID="797b38f5-d9a7-4f82-bd12-e40e021ef28e" containerID="43a6b2817e43c4fa48cce3c4bb7ae49ce43b81d1e67941c6e11676db49de2111" exitCode=0 Mar 20 13:44:13 crc kubenswrapper[4973]: I0320 13:44:13.884636 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"797b38f5-d9a7-4f82-bd12-e40e021ef28e","Type":"ContainerDied","Data":"43a6b2817e43c4fa48cce3c4bb7ae49ce43b81d1e67941c6e11676db49de2111"} Mar 20 13:44:13 crc kubenswrapper[4973]: I0320 13:44:13.892531 4973 generic.go:334] "Generic (PLEG): container finished" podID="4ed60638-5022-406b-b568-7fa0d6bf4ba8" containerID="1197fb893b7f76b1e4555b8d0ff5bfaca6b9fd60a6146cd1c20b9f45d87f3162" exitCode=0 Mar 20 13:44:13 crc kubenswrapper[4973]: I0320 13:44:13.892609 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"4ed60638-5022-406b-b568-7fa0d6bf4ba8","Type":"ContainerDied","Data":"1197fb893b7f76b1e4555b8d0ff5bfaca6b9fd60a6146cd1c20b9f45d87f3162"} Mar 20 13:44:13 crc kubenswrapper[4973]: I0320 13:44:13.895559 4973 generic.go:334] "Generic (PLEG): container finished" podID="dbc7e778-9029-42ce-9a8e-e76636aad6a5" containerID="0f4d55f6330cb549a3f06cc7c71a6b1608668efdd56a700cd9417204745cd673" exitCode=0 Mar 20 13:44:13 crc kubenswrapper[4973]: I0320 13:44:13.895609 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qjvlf" event={"ID":"dbc7e778-9029-42ce-9a8e-e76636aad6a5","Type":"ContainerDied","Data":"0f4d55f6330cb549a3f06cc7c71a6b1608668efdd56a700cd9417204745cd673"} Mar 20 13:44:14 crc kubenswrapper[4973]: I0320 13:44:14.420618 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0ba4-account-create-update-lcrs9"] Mar 20 13:44:14 crc kubenswrapper[4973]: W0320 13:44:14.453665 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod707ca787_d762_44eb_932e_0130c091ae6a.slice/crio-9c2abd9d87008d2f3638cb6e10ef79297d88df6b9856625ef42ed2cde28d3573 WatchSource:0}: Error finding container 9c2abd9d87008d2f3638cb6e10ef79297d88df6b9856625ef42ed2cde28d3573: Status 404 returned error can't find the container with id 9c2abd9d87008d2f3638cb6e10ef79297d88df6b9856625ef42ed2cde28d3573 Mar 20 13:44:14 crc kubenswrapper[4973]: I0320 13:44:14.829703 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-xbx75"] Mar 20 13:44:14 crc kubenswrapper[4973]: I0320 13:44:14.902815 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-698758b865-j6blp" Mar 20 13:44:14 crc kubenswrapper[4973]: I0320 13:44:14.914665 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0ba4-account-create-update-lcrs9" event={"ID":"707ca787-d762-44eb-932e-0130c091ae6a","Type":"ContainerStarted","Data":"9c2abd9d87008d2f3638cb6e10ef79297d88df6b9856625ef42ed2cde28d3573"} Mar 20 13:44:14 crc kubenswrapper[4973]: I0320 13:44:14.940284 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5936fcd4-c4a1-423b-b026-9a86d9964154","Type":"ContainerStarted","Data":"a76e0ee510fddedd042dcd3ac1d73c51c202bfe2bc47ee4e9fdf40e4aaa88463"} Mar 20 13:44:14 crc kubenswrapper[4973]: I0320 13:44:14.941086 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 20 13:44:14 crc kubenswrapper[4973]: I0320 13:44:14.946300 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"797b38f5-d9a7-4f82-bd12-e40e021ef28e","Type":"ContainerStarted","Data":"d6ddcd3aa7658bceaec6103da5780aeff16abb7dbb7606fe4e686cb5378fa9bb"} Mar 20 13:44:14 crc kubenswrapper[4973]: I0320 13:44:14.947830 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Mar 20 13:44:14 crc kubenswrapper[4973]: I0320 13:44:14.949425 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xbx75" event={"ID":"33bc7d77-fed7-4177-83a1-6eff4b55ee6d","Type":"ContainerStarted","Data":"4f125250e16156ccafaf676e8c0fcbae2d4345fcf2cd847572e5ece6030bed48"} Mar 20 13:44:14 crc kubenswrapper[4973]: I0320 13:44:14.952317 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"4ed60638-5022-406b-b568-7fa0d6bf4ba8","Type":"ContainerStarted","Data":"037f5251eaf4cb4f67de62e170130e7e06ee060a34d17561441e79a60d62fd3c"} Mar 20 13:44:14 crc kubenswrapper[4973]: I0320 13:44:14.953670 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Mar 20 13:44:14 crc kubenswrapper[4973]: I0320 13:44:14.963814 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"20780ec2-d338-45a4-9259-16a651e46e55","Type":"ContainerStarted","Data":"92e9958d24196db83cc045e1f87b9f21ff9358fb34e199690b6dd3b40a1daaaa"} Mar 20 13:44:14 crc kubenswrapper[4973]: I0320 13:44:14.964110 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:44:14 crc kubenswrapper[4973]: I0320 13:44:14.967717 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"96de22e2-f61c-4f75-8faa-9a0591aa0f38","Type":"ContainerStarted","Data":"3a7f3d91e76a50e04bcf5c49224d40c07b200971965316fe7aecb18bfbe0b643"} Mar 20 13:44:14 crc kubenswrapper[4973]: I0320 13:44:14.969563 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 13:44:15 crc kubenswrapper[4973]: I0320 13:44:15.005428 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-9jcwr"] Mar 20 13:44:15 crc kubenswrapper[4973]: I0320 13:44:15.005748 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7cb5889db5-9jcwr" podUID="e324e333-c4ed-44f5-abc5-0ee4083027eb" containerName="dnsmasq-dns" containerID="cri-o://c9ea00565b7792f08df0951efad1b2003b631c4e010521a32cc7cc75fb9c366f" gracePeriod=10 Mar 20 13:44:15 crc kubenswrapper[4973]: I0320 13:44:15.027313 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=7.636501225 podStartE2EDuration="10.027292972s" podCreationTimestamp="2026-03-20 13:44:05 +0000 UTC" firstStartedPulling="2026-03-20 13:44:06.258737507 +0000 UTC m=+1367.002407251" lastFinishedPulling="2026-03-20 13:44:08.649529254 +0000 UTC m=+1369.393198998" observedRunningTime="2026-03-20 13:44:15.013369292 +0000 UTC m=+1375.757039056" watchObservedRunningTime="2026-03-20 13:44:15.027292972 +0000 UTC m=+1375.770962716" Mar 20 13:44:15 crc kubenswrapper[4973]: I0320 13:44:15.067970 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=61.067943282 podStartE2EDuration="1m1.067943282s" podCreationTimestamp="2026-03-20 13:43:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:15.055452421 +0000 UTC m=+1375.799122155" watchObservedRunningTime="2026-03-20 13:44:15.067943282 +0000 UTC m=+1375.811613036" Mar 20 13:44:15 crc kubenswrapper[4973]: I0320 13:44:15.091892 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=61.091875475 podStartE2EDuration="1m1.091875475s" podCreationTimestamp="2026-03-20 13:43:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:15.084567186 +0000 UTC m=+1375.828236930" watchObservedRunningTime="2026-03-20 13:44:15.091875475 +0000 UTC m=+1375.835545219" Mar 20 13:44:15 crc kubenswrapper[4973]: I0320 13:44:15.194732 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-2" podStartSLOduration=61.194691031 podStartE2EDuration="1m1.194691031s" podCreationTimestamp="2026-03-20 13:43:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:15.185312556 +0000 UTC m=+1375.928982310" watchObservedRunningTime="2026-03-20 13:44:15.194691031 +0000 UTC m=+1375.938360775" Mar 20 13:44:15 crc kubenswrapper[4973]: I0320 13:44:15.212034 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=52.495168054 podStartE2EDuration="1m2.211958703s" podCreationTimestamp="2026-03-20 13:43:13 +0000 UTC" firstStartedPulling="2026-03-20 13:43:27.316087843 +0000 UTC m=+1328.059757577" lastFinishedPulling="2026-03-20 13:43:37.032878482 +0000 UTC m=+1337.776548226" observedRunningTime="2026-03-20 13:44:15.140565254 +0000 UTC m=+1375.884235008" watchObservedRunningTime="2026-03-20 13:44:15.211958703 +0000 UTC m=+1375.955628457" Mar 20 13:44:15 crc kubenswrapper[4973]: I0320 13:44:15.241168 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-9dcsn"] Mar 20 13:44:15 crc kubenswrapper[4973]: I0320 13:44:15.242544 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9dcsn" Mar 20 13:44:15 crc kubenswrapper[4973]: I0320 13:44:15.246773 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 13:44:15 crc kubenswrapper[4973]: I0320 13:44:15.261087 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9dcsn"] Mar 20 13:44:15 crc kubenswrapper[4973]: I0320 13:44:15.330071 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db40f23c-3580-4532-8855-fb0d6e786747-operator-scripts\") pod \"root-account-create-update-9dcsn\" (UID: \"db40f23c-3580-4532-8855-fb0d6e786747\") " pod="openstack/root-account-create-update-9dcsn" Mar 20 13:44:15 crc kubenswrapper[4973]: I0320 13:44:15.330285 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l8m7g\" (UniqueName: \"kubernetes.io/projected/db40f23c-3580-4532-8855-fb0d6e786747-kube-api-access-l8m7g\") pod \"root-account-create-update-9dcsn\" (UID: \"db40f23c-3580-4532-8855-fb0d6e786747\") " pod="openstack/root-account-create-update-9dcsn" Mar 20 13:44:15 crc kubenswrapper[4973]: I0320 13:44:15.416000 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-ee19-account-create-update-jjnq9"] Mar 20 13:44:15 crc kubenswrapper[4973]: I0320 13:44:15.430183 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-68e0-account-create-update-8dkxv"] Mar 20 13:44:15 crc kubenswrapper[4973]: I0320 13:44:15.432839 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l8m7g\" (UniqueName: \"kubernetes.io/projected/db40f23c-3580-4532-8855-fb0d6e786747-kube-api-access-l8m7g\") pod \"root-account-create-update-9dcsn\" (UID: \"db40f23c-3580-4532-8855-fb0d6e786747\") " pod="openstack/root-account-create-update-9dcsn" Mar 20 13:44:15 crc kubenswrapper[4973]: I0320 13:44:15.438097 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db40f23c-3580-4532-8855-fb0d6e786747-operator-scripts\") pod \"root-account-create-update-9dcsn\" (UID: \"db40f23c-3580-4532-8855-fb0d6e786747\") " pod="openstack/root-account-create-update-9dcsn" Mar 20 13:44:15 crc kubenswrapper[4973]: I0320 13:44:15.438790 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db40f23c-3580-4532-8855-fb0d6e786747-operator-scripts\") pod \"root-account-create-update-9dcsn\" (UID: \"db40f23c-3580-4532-8855-fb0d6e786747\") " pod="openstack/root-account-create-update-9dcsn" Mar 20 13:44:15 crc kubenswrapper[4973]: I0320 13:44:15.462082 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l8m7g\" (UniqueName: \"kubernetes.io/projected/db40f23c-3580-4532-8855-fb0d6e786747-kube-api-access-l8m7g\") pod \"root-account-create-update-9dcsn\" (UID: \"db40f23c-3580-4532-8855-fb0d6e786747\") " pod="openstack/root-account-create-update-9dcsn" Mar 20 13:44:15 crc kubenswrapper[4973]: I0320 13:44:15.472795 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-wvx2d"] Mar 20 13:44:15 crc kubenswrapper[4973]: I0320 13:44:15.568332 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9dcsn" Mar 20 13:44:15 crc kubenswrapper[4973]: I0320 13:44:15.595330 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-bkrl2"] Mar 20 13:44:15 crc kubenswrapper[4973]: I0320 13:44:15.985774 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ee19-account-create-update-jjnq9" event={"ID":"a6efe097-248d-4d68-b2e0-172c2d005a17","Type":"ContainerStarted","Data":"7b12a6157627efa83996ef02ace60c25a9a0e0cabd3a6894d5725e76994f7494"} Mar 20 13:44:15 crc kubenswrapper[4973]: I0320 13:44:15.985820 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ee19-account-create-update-jjnq9" event={"ID":"a6efe097-248d-4d68-b2e0-172c2d005a17","Type":"ContainerStarted","Data":"80e0208124460a11d6de2b432bfbe73e1a3861b077a951f39a7073d949cc9b8e"} Mar 20 13:44:15 crc kubenswrapper[4973]: I0320 13:44:15.989760 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68e0-account-create-update-8dkxv" event={"ID":"e554d8c1-24d9-4dff-8495-2fb41208cdad","Type":"ContainerStarted","Data":"b166582adfa64767b2638eb7d18411fbcc0b9494239d9daa23730365c03f52ca"} Mar 20 13:44:15 crc kubenswrapper[4973]: I0320 13:44:15.989798 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68e0-account-create-update-8dkxv" event={"ID":"e554d8c1-24d9-4dff-8495-2fb41208cdad","Type":"ContainerStarted","Data":"4c737851dd187ed813e99921d75d0df66c4ed0b1f038e7cdc4a46565be91c793"} Mar 20 13:44:15 crc kubenswrapper[4973]: I0320 13:44:15.995798 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0ba4-account-create-update-lcrs9" event={"ID":"707ca787-d762-44eb-932e-0130c091ae6a","Type":"ContainerStarted","Data":"f997ceb54f445e211dc7001f033ec32f59125665ce183ac0f2b537251562b27e"} Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.006236 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wvx2d" event={"ID":"1df509cd-ad47-4fa5-86bf-c2dc341259b7","Type":"ContainerStarted","Data":"0d0f4c2ce0083e20437758948379ee9d68025cdd2ef9f47004cd9391cf43fca9"} Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.006293 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wvx2d" event={"ID":"1df509cd-ad47-4fa5-86bf-c2dc341259b7","Type":"ContainerStarted","Data":"acd8f4aca1d4b1111cda3c5e195e3e517307fc461a086f9952c0b24698f93de8"} Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.024677 4973 generic.go:334] "Generic (PLEG): container finished" podID="e324e333-c4ed-44f5-abc5-0ee4083027eb" containerID="c9ea00565b7792f08df0951efad1b2003b631c4e010521a32cc7cc75fb9c366f" exitCode=0 Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.024749 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-9jcwr" event={"ID":"e324e333-c4ed-44f5-abc5-0ee4083027eb","Type":"ContainerDied","Data":"c9ea00565b7792f08df0951efad1b2003b631c4e010521a32cc7cc75fb9c366f"} Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.029608 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qjvlf" Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.031457 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-ee19-account-create-update-jjnq9" podStartSLOduration=7.031385419 podStartE2EDuration="7.031385419s" podCreationTimestamp="2026-03-20 13:44:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:16.017115639 +0000 UTC m=+1376.760785383" watchObservedRunningTime="2026-03-20 13:44:16.031385419 +0000 UTC m=+1376.775055163" Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.040940 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-qjvlf" event={"ID":"dbc7e778-9029-42ce-9a8e-e76636aad6a5","Type":"ContainerDied","Data":"6d2d9cbba70b40d11f35eaeff96830c8b0b223472e33ba69d6245757221204ae"} Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.040986 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d2d9cbba70b40d11f35eaeff96830c8b0b223472e33ba69d6245757221204ae" Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.051375 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-bkrl2" event={"ID":"9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8","Type":"ContainerStarted","Data":"8b79f35a8376e89cdf00dce65fd6d6452a41aa72ce0ebd3d6a301b449f1882a0"} Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.057113 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"642c51b4-2774-4114-808c-1fb722862437","Type":"ContainerStarted","Data":"2e35fac61505298631322828fcdedf86c60d2615539c2e83fe79fc0f40f3ac65"} Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.063069 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xbx75" event={"ID":"33bc7d77-fed7-4177-83a1-6eff4b55ee6d","Type":"ContainerStarted","Data":"1986de820988ad7690038fa641c33b650c37a027712bbbf0656e4aaecddf6afc"} Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.077971 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-wvx2d" podStartSLOduration=7.077953159 podStartE2EDuration="7.077953159s" podCreationTimestamp="2026-03-20 13:44:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:16.051723884 +0000 UTC m=+1376.795393628" watchObservedRunningTime="2026-03-20 13:44:16.077953159 +0000 UTC m=+1376.821622903" Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.126984 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0ba4-account-create-update-lcrs9" podStartSLOduration=6.126963887 podStartE2EDuration="6.126963887s" podCreationTimestamp="2026-03-20 13:44:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:16.071598257 +0000 UTC m=+1376.815267991" watchObservedRunningTime="2026-03-20 13:44:16.126963887 +0000 UTC m=+1376.870633621" Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.177408 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-xbx75" podStartSLOduration=7.177378793 podStartE2EDuration="7.177378793s" podCreationTimestamp="2026-03-20 13:44:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:16.14172384 +0000 UTC m=+1376.885393584" watchObservedRunningTime="2026-03-20 13:44:16.177378793 +0000 UTC m=+1376.921048537" Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.274923 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5p8qj\" (UniqueName: \"kubernetes.io/projected/dbc7e778-9029-42ce-9a8e-e76636aad6a5-kube-api-access-5p8qj\") pod \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.275089 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dbc7e778-9029-42ce-9a8e-e76636aad6a5-dispersionconf\") pod \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.275117 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dbc7e778-9029-42ce-9a8e-e76636aad6a5-ring-data-devices\") pod \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.275184 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dbc7e778-9029-42ce-9a8e-e76636aad6a5-swiftconf\") pod \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.275247 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dbc7e778-9029-42ce-9a8e-e76636aad6a5-etc-swift\") pod \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.275395 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc7e778-9029-42ce-9a8e-e76636aad6a5-combined-ca-bundle\") pod \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.275454 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbc7e778-9029-42ce-9a8e-e76636aad6a5-scripts\") pod \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\" (UID: \"dbc7e778-9029-42ce-9a8e-e76636aad6a5\") " Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.276608 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dbc7e778-9029-42ce-9a8e-e76636aad6a5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "dbc7e778-9029-42ce-9a8e-e76636aad6a5" (UID: "dbc7e778-9029-42ce-9a8e-e76636aad6a5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.277277 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbc7e778-9029-42ce-9a8e-e76636aad6a5-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "dbc7e778-9029-42ce-9a8e-e76636aad6a5" (UID: "dbc7e778-9029-42ce-9a8e-e76636aad6a5"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.278310 4973 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/dbc7e778-9029-42ce-9a8e-e76636aad6a5-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.278325 4973 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/dbc7e778-9029-42ce-9a8e-e76636aad6a5-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.306790 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dbc7e778-9029-42ce-9a8e-e76636aad6a5-kube-api-access-5p8qj" (OuterVolumeSpecName: "kube-api-access-5p8qj") pod "dbc7e778-9029-42ce-9a8e-e76636aad6a5" (UID: "dbc7e778-9029-42ce-9a8e-e76636aad6a5"). InnerVolumeSpecName "kube-api-access-5p8qj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.331627 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc7e778-9029-42ce-9a8e-e76636aad6a5-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "dbc7e778-9029-42ce-9a8e-e76636aad6a5" (UID: "dbc7e778-9029-42ce-9a8e-e76636aad6a5"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.379907 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5p8qj\" (UniqueName: \"kubernetes.io/projected/dbc7e778-9029-42ce-9a8e-e76636aad6a5-kube-api-access-5p8qj\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.379968 4973 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/dbc7e778-9029-42ce-9a8e-e76636aad6a5-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.396661 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc7e778-9029-42ce-9a8e-e76636aad6a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dbc7e778-9029-42ce-9a8e-e76636aad6a5" (UID: "dbc7e778-9029-42ce-9a8e-e76636aad6a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.397788 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dbc7e778-9029-42ce-9a8e-e76636aad6a5-scripts" (OuterVolumeSpecName: "scripts") pod "dbc7e778-9029-42ce-9a8e-e76636aad6a5" (UID: "dbc7e778-9029-42ce-9a8e-e76636aad6a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.402379 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dbc7e778-9029-42ce-9a8e-e76636aad6a5-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "dbc7e778-9029-42ce-9a8e-e76636aad6a5" (UID: "dbc7e778-9029-42ce-9a8e-e76636aad6a5"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.483814 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dbc7e778-9029-42ce-9a8e-e76636aad6a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.483855 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/dbc7e778-9029-42ce-9a8e-e76636aad6a5-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.483863 4973 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/dbc7e778-9029-42ce-9a8e-e76636aad6a5-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.626773 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-9jcwr" Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.694877 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e324e333-c4ed-44f5-abc5-0ee4083027eb-dns-svc\") pod \"e324e333-c4ed-44f5-abc5-0ee4083027eb\" (UID: \"e324e333-c4ed-44f5-abc5-0ee4083027eb\") " Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.694952 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4sdw\" (UniqueName: \"kubernetes.io/projected/e324e333-c4ed-44f5-abc5-0ee4083027eb-kube-api-access-l4sdw\") pod \"e324e333-c4ed-44f5-abc5-0ee4083027eb\" (UID: \"e324e333-c4ed-44f5-abc5-0ee4083027eb\") " Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.695082 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e324e333-c4ed-44f5-abc5-0ee4083027eb-config\") pod \"e324e333-c4ed-44f5-abc5-0ee4083027eb\" (UID: \"e324e333-c4ed-44f5-abc5-0ee4083027eb\") " Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.703383 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e324e333-c4ed-44f5-abc5-0ee4083027eb-kube-api-access-l4sdw" (OuterVolumeSpecName: "kube-api-access-l4sdw") pod "e324e333-c4ed-44f5-abc5-0ee4083027eb" (UID: "e324e333-c4ed-44f5-abc5-0ee4083027eb"). InnerVolumeSpecName "kube-api-access-l4sdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.749203 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-9dcsn"] Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.776269 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e324e333-c4ed-44f5-abc5-0ee4083027eb-config" (OuterVolumeSpecName: "config") pod "e324e333-c4ed-44f5-abc5-0ee4083027eb" (UID: "e324e333-c4ed-44f5-abc5-0ee4083027eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.801080 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4sdw\" (UniqueName: \"kubernetes.io/projected/e324e333-c4ed-44f5-abc5-0ee4083027eb-kube-api-access-l4sdw\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.801115 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e324e333-c4ed-44f5-abc5-0ee4083027eb-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.817155 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e324e333-c4ed-44f5-abc5-0ee4083027eb-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e324e333-c4ed-44f5-abc5-0ee4083027eb" (UID: "e324e333-c4ed-44f5-abc5-0ee4083027eb"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:16 crc kubenswrapper[4973]: I0320 13:44:16.903293 4973 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e324e333-c4ed-44f5-abc5-0ee4083027eb-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:17 crc kubenswrapper[4973]: I0320 13:44:17.074837 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9dcsn" event={"ID":"db40f23c-3580-4532-8855-fb0d6e786747","Type":"ContainerStarted","Data":"40596f8f693b6dfd9035c7282f1cb4266148850bd96fb19c0cd43b9c6477025c"} Mar 20 13:44:17 crc kubenswrapper[4973]: I0320 13:44:17.076957 4973 generic.go:334] "Generic (PLEG): container finished" podID="33bc7d77-fed7-4177-83a1-6eff4b55ee6d" containerID="1986de820988ad7690038fa641c33b650c37a027712bbbf0656e4aaecddf6afc" exitCode=0 Mar 20 13:44:17 crc kubenswrapper[4973]: I0320 13:44:17.077029 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xbx75" event={"ID":"33bc7d77-fed7-4177-83a1-6eff4b55ee6d","Type":"ContainerDied","Data":"1986de820988ad7690038fa641c33b650c37a027712bbbf0656e4aaecddf6afc"} Mar 20 13:44:17 crc kubenswrapper[4973]: I0320 13:44:17.079110 4973 generic.go:334] "Generic (PLEG): container finished" podID="1df509cd-ad47-4fa5-86bf-c2dc341259b7" containerID="0d0f4c2ce0083e20437758948379ee9d68025cdd2ef9f47004cd9391cf43fca9" exitCode=0 Mar 20 13:44:17 crc kubenswrapper[4973]: I0320 13:44:17.079186 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wvx2d" event={"ID":"1df509cd-ad47-4fa5-86bf-c2dc341259b7","Type":"ContainerDied","Data":"0d0f4c2ce0083e20437758948379ee9d68025cdd2ef9f47004cd9391cf43fca9"} Mar 20 13:44:17 crc kubenswrapper[4973]: I0320 13:44:17.081030 4973 generic.go:334] "Generic (PLEG): container finished" podID="a6efe097-248d-4d68-b2e0-172c2d005a17" containerID="7b12a6157627efa83996ef02ace60c25a9a0e0cabd3a6894d5725e76994f7494" exitCode=0 Mar 20 13:44:17 crc kubenswrapper[4973]: I0320 13:44:17.081092 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ee19-account-create-update-jjnq9" event={"ID":"a6efe097-248d-4d68-b2e0-172c2d005a17","Type":"ContainerDied","Data":"7b12a6157627efa83996ef02ace60c25a9a0e0cabd3a6894d5725e76994f7494"} Mar 20 13:44:17 crc kubenswrapper[4973]: I0320 13:44:17.082977 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7cb5889db5-9jcwr" event={"ID":"e324e333-c4ed-44f5-abc5-0ee4083027eb","Type":"ContainerDied","Data":"3d53830d21a3c218183322b1d7819cc6551106444a7f42303b2992ee1f35dcd9"} Mar 20 13:44:17 crc kubenswrapper[4973]: I0320 13:44:17.083016 4973 scope.go:117] "RemoveContainer" containerID="c9ea00565b7792f08df0951efad1b2003b631c4e010521a32cc7cc75fb9c366f" Mar 20 13:44:17 crc kubenswrapper[4973]: I0320 13:44:17.083146 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7cb5889db5-9jcwr" Mar 20 13:44:17 crc kubenswrapper[4973]: I0320 13:44:17.090096 4973 generic.go:334] "Generic (PLEG): container finished" podID="9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8" containerID="8c42b9802a8cc2c4fc62f3ddda1fc8da9a9e7b8a0b4dcf1e86c5ec99f37e570e" exitCode=0 Mar 20 13:44:17 crc kubenswrapper[4973]: I0320 13:44:17.090177 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-bkrl2" event={"ID":"9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8","Type":"ContainerDied","Data":"8c42b9802a8cc2c4fc62f3ddda1fc8da9a9e7b8a0b4dcf1e86c5ec99f37e570e"} Mar 20 13:44:17 crc kubenswrapper[4973]: I0320 13:44:17.093333 4973 generic.go:334] "Generic (PLEG): container finished" podID="e554d8c1-24d9-4dff-8495-2fb41208cdad" containerID="b166582adfa64767b2638eb7d18411fbcc0b9494239d9daa23730365c03f52ca" exitCode=0 Mar 20 13:44:17 crc kubenswrapper[4973]: I0320 13:44:17.093402 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68e0-account-create-update-8dkxv" event={"ID":"e554d8c1-24d9-4dff-8495-2fb41208cdad","Type":"ContainerDied","Data":"b166582adfa64767b2638eb7d18411fbcc0b9494239d9daa23730365c03f52ca"} Mar 20 13:44:17 crc kubenswrapper[4973]: I0320 13:44:17.096320 4973 generic.go:334] "Generic (PLEG): container finished" podID="707ca787-d762-44eb-932e-0130c091ae6a" containerID="f997ceb54f445e211dc7001f033ec32f59125665ce183ac0f2b537251562b27e" exitCode=0 Mar 20 13:44:17 crc kubenswrapper[4973]: I0320 13:44:17.096404 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0ba4-account-create-update-lcrs9" event={"ID":"707ca787-d762-44eb-932e-0130c091ae6a","Type":"ContainerDied","Data":"f997ceb54f445e211dc7001f033ec32f59125665ce183ac0f2b537251562b27e"} Mar 20 13:44:17 crc kubenswrapper[4973]: I0320 13:44:17.096442 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-qjvlf" Mar 20 13:44:17 crc kubenswrapper[4973]: I0320 13:44:17.113894 4973 scope.go:117] "RemoveContainer" containerID="af5781a711f799e3009428c39379e5691c00fa06d55896f42a46404b5df0af2d" Mar 20 13:44:17 crc kubenswrapper[4973]: I0320 13:44:17.246330 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-9jcwr"] Mar 20 13:44:17 crc kubenswrapper[4973]: I0320 13:44:17.255936 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7cb5889db5-9jcwr"] Mar 20 13:44:17 crc kubenswrapper[4973]: I0320 13:44:17.963784 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e324e333-c4ed-44f5-abc5-0ee4083027eb" path="/var/lib/kubelet/pods/e324e333-c4ed-44f5-abc5-0ee4083027eb/volumes" Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.111809 4973 generic.go:334] "Generic (PLEG): container finished" podID="db40f23c-3580-4532-8855-fb0d6e786747" containerID="efb96e669bfd1f34ec8068edc1c1938749d9e8206ecf2a33f71de536aec83da2" exitCode=0 Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.112511 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9dcsn" event={"ID":"db40f23c-3580-4532-8855-fb0d6e786747","Type":"ContainerDied","Data":"efb96e669bfd1f34ec8068edc1c1938749d9e8206ecf2a33f71de536aec83da2"} Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.302556 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-v59x5"] Mar 20 13:44:18 crc kubenswrapper[4973]: E0320 13:44:18.303304 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dbc7e778-9029-42ce-9a8e-e76636aad6a5" containerName="swift-ring-rebalance" Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.303317 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="dbc7e778-9029-42ce-9a8e-e76636aad6a5" containerName="swift-ring-rebalance" Mar 20 13:44:18 crc kubenswrapper[4973]: E0320 13:44:18.303326 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e324e333-c4ed-44f5-abc5-0ee4083027eb" containerName="dnsmasq-dns" Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.303349 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="e324e333-c4ed-44f5-abc5-0ee4083027eb" containerName="dnsmasq-dns" Mar 20 13:44:18 crc kubenswrapper[4973]: E0320 13:44:18.303390 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e324e333-c4ed-44f5-abc5-0ee4083027eb" containerName="init" Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.303397 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="e324e333-c4ed-44f5-abc5-0ee4083027eb" containerName="init" Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.303579 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="e324e333-c4ed-44f5-abc5-0ee4083027eb" containerName="dnsmasq-dns" Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.303593 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="dbc7e778-9029-42ce-9a8e-e76636aad6a5" containerName="swift-ring-rebalance" Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.305221 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v59x5" Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.310221 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-v59x5"] Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.333966 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx627\" (UniqueName: \"kubernetes.io/projected/99298e82-7167-49cb-ae27-e107a53c57d8-kube-api-access-cx627\") pod \"glance-db-create-v59x5\" (UID: \"99298e82-7167-49cb-ae27-e107a53c57d8\") " pod="openstack/glance-db-create-v59x5" Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.334023 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99298e82-7167-49cb-ae27-e107a53c57d8-operator-scripts\") pod \"glance-db-create-v59x5\" (UID: \"99298e82-7167-49cb-ae27-e107a53c57d8\") " pod="openstack/glance-db-create-v59x5" Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.399208 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-6399-account-create-update-tkjxp"] Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.401116 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6399-account-create-update-tkjxp" Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.403543 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.409255 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6399-account-create-update-tkjxp"] Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.435329 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cx627\" (UniqueName: \"kubernetes.io/projected/99298e82-7167-49cb-ae27-e107a53c57d8-kube-api-access-cx627\") pod \"glance-db-create-v59x5\" (UID: \"99298e82-7167-49cb-ae27-e107a53c57d8\") " pod="openstack/glance-db-create-v59x5" Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.435395 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99298e82-7167-49cb-ae27-e107a53c57d8-operator-scripts\") pod \"glance-db-create-v59x5\" (UID: \"99298e82-7167-49cb-ae27-e107a53c57d8\") " pod="openstack/glance-db-create-v59x5" Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.435460 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvtbq\" (UniqueName: \"kubernetes.io/projected/3d64a6d2-5f83-480e-a594-7e633e0e0586-kube-api-access-bvtbq\") pod \"glance-6399-account-create-update-tkjxp\" (UID: \"3d64a6d2-5f83-480e-a594-7e633e0e0586\") " pod="openstack/glance-6399-account-create-update-tkjxp" Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.435574 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d64a6d2-5f83-480e-a594-7e633e0e0586-operator-scripts\") pod \"glance-6399-account-create-update-tkjxp\" (UID: \"3d64a6d2-5f83-480e-a594-7e633e0e0586\") " pod="openstack/glance-6399-account-create-update-tkjxp" Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.436317 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99298e82-7167-49cb-ae27-e107a53c57d8-operator-scripts\") pod \"glance-db-create-v59x5\" (UID: \"99298e82-7167-49cb-ae27-e107a53c57d8\") " pod="openstack/glance-db-create-v59x5" Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.467773 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx627\" (UniqueName: \"kubernetes.io/projected/99298e82-7167-49cb-ae27-e107a53c57d8-kube-api-access-cx627\") pod \"glance-db-create-v59x5\" (UID: \"99298e82-7167-49cb-ae27-e107a53c57d8\") " pod="openstack/glance-db-create-v59x5" Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.538012 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvtbq\" (UniqueName: \"kubernetes.io/projected/3d64a6d2-5f83-480e-a594-7e633e0e0586-kube-api-access-bvtbq\") pod \"glance-6399-account-create-update-tkjxp\" (UID: \"3d64a6d2-5f83-480e-a594-7e633e0e0586\") " pod="openstack/glance-6399-account-create-update-tkjxp" Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.538502 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d64a6d2-5f83-480e-a594-7e633e0e0586-operator-scripts\") pod \"glance-6399-account-create-update-tkjxp\" (UID: \"3d64a6d2-5f83-480e-a594-7e633e0e0586\") " pod="openstack/glance-6399-account-create-update-tkjxp" Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.539299 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d64a6d2-5f83-480e-a594-7e633e0e0586-operator-scripts\") pod \"glance-6399-account-create-update-tkjxp\" (UID: \"3d64a6d2-5f83-480e-a594-7e633e0e0586\") " pod="openstack/glance-6399-account-create-update-tkjxp" Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.565029 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvtbq\" (UniqueName: \"kubernetes.io/projected/3d64a6d2-5f83-480e-a594-7e633e0e0586-kube-api-access-bvtbq\") pod \"glance-6399-account-create-update-tkjxp\" (UID: \"3d64a6d2-5f83-480e-a594-7e633e0e0586\") " pod="openstack/glance-6399-account-create-update-tkjxp" Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.637442 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v59x5" Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.721864 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6399-account-create-update-tkjxp" Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.862147 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-bkrl2" Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.952590 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rqph\" (UniqueName: \"kubernetes.io/projected/9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8-kube-api-access-5rqph\") pod \"9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8\" (UID: \"9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8\") " Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.952675 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8-operator-scripts\") pod \"9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8\" (UID: \"9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8\") " Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.954647 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8" (UID: "9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:18 crc kubenswrapper[4973]: I0320 13:44:18.964715 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8-kube-api-access-5rqph" (OuterVolumeSpecName: "kube-api-access-5rqph") pod "9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8" (UID: "9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8"). InnerVolumeSpecName "kube-api-access-5rqph". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.068767 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rqph\" (UniqueName: \"kubernetes.io/projected/9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8-kube-api-access-5rqph\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.068806 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.138695 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-db-create-bkrl2" event={"ID":"9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8","Type":"ContainerDied","Data":"8b79f35a8376e89cdf00dce65fd6d6452a41aa72ce0ebd3d6a301b449f1882a0"} Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.139023 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b79f35a8376e89cdf00dce65fd6d6452a41aa72ce0ebd3d6a301b449f1882a0" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.139104 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-db-create-bkrl2" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.148203 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"642c51b4-2774-4114-808c-1fb722862437","Type":"ContainerStarted","Data":"97cecec69536e1f8243868aef7821cad1dcd935afc8c9c21dd2f6bb5fe575497"} Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.159294 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wvx2d" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.171214 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5444n\" (UniqueName: \"kubernetes.io/projected/1df509cd-ad47-4fa5-86bf-c2dc341259b7-kube-api-access-5444n\") pod \"1df509cd-ad47-4fa5-86bf-c2dc341259b7\" (UID: \"1df509cd-ad47-4fa5-86bf-c2dc341259b7\") " Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.171308 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1df509cd-ad47-4fa5-86bf-c2dc341259b7-operator-scripts\") pod \"1df509cd-ad47-4fa5-86bf-c2dc341259b7\" (UID: \"1df509cd-ad47-4fa5-86bf-c2dc341259b7\") " Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.172764 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1df509cd-ad47-4fa5-86bf-c2dc341259b7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1df509cd-ad47-4fa5-86bf-c2dc341259b7" (UID: "1df509cd-ad47-4fa5-86bf-c2dc341259b7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.181661 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1df509cd-ad47-4fa5-86bf-c2dc341259b7-kube-api-access-5444n" (OuterVolumeSpecName: "kube-api-access-5444n") pod "1df509cd-ad47-4fa5-86bf-c2dc341259b7" (UID: "1df509cd-ad47-4fa5-86bf-c2dc341259b7"). InnerVolumeSpecName "kube-api-access-5444n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.231977 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ee19-account-create-update-jjnq9" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.272739 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdj2q\" (UniqueName: \"kubernetes.io/projected/a6efe097-248d-4d68-b2e0-172c2d005a17-kube-api-access-wdj2q\") pod \"a6efe097-248d-4d68-b2e0-172c2d005a17\" (UID: \"a6efe097-248d-4d68-b2e0-172c2d005a17\") " Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.273083 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6efe097-248d-4d68-b2e0-172c2d005a17-operator-scripts\") pod \"a6efe097-248d-4d68-b2e0-172c2d005a17\" (UID: \"a6efe097-248d-4d68-b2e0-172c2d005a17\") " Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.273612 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a6efe097-248d-4d68-b2e0-172c2d005a17-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a6efe097-248d-4d68-b2e0-172c2d005a17" (UID: "a6efe097-248d-4d68-b2e0-172c2d005a17"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.273690 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5444n\" (UniqueName: \"kubernetes.io/projected/1df509cd-ad47-4fa5-86bf-c2dc341259b7-kube-api-access-5444n\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.273707 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1df509cd-ad47-4fa5-86bf-c2dc341259b7-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.275811 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0ba4-account-create-update-lcrs9" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.276811 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6efe097-248d-4d68-b2e0-172c2d005a17-kube-api-access-wdj2q" (OuterVolumeSpecName: "kube-api-access-wdj2q") pod "a6efe097-248d-4d68-b2e0-172c2d005a17" (UID: "a6efe097-248d-4d68-b2e0-172c2d005a17"). InnerVolumeSpecName "kube-api-access-wdj2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.309058 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68e0-account-create-update-8dkxv" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.376501 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/707ca787-d762-44eb-932e-0130c091ae6a-operator-scripts\") pod \"707ca787-d762-44eb-932e-0130c091ae6a\" (UID: \"707ca787-d762-44eb-932e-0130c091ae6a\") " Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.376736 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n6bc\" (UniqueName: \"kubernetes.io/projected/e554d8c1-24d9-4dff-8495-2fb41208cdad-kube-api-access-9n6bc\") pod \"e554d8c1-24d9-4dff-8495-2fb41208cdad\" (UID: \"e554d8c1-24d9-4dff-8495-2fb41208cdad\") " Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.376787 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpxzc\" (UniqueName: \"kubernetes.io/projected/707ca787-d762-44eb-932e-0130c091ae6a-kube-api-access-tpxzc\") pod \"707ca787-d762-44eb-932e-0130c091ae6a\" (UID: \"707ca787-d762-44eb-932e-0130c091ae6a\") " Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.376858 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e554d8c1-24d9-4dff-8495-2fb41208cdad-operator-scripts\") pod \"e554d8c1-24d9-4dff-8495-2fb41208cdad\" (UID: \"e554d8c1-24d9-4dff-8495-2fb41208cdad\") " Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.377495 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a6efe097-248d-4d68-b2e0-172c2d005a17-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.377515 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdj2q\" (UniqueName: \"kubernetes.io/projected/a6efe097-248d-4d68-b2e0-172c2d005a17-kube-api-access-wdj2q\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.379236 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e554d8c1-24d9-4dff-8495-2fb41208cdad-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e554d8c1-24d9-4dff-8495-2fb41208cdad" (UID: "e554d8c1-24d9-4dff-8495-2fb41208cdad"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.379632 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/707ca787-d762-44eb-932e-0130c091ae6a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "707ca787-d762-44eb-932e-0130c091ae6a" (UID: "707ca787-d762-44eb-932e-0130c091ae6a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.386147 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/707ca787-d762-44eb-932e-0130c091ae6a-kube-api-access-tpxzc" (OuterVolumeSpecName: "kube-api-access-tpxzc") pod "707ca787-d762-44eb-932e-0130c091ae6a" (UID: "707ca787-d762-44eb-932e-0130c091ae6a"). InnerVolumeSpecName "kube-api-access-tpxzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.393553 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e554d8c1-24d9-4dff-8495-2fb41208cdad-kube-api-access-9n6bc" (OuterVolumeSpecName: "kube-api-access-9n6bc") pod "e554d8c1-24d9-4dff-8495-2fb41208cdad" (UID: "e554d8c1-24d9-4dff-8495-2fb41208cdad"). InnerVolumeSpecName "kube-api-access-9n6bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.422316 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xbx75" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.490331 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/707ca787-d762-44eb-932e-0130c091ae6a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.490462 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n6bc\" (UniqueName: \"kubernetes.io/projected/e554d8c1-24d9-4dff-8495-2fb41208cdad-kube-api-access-9n6bc\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.490477 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpxzc\" (UniqueName: \"kubernetes.io/projected/707ca787-d762-44eb-932e-0130c091ae6a-kube-api-access-tpxzc\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.490491 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e554d8c1-24d9-4dff-8495-2fb41208cdad-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.602078 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpbtv\" (UniqueName: \"kubernetes.io/projected/33bc7d77-fed7-4177-83a1-6eff4b55ee6d-kube-api-access-wpbtv\") pod \"33bc7d77-fed7-4177-83a1-6eff4b55ee6d\" (UID: \"33bc7d77-fed7-4177-83a1-6eff4b55ee6d\") " Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.602171 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33bc7d77-fed7-4177-83a1-6eff4b55ee6d-operator-scripts\") pod \"33bc7d77-fed7-4177-83a1-6eff4b55ee6d\" (UID: \"33bc7d77-fed7-4177-83a1-6eff4b55ee6d\") " Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.603274 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33bc7d77-fed7-4177-83a1-6eff4b55ee6d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "33bc7d77-fed7-4177-83a1-6eff4b55ee6d" (UID: "33bc7d77-fed7-4177-83a1-6eff4b55ee6d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.629608 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33bc7d77-fed7-4177-83a1-6eff4b55ee6d-kube-api-access-wpbtv" (OuterVolumeSpecName: "kube-api-access-wpbtv") pod "33bc7d77-fed7-4177-83a1-6eff4b55ee6d" (UID: "33bc7d77-fed7-4177-83a1-6eff4b55ee6d"). InnerVolumeSpecName "kube-api-access-wpbtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.705599 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpbtv\" (UniqueName: \"kubernetes.io/projected/33bc7d77-fed7-4177-83a1-6eff4b55ee6d-kube-api-access-wpbtv\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.705907 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33bc7d77-fed7-4177-83a1-6eff4b55ee6d-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.859633 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-v59x5"] Mar 20 13:44:19 crc kubenswrapper[4973]: W0320 13:44:19.872751 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod99298e82_7167_49cb_ae27_e107a53c57d8.slice/crio-820774501eaca5fa0665471391e800f809219fa8327ec6100c3718a4b4133312 WatchSource:0}: Error finding container 820774501eaca5fa0665471391e800f809219fa8327ec6100c3718a4b4133312: Status 404 returned error can't find the container with id 820774501eaca5fa0665471391e800f809219fa8327ec6100c3718a4b4133312 Mar 20 13:44:19 crc kubenswrapper[4973]: I0320 13:44:19.985181 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-6399-account-create-update-tkjxp"] Mar 20 13:44:19 crc kubenswrapper[4973]: W0320 13:44:19.993068 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3d64a6d2_5f83_480e_a594_7e633e0e0586.slice/crio-87c9e93a8fbc389f77315a9dedca3802584bbb67f43ed932587dc04cce09a8fc WatchSource:0}: Error finding container 87c9e93a8fbc389f77315a9dedca3802584bbb67f43ed932587dc04cce09a8fc: Status 404 returned error can't find the container with id 87c9e93a8fbc389f77315a9dedca3802584bbb67f43ed932587dc04cce09a8fc Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.012139 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.094273 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9dcsn" Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.181779 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-9dcsn" event={"ID":"db40f23c-3580-4532-8855-fb0d6e786747","Type":"ContainerDied","Data":"40596f8f693b6dfd9035c7282f1cb4266148850bd96fb19c0cd43b9c6477025c"} Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.181823 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40596f8f693b6dfd9035c7282f1cb4266148850bd96fb19c0cd43b9c6477025c" Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.181875 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-9dcsn" Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.187877 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-xbx75" Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.187960 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-xbx75" event={"ID":"33bc7d77-fed7-4177-83a1-6eff4b55ee6d","Type":"ContainerDied","Data":"4f125250e16156ccafaf676e8c0fcbae2d4345fcf2cd847572e5ece6030bed48"} Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.187984 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4f125250e16156ccafaf676e8c0fcbae2d4345fcf2cd847572e5ece6030bed48" Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.192970 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-wvx2d" Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.192969 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-wvx2d" event={"ID":"1df509cd-ad47-4fa5-86bf-c2dc341259b7","Type":"ContainerDied","Data":"acd8f4aca1d4b1111cda3c5e195e3e517307fc461a086f9952c0b24698f93de8"} Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.193073 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="acd8f4aca1d4b1111cda3c5e195e3e517307fc461a086f9952c0b24698f93de8" Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.196288 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-ee19-account-create-update-jjnq9" event={"ID":"a6efe097-248d-4d68-b2e0-172c2d005a17","Type":"ContainerDied","Data":"80e0208124460a11d6de2b432bfbe73e1a3861b077a951f39a7073d949cc9b8e"} Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.196317 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80e0208124460a11d6de2b432bfbe73e1a3861b077a951f39a7073d949cc9b8e" Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.196368 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-ee19-account-create-update-jjnq9" Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.198811 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6399-account-create-update-tkjxp" event={"ID":"3d64a6d2-5f83-480e-a594-7e633e0e0586","Type":"ContainerStarted","Data":"87c9e93a8fbc389f77315a9dedca3802584bbb67f43ed932587dc04cce09a8fc"} Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.205401 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-v59x5" event={"ID":"99298e82-7167-49cb-ae27-e107a53c57d8","Type":"ContainerStarted","Data":"820774501eaca5fa0665471391e800f809219fa8327ec6100c3718a4b4133312"} Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.208775 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-68e0-account-create-update-8dkxv" event={"ID":"e554d8c1-24d9-4dff-8495-2fb41208cdad","Type":"ContainerDied","Data":"4c737851dd187ed813e99921d75d0df66c4ed0b1f038e7cdc4a46565be91c793"} Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.208813 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c737851dd187ed813e99921d75d0df66c4ed0b1f038e7cdc4a46565be91c793" Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.208876 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-68e0-account-create-update-8dkxv" Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.214833 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db40f23c-3580-4532-8855-fb0d6e786747-operator-scripts\") pod \"db40f23c-3580-4532-8855-fb0d6e786747\" (UID: \"db40f23c-3580-4532-8855-fb0d6e786747\") " Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.215008 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l8m7g\" (UniqueName: \"kubernetes.io/projected/db40f23c-3580-4532-8855-fb0d6e786747-kube-api-access-l8m7g\") pod \"db40f23c-3580-4532-8855-fb0d6e786747\" (UID: \"db40f23c-3580-4532-8855-fb0d6e786747\") " Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.216670 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db40f23c-3580-4532-8855-fb0d6e786747-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "db40f23c-3580-4532-8855-fb0d6e786747" (UID: "db40f23c-3580-4532-8855-fb0d6e786747"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.219227 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0ba4-account-create-update-lcrs9" event={"ID":"707ca787-d762-44eb-932e-0130c091ae6a","Type":"ContainerDied","Data":"9c2abd9d87008d2f3638cb6e10ef79297d88df6b9856625ef42ed2cde28d3573"} Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.219267 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c2abd9d87008d2f3638cb6e10ef79297d88df6b9856625ef42ed2cde28d3573" Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.219329 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0ba4-account-create-update-lcrs9" Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.248926 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db40f23c-3580-4532-8855-fb0d6e786747-kube-api-access-l8m7g" (OuterVolumeSpecName: "kube-api-access-l8m7g") pod "db40f23c-3580-4532-8855-fb0d6e786747" (UID: "db40f23c-3580-4532-8855-fb0d6e786747"). InnerVolumeSpecName "kube-api-access-l8m7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.317882 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/db40f23c-3580-4532-8855-fb0d6e786747-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:20 crc kubenswrapper[4973]: I0320 13:44:20.318813 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l8m7g\" (UniqueName: \"kubernetes.io/projected/db40f23c-3580-4532-8855-fb0d6e786747-kube-api-access-l8m7g\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.092761 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7cb5889db5-9jcwr" podUID="e324e333-c4ed-44f5-abc5-0ee4083027eb" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.147:5353: i/o timeout" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.101450 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-7vzxf"] Mar 20 13:44:21 crc kubenswrapper[4973]: E0320 13:44:21.102237 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="707ca787-d762-44eb-932e-0130c091ae6a" containerName="mariadb-account-create-update" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.102323 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="707ca787-d762-44eb-932e-0130c091ae6a" containerName="mariadb-account-create-update" Mar 20 13:44:21 crc kubenswrapper[4973]: E0320 13:44:21.102440 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1df509cd-ad47-4fa5-86bf-c2dc341259b7" containerName="mariadb-database-create" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.102508 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="1df509cd-ad47-4fa5-86bf-c2dc341259b7" containerName="mariadb-database-create" Mar 20 13:44:21 crc kubenswrapper[4973]: E0320 13:44:21.102602 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6efe097-248d-4d68-b2e0-172c2d005a17" containerName="mariadb-account-create-update" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.102654 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6efe097-248d-4d68-b2e0-172c2d005a17" containerName="mariadb-account-create-update" Mar 20 13:44:21 crc kubenswrapper[4973]: E0320 13:44:21.102705 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e554d8c1-24d9-4dff-8495-2fb41208cdad" containerName="mariadb-account-create-update" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.102755 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="e554d8c1-24d9-4dff-8495-2fb41208cdad" containerName="mariadb-account-create-update" Mar 20 13:44:21 crc kubenswrapper[4973]: E0320 13:44:21.102817 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db40f23c-3580-4532-8855-fb0d6e786747" containerName="mariadb-account-create-update" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.102908 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="db40f23c-3580-4532-8855-fb0d6e786747" containerName="mariadb-account-create-update" Mar 20 13:44:21 crc kubenswrapper[4973]: E0320 13:44:21.102966 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8" containerName="mariadb-database-create" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.103028 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8" containerName="mariadb-database-create" Mar 20 13:44:21 crc kubenswrapper[4973]: E0320 13:44:21.103087 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33bc7d77-fed7-4177-83a1-6eff4b55ee6d" containerName="mariadb-database-create" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.103155 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="33bc7d77-fed7-4177-83a1-6eff4b55ee6d" containerName="mariadb-database-create" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.103478 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="707ca787-d762-44eb-932e-0130c091ae6a" containerName="mariadb-account-create-update" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.103572 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="33bc7d77-fed7-4177-83a1-6eff4b55ee6d" containerName="mariadb-database-create" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.103781 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="e554d8c1-24d9-4dff-8495-2fb41208cdad" containerName="mariadb-account-create-update" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.103855 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8" containerName="mariadb-database-create" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.103915 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="1df509cd-ad47-4fa5-86bf-c2dc341259b7" containerName="mariadb-database-create" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.104010 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6efe097-248d-4d68-b2e0-172c2d005a17" containerName="mariadb-account-create-update" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.104073 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="db40f23c-3580-4532-8855-fb0d6e786747" containerName="mariadb-account-create-update" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.104870 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-7vzxf" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.127414 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-7vzxf"] Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.134906 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88d9ea96-b45d-44d9-9aa0-60d2668a9a0c-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-7vzxf\" (UID: \"88d9ea96-b45d-44d9-9aa0-60d2668a9a0c\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-7vzxf" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.135225 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62dgq\" (UniqueName: \"kubernetes.io/projected/88d9ea96-b45d-44d9-9aa0-60d2668a9a0c-kube-api-access-62dgq\") pod \"mysqld-exporter-openstack-cell1-db-create-7vzxf\" (UID: \"88d9ea96-b45d-44d9-9aa0-60d2668a9a0c\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-7vzxf" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.235318 4973 generic.go:334] "Generic (PLEG): container finished" podID="3d64a6d2-5f83-480e-a594-7e633e0e0586" containerID="d244307f50c47c1f04e7c7e3897f5842a02a6df9aa1aef658d0d450bafe87e5f" exitCode=0 Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.235446 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6399-account-create-update-tkjxp" event={"ID":"3d64a6d2-5f83-480e-a594-7e633e0e0586","Type":"ContainerDied","Data":"d244307f50c47c1f04e7c7e3897f5842a02a6df9aa1aef658d0d450bafe87e5f"} Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.236808 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88d9ea96-b45d-44d9-9aa0-60d2668a9a0c-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-7vzxf\" (UID: \"88d9ea96-b45d-44d9-9aa0-60d2668a9a0c\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-7vzxf" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.236911 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62dgq\" (UniqueName: \"kubernetes.io/projected/88d9ea96-b45d-44d9-9aa0-60d2668a9a0c-kube-api-access-62dgq\") pod \"mysqld-exporter-openstack-cell1-db-create-7vzxf\" (UID: \"88d9ea96-b45d-44d9-9aa0-60d2668a9a0c\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-7vzxf" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.238141 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88d9ea96-b45d-44d9-9aa0-60d2668a9a0c-operator-scripts\") pod \"mysqld-exporter-openstack-cell1-db-create-7vzxf\" (UID: \"88d9ea96-b45d-44d9-9aa0-60d2668a9a0c\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-7vzxf" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.241733 4973 generic.go:334] "Generic (PLEG): container finished" podID="99298e82-7167-49cb-ae27-e107a53c57d8" containerID="7d3afabbfc3c755ab5cba1e458d92066c4cc6d813750ceaa81aa3300c1767fa0" exitCode=0 Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.241772 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-v59x5" event={"ID":"99298e82-7167-49cb-ae27-e107a53c57d8","Type":"ContainerDied","Data":"7d3afabbfc3c755ab5cba1e458d92066c4cc6d813750ceaa81aa3300c1767fa0"} Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.262146 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62dgq\" (UniqueName: \"kubernetes.io/projected/88d9ea96-b45d-44d9-9aa0-60d2668a9a0c-kube-api-access-62dgq\") pod \"mysqld-exporter-openstack-cell1-db-create-7vzxf\" (UID: \"88d9ea96-b45d-44d9-9aa0-60d2668a9a0c\") " pod="openstack/mysqld-exporter-openstack-cell1-db-create-7vzxf" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.327033 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-19e0-account-create-update-rkd5n"] Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.328840 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-19e0-account-create-update-rkd5n" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.333213 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-openstack-cell1-db-secret" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.340762 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-19e0-account-create-update-rkd5n"] Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.428877 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-7vzxf" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.441294 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4ce1c5d-178b-4adc-b4f5-7c354e2914d0-operator-scripts\") pod \"mysqld-exporter-19e0-account-create-update-rkd5n\" (UID: \"d4ce1c5d-178b-4adc-b4f5-7c354e2914d0\") " pod="openstack/mysqld-exporter-19e0-account-create-update-rkd5n" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.441438 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f82tb\" (UniqueName: \"kubernetes.io/projected/d4ce1c5d-178b-4adc-b4f5-7c354e2914d0-kube-api-access-f82tb\") pod \"mysqld-exporter-19e0-account-create-update-rkd5n\" (UID: \"d4ce1c5d-178b-4adc-b4f5-7c354e2914d0\") " pod="openstack/mysqld-exporter-19e0-account-create-update-rkd5n" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.499923 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-9dcsn"] Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.509777 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-9dcsn"] Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.543578 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4ce1c5d-178b-4adc-b4f5-7c354e2914d0-operator-scripts\") pod \"mysqld-exporter-19e0-account-create-update-rkd5n\" (UID: \"d4ce1c5d-178b-4adc-b4f5-7c354e2914d0\") " pod="openstack/mysqld-exporter-19e0-account-create-update-rkd5n" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.543678 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f82tb\" (UniqueName: \"kubernetes.io/projected/d4ce1c5d-178b-4adc-b4f5-7c354e2914d0-kube-api-access-f82tb\") pod \"mysqld-exporter-19e0-account-create-update-rkd5n\" (UID: \"d4ce1c5d-178b-4adc-b4f5-7c354e2914d0\") " pod="openstack/mysqld-exporter-19e0-account-create-update-rkd5n" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.544297 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4ce1c5d-178b-4adc-b4f5-7c354e2914d0-operator-scripts\") pod \"mysqld-exporter-19e0-account-create-update-rkd5n\" (UID: \"d4ce1c5d-178b-4adc-b4f5-7c354e2914d0\") " pod="openstack/mysqld-exporter-19e0-account-create-update-rkd5n" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.561788 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f82tb\" (UniqueName: \"kubernetes.io/projected/d4ce1c5d-178b-4adc-b4f5-7c354e2914d0-kube-api-access-f82tb\") pod \"mysqld-exporter-19e0-account-create-update-rkd5n\" (UID: \"d4ce1c5d-178b-4adc-b4f5-7c354e2914d0\") " pod="openstack/mysqld-exporter-19e0-account-create-update-rkd5n" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.649972 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-19e0-account-create-update-rkd5n" Mar 20 13:44:21 crc kubenswrapper[4973]: I0320 13:44:21.961106 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db40f23c-3580-4532-8855-fb0d6e786747" path="/var/lib/kubelet/pods/db40f23c-3580-4532-8855-fb0d6e786747/volumes" Mar 20 13:44:23 crc kubenswrapper[4973]: I0320 13:44:23.468052 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v59x5" Mar 20 13:44:23 crc kubenswrapper[4973]: I0320 13:44:23.482210 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6399-account-create-update-tkjxp" Mar 20 13:44:23 crc kubenswrapper[4973]: I0320 13:44:23.499815 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d64a6d2-5f83-480e-a594-7e633e0e0586-operator-scripts\") pod \"3d64a6d2-5f83-480e-a594-7e633e0e0586\" (UID: \"3d64a6d2-5f83-480e-a594-7e633e0e0586\") " Mar 20 13:44:23 crc kubenswrapper[4973]: I0320 13:44:23.500110 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvtbq\" (UniqueName: \"kubernetes.io/projected/3d64a6d2-5f83-480e-a594-7e633e0e0586-kube-api-access-bvtbq\") pod \"3d64a6d2-5f83-480e-a594-7e633e0e0586\" (UID: \"3d64a6d2-5f83-480e-a594-7e633e0e0586\") " Mar 20 13:44:23 crc kubenswrapper[4973]: I0320 13:44:23.500157 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99298e82-7167-49cb-ae27-e107a53c57d8-operator-scripts\") pod \"99298e82-7167-49cb-ae27-e107a53c57d8\" (UID: \"99298e82-7167-49cb-ae27-e107a53c57d8\") " Mar 20 13:44:23 crc kubenswrapper[4973]: I0320 13:44:23.500274 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx627\" (UniqueName: \"kubernetes.io/projected/99298e82-7167-49cb-ae27-e107a53c57d8-kube-api-access-cx627\") pod \"99298e82-7167-49cb-ae27-e107a53c57d8\" (UID: \"99298e82-7167-49cb-ae27-e107a53c57d8\") " Mar 20 13:44:23 crc kubenswrapper[4973]: I0320 13:44:23.502805 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99298e82-7167-49cb-ae27-e107a53c57d8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "99298e82-7167-49cb-ae27-e107a53c57d8" (UID: "99298e82-7167-49cb-ae27-e107a53c57d8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:23 crc kubenswrapper[4973]: I0320 13:44:23.503085 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d64a6d2-5f83-480e-a594-7e633e0e0586-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3d64a6d2-5f83-480e-a594-7e633e0e0586" (UID: "3d64a6d2-5f83-480e-a594-7e633e0e0586"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:23 crc kubenswrapper[4973]: I0320 13:44:23.509867 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d64a6d2-5f83-480e-a594-7e633e0e0586-kube-api-access-bvtbq" (OuterVolumeSpecName: "kube-api-access-bvtbq") pod "3d64a6d2-5f83-480e-a594-7e633e0e0586" (UID: "3d64a6d2-5f83-480e-a594-7e633e0e0586"). InnerVolumeSpecName "kube-api-access-bvtbq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:23 crc kubenswrapper[4973]: I0320 13:44:23.522719 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99298e82-7167-49cb-ae27-e107a53c57d8-kube-api-access-cx627" (OuterVolumeSpecName: "kube-api-access-cx627") pod "99298e82-7167-49cb-ae27-e107a53c57d8" (UID: "99298e82-7167-49cb-ae27-e107a53c57d8"). InnerVolumeSpecName "kube-api-access-cx627". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:23 crc kubenswrapper[4973]: I0320 13:44:23.602525 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cx627\" (UniqueName: \"kubernetes.io/projected/99298e82-7167-49cb-ae27-e107a53c57d8-kube-api-access-cx627\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:23 crc kubenswrapper[4973]: I0320 13:44:23.602556 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d64a6d2-5f83-480e-a594-7e633e0e0586-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:23 crc kubenswrapper[4973]: I0320 13:44:23.602565 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvtbq\" (UniqueName: \"kubernetes.io/projected/3d64a6d2-5f83-480e-a594-7e633e0e0586-kube-api-access-bvtbq\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:23 crc kubenswrapper[4973]: I0320 13:44:23.602573 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/99298e82-7167-49cb-ae27-e107a53c57d8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:23 crc kubenswrapper[4973]: I0320 13:44:23.642548 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-x2nll" podUID="f2c7b535-ad26-4bf4-848b-26890c0eb580" containerName="ovn-controller" probeResult="failure" output=< Mar 20 13:44:23 crc kubenswrapper[4973]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 13:44:23 crc kubenswrapper[4973]: > Mar 20 13:44:23 crc kubenswrapper[4973]: I0320 13:44:23.806568 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-etc-swift\") pod \"swift-storage-0\" (UID: \"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b\") " pod="openstack/swift-storage-0" Mar 20 13:44:23 crc kubenswrapper[4973]: I0320 13:44:23.812901 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/8184c0e7-f9ef-48a3-9461-5cc6c1188e6b-etc-swift\") pod \"swift-storage-0\" (UID: \"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b\") " pod="openstack/swift-storage-0" Mar 20 13:44:23 crc kubenswrapper[4973]: I0320 13:44:23.891597 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-7vzxf"] Mar 20 13:44:23 crc kubenswrapper[4973]: W0320 13:44:23.998015 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4ce1c5d_178b_4adc_b4f5_7c354e2914d0.slice/crio-f6587c7ff4bb7d48df118f60025707a8bb1e3c7fd65c3559e10a5d220e318bc8 WatchSource:0}: Error finding container f6587c7ff4bb7d48df118f60025707a8bb1e3c7fd65c3559e10a5d220e318bc8: Status 404 returned error can't find the container with id f6587c7ff4bb7d48df118f60025707a8bb1e3c7fd65c3559e10a5d220e318bc8 Mar 20 13:44:23 crc kubenswrapper[4973]: I0320 13:44:23.998386 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-19e0-account-create-update-rkd5n"] Mar 20 13:44:24 crc kubenswrapper[4973]: I0320 13:44:24.004319 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 20 13:44:24 crc kubenswrapper[4973]: I0320 13:44:24.294420 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"642c51b4-2774-4114-808c-1fb722862437","Type":"ContainerStarted","Data":"750c5df52618bc5d32c078cdb27b56850d9e8d6bf24b361607e9d8944cae08c8"} Mar 20 13:44:24 crc kubenswrapper[4973]: I0320 13:44:24.301659 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-19e0-account-create-update-rkd5n" event={"ID":"d4ce1c5d-178b-4adc-b4f5-7c354e2914d0","Type":"ContainerStarted","Data":"32ff66beb5df2f942bd61fe0aae352bad22a5e8ca6e4333bccc86455dcb1e33e"} Mar 20 13:44:24 crc kubenswrapper[4973]: I0320 13:44:24.301726 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-19e0-account-create-update-rkd5n" event={"ID":"d4ce1c5d-178b-4adc-b4f5-7c354e2914d0","Type":"ContainerStarted","Data":"f6587c7ff4bb7d48df118f60025707a8bb1e3c7fd65c3559e10a5d220e318bc8"} Mar 20 13:44:24 crc kubenswrapper[4973]: I0320 13:44:24.303861 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-7vzxf" event={"ID":"88d9ea96-b45d-44d9-9aa0-60d2668a9a0c","Type":"ContainerStarted","Data":"121d3840114c7ee194ce3cdda0b1be9791fbf779b319e5e7474aa6bab50bdb3a"} Mar 20 13:44:24 crc kubenswrapper[4973]: I0320 13:44:24.303943 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-7vzxf" event={"ID":"88d9ea96-b45d-44d9-9aa0-60d2668a9a0c","Type":"ContainerStarted","Data":"28811e0bd9f92712407df9838810deacd8083585b5f3591ed073a0c5cd2cdaff"} Mar 20 13:44:24 crc kubenswrapper[4973]: I0320 13:44:24.307140 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-6399-account-create-update-tkjxp" event={"ID":"3d64a6d2-5f83-480e-a594-7e633e0e0586","Type":"ContainerDied","Data":"87c9e93a8fbc389f77315a9dedca3802584bbb67f43ed932587dc04cce09a8fc"} Mar 20 13:44:24 crc kubenswrapper[4973]: I0320 13:44:24.307172 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87c9e93a8fbc389f77315a9dedca3802584bbb67f43ed932587dc04cce09a8fc" Mar 20 13:44:24 crc kubenswrapper[4973]: I0320 13:44:24.307226 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-6399-account-create-update-tkjxp" Mar 20 13:44:24 crc kubenswrapper[4973]: I0320 13:44:24.309279 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-v59x5" event={"ID":"99298e82-7167-49cb-ae27-e107a53c57d8","Type":"ContainerDied","Data":"820774501eaca5fa0665471391e800f809219fa8327ec6100c3718a4b4133312"} Mar 20 13:44:24 crc kubenswrapper[4973]: I0320 13:44:24.309308 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="820774501eaca5fa0665471391e800f809219fa8327ec6100c3718a4b4133312" Mar 20 13:44:24 crc kubenswrapper[4973]: I0320 13:44:24.309370 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-v59x5" Mar 20 13:44:24 crc kubenswrapper[4973]: I0320 13:44:24.348760 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=19.004604227 podStartE2EDuration="1m4.348741408s" podCreationTimestamp="2026-03-20 13:43:20 +0000 UTC" firstStartedPulling="2026-03-20 13:43:38.303668545 +0000 UTC m=+1339.047338289" lastFinishedPulling="2026-03-20 13:44:23.647805726 +0000 UTC m=+1384.391475470" observedRunningTime="2026-03-20 13:44:24.340813692 +0000 UTC m=+1385.084483446" watchObservedRunningTime="2026-03-20 13:44:24.348741408 +0000 UTC m=+1385.092411152" Mar 20 13:44:24 crc kubenswrapper[4973]: I0320 13:44:24.377191 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-openstack-cell1-db-create-7vzxf" podStartSLOduration=3.377155274 podStartE2EDuration="3.377155274s" podCreationTimestamp="2026-03-20 13:44:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:24.364840268 +0000 UTC m=+1385.108510012" watchObservedRunningTime="2026-03-20 13:44:24.377155274 +0000 UTC m=+1385.120825028" Mar 20 13:44:24 crc kubenswrapper[4973]: I0320 13:44:24.401565 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-19e0-account-create-update-rkd5n" podStartSLOduration=3.401542739 podStartE2EDuration="3.401542739s" podCreationTimestamp="2026-03-20 13:44:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:24.380986398 +0000 UTC m=+1385.124656142" watchObservedRunningTime="2026-03-20 13:44:24.401542739 +0000 UTC m=+1385.145212493" Mar 20 13:44:24 crc kubenswrapper[4973]: I0320 13:44:24.744508 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 20 13:44:24 crc kubenswrapper[4973]: W0320 13:44:24.748506 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8184c0e7_f9ef_48a3_9461_5cc6c1188e6b.slice/crio-8a74075cea647fe40636a912d763025dba52ba58cf6b17827553d2ea9e440c33 WatchSource:0}: Error finding container 8a74075cea647fe40636a912d763025dba52ba58cf6b17827553d2ea9e440c33: Status 404 returned error can't find the container with id 8a74075cea647fe40636a912d763025dba52ba58cf6b17827553d2ea9e440c33 Mar 20 13:44:24 crc kubenswrapper[4973]: E0320 13:44:24.834188 4973 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4ce1c5d_178b_4adc_b4f5_7c354e2914d0.slice/crio-32ff66beb5df2f942bd61fe0aae352bad22a5e8ca6e4333bccc86455dcb1e33e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd4ce1c5d_178b_4adc_b4f5_7c354e2914d0.slice/crio-conmon-32ff66beb5df2f942bd61fe0aae352bad22a5e8ca6e4333bccc86455dcb1e33e.scope\": RecentStats: unable to find data in memory cache]" Mar 20 13:44:25 crc kubenswrapper[4973]: I0320 13:44:25.251475 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-w99bc"] Mar 20 13:44:25 crc kubenswrapper[4973]: E0320 13:44:25.252488 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d64a6d2-5f83-480e-a594-7e633e0e0586" containerName="mariadb-account-create-update" Mar 20 13:44:25 crc kubenswrapper[4973]: I0320 13:44:25.252544 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d64a6d2-5f83-480e-a594-7e633e0e0586" containerName="mariadb-account-create-update" Mar 20 13:44:25 crc kubenswrapper[4973]: E0320 13:44:25.252637 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99298e82-7167-49cb-ae27-e107a53c57d8" containerName="mariadb-database-create" Mar 20 13:44:25 crc kubenswrapper[4973]: I0320 13:44:25.252651 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="99298e82-7167-49cb-ae27-e107a53c57d8" containerName="mariadb-database-create" Mar 20 13:44:25 crc kubenswrapper[4973]: I0320 13:44:25.253033 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="99298e82-7167-49cb-ae27-e107a53c57d8" containerName="mariadb-database-create" Mar 20 13:44:25 crc kubenswrapper[4973]: I0320 13:44:25.253058 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d64a6d2-5f83-480e-a594-7e633e0e0586" containerName="mariadb-account-create-update" Mar 20 13:44:25 crc kubenswrapper[4973]: I0320 13:44:25.253848 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-w99bc" Mar 20 13:44:25 crc kubenswrapper[4973]: I0320 13:44:25.259892 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 13:44:25 crc kubenswrapper[4973]: I0320 13:44:25.263003 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-w99bc"] Mar 20 13:44:25 crc kubenswrapper[4973]: I0320 13:44:25.320467 4973 generic.go:334] "Generic (PLEG): container finished" podID="88d9ea96-b45d-44d9-9aa0-60d2668a9a0c" containerID="121d3840114c7ee194ce3cdda0b1be9791fbf779b319e5e7474aa6bab50bdb3a" exitCode=0 Mar 20 13:44:25 crc kubenswrapper[4973]: I0320 13:44:25.320794 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-7vzxf" event={"ID":"88d9ea96-b45d-44d9-9aa0-60d2668a9a0c","Type":"ContainerDied","Data":"121d3840114c7ee194ce3cdda0b1be9791fbf779b319e5e7474aa6bab50bdb3a"} Mar 20 13:44:25 crc kubenswrapper[4973]: I0320 13:44:25.323050 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b","Type":"ContainerStarted","Data":"8a74075cea647fe40636a912d763025dba52ba58cf6b17827553d2ea9e440c33"} Mar 20 13:44:25 crc kubenswrapper[4973]: I0320 13:44:25.326506 4973 generic.go:334] "Generic (PLEG): container finished" podID="d4ce1c5d-178b-4adc-b4f5-7c354e2914d0" containerID="32ff66beb5df2f942bd61fe0aae352bad22a5e8ca6e4333bccc86455dcb1e33e" exitCode=0 Mar 20 13:44:25 crc kubenswrapper[4973]: I0320 13:44:25.328384 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-19e0-account-create-update-rkd5n" event={"ID":"d4ce1c5d-178b-4adc-b4f5-7c354e2914d0","Type":"ContainerDied","Data":"32ff66beb5df2f942bd61fe0aae352bad22a5e8ca6e4333bccc86455dcb1e33e"} Mar 20 13:44:25 crc kubenswrapper[4973]: I0320 13:44:25.404540 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="20780ec2-d338-45a4-9259-16a651e46e55" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.132:5671: connect: connection refused" Mar 20 13:44:25 crc kubenswrapper[4973]: I0320 13:44:25.442352 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc78n\" (UniqueName: \"kubernetes.io/projected/395a3c82-8435-4a03-bbb4-4289d1cf6b15-kube-api-access-pc78n\") pod \"root-account-create-update-w99bc\" (UID: \"395a3c82-8435-4a03-bbb4-4289d1cf6b15\") " pod="openstack/root-account-create-update-w99bc" Mar 20 13:44:25 crc kubenswrapper[4973]: I0320 13:44:25.442424 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/395a3c82-8435-4a03-bbb4-4289d1cf6b15-operator-scripts\") pod \"root-account-create-update-w99bc\" (UID: \"395a3c82-8435-4a03-bbb4-4289d1cf6b15\") " pod="openstack/root-account-create-update-w99bc" Mar 20 13:44:25 crc kubenswrapper[4973]: I0320 13:44:25.455537 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="96de22e2-f61c-4f75-8faa-9a0591aa0f38" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.133:5671: connect: connection refused" Mar 20 13:44:25 crc kubenswrapper[4973]: I0320 13:44:25.476578 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="797b38f5-d9a7-4f82-bd12-e40e021ef28e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.134:5671: connect: connection refused" Mar 20 13:44:26 crc kubenswrapper[4973]: I0320 13:44:26.045758 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc78n\" (UniqueName: \"kubernetes.io/projected/395a3c82-8435-4a03-bbb4-4289d1cf6b15-kube-api-access-pc78n\") pod \"root-account-create-update-w99bc\" (UID: \"395a3c82-8435-4a03-bbb4-4289d1cf6b15\") " pod="openstack/root-account-create-update-w99bc" Mar 20 13:44:26 crc kubenswrapper[4973]: I0320 13:44:26.045835 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/395a3c82-8435-4a03-bbb4-4289d1cf6b15-operator-scripts\") pod \"root-account-create-update-w99bc\" (UID: \"395a3c82-8435-4a03-bbb4-4289d1cf6b15\") " pod="openstack/root-account-create-update-w99bc" Mar 20 13:44:26 crc kubenswrapper[4973]: I0320 13:44:26.046801 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/395a3c82-8435-4a03-bbb4-4289d1cf6b15-operator-scripts\") pod \"root-account-create-update-w99bc\" (UID: \"395a3c82-8435-4a03-bbb4-4289d1cf6b15\") " pod="openstack/root-account-create-update-w99bc" Mar 20 13:44:26 crc kubenswrapper[4973]: I0320 13:44:26.049227 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="4ed60638-5022-406b-b568-7fa0d6bf4ba8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.135:5671: connect: connection refused" Mar 20 13:44:26 crc kubenswrapper[4973]: I0320 13:44:26.110259 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc78n\" (UniqueName: \"kubernetes.io/projected/395a3c82-8435-4a03-bbb4-4289d1cf6b15-kube-api-access-pc78n\") pod \"root-account-create-update-w99bc\" (UID: \"395a3c82-8435-4a03-bbb4-4289d1cf6b15\") " pod="openstack/root-account-create-update-w99bc" Mar 20 13:44:26 crc kubenswrapper[4973]: I0320 13:44:26.204888 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-w99bc" Mar 20 13:44:26 crc kubenswrapper[4973]: I0320 13:44:26.381504 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 20 13:44:27 crc kubenswrapper[4973]: I0320 13:44:27.087141 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-w99bc"] Mar 20 13:44:27 crc kubenswrapper[4973]: W0320 13:44:27.096806 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod395a3c82_8435_4a03_bbb4_4289d1cf6b15.slice/crio-12b67527184aae5df1df5c153dd06695dbfd223ee2e68e8667988fd5300f222c WatchSource:0}: Error finding container 12b67527184aae5df1df5c153dd06695dbfd223ee2e68e8667988fd5300f222c: Status 404 returned error can't find the container with id 12b67527184aae5df1df5c153dd06695dbfd223ee2e68e8667988fd5300f222c Mar 20 13:44:27 crc kubenswrapper[4973]: I0320 13:44:27.217656 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:27 crc kubenswrapper[4973]: I0320 13:44:27.370136 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-openstack-cell1-db-create-7vzxf" event={"ID":"88d9ea96-b45d-44d9-9aa0-60d2668a9a0c","Type":"ContainerDied","Data":"28811e0bd9f92712407df9838810deacd8083585b5f3591ed073a0c5cd2cdaff"} Mar 20 13:44:27 crc kubenswrapper[4973]: I0320 13:44:27.370815 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28811e0bd9f92712407df9838810deacd8083585b5f3591ed073a0c5cd2cdaff" Mar 20 13:44:27 crc kubenswrapper[4973]: I0320 13:44:27.371812 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-w99bc" event={"ID":"395a3c82-8435-4a03-bbb4-4289d1cf6b15","Type":"ContainerStarted","Data":"12b67527184aae5df1df5c153dd06695dbfd223ee2e68e8667988fd5300f222c"} Mar 20 13:44:27 crc kubenswrapper[4973]: I0320 13:44:27.373281 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-19e0-account-create-update-rkd5n" event={"ID":"d4ce1c5d-178b-4adc-b4f5-7c354e2914d0","Type":"ContainerDied","Data":"f6587c7ff4bb7d48df118f60025707a8bb1e3c7fd65c3559e10a5d220e318bc8"} Mar 20 13:44:27 crc kubenswrapper[4973]: I0320 13:44:27.373329 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6587c7ff4bb7d48df118f60025707a8bb1e3c7fd65c3559e10a5d220e318bc8" Mar 20 13:44:27 crc kubenswrapper[4973]: I0320 13:44:27.388515 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-7vzxf" Mar 20 13:44:27 crc kubenswrapper[4973]: I0320 13:44:27.468122 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-19e0-account-create-update-rkd5n" Mar 20 13:44:27 crc kubenswrapper[4973]: I0320 13:44:27.581525 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88d9ea96-b45d-44d9-9aa0-60d2668a9a0c-operator-scripts\") pod \"88d9ea96-b45d-44d9-9aa0-60d2668a9a0c\" (UID: \"88d9ea96-b45d-44d9-9aa0-60d2668a9a0c\") " Mar 20 13:44:27 crc kubenswrapper[4973]: I0320 13:44:27.581607 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62dgq\" (UniqueName: \"kubernetes.io/projected/88d9ea96-b45d-44d9-9aa0-60d2668a9a0c-kube-api-access-62dgq\") pod \"88d9ea96-b45d-44d9-9aa0-60d2668a9a0c\" (UID: \"88d9ea96-b45d-44d9-9aa0-60d2668a9a0c\") " Mar 20 13:44:27 crc kubenswrapper[4973]: I0320 13:44:27.583150 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4ce1c5d-178b-4adc-b4f5-7c354e2914d0-operator-scripts\") pod \"d4ce1c5d-178b-4adc-b4f5-7c354e2914d0\" (UID: \"d4ce1c5d-178b-4adc-b4f5-7c354e2914d0\") " Mar 20 13:44:27 crc kubenswrapper[4973]: I0320 13:44:27.583397 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f82tb\" (UniqueName: \"kubernetes.io/projected/d4ce1c5d-178b-4adc-b4f5-7c354e2914d0-kube-api-access-f82tb\") pod \"d4ce1c5d-178b-4adc-b4f5-7c354e2914d0\" (UID: \"d4ce1c5d-178b-4adc-b4f5-7c354e2914d0\") " Mar 20 13:44:27 crc kubenswrapper[4973]: I0320 13:44:27.585825 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/88d9ea96-b45d-44d9-9aa0-60d2668a9a0c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "88d9ea96-b45d-44d9-9aa0-60d2668a9a0c" (UID: "88d9ea96-b45d-44d9-9aa0-60d2668a9a0c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:27 crc kubenswrapper[4973]: I0320 13:44:27.586074 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4ce1c5d-178b-4adc-b4f5-7c354e2914d0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d4ce1c5d-178b-4adc-b4f5-7c354e2914d0" (UID: "d4ce1c5d-178b-4adc-b4f5-7c354e2914d0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:27 crc kubenswrapper[4973]: I0320 13:44:27.589138 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4ce1c5d-178b-4adc-b4f5-7c354e2914d0-kube-api-access-f82tb" (OuterVolumeSpecName: "kube-api-access-f82tb") pod "d4ce1c5d-178b-4adc-b4f5-7c354e2914d0" (UID: "d4ce1c5d-178b-4adc-b4f5-7c354e2914d0"). InnerVolumeSpecName "kube-api-access-f82tb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:27 crc kubenswrapper[4973]: I0320 13:44:27.590392 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/88d9ea96-b45d-44d9-9aa0-60d2668a9a0c-kube-api-access-62dgq" (OuterVolumeSpecName: "kube-api-access-62dgq") pod "88d9ea96-b45d-44d9-9aa0-60d2668a9a0c" (UID: "88d9ea96-b45d-44d9-9aa0-60d2668a9a0c"). InnerVolumeSpecName "kube-api-access-62dgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:27 crc kubenswrapper[4973]: I0320 13:44:27.687153 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62dgq\" (UniqueName: \"kubernetes.io/projected/88d9ea96-b45d-44d9-9aa0-60d2668a9a0c-kube-api-access-62dgq\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:27 crc kubenswrapper[4973]: I0320 13:44:27.687198 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d4ce1c5d-178b-4adc-b4f5-7c354e2914d0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:27 crc kubenswrapper[4973]: I0320 13:44:27.687209 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f82tb\" (UniqueName: \"kubernetes.io/projected/d4ce1c5d-178b-4adc-b4f5-7c354e2914d0-kube-api-access-f82tb\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:27 crc kubenswrapper[4973]: I0320 13:44:27.687217 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/88d9ea96-b45d-44d9-9aa0-60d2668a9a0c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.386616 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b","Type":"ContainerStarted","Data":"f7af05ae1410a716631aaaf3303b86a84f7bd156172b20ecd1a2389cdacfc65b"} Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.387004 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b","Type":"ContainerStarted","Data":"0091e416aaa52455720fba6197e7be84c8a8deedb356c75b9b06effa38aa5436"} Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.387024 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b","Type":"ContainerStarted","Data":"1b52f0f3fe1339cf7c9bc275f87ca841988d46b506d7089e79ab9027dbf80b6a"} Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.387038 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b","Type":"ContainerStarted","Data":"2ec2cfb9f3bee8acdaf3e6b146c0667946a67b8c044497d36d3cd98d815c7141"} Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.388464 4973 generic.go:334] "Generic (PLEG): container finished" podID="395a3c82-8435-4a03-bbb4-4289d1cf6b15" containerID="f78778e5adb4b7c6e6b3e7e0d2b46e5e0e4199d4a34428851d16da0c275857b1" exitCode=0 Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.388584 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-openstack-cell1-db-create-7vzxf" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.388602 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-19e0-account-create-update-rkd5n" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.389030 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-w99bc" event={"ID":"395a3c82-8435-4a03-bbb4-4289d1cf6b15","Type":"ContainerDied","Data":"f78778e5adb4b7c6e6b3e7e0d2b46e5e0e4199d4a34428851d16da0c275857b1"} Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.559265 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-mfb4w"] Mar 20 13:44:28 crc kubenswrapper[4973]: E0320 13:44:28.559730 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="88d9ea96-b45d-44d9-9aa0-60d2668a9a0c" containerName="mariadb-database-create" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.559751 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="88d9ea96-b45d-44d9-9aa0-60d2668a9a0c" containerName="mariadb-database-create" Mar 20 13:44:28 crc kubenswrapper[4973]: E0320 13:44:28.559788 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4ce1c5d-178b-4adc-b4f5-7c354e2914d0" containerName="mariadb-account-create-update" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.559797 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4ce1c5d-178b-4adc-b4f5-7c354e2914d0" containerName="mariadb-account-create-update" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.559977 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="88d9ea96-b45d-44d9-9aa0-60d2668a9a0c" containerName="mariadb-database-create" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.559988 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4ce1c5d-178b-4adc-b4f5-7c354e2914d0" containerName="mariadb-account-create-update" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.560757 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mfb4w" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.565689 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.565752 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-tjqdw" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.575557 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-mfb4w"] Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.616760 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/716148fc-095e-4811-8e79-53bf3e2cd53b-config-data\") pod \"glance-db-sync-mfb4w\" (UID: \"716148fc-095e-4811-8e79-53bf3e2cd53b\") " pod="openstack/glance-db-sync-mfb4w" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.616839 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grvhn\" (UniqueName: \"kubernetes.io/projected/716148fc-095e-4811-8e79-53bf3e2cd53b-kube-api-access-grvhn\") pod \"glance-db-sync-mfb4w\" (UID: \"716148fc-095e-4811-8e79-53bf3e2cd53b\") " pod="openstack/glance-db-sync-mfb4w" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.616896 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/716148fc-095e-4811-8e79-53bf3e2cd53b-db-sync-config-data\") pod \"glance-db-sync-mfb4w\" (UID: \"716148fc-095e-4811-8e79-53bf3e2cd53b\") " pod="openstack/glance-db-sync-mfb4w" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.616932 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/716148fc-095e-4811-8e79-53bf3e2cd53b-combined-ca-bundle\") pod \"glance-db-sync-mfb4w\" (UID: \"716148fc-095e-4811-8e79-53bf3e2cd53b\") " pod="openstack/glance-db-sync-mfb4w" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.638896 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-x2nll" podUID="f2c7b535-ad26-4bf4-848b-26890c0eb580" containerName="ovn-controller" probeResult="failure" output=< Mar 20 13:44:28 crc kubenswrapper[4973]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 20 13:44:28 crc kubenswrapper[4973]: > Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.696049 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-gvrbc" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.704639 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-gvrbc" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.719370 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/716148fc-095e-4811-8e79-53bf3e2cd53b-config-data\") pod \"glance-db-sync-mfb4w\" (UID: \"716148fc-095e-4811-8e79-53bf3e2cd53b\") " pod="openstack/glance-db-sync-mfb4w" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.719438 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grvhn\" (UniqueName: \"kubernetes.io/projected/716148fc-095e-4811-8e79-53bf3e2cd53b-kube-api-access-grvhn\") pod \"glance-db-sync-mfb4w\" (UID: \"716148fc-095e-4811-8e79-53bf3e2cd53b\") " pod="openstack/glance-db-sync-mfb4w" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.719499 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/716148fc-095e-4811-8e79-53bf3e2cd53b-db-sync-config-data\") pod \"glance-db-sync-mfb4w\" (UID: \"716148fc-095e-4811-8e79-53bf3e2cd53b\") " pod="openstack/glance-db-sync-mfb4w" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.719938 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/716148fc-095e-4811-8e79-53bf3e2cd53b-combined-ca-bundle\") pod \"glance-db-sync-mfb4w\" (UID: \"716148fc-095e-4811-8e79-53bf3e2cd53b\") " pod="openstack/glance-db-sync-mfb4w" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.726474 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/716148fc-095e-4811-8e79-53bf3e2cd53b-config-data\") pod \"glance-db-sync-mfb4w\" (UID: \"716148fc-095e-4811-8e79-53bf3e2cd53b\") " pod="openstack/glance-db-sync-mfb4w" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.727028 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/716148fc-095e-4811-8e79-53bf3e2cd53b-combined-ca-bundle\") pod \"glance-db-sync-mfb4w\" (UID: \"716148fc-095e-4811-8e79-53bf3e2cd53b\") " pod="openstack/glance-db-sync-mfb4w" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.728841 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/716148fc-095e-4811-8e79-53bf3e2cd53b-db-sync-config-data\") pod \"glance-db-sync-mfb4w\" (UID: \"716148fc-095e-4811-8e79-53bf3e2cd53b\") " pod="openstack/glance-db-sync-mfb4w" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.747766 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grvhn\" (UniqueName: \"kubernetes.io/projected/716148fc-095e-4811-8e79-53bf3e2cd53b-kube-api-access-grvhn\") pod \"glance-db-sync-mfb4w\" (UID: \"716148fc-095e-4811-8e79-53bf3e2cd53b\") " pod="openstack/glance-db-sync-mfb4w" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.878961 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mfb4w" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.962193 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-x2nll-config-kclgp"] Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.964323 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x2nll-config-kclgp" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.966745 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 20 13:44:28 crc kubenswrapper[4973]: I0320 13:44:28.973328 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-x2nll-config-kclgp"] Mar 20 13:44:29 crc kubenswrapper[4973]: I0320 13:44:29.127945 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr996\" (UniqueName: \"kubernetes.io/projected/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-kube-api-access-tr996\") pod \"ovn-controller-x2nll-config-kclgp\" (UID: \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\") " pod="openstack/ovn-controller-x2nll-config-kclgp" Mar 20 13:44:29 crc kubenswrapper[4973]: I0320 13:44:29.128045 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-additional-scripts\") pod \"ovn-controller-x2nll-config-kclgp\" (UID: \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\") " pod="openstack/ovn-controller-x2nll-config-kclgp" Mar 20 13:44:29 crc kubenswrapper[4973]: I0320 13:44:29.128093 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-var-run\") pod \"ovn-controller-x2nll-config-kclgp\" (UID: \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\") " pod="openstack/ovn-controller-x2nll-config-kclgp" Mar 20 13:44:29 crc kubenswrapper[4973]: I0320 13:44:29.128210 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-var-run-ovn\") pod \"ovn-controller-x2nll-config-kclgp\" (UID: \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\") " pod="openstack/ovn-controller-x2nll-config-kclgp" Mar 20 13:44:29 crc kubenswrapper[4973]: I0320 13:44:29.128290 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-var-log-ovn\") pod \"ovn-controller-x2nll-config-kclgp\" (UID: \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\") " pod="openstack/ovn-controller-x2nll-config-kclgp" Mar 20 13:44:29 crc kubenswrapper[4973]: I0320 13:44:29.128313 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-scripts\") pod \"ovn-controller-x2nll-config-kclgp\" (UID: \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\") " pod="openstack/ovn-controller-x2nll-config-kclgp" Mar 20 13:44:29 crc kubenswrapper[4973]: I0320 13:44:29.229788 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr996\" (UniqueName: \"kubernetes.io/projected/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-kube-api-access-tr996\") pod \"ovn-controller-x2nll-config-kclgp\" (UID: \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\") " pod="openstack/ovn-controller-x2nll-config-kclgp" Mar 20 13:44:29 crc kubenswrapper[4973]: I0320 13:44:29.230247 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-additional-scripts\") pod \"ovn-controller-x2nll-config-kclgp\" (UID: \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\") " pod="openstack/ovn-controller-x2nll-config-kclgp" Mar 20 13:44:29 crc kubenswrapper[4973]: I0320 13:44:29.231282 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-additional-scripts\") pod \"ovn-controller-x2nll-config-kclgp\" (UID: \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\") " pod="openstack/ovn-controller-x2nll-config-kclgp" Mar 20 13:44:29 crc kubenswrapper[4973]: I0320 13:44:29.231393 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-var-run\") pod \"ovn-controller-x2nll-config-kclgp\" (UID: \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\") " pod="openstack/ovn-controller-x2nll-config-kclgp" Mar 20 13:44:29 crc kubenswrapper[4973]: I0320 13:44:29.231481 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-var-run-ovn\") pod \"ovn-controller-x2nll-config-kclgp\" (UID: \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\") " pod="openstack/ovn-controller-x2nll-config-kclgp" Mar 20 13:44:29 crc kubenswrapper[4973]: I0320 13:44:29.231575 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-var-log-ovn\") pod \"ovn-controller-x2nll-config-kclgp\" (UID: \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\") " pod="openstack/ovn-controller-x2nll-config-kclgp" Mar 20 13:44:29 crc kubenswrapper[4973]: I0320 13:44:29.231608 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-scripts\") pod \"ovn-controller-x2nll-config-kclgp\" (UID: \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\") " pod="openstack/ovn-controller-x2nll-config-kclgp" Mar 20 13:44:29 crc kubenswrapper[4973]: I0320 13:44:29.231945 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-var-run\") pod \"ovn-controller-x2nll-config-kclgp\" (UID: \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\") " pod="openstack/ovn-controller-x2nll-config-kclgp" Mar 20 13:44:29 crc kubenswrapper[4973]: I0320 13:44:29.232033 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-var-run-ovn\") pod \"ovn-controller-x2nll-config-kclgp\" (UID: \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\") " pod="openstack/ovn-controller-x2nll-config-kclgp" Mar 20 13:44:29 crc kubenswrapper[4973]: I0320 13:44:29.232081 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-var-log-ovn\") pod \"ovn-controller-x2nll-config-kclgp\" (UID: \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\") " pod="openstack/ovn-controller-x2nll-config-kclgp" Mar 20 13:44:29 crc kubenswrapper[4973]: I0320 13:44:29.233651 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-scripts\") pod \"ovn-controller-x2nll-config-kclgp\" (UID: \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\") " pod="openstack/ovn-controller-x2nll-config-kclgp" Mar 20 13:44:29 crc kubenswrapper[4973]: I0320 13:44:29.272760 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr996\" (UniqueName: \"kubernetes.io/projected/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-kube-api-access-tr996\") pod \"ovn-controller-x2nll-config-kclgp\" (UID: \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\") " pod="openstack/ovn-controller-x2nll-config-kclgp" Mar 20 13:44:29 crc kubenswrapper[4973]: I0320 13:44:29.297077 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x2nll-config-kclgp" Mar 20 13:44:29 crc kubenswrapper[4973]: I0320 13:44:29.712850 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-mfb4w"] Mar 20 13:44:30 crc kubenswrapper[4973]: W0320 13:44:30.005477 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode7857fa0_a2dd_42e9_9626_63a4eb1fad3e.slice/crio-dbb32919b1304d85c2e31156216fc037082ac6370ed26e4d7ce6a3edddf69d1d WatchSource:0}: Error finding container dbb32919b1304d85c2e31156216fc037082ac6370ed26e4d7ce6a3edddf69d1d: Status 404 returned error can't find the container with id dbb32919b1304d85c2e31156216fc037082ac6370ed26e4d7ce6a3edddf69d1d Mar 20 13:44:30 crc kubenswrapper[4973]: I0320 13:44:30.032611 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-x2nll-config-kclgp"] Mar 20 13:44:30 crc kubenswrapper[4973]: I0320 13:44:30.037880 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-w99bc" Mar 20 13:44:30 crc kubenswrapper[4973]: I0320 13:44:30.057315 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/395a3c82-8435-4a03-bbb4-4289d1cf6b15-operator-scripts\") pod \"395a3c82-8435-4a03-bbb4-4289d1cf6b15\" (UID: \"395a3c82-8435-4a03-bbb4-4289d1cf6b15\") " Mar 20 13:44:30 crc kubenswrapper[4973]: I0320 13:44:30.057496 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc78n\" (UniqueName: \"kubernetes.io/projected/395a3c82-8435-4a03-bbb4-4289d1cf6b15-kube-api-access-pc78n\") pod \"395a3c82-8435-4a03-bbb4-4289d1cf6b15\" (UID: \"395a3c82-8435-4a03-bbb4-4289d1cf6b15\") " Mar 20 13:44:30 crc kubenswrapper[4973]: I0320 13:44:30.060227 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/395a3c82-8435-4a03-bbb4-4289d1cf6b15-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "395a3c82-8435-4a03-bbb4-4289d1cf6b15" (UID: "395a3c82-8435-4a03-bbb4-4289d1cf6b15"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:30 crc kubenswrapper[4973]: I0320 13:44:30.089838 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/395a3c82-8435-4a03-bbb4-4289d1cf6b15-kube-api-access-pc78n" (OuterVolumeSpecName: "kube-api-access-pc78n") pod "395a3c82-8435-4a03-bbb4-4289d1cf6b15" (UID: "395a3c82-8435-4a03-bbb4-4289d1cf6b15"). InnerVolumeSpecName "kube-api-access-pc78n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:30 crc kubenswrapper[4973]: I0320 13:44:30.160426 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc78n\" (UniqueName: \"kubernetes.io/projected/395a3c82-8435-4a03-bbb4-4289d1cf6b15-kube-api-access-pc78n\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:30 crc kubenswrapper[4973]: I0320 13:44:30.160461 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/395a3c82-8435-4a03-bbb4-4289d1cf6b15-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:30 crc kubenswrapper[4973]: I0320 13:44:30.420401 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-w99bc" event={"ID":"395a3c82-8435-4a03-bbb4-4289d1cf6b15","Type":"ContainerDied","Data":"12b67527184aae5df1df5c153dd06695dbfd223ee2e68e8667988fd5300f222c"} Mar 20 13:44:30 crc kubenswrapper[4973]: I0320 13:44:30.420511 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12b67527184aae5df1df5c153dd06695dbfd223ee2e68e8667988fd5300f222c" Mar 20 13:44:30 crc kubenswrapper[4973]: I0320 13:44:30.420575 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-w99bc" Mar 20 13:44:30 crc kubenswrapper[4973]: I0320 13:44:30.423481 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x2nll-config-kclgp" event={"ID":"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e","Type":"ContainerStarted","Data":"9757315f6b18fc3dd71beee72648b81288d616265cb9aba22d703be83307bf4b"} Mar 20 13:44:30 crc kubenswrapper[4973]: I0320 13:44:30.423535 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x2nll-config-kclgp" event={"ID":"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e","Type":"ContainerStarted","Data":"dbb32919b1304d85c2e31156216fc037082ac6370ed26e4d7ce6a3edddf69d1d"} Mar 20 13:44:30 crc kubenswrapper[4973]: I0320 13:44:30.426471 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mfb4w" event={"ID":"716148fc-095e-4811-8e79-53bf3e2cd53b","Type":"ContainerStarted","Data":"68d4095d38164bc2c6126c49b9dd033002967f8fab653f013e394f2d1a239f1c"} Mar 20 13:44:30 crc kubenswrapper[4973]: I0320 13:44:30.445613 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-x2nll-config-kclgp" podStartSLOduration=2.44559498 podStartE2EDuration="2.44559498s" podCreationTimestamp="2026-03-20 13:44:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:30.44088074 +0000 UTC m=+1391.184550494" watchObservedRunningTime="2026-03-20 13:44:30.44559498 +0000 UTC m=+1391.189264724" Mar 20 13:44:31 crc kubenswrapper[4973]: I0320 13:44:31.440129 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b","Type":"ContainerStarted","Data":"291a4db9dd352263bffe630cd19ee5083d3861863746d393e39f0f726ed24100"} Mar 20 13:44:31 crc kubenswrapper[4973]: I0320 13:44:31.440749 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b","Type":"ContainerStarted","Data":"d471814fe4c6976cb3500261942950832618fdd99ed367b3f780c69b7add2107"} Mar 20 13:44:31 crc kubenswrapper[4973]: I0320 13:44:31.442586 4973 generic.go:334] "Generic (PLEG): container finished" podID="e7857fa0-a2dd-42e9-9626-63a4eb1fad3e" containerID="9757315f6b18fc3dd71beee72648b81288d616265cb9aba22d703be83307bf4b" exitCode=0 Mar 20 13:44:31 crc kubenswrapper[4973]: I0320 13:44:31.442633 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-x2nll-config-kclgp" event={"ID":"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e","Type":"ContainerDied","Data":"9757315f6b18fc3dd71beee72648b81288d616265cb9aba22d703be83307bf4b"} Mar 20 13:44:31 crc kubenswrapper[4973]: I0320 13:44:31.606020 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Mar 20 13:44:31 crc kubenswrapper[4973]: E0320 13:44:31.607071 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="395a3c82-8435-4a03-bbb4-4289d1cf6b15" containerName="mariadb-account-create-update" Mar 20 13:44:31 crc kubenswrapper[4973]: I0320 13:44:31.607089 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="395a3c82-8435-4a03-bbb4-4289d1cf6b15" containerName="mariadb-account-create-update" Mar 20 13:44:31 crc kubenswrapper[4973]: I0320 13:44:31.607458 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="395a3c82-8435-4a03-bbb4-4289d1cf6b15" containerName="mariadb-account-create-update" Mar 20 13:44:31 crc kubenswrapper[4973]: I0320 13:44:31.608436 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 20 13:44:31 crc kubenswrapper[4973]: I0320 13:44:31.612776 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Mar 20 13:44:31 crc kubenswrapper[4973]: I0320 13:44:31.621389 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 20 13:44:31 crc kubenswrapper[4973]: I0320 13:44:31.670464 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-w99bc"] Mar 20 13:44:31 crc kubenswrapper[4973]: I0320 13:44:31.695579 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-w99bc"] Mar 20 13:44:31 crc kubenswrapper[4973]: I0320 13:44:31.708097 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/312b29b4-b820-4dcc-bed0-62c42937d544-config-data\") pod \"mysqld-exporter-0\" (UID: \"312b29b4-b820-4dcc-bed0-62c42937d544\") " pod="openstack/mysqld-exporter-0" Mar 20 13:44:31 crc kubenswrapper[4973]: I0320 13:44:31.708181 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p729\" (UniqueName: \"kubernetes.io/projected/312b29b4-b820-4dcc-bed0-62c42937d544-kube-api-access-6p729\") pod \"mysqld-exporter-0\" (UID: \"312b29b4-b820-4dcc-bed0-62c42937d544\") " pod="openstack/mysqld-exporter-0" Mar 20 13:44:31 crc kubenswrapper[4973]: I0320 13:44:31.708219 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312b29b4-b820-4dcc-bed0-62c42937d544-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"312b29b4-b820-4dcc-bed0-62c42937d544\") " pod="openstack/mysqld-exporter-0" Mar 20 13:44:31 crc kubenswrapper[4973]: I0320 13:44:31.810079 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/312b29b4-b820-4dcc-bed0-62c42937d544-config-data\") pod \"mysqld-exporter-0\" (UID: \"312b29b4-b820-4dcc-bed0-62c42937d544\") " pod="openstack/mysqld-exporter-0" Mar 20 13:44:31 crc kubenswrapper[4973]: I0320 13:44:31.810163 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p729\" (UniqueName: \"kubernetes.io/projected/312b29b4-b820-4dcc-bed0-62c42937d544-kube-api-access-6p729\") pod \"mysqld-exporter-0\" (UID: \"312b29b4-b820-4dcc-bed0-62c42937d544\") " pod="openstack/mysqld-exporter-0" Mar 20 13:44:31 crc kubenswrapper[4973]: I0320 13:44:31.810204 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312b29b4-b820-4dcc-bed0-62c42937d544-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"312b29b4-b820-4dcc-bed0-62c42937d544\") " pod="openstack/mysqld-exporter-0" Mar 20 13:44:31 crc kubenswrapper[4973]: I0320 13:44:31.819698 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312b29b4-b820-4dcc-bed0-62c42937d544-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"312b29b4-b820-4dcc-bed0-62c42937d544\") " pod="openstack/mysqld-exporter-0" Mar 20 13:44:31 crc kubenswrapper[4973]: I0320 13:44:31.820141 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/312b29b4-b820-4dcc-bed0-62c42937d544-config-data\") pod \"mysqld-exporter-0\" (UID: \"312b29b4-b820-4dcc-bed0-62c42937d544\") " pod="openstack/mysqld-exporter-0" Mar 20 13:44:31 crc kubenswrapper[4973]: I0320 13:44:31.828171 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p729\" (UniqueName: \"kubernetes.io/projected/312b29b4-b820-4dcc-bed0-62c42937d544-kube-api-access-6p729\") pod \"mysqld-exporter-0\" (UID: \"312b29b4-b820-4dcc-bed0-62c42937d544\") " pod="openstack/mysqld-exporter-0" Mar 20 13:44:31 crc kubenswrapper[4973]: I0320 13:44:31.941478 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 20 13:44:31 crc kubenswrapper[4973]: I0320 13:44:31.976450 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="395a3c82-8435-4a03-bbb4-4289d1cf6b15" path="/var/lib/kubelet/pods/395a3c82-8435-4a03-bbb4-4289d1cf6b15/volumes" Mar 20 13:44:32 crc kubenswrapper[4973]: I0320 13:44:32.473775 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b","Type":"ContainerStarted","Data":"63fefb63aaaac56b2617a7a625e9b7e4a632d223035d06f53072ce42622d64ca"} Mar 20 13:44:32 crc kubenswrapper[4973]: I0320 13:44:32.474143 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b","Type":"ContainerStarted","Data":"90f1a3571fb40a6fb4ba438c6d9e86d7d9e9ea1c2ff0d4f2ac04ad882f2612e8"} Mar 20 13:44:32 crc kubenswrapper[4973]: I0320 13:44:32.511606 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 20 13:44:32 crc kubenswrapper[4973]: I0320 13:44:32.964365 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x2nll-config-kclgp" Mar 20 13:44:33 crc kubenswrapper[4973]: I0320 13:44:33.038855 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-var-log-ovn\") pod \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\" (UID: \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\") " Mar 20 13:44:33 crc kubenswrapper[4973]: I0320 13:44:33.038963 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tr996\" (UniqueName: \"kubernetes.io/projected/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-kube-api-access-tr996\") pod \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\" (UID: \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\") " Mar 20 13:44:33 crc kubenswrapper[4973]: I0320 13:44:33.039104 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-var-run-ovn\") pod \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\" (UID: \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\") " Mar 20 13:44:33 crc kubenswrapper[4973]: I0320 13:44:33.039265 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-additional-scripts\") pod \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\" (UID: \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\") " Mar 20 13:44:33 crc kubenswrapper[4973]: I0320 13:44:33.039284 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-var-run\") pod \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\" (UID: \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\") " Mar 20 13:44:33 crc kubenswrapper[4973]: I0320 13:44:33.039317 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-scripts\") pod \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\" (UID: \"e7857fa0-a2dd-42e9-9626-63a4eb1fad3e\") " Mar 20 13:44:33 crc kubenswrapper[4973]: I0320 13:44:33.039497 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "e7857fa0-a2dd-42e9-9626-63a4eb1fad3e" (UID: "e7857fa0-a2dd-42e9-9626-63a4eb1fad3e"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4973]: I0320 13:44:33.039556 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "e7857fa0-a2dd-42e9-9626-63a4eb1fad3e" (UID: "e7857fa0-a2dd-42e9-9626-63a4eb1fad3e"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4973]: I0320 13:44:33.041071 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "e7857fa0-a2dd-42e9-9626-63a4eb1fad3e" (UID: "e7857fa0-a2dd-42e9-9626-63a4eb1fad3e"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4973]: I0320 13:44:33.041149 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-var-run" (OuterVolumeSpecName: "var-run") pod "e7857fa0-a2dd-42e9-9626-63a4eb1fad3e" (UID: "e7857fa0-a2dd-42e9-9626-63a4eb1fad3e"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4973]: I0320 13:44:33.041860 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-scripts" (OuterVolumeSpecName: "scripts") pod "e7857fa0-a2dd-42e9-9626-63a4eb1fad3e" (UID: "e7857fa0-a2dd-42e9-9626-63a4eb1fad3e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4973]: I0320 13:44:33.042383 4973 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4973]: I0320 13:44:33.042975 4973 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4973]: I0320 13:44:33.043239 4973 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-var-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4973]: I0320 13:44:33.043494 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4973]: I0320 13:44:33.043569 4973 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4973]: I0320 13:44:33.045452 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-kube-api-access-tr996" (OuterVolumeSpecName: "kube-api-access-tr996") pod "e7857fa0-a2dd-42e9-9626-63a4eb1fad3e" (UID: "e7857fa0-a2dd-42e9-9626-63a4eb1fad3e"). InnerVolumeSpecName "kube-api-access-tr996". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:33 crc kubenswrapper[4973]: I0320 13:44:33.091469 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-x2nll-config-kclgp"] Mar 20 13:44:33 crc kubenswrapper[4973]: I0320 13:44:33.103221 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-x2nll-config-kclgp"] Mar 20 13:44:33 crc kubenswrapper[4973]: I0320 13:44:33.145535 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tr996\" (UniqueName: \"kubernetes.io/projected/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e-kube-api-access-tr996\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:33 crc kubenswrapper[4973]: I0320 13:44:33.485950 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbb32919b1304d85c2e31156216fc037082ac6370ed26e4d7ce6a3edddf69d1d" Mar 20 13:44:33 crc kubenswrapper[4973]: I0320 13:44:33.485960 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-x2nll-config-kclgp" Mar 20 13:44:33 crc kubenswrapper[4973]: I0320 13:44:33.487534 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"312b29b4-b820-4dcc-bed0-62c42937d544","Type":"ContainerStarted","Data":"c95ee0e5a0fd2fbd22e6d527fb55aba4f84fdf29621cd292e973b57367547d06"} Mar 20 13:44:33 crc kubenswrapper[4973]: I0320 13:44:33.664864 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-x2nll" Mar 20 13:44:33 crc kubenswrapper[4973]: I0320 13:44:33.965741 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7857fa0-a2dd-42e9-9626-63a4eb1fad3e" path="/var/lib/kubelet/pods/e7857fa0-a2dd-42e9-9626-63a4eb1fad3e/volumes" Mar 20 13:44:34 crc kubenswrapper[4973]: I0320 13:44:34.515750 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b","Type":"ContainerStarted","Data":"71fb5ca4299fe7cd7a9197c1fc5fdd776e5646e995c8a8dd9932f56bfce3e7ca"} Mar 20 13:44:35 crc kubenswrapper[4973]: I0320 13:44:35.292975 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-fv44x"] Mar 20 13:44:35 crc kubenswrapper[4973]: E0320 13:44:35.294396 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7857fa0-a2dd-42e9-9626-63a4eb1fad3e" containerName="ovn-config" Mar 20 13:44:35 crc kubenswrapper[4973]: I0320 13:44:35.294418 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7857fa0-a2dd-42e9-9626-63a4eb1fad3e" containerName="ovn-config" Mar 20 13:44:35 crc kubenswrapper[4973]: I0320 13:44:35.294642 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7857fa0-a2dd-42e9-9626-63a4eb1fad3e" containerName="ovn-config" Mar 20 13:44:35 crc kubenswrapper[4973]: I0320 13:44:35.295954 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fv44x" Mar 20 13:44:35 crc kubenswrapper[4973]: I0320 13:44:35.299498 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 20 13:44:35 crc kubenswrapper[4973]: I0320 13:44:35.311619 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fv44x"] Mar 20 13:44:35 crc kubenswrapper[4973]: I0320 13:44:35.398039 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c65r8\" (UniqueName: \"kubernetes.io/projected/5b26bd9a-6875-4bed-8e7b-f73de3d70427-kube-api-access-c65r8\") pod \"root-account-create-update-fv44x\" (UID: \"5b26bd9a-6875-4bed-8e7b-f73de3d70427\") " pod="openstack/root-account-create-update-fv44x" Mar 20 13:44:35 crc kubenswrapper[4973]: I0320 13:44:35.398286 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b26bd9a-6875-4bed-8e7b-f73de3d70427-operator-scripts\") pod \"root-account-create-update-fv44x\" (UID: \"5b26bd9a-6875-4bed-8e7b-f73de3d70427\") " pod="openstack/root-account-create-update-fv44x" Mar 20 13:44:35 crc kubenswrapper[4973]: I0320 13:44:35.404776 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:44:35 crc kubenswrapper[4973]: I0320 13:44:35.456448 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 13:44:35 crc kubenswrapper[4973]: I0320 13:44:35.480558 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Mar 20 13:44:35 crc kubenswrapper[4973]: I0320 13:44:35.500609 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c65r8\" (UniqueName: \"kubernetes.io/projected/5b26bd9a-6875-4bed-8e7b-f73de3d70427-kube-api-access-c65r8\") pod \"root-account-create-update-fv44x\" (UID: \"5b26bd9a-6875-4bed-8e7b-f73de3d70427\") " pod="openstack/root-account-create-update-fv44x" Mar 20 13:44:35 crc kubenswrapper[4973]: I0320 13:44:35.500853 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b26bd9a-6875-4bed-8e7b-f73de3d70427-operator-scripts\") pod \"root-account-create-update-fv44x\" (UID: \"5b26bd9a-6875-4bed-8e7b-f73de3d70427\") " pod="openstack/root-account-create-update-fv44x" Mar 20 13:44:35 crc kubenswrapper[4973]: I0320 13:44:35.501616 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b26bd9a-6875-4bed-8e7b-f73de3d70427-operator-scripts\") pod \"root-account-create-update-fv44x\" (UID: \"5b26bd9a-6875-4bed-8e7b-f73de3d70427\") " pod="openstack/root-account-create-update-fv44x" Mar 20 13:44:35 crc kubenswrapper[4973]: I0320 13:44:35.530069 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c65r8\" (UniqueName: \"kubernetes.io/projected/5b26bd9a-6875-4bed-8e7b-f73de3d70427-kube-api-access-c65r8\") pod \"root-account-create-update-fv44x\" (UID: \"5b26bd9a-6875-4bed-8e7b-f73de3d70427\") " pod="openstack/root-account-create-update-fv44x" Mar 20 13:44:35 crc kubenswrapper[4973]: I0320 13:44:35.547602 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"312b29b4-b820-4dcc-bed0-62c42937d544","Type":"ContainerStarted","Data":"c95c1def070c1897894cc1a793da1f6d9c833ffd4b974d392025d896ae849f69"} Mar 20 13:44:35 crc kubenswrapper[4973]: I0320 13:44:35.600828 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=2.650022573 podStartE2EDuration="4.600804809s" podCreationTimestamp="2026-03-20 13:44:31 +0000 UTC" firstStartedPulling="2026-03-20 13:44:32.531783181 +0000 UTC m=+1393.275452925" lastFinishedPulling="2026-03-20 13:44:34.482565397 +0000 UTC m=+1395.226235161" observedRunningTime="2026-03-20 13:44:35.580044763 +0000 UTC m=+1396.323714507" watchObservedRunningTime="2026-03-20 13:44:35.600804809 +0000 UTC m=+1396.344474553" Mar 20 13:44:35 crc kubenswrapper[4973]: I0320 13:44:35.601937 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b","Type":"ContainerStarted","Data":"41e34c2efead8423ce1ce2a29adff1c0675aa2a86e2830237e3a18136411aace"} Mar 20 13:44:35 crc kubenswrapper[4973]: I0320 13:44:35.602052 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b","Type":"ContainerStarted","Data":"00a86bf0c65b256ddfe55c68f16d99254ef970a3312a3943c8b1ceea8c0f2683"} Mar 20 13:44:35 crc kubenswrapper[4973]: I0320 13:44:35.617710 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fv44x" Mar 20 13:44:35 crc kubenswrapper[4973]: I0320 13:44:35.790884 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="4ed60638-5022-406b-b568-7fa0d6bf4ba8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.135:5671: connect: connection refused" Mar 20 13:44:36 crc kubenswrapper[4973]: I0320 13:44:36.326357 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-fv44x"] Mar 20 13:44:36 crc kubenswrapper[4973]: I0320 13:44:36.613977 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fv44x" event={"ID":"5b26bd9a-6875-4bed-8e7b-f73de3d70427","Type":"ContainerStarted","Data":"310970b4d6f35411af5ccf038f1e1ee5f497226276d4aac33e30cd6fbc0d34af"} Mar 20 13:44:36 crc kubenswrapper[4973]: I0320 13:44:36.614315 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fv44x" event={"ID":"5b26bd9a-6875-4bed-8e7b-f73de3d70427","Type":"ContainerStarted","Data":"4b077f978b5859647a18b6451cf81cb64ea2c21b97db106e175d78fca4ac32aa"} Mar 20 13:44:36 crc kubenswrapper[4973]: I0320 13:44:36.627578 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b","Type":"ContainerStarted","Data":"592836e4c00dee22699ac514e7ab39919f7e5ebc78df381744bab4e2c678fdb1"} Mar 20 13:44:36 crc kubenswrapper[4973]: I0320 13:44:36.627622 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b","Type":"ContainerStarted","Data":"7020161df413fb3961384727ad0c854532c47c5c0c04914803294712b6cc2e86"} Mar 20 13:44:36 crc kubenswrapper[4973]: I0320 13:44:36.640954 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/root-account-create-update-fv44x" podStartSLOduration=1.64093424 podStartE2EDuration="1.64093424s" podCreationTimestamp="2026-03-20 13:44:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:36.640333153 +0000 UTC m=+1397.384002907" watchObservedRunningTime="2026-03-20 13:44:36.64093424 +0000 UTC m=+1397.384603984" Mar 20 13:44:37 crc kubenswrapper[4973]: I0320 13:44:37.218328 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:37 crc kubenswrapper[4973]: I0320 13:44:37.221464 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:37 crc kubenswrapper[4973]: I0320 13:44:37.667026 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b","Type":"ContainerStarted","Data":"6803cfedd53a434d5efdb39350f498710dc4bf0152d9efccaf568f8a73392fdc"} Mar 20 13:44:37 crc kubenswrapper[4973]: I0320 13:44:37.667092 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"8184c0e7-f9ef-48a3-9461-5cc6c1188e6b","Type":"ContainerStarted","Data":"6cde0e6d47a3d513ce035db013fcb43028ef3c066ef8e0cd03b6bed679590c1c"} Mar 20 13:44:37 crc kubenswrapper[4973]: I0320 13:44:37.678596 4973 generic.go:334] "Generic (PLEG): container finished" podID="5b26bd9a-6875-4bed-8e7b-f73de3d70427" containerID="310970b4d6f35411af5ccf038f1e1ee5f497226276d4aac33e30cd6fbc0d34af" exitCode=0 Mar 20 13:44:37 crc kubenswrapper[4973]: I0320 13:44:37.678668 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fv44x" event={"ID":"5b26bd9a-6875-4bed-8e7b-f73de3d70427","Type":"ContainerDied","Data":"310970b4d6f35411af5ccf038f1e1ee5f497226276d4aac33e30cd6fbc0d34af"} Mar 20 13:44:37 crc kubenswrapper[4973]: I0320 13:44:37.681413 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:37 crc kubenswrapper[4973]: I0320 13:44:37.717851 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=39.017267354 podStartE2EDuration="47.717831653s" podCreationTimestamp="2026-03-20 13:43:50 +0000 UTC" firstStartedPulling="2026-03-20 13:44:24.751051279 +0000 UTC m=+1385.494721023" lastFinishedPulling="2026-03-20 13:44:33.451615578 +0000 UTC m=+1394.195285322" observedRunningTime="2026-03-20 13:44:37.712119247 +0000 UTC m=+1398.455788991" watchObservedRunningTime="2026-03-20 13:44:37.717831653 +0000 UTC m=+1398.461501397" Mar 20 13:44:37 crc kubenswrapper[4973]: I0320 13:44:37.780679 4973 scope.go:117] "RemoveContainer" containerID="096f3595763b234e154ad2f6f4f5fa489d0bc31073cd4df710239496060c67e4" Mar 20 13:44:38 crc kubenswrapper[4973]: I0320 13:44:38.226653 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-wgtqz"] Mar 20 13:44:38 crc kubenswrapper[4973]: I0320 13:44:38.230644 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" Mar 20 13:44:38 crc kubenswrapper[4973]: I0320 13:44:38.233077 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 20 13:44:38 crc kubenswrapper[4973]: I0320 13:44:38.247099 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-wgtqz"] Mar 20 13:44:38 crc kubenswrapper[4973]: I0320 13:44:38.274859 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-wgtqz\" (UID: \"86ba366f-247a-4630-8fa6-196198d8aec7\") " pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" Mar 20 13:44:38 crc kubenswrapper[4973]: I0320 13:44:38.275609 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-wgtqz\" (UID: \"86ba366f-247a-4630-8fa6-196198d8aec7\") " pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" Mar 20 13:44:38 crc kubenswrapper[4973]: I0320 13:44:38.275701 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-config\") pod \"dnsmasq-dns-77585f5f8c-wgtqz\" (UID: \"86ba366f-247a-4630-8fa6-196198d8aec7\") " pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" Mar 20 13:44:38 crc kubenswrapper[4973]: I0320 13:44:38.275886 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc7rh\" (UniqueName: \"kubernetes.io/projected/86ba366f-247a-4630-8fa6-196198d8aec7-kube-api-access-xc7rh\") pod \"dnsmasq-dns-77585f5f8c-wgtqz\" (UID: \"86ba366f-247a-4630-8fa6-196198d8aec7\") " pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" Mar 20 13:44:38 crc kubenswrapper[4973]: I0320 13:44:38.276003 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-wgtqz\" (UID: \"86ba366f-247a-4630-8fa6-196198d8aec7\") " pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" Mar 20 13:44:38 crc kubenswrapper[4973]: I0320 13:44:38.277811 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-wgtqz\" (UID: \"86ba366f-247a-4630-8fa6-196198d8aec7\") " pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" Mar 20 13:44:38 crc kubenswrapper[4973]: I0320 13:44:38.379785 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-wgtqz\" (UID: \"86ba366f-247a-4630-8fa6-196198d8aec7\") " pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" Mar 20 13:44:38 crc kubenswrapper[4973]: I0320 13:44:38.379856 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-wgtqz\" (UID: \"86ba366f-247a-4630-8fa6-196198d8aec7\") " pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" Mar 20 13:44:38 crc kubenswrapper[4973]: I0320 13:44:38.379902 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-config\") pod \"dnsmasq-dns-77585f5f8c-wgtqz\" (UID: \"86ba366f-247a-4630-8fa6-196198d8aec7\") " pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" Mar 20 13:44:38 crc kubenswrapper[4973]: I0320 13:44:38.379939 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc7rh\" (UniqueName: \"kubernetes.io/projected/86ba366f-247a-4630-8fa6-196198d8aec7-kube-api-access-xc7rh\") pod \"dnsmasq-dns-77585f5f8c-wgtqz\" (UID: \"86ba366f-247a-4630-8fa6-196198d8aec7\") " pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" Mar 20 13:44:38 crc kubenswrapper[4973]: I0320 13:44:38.380096 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-wgtqz\" (UID: \"86ba366f-247a-4630-8fa6-196198d8aec7\") " pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" Mar 20 13:44:38 crc kubenswrapper[4973]: I0320 13:44:38.380128 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-wgtqz\" (UID: \"86ba366f-247a-4630-8fa6-196198d8aec7\") " pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" Mar 20 13:44:38 crc kubenswrapper[4973]: I0320 13:44:38.381206 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-dns-swift-storage-0\") pod \"dnsmasq-dns-77585f5f8c-wgtqz\" (UID: \"86ba366f-247a-4630-8fa6-196198d8aec7\") " pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" Mar 20 13:44:38 crc kubenswrapper[4973]: I0320 13:44:38.382737 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-ovsdbserver-nb\") pod \"dnsmasq-dns-77585f5f8c-wgtqz\" (UID: \"86ba366f-247a-4630-8fa6-196198d8aec7\") " pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" Mar 20 13:44:38 crc kubenswrapper[4973]: I0320 13:44:38.383181 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-config\") pod \"dnsmasq-dns-77585f5f8c-wgtqz\" (UID: \"86ba366f-247a-4630-8fa6-196198d8aec7\") " pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" Mar 20 13:44:38 crc kubenswrapper[4973]: I0320 13:44:38.383306 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-ovsdbserver-sb\") pod \"dnsmasq-dns-77585f5f8c-wgtqz\" (UID: \"86ba366f-247a-4630-8fa6-196198d8aec7\") " pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" Mar 20 13:44:38 crc kubenswrapper[4973]: I0320 13:44:38.383352 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-dns-svc\") pod \"dnsmasq-dns-77585f5f8c-wgtqz\" (UID: \"86ba366f-247a-4630-8fa6-196198d8aec7\") " pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" Mar 20 13:44:38 crc kubenswrapper[4973]: I0320 13:44:38.405545 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc7rh\" (UniqueName: \"kubernetes.io/projected/86ba366f-247a-4630-8fa6-196198d8aec7-kube-api-access-xc7rh\") pod \"dnsmasq-dns-77585f5f8c-wgtqz\" (UID: \"86ba366f-247a-4630-8fa6-196198d8aec7\") " pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" Mar 20 13:44:38 crc kubenswrapper[4973]: I0320 13:44:38.558158 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" Mar 20 13:44:39 crc kubenswrapper[4973]: I0320 13:44:39.114956 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-wgtqz"] Mar 20 13:44:39 crc kubenswrapper[4973]: I0320 13:44:39.293196 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fv44x" Mar 20 13:44:39 crc kubenswrapper[4973]: I0320 13:44:39.433857 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c65r8\" (UniqueName: \"kubernetes.io/projected/5b26bd9a-6875-4bed-8e7b-f73de3d70427-kube-api-access-c65r8\") pod \"5b26bd9a-6875-4bed-8e7b-f73de3d70427\" (UID: \"5b26bd9a-6875-4bed-8e7b-f73de3d70427\") " Mar 20 13:44:39 crc kubenswrapper[4973]: I0320 13:44:39.434067 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b26bd9a-6875-4bed-8e7b-f73de3d70427-operator-scripts\") pod \"5b26bd9a-6875-4bed-8e7b-f73de3d70427\" (UID: \"5b26bd9a-6875-4bed-8e7b-f73de3d70427\") " Mar 20 13:44:39 crc kubenswrapper[4973]: I0320 13:44:39.435291 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b26bd9a-6875-4bed-8e7b-f73de3d70427-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5b26bd9a-6875-4bed-8e7b-f73de3d70427" (UID: "5b26bd9a-6875-4bed-8e7b-f73de3d70427"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:39 crc kubenswrapper[4973]: I0320 13:44:39.439650 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b26bd9a-6875-4bed-8e7b-f73de3d70427-kube-api-access-c65r8" (OuterVolumeSpecName: "kube-api-access-c65r8") pod "5b26bd9a-6875-4bed-8e7b-f73de3d70427" (UID: "5b26bd9a-6875-4bed-8e7b-f73de3d70427"). InnerVolumeSpecName "kube-api-access-c65r8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:39 crc kubenswrapper[4973]: I0320 13:44:39.536937 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5b26bd9a-6875-4bed-8e7b-f73de3d70427-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:39 crc kubenswrapper[4973]: I0320 13:44:39.536975 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c65r8\" (UniqueName: \"kubernetes.io/projected/5b26bd9a-6875-4bed-8e7b-f73de3d70427-kube-api-access-c65r8\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:39 crc kubenswrapper[4973]: I0320 13:44:39.713043 4973 generic.go:334] "Generic (PLEG): container finished" podID="86ba366f-247a-4630-8fa6-196198d8aec7" containerID="5b15ddb77213cb46099b1880baa1f7699211e31f7d30a7d59159593f8d8ffc93" exitCode=0 Mar 20 13:44:39 crc kubenswrapper[4973]: I0320 13:44:39.713119 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" event={"ID":"86ba366f-247a-4630-8fa6-196198d8aec7","Type":"ContainerDied","Data":"5b15ddb77213cb46099b1880baa1f7699211e31f7d30a7d59159593f8d8ffc93"} Mar 20 13:44:39 crc kubenswrapper[4973]: I0320 13:44:39.713143 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" event={"ID":"86ba366f-247a-4630-8fa6-196198d8aec7","Type":"ContainerStarted","Data":"ce4cc5705c246c0732cd3085297b51ce303e226385309cd32693c47f580867ad"} Mar 20 13:44:39 crc kubenswrapper[4973]: I0320 13:44:39.715078 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-fv44x" event={"ID":"5b26bd9a-6875-4bed-8e7b-f73de3d70427","Type":"ContainerDied","Data":"4b077f978b5859647a18b6451cf81cb64ea2c21b97db106e175d78fca4ac32aa"} Mar 20 13:44:39 crc kubenswrapper[4973]: I0320 13:44:39.715128 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b077f978b5859647a18b6451cf81cb64ea2c21b97db106e175d78fca4ac32aa" Mar 20 13:44:39 crc kubenswrapper[4973]: I0320 13:44:39.715134 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-fv44x" Mar 20 13:44:40 crc kubenswrapper[4973]: I0320 13:44:40.275432 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 13:44:40 crc kubenswrapper[4973]: I0320 13:44:40.275924 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="642c51b4-2774-4114-808c-1fb722862437" containerName="prometheus" containerID="cri-o://2e35fac61505298631322828fcdedf86c60d2615539c2e83fe79fc0f40f3ac65" gracePeriod=600 Mar 20 13:44:40 crc kubenswrapper[4973]: I0320 13:44:40.275988 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="642c51b4-2774-4114-808c-1fb722862437" containerName="config-reloader" containerID="cri-o://97cecec69536e1f8243868aef7821cad1dcd935afc8c9c21dd2f6bb5fe575497" gracePeriod=600 Mar 20 13:44:40 crc kubenswrapper[4973]: I0320 13:44:40.276019 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/prometheus-metric-storage-0" podUID="642c51b4-2774-4114-808c-1fb722862437" containerName="thanos-sidecar" containerID="cri-o://750c5df52618bc5d32c078cdb27b56850d9e8d6bf24b361607e9d8944cae08c8" gracePeriod=600 Mar 20 13:44:40 crc kubenswrapper[4973]: E0320 13:44:40.633986 4973 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod642c51b4_2774_4114_808c_1fb722862437.slice/crio-conmon-2e35fac61505298631322828fcdedf86c60d2615539c2e83fe79fc0f40f3ac65.scope\": RecentStats: unable to find data in memory cache]" Mar 20 13:44:40 crc kubenswrapper[4973]: I0320 13:44:40.729357 4973 generic.go:334] "Generic (PLEG): container finished" podID="642c51b4-2774-4114-808c-1fb722862437" containerID="750c5df52618bc5d32c078cdb27b56850d9e8d6bf24b361607e9d8944cae08c8" exitCode=0 Mar 20 13:44:40 crc kubenswrapper[4973]: I0320 13:44:40.729415 4973 generic.go:334] "Generic (PLEG): container finished" podID="642c51b4-2774-4114-808c-1fb722862437" containerID="97cecec69536e1f8243868aef7821cad1dcd935afc8c9c21dd2f6bb5fe575497" exitCode=0 Mar 20 13:44:40 crc kubenswrapper[4973]: I0320 13:44:40.729440 4973 generic.go:334] "Generic (PLEG): container finished" podID="642c51b4-2774-4114-808c-1fb722862437" containerID="2e35fac61505298631322828fcdedf86c60d2615539c2e83fe79fc0f40f3ac65" exitCode=0 Mar 20 13:44:40 crc kubenswrapper[4973]: I0320 13:44:40.729462 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"642c51b4-2774-4114-808c-1fb722862437","Type":"ContainerDied","Data":"750c5df52618bc5d32c078cdb27b56850d9e8d6bf24b361607e9d8944cae08c8"} Mar 20 13:44:40 crc kubenswrapper[4973]: I0320 13:44:40.729489 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"642c51b4-2774-4114-808c-1fb722862437","Type":"ContainerDied","Data":"97cecec69536e1f8243868aef7821cad1dcd935afc8c9c21dd2f6bb5fe575497"} Mar 20 13:44:40 crc kubenswrapper[4973]: I0320 13:44:40.729528 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"642c51b4-2774-4114-808c-1fb722862437","Type":"ContainerDied","Data":"2e35fac61505298631322828fcdedf86c60d2615539c2e83fe79fc0f40f3ac65"} Mar 20 13:44:41 crc kubenswrapper[4973]: I0320 13:44:41.695498 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-fv44x"] Mar 20 13:44:41 crc kubenswrapper[4973]: I0320 13:44:41.706915 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-fv44x"] Mar 20 13:44:41 crc kubenswrapper[4973]: I0320 13:44:41.970544 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b26bd9a-6875-4bed-8e7b-f73de3d70427" path="/var/lib/kubelet/pods/5b26bd9a-6875-4bed-8e7b-f73de3d70427/volumes" Mar 20 13:44:42 crc kubenswrapper[4973]: I0320 13:44:42.219194 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="642c51b4-2774-4114-808c-1fb722862437" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.142:9090/-/ready\": dial tcp 10.217.0.142:9090: connect: connection refused" Mar 20 13:44:43 crc kubenswrapper[4973]: I0320 13:44:43.320361 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:44:43 crc kubenswrapper[4973]: I0320 13:44:43.320431 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:44:43 crc kubenswrapper[4973]: I0320 13:44:43.320480 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 13:44:43 crc kubenswrapper[4973]: I0320 13:44:43.321192 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"96fa5ee864c868aadb2da3b33886e9ad0c244086b57b687cf9a31b416fea8562"} pod="openshift-machine-config-operator/machine-config-daemon-qlztx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:44:43 crc kubenswrapper[4973]: I0320 13:44:43.321290 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" containerID="cri-o://96fa5ee864c868aadb2da3b33886e9ad0c244086b57b687cf9a31b416fea8562" gracePeriod=600 Mar 20 13:44:43 crc kubenswrapper[4973]: I0320 13:44:43.771059 4973 generic.go:334] "Generic (PLEG): container finished" podID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerID="96fa5ee864c868aadb2da3b33886e9ad0c244086b57b687cf9a31b416fea8562" exitCode=0 Mar 20 13:44:43 crc kubenswrapper[4973]: I0320 13:44:43.771130 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerDied","Data":"96fa5ee864c868aadb2da3b33886e9ad0c244086b57b687cf9a31b416fea8562"} Mar 20 13:44:43 crc kubenswrapper[4973]: I0320 13:44:43.771465 4973 scope.go:117] "RemoveContainer" containerID="6ee146100b8d3ae20a6493daed451ee0c8c9f7d655dfaba5ce2b9446864d5f7d" Mar 20 13:44:45 crc kubenswrapper[4973]: I0320 13:44:45.792150 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-2" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.123134 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-wdm6w"] Mar 20 13:44:46 crc kubenswrapper[4973]: E0320 13:44:46.123682 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b26bd9a-6875-4bed-8e7b-f73de3d70427" containerName="mariadb-account-create-update" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.123699 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b26bd9a-6875-4bed-8e7b-f73de3d70427" containerName="mariadb-account-create-update" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.123922 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b26bd9a-6875-4bed-8e7b-f73de3d70427" containerName="mariadb-account-create-update" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.124659 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wdm6w" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.194519 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-wdm6w"] Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.221820 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gzdq\" (UniqueName: \"kubernetes.io/projected/0bb79520-00fa-4626-9fef-c7250fccd210-kube-api-access-2gzdq\") pod \"cinder-db-create-wdm6w\" (UID: \"0bb79520-00fa-4626-9fef-c7250fccd210\") " pod="openstack/cinder-db-create-wdm6w" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.221962 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bb79520-00fa-4626-9fef-c7250fccd210-operator-scripts\") pod \"cinder-db-create-wdm6w\" (UID: \"0bb79520-00fa-4626-9fef-c7250fccd210\") " pod="openstack/cinder-db-create-wdm6w" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.275051 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-a222-account-create-update-bsrjg"] Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.279060 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a222-account-create-update-bsrjg" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.286992 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.296021 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a222-account-create-update-bsrjg"] Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.325875 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gzdq\" (UniqueName: \"kubernetes.io/projected/0bb79520-00fa-4626-9fef-c7250fccd210-kube-api-access-2gzdq\") pod \"cinder-db-create-wdm6w\" (UID: \"0bb79520-00fa-4626-9fef-c7250fccd210\") " pod="openstack/cinder-db-create-wdm6w" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.325949 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bb79520-00fa-4626-9fef-c7250fccd210-operator-scripts\") pod \"cinder-db-create-wdm6w\" (UID: \"0bb79520-00fa-4626-9fef-c7250fccd210\") " pod="openstack/cinder-db-create-wdm6w" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.326864 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bb79520-00fa-4626-9fef-c7250fccd210-operator-scripts\") pod \"cinder-db-create-wdm6w\" (UID: \"0bb79520-00fa-4626-9fef-c7250fccd210\") " pod="openstack/cinder-db-create-wdm6w" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.382075 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gzdq\" (UniqueName: \"kubernetes.io/projected/0bb79520-00fa-4626-9fef-c7250fccd210-kube-api-access-2gzdq\") pod \"cinder-db-create-wdm6w\" (UID: \"0bb79520-00fa-4626-9fef-c7250fccd210\") " pod="openstack/cinder-db-create-wdm6w" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.400854 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-rwnmf"] Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.402754 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-rwnmf" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.415447 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-rwnmf"] Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.427826 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkfvj\" (UniqueName: \"kubernetes.io/projected/2d0fe71e-2a81-41d4-86d6-dd4345597c6e-kube-api-access-nkfvj\") pod \"cinder-a222-account-create-update-bsrjg\" (UID: \"2d0fe71e-2a81-41d4-86d6-dd4345597c6e\") " pod="openstack/cinder-a222-account-create-update-bsrjg" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.428022 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d0fe71e-2a81-41d4-86d6-dd4345597c6e-operator-scripts\") pod \"cinder-a222-account-create-update-bsrjg\" (UID: \"2d0fe71e-2a81-41d4-86d6-dd4345597c6e\") " pod="openstack/cinder-a222-account-create-update-bsrjg" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.445730 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wdm6w" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.480456 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-0791-account-create-update-rz2hx"] Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.482513 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-0791-account-create-update-rz2hx" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.489091 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.493960 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-0791-account-create-update-rz2hx"] Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.530159 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkfvj\" (UniqueName: \"kubernetes.io/projected/2d0fe71e-2a81-41d4-86d6-dd4345597c6e-kube-api-access-nkfvj\") pod \"cinder-a222-account-create-update-bsrjg\" (UID: \"2d0fe71e-2a81-41d4-86d6-dd4345597c6e\") " pod="openstack/cinder-a222-account-create-update-bsrjg" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.530255 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7rxm\" (UniqueName: \"kubernetes.io/projected/0850cf2e-8755-4161-97ea-a1507ef8f2fb-kube-api-access-v7rxm\") pod \"heat-db-create-rwnmf\" (UID: \"0850cf2e-8755-4161-97ea-a1507ef8f2fb\") " pod="openstack/heat-db-create-rwnmf" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.530307 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0850cf2e-8755-4161-97ea-a1507ef8f2fb-operator-scripts\") pod \"heat-db-create-rwnmf\" (UID: \"0850cf2e-8755-4161-97ea-a1507ef8f2fb\") " pod="openstack/heat-db-create-rwnmf" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.530381 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d0fe71e-2a81-41d4-86d6-dd4345597c6e-operator-scripts\") pod \"cinder-a222-account-create-update-bsrjg\" (UID: \"2d0fe71e-2a81-41d4-86d6-dd4345597c6e\") " pod="openstack/cinder-a222-account-create-update-bsrjg" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.531151 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d0fe71e-2a81-41d4-86d6-dd4345597c6e-operator-scripts\") pod \"cinder-a222-account-create-update-bsrjg\" (UID: \"2d0fe71e-2a81-41d4-86d6-dd4345597c6e\") " pod="openstack/cinder-a222-account-create-update-bsrjg" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.551168 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkfvj\" (UniqueName: \"kubernetes.io/projected/2d0fe71e-2a81-41d4-86d6-dd4345597c6e-kube-api-access-nkfvj\") pod \"cinder-a222-account-create-update-bsrjg\" (UID: \"2d0fe71e-2a81-41d4-86d6-dd4345597c6e\") " pod="openstack/cinder-a222-account-create-update-bsrjg" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.553694 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-bhvvf"] Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.555518 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bhvvf" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.565551 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bhvvf"] Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.625594 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a222-account-create-update-bsrjg" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.628517 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-8fxhw"] Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.633915 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8fxhw" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.634778 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7rxm\" (UniqueName: \"kubernetes.io/projected/0850cf2e-8755-4161-97ea-a1507ef8f2fb-kube-api-access-v7rxm\") pod \"heat-db-create-rwnmf\" (UID: \"0850cf2e-8755-4161-97ea-a1507ef8f2fb\") " pod="openstack/heat-db-create-rwnmf" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.634887 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0850cf2e-8755-4161-97ea-a1507ef8f2fb-operator-scripts\") pod \"heat-db-create-rwnmf\" (UID: \"0850cf2e-8755-4161-97ea-a1507ef8f2fb\") " pod="openstack/heat-db-create-rwnmf" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.634926 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzngg\" (UniqueName: \"kubernetes.io/projected/a26bab7b-118f-46a6-8809-8f7b8b7a83d4-kube-api-access-xzngg\") pod \"heat-0791-account-create-update-rz2hx\" (UID: \"a26bab7b-118f-46a6-8809-8f7b8b7a83d4\") " pod="openstack/heat-0791-account-create-update-rz2hx" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.634946 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a26bab7b-118f-46a6-8809-8f7b8b7a83d4-operator-scripts\") pod \"heat-0791-account-create-update-rz2hx\" (UID: \"a26bab7b-118f-46a6-8809-8f7b8b7a83d4\") " pod="openstack/heat-0791-account-create-update-rz2hx" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.635740 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0850cf2e-8755-4161-97ea-a1507ef8f2fb-operator-scripts\") pod \"heat-db-create-rwnmf\" (UID: \"0850cf2e-8755-4161-97ea-a1507ef8f2fb\") " pod="openstack/heat-db-create-rwnmf" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.654186 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-c2db-account-create-update-4m67k"] Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.655947 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c2db-account-create-update-4m67k" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.661177 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.664764 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-8fxhw"] Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.696631 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c2db-account-create-update-4m67k"] Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.699549 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7rxm\" (UniqueName: \"kubernetes.io/projected/0850cf2e-8755-4161-97ea-a1507ef8f2fb-kube-api-access-v7rxm\") pod \"heat-db-create-rwnmf\" (UID: \"0850cf2e-8755-4161-97ea-a1507ef8f2fb\") " pod="openstack/heat-db-create-rwnmf" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.737467 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-844tx\" (UniqueName: \"kubernetes.io/projected/a14f07ce-f8ea-4af0-bf01-9ad6b1af917a-kube-api-access-844tx\") pod \"barbican-c2db-account-create-update-4m67k\" (UID: \"a14f07ce-f8ea-4af0-bf01-9ad6b1af917a\") " pod="openstack/barbican-c2db-account-create-update-4m67k" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.737684 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzngg\" (UniqueName: \"kubernetes.io/projected/a26bab7b-118f-46a6-8809-8f7b8b7a83d4-kube-api-access-xzngg\") pod \"heat-0791-account-create-update-rz2hx\" (UID: \"a26bab7b-118f-46a6-8809-8f7b8b7a83d4\") " pod="openstack/heat-0791-account-create-update-rz2hx" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.737712 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a26bab7b-118f-46a6-8809-8f7b8b7a83d4-operator-scripts\") pod \"heat-0791-account-create-update-rz2hx\" (UID: \"a26bab7b-118f-46a6-8809-8f7b8b7a83d4\") " pod="openstack/heat-0791-account-create-update-rz2hx" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.737744 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqdmg\" (UniqueName: \"kubernetes.io/projected/1286f2f0-7c61-4ba2-b2a6-3ec09e602af9-kube-api-access-bqdmg\") pod \"barbican-db-create-bhvvf\" (UID: \"1286f2f0-7c61-4ba2-b2a6-3ec09e602af9\") " pod="openstack/barbican-db-create-bhvvf" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.737766 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1286f2f0-7c61-4ba2-b2a6-3ec09e602af9-operator-scripts\") pod \"barbican-db-create-bhvvf\" (UID: \"1286f2f0-7c61-4ba2-b2a6-3ec09e602af9\") " pod="openstack/barbican-db-create-bhvvf" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.737812 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a14f07ce-f8ea-4af0-bf01-9ad6b1af917a-operator-scripts\") pod \"barbican-c2db-account-create-update-4m67k\" (UID: \"a14f07ce-f8ea-4af0-bf01-9ad6b1af917a\") " pod="openstack/barbican-c2db-account-create-update-4m67k" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.737845 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zx9t\" (UniqueName: \"kubernetes.io/projected/024e6737-2a16-43e9-99f4-62d2b39df77b-kube-api-access-6zx9t\") pod \"neutron-db-create-8fxhw\" (UID: \"024e6737-2a16-43e9-99f4-62d2b39df77b\") " pod="openstack/neutron-db-create-8fxhw" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.737866 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/024e6737-2a16-43e9-99f4-62d2b39df77b-operator-scripts\") pod \"neutron-db-create-8fxhw\" (UID: \"024e6737-2a16-43e9-99f4-62d2b39df77b\") " pod="openstack/neutron-db-create-8fxhw" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.740909 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a26bab7b-118f-46a6-8809-8f7b8b7a83d4-operator-scripts\") pod \"heat-0791-account-create-update-rz2hx\" (UID: \"a26bab7b-118f-46a6-8809-8f7b8b7a83d4\") " pod="openstack/heat-0791-account-create-update-rz2hx" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.746537 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-rwnmf" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.762591 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-1972-account-create-update-l6mqj"] Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.764010 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1972-account-create-update-l6mqj" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.768914 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.769906 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1972-account-create-update-l6mqj"] Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.774970 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzngg\" (UniqueName: \"kubernetes.io/projected/a26bab7b-118f-46a6-8809-8f7b8b7a83d4-kube-api-access-xzngg\") pod \"heat-0791-account-create-update-rz2hx\" (UID: \"a26bab7b-118f-46a6-8809-8f7b8b7a83d4\") " pod="openstack/heat-0791-account-create-update-rz2hx" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.813643 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-0791-account-create-update-rz2hx" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.832160 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-2k8xz"] Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.834023 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2k8xz" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.839178 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.840974 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79l52\" (UniqueName: \"kubernetes.io/projected/be6b9024-516f-484d-94ff-391bf79246bf-kube-api-access-79l52\") pod \"neutron-1972-account-create-update-l6mqj\" (UID: \"be6b9024-516f-484d-94ff-391bf79246bf\") " pod="openstack/neutron-1972-account-create-update-l6mqj" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.841073 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqdmg\" (UniqueName: \"kubernetes.io/projected/1286f2f0-7c61-4ba2-b2a6-3ec09e602af9-kube-api-access-bqdmg\") pod \"barbican-db-create-bhvvf\" (UID: \"1286f2f0-7c61-4ba2-b2a6-3ec09e602af9\") " pod="openstack/barbican-db-create-bhvvf" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.841106 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1286f2f0-7c61-4ba2-b2a6-3ec09e602af9-operator-scripts\") pod \"barbican-db-create-bhvvf\" (UID: \"1286f2f0-7c61-4ba2-b2a6-3ec09e602af9\") " pod="openstack/barbican-db-create-bhvvf" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.841128 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be6b9024-516f-484d-94ff-391bf79246bf-operator-scripts\") pod \"neutron-1972-account-create-update-l6mqj\" (UID: \"be6b9024-516f-484d-94ff-391bf79246bf\") " pod="openstack/neutron-1972-account-create-update-l6mqj" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.841205 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a14f07ce-f8ea-4af0-bf01-9ad6b1af917a-operator-scripts\") pod \"barbican-c2db-account-create-update-4m67k\" (UID: \"a14f07ce-f8ea-4af0-bf01-9ad6b1af917a\") " pod="openstack/barbican-c2db-account-create-update-4m67k" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.841239 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zx9t\" (UniqueName: \"kubernetes.io/projected/024e6737-2a16-43e9-99f4-62d2b39df77b-kube-api-access-6zx9t\") pod \"neutron-db-create-8fxhw\" (UID: \"024e6737-2a16-43e9-99f4-62d2b39df77b\") " pod="openstack/neutron-db-create-8fxhw" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.841256 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/024e6737-2a16-43e9-99f4-62d2b39df77b-operator-scripts\") pod \"neutron-db-create-8fxhw\" (UID: \"024e6737-2a16-43e9-99f4-62d2b39df77b\") " pod="openstack/neutron-db-create-8fxhw" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.841303 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-844tx\" (UniqueName: \"kubernetes.io/projected/a14f07ce-f8ea-4af0-bf01-9ad6b1af917a-kube-api-access-844tx\") pod \"barbican-c2db-account-create-update-4m67k\" (UID: \"a14f07ce-f8ea-4af0-bf01-9ad6b1af917a\") " pod="openstack/barbican-c2db-account-create-update-4m67k" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.842680 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1286f2f0-7c61-4ba2-b2a6-3ec09e602af9-operator-scripts\") pod \"barbican-db-create-bhvvf\" (UID: \"1286f2f0-7c61-4ba2-b2a6-3ec09e602af9\") " pod="openstack/barbican-db-create-bhvvf" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.843576 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a14f07ce-f8ea-4af0-bf01-9ad6b1af917a-operator-scripts\") pod \"barbican-c2db-account-create-update-4m67k\" (UID: \"a14f07ce-f8ea-4af0-bf01-9ad6b1af917a\") " pod="openstack/barbican-c2db-account-create-update-4m67k" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.844149 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/024e6737-2a16-43e9-99f4-62d2b39df77b-operator-scripts\") pod \"neutron-db-create-8fxhw\" (UID: \"024e6737-2a16-43e9-99f4-62d2b39df77b\") " pod="openstack/neutron-db-create-8fxhw" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.860296 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2k8xz"] Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.874525 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-844tx\" (UniqueName: \"kubernetes.io/projected/a14f07ce-f8ea-4af0-bf01-9ad6b1af917a-kube-api-access-844tx\") pod \"barbican-c2db-account-create-update-4m67k\" (UID: \"a14f07ce-f8ea-4af0-bf01-9ad6b1af917a\") " pod="openstack/barbican-c2db-account-create-update-4m67k" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.874574 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zx9t\" (UniqueName: \"kubernetes.io/projected/024e6737-2a16-43e9-99f4-62d2b39df77b-kube-api-access-6zx9t\") pod \"neutron-db-create-8fxhw\" (UID: \"024e6737-2a16-43e9-99f4-62d2b39df77b\") " pod="openstack/neutron-db-create-8fxhw" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.878052 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqdmg\" (UniqueName: \"kubernetes.io/projected/1286f2f0-7c61-4ba2-b2a6-3ec09e602af9-kube-api-access-bqdmg\") pod \"barbican-db-create-bhvvf\" (UID: \"1286f2f0-7c61-4ba2-b2a6-3ec09e602af9\") " pod="openstack/barbican-db-create-bhvvf" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.940975 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bhvvf" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.943537 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be6b9024-516f-484d-94ff-391bf79246bf-operator-scripts\") pod \"neutron-1972-account-create-update-l6mqj\" (UID: \"be6b9024-516f-484d-94ff-391bf79246bf\") " pod="openstack/neutron-1972-account-create-update-l6mqj" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.943696 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdvnj\" (UniqueName: \"kubernetes.io/projected/b1f86541-a1d0-4a26-80c2-be736fa62eb9-kube-api-access-bdvnj\") pod \"root-account-create-update-2k8xz\" (UID: \"b1f86541-a1d0-4a26-80c2-be736fa62eb9\") " pod="openstack/root-account-create-update-2k8xz" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.943750 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1f86541-a1d0-4a26-80c2-be736fa62eb9-operator-scripts\") pod \"root-account-create-update-2k8xz\" (UID: \"b1f86541-a1d0-4a26-80c2-be736fa62eb9\") " pod="openstack/root-account-create-update-2k8xz" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.943805 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79l52\" (UniqueName: \"kubernetes.io/projected/be6b9024-516f-484d-94ff-391bf79246bf-kube-api-access-79l52\") pod \"neutron-1972-account-create-update-l6mqj\" (UID: \"be6b9024-516f-484d-94ff-391bf79246bf\") " pod="openstack/neutron-1972-account-create-update-l6mqj" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.945183 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be6b9024-516f-484d-94ff-391bf79246bf-operator-scripts\") pod \"neutron-1972-account-create-update-l6mqj\" (UID: \"be6b9024-516f-484d-94ff-391bf79246bf\") " pod="openstack/neutron-1972-account-create-update-l6mqj" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.959446 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8fxhw" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.960002 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-n2wvb"] Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.969740 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-n2wvb"] Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.969843 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-n2wvb" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.972710 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.973180 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79l52\" (UniqueName: \"kubernetes.io/projected/be6b9024-516f-484d-94ff-391bf79246bf-kube-api-access-79l52\") pod \"neutron-1972-account-create-update-l6mqj\" (UID: \"be6b9024-516f-484d-94ff-391bf79246bf\") " pod="openstack/neutron-1972-account-create-update-l6mqj" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.976675 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.976777 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.976920 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nvzpw" Mar 20 13:44:46 crc kubenswrapper[4973]: I0320 13:44:46.978865 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c2db-account-create-update-4m67k" Mar 20 13:44:47 crc kubenswrapper[4973]: I0320 13:44:47.048601 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1f86541-a1d0-4a26-80c2-be736fa62eb9-operator-scripts\") pod \"root-account-create-update-2k8xz\" (UID: \"b1f86541-a1d0-4a26-80c2-be736fa62eb9\") " pod="openstack/root-account-create-update-2k8xz" Mar 20 13:44:47 crc kubenswrapper[4973]: I0320 13:44:47.048967 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm2n4\" (UniqueName: \"kubernetes.io/projected/0f86183c-8d71-4d52-860b-9579ba761393-kube-api-access-qm2n4\") pod \"keystone-db-sync-n2wvb\" (UID: \"0f86183c-8d71-4d52-860b-9579ba761393\") " pod="openstack/keystone-db-sync-n2wvb" Mar 20 13:44:47 crc kubenswrapper[4973]: I0320 13:44:47.049171 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f86183c-8d71-4d52-860b-9579ba761393-config-data\") pod \"keystone-db-sync-n2wvb\" (UID: \"0f86183c-8d71-4d52-860b-9579ba761393\") " pod="openstack/keystone-db-sync-n2wvb" Mar 20 13:44:47 crc kubenswrapper[4973]: I0320 13:44:47.049314 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1f86541-a1d0-4a26-80c2-be736fa62eb9-operator-scripts\") pod \"root-account-create-update-2k8xz\" (UID: \"b1f86541-a1d0-4a26-80c2-be736fa62eb9\") " pod="openstack/root-account-create-update-2k8xz" Mar 20 13:44:47 crc kubenswrapper[4973]: I0320 13:44:47.049533 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f86183c-8d71-4d52-860b-9579ba761393-combined-ca-bundle\") pod \"keystone-db-sync-n2wvb\" (UID: \"0f86183c-8d71-4d52-860b-9579ba761393\") " pod="openstack/keystone-db-sync-n2wvb" Mar 20 13:44:47 crc kubenswrapper[4973]: I0320 13:44:47.049774 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdvnj\" (UniqueName: \"kubernetes.io/projected/b1f86541-a1d0-4a26-80c2-be736fa62eb9-kube-api-access-bdvnj\") pod \"root-account-create-update-2k8xz\" (UID: \"b1f86541-a1d0-4a26-80c2-be736fa62eb9\") " pod="openstack/root-account-create-update-2k8xz" Mar 20 13:44:47 crc kubenswrapper[4973]: I0320 13:44:47.072954 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdvnj\" (UniqueName: \"kubernetes.io/projected/b1f86541-a1d0-4a26-80c2-be736fa62eb9-kube-api-access-bdvnj\") pod \"root-account-create-update-2k8xz\" (UID: \"b1f86541-a1d0-4a26-80c2-be736fa62eb9\") " pod="openstack/root-account-create-update-2k8xz" Mar 20 13:44:47 crc kubenswrapper[4973]: I0320 13:44:47.128879 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1972-account-create-update-l6mqj" Mar 20 13:44:47 crc kubenswrapper[4973]: I0320 13:44:47.151797 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qm2n4\" (UniqueName: \"kubernetes.io/projected/0f86183c-8d71-4d52-860b-9579ba761393-kube-api-access-qm2n4\") pod \"keystone-db-sync-n2wvb\" (UID: \"0f86183c-8d71-4d52-860b-9579ba761393\") " pod="openstack/keystone-db-sync-n2wvb" Mar 20 13:44:47 crc kubenswrapper[4973]: I0320 13:44:47.151880 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f86183c-8d71-4d52-860b-9579ba761393-config-data\") pod \"keystone-db-sync-n2wvb\" (UID: \"0f86183c-8d71-4d52-860b-9579ba761393\") " pod="openstack/keystone-db-sync-n2wvb" Mar 20 13:44:47 crc kubenswrapper[4973]: I0320 13:44:47.151927 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f86183c-8d71-4d52-860b-9579ba761393-combined-ca-bundle\") pod \"keystone-db-sync-n2wvb\" (UID: \"0f86183c-8d71-4d52-860b-9579ba761393\") " pod="openstack/keystone-db-sync-n2wvb" Mar 20 13:44:47 crc kubenswrapper[4973]: I0320 13:44:47.155984 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f86183c-8d71-4d52-860b-9579ba761393-config-data\") pod \"keystone-db-sync-n2wvb\" (UID: \"0f86183c-8d71-4d52-860b-9579ba761393\") " pod="openstack/keystone-db-sync-n2wvb" Mar 20 13:44:47 crc kubenswrapper[4973]: I0320 13:44:47.156704 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f86183c-8d71-4d52-860b-9579ba761393-combined-ca-bundle\") pod \"keystone-db-sync-n2wvb\" (UID: \"0f86183c-8d71-4d52-860b-9579ba761393\") " pod="openstack/keystone-db-sync-n2wvb" Mar 20 13:44:47 crc kubenswrapper[4973]: I0320 13:44:47.178052 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm2n4\" (UniqueName: \"kubernetes.io/projected/0f86183c-8d71-4d52-860b-9579ba761393-kube-api-access-qm2n4\") pod \"keystone-db-sync-n2wvb\" (UID: \"0f86183c-8d71-4d52-860b-9579ba761393\") " pod="openstack/keystone-db-sync-n2wvb" Mar 20 13:44:47 crc kubenswrapper[4973]: I0320 13:44:47.212055 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2k8xz" Mar 20 13:44:47 crc kubenswrapper[4973]: I0320 13:44:47.218840 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="642c51b4-2774-4114-808c-1fb722862437" containerName="prometheus" probeResult="failure" output="Get \"http://10.217.0.142:9090/-/ready\": dial tcp 10.217.0.142:9090: connect: connection refused" Mar 20 13:44:47 crc kubenswrapper[4973]: I0320 13:44:47.293122 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-n2wvb" Mar 20 13:44:48 crc kubenswrapper[4973]: I0320 13:44:48.807368 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:48 crc kubenswrapper[4973]: I0320 13:44:48.888600 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"642c51b4-2774-4114-808c-1fb722862437","Type":"ContainerDied","Data":"8b1e9828042ee52bb66810736a8d1325dd64830aeaedfb2a3ebe46bbc8f62a7f"} Mar 20 13:44:48 crc kubenswrapper[4973]: I0320 13:44:48.888666 4973 scope.go:117] "RemoveContainer" containerID="750c5df52618bc5d32c078cdb27b56850d9e8d6bf24b361607e9d8944cae08c8" Mar 20 13:44:48 crc kubenswrapper[4973]: I0320 13:44:48.888841 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:48 crc kubenswrapper[4973]: I0320 13:44:48.907416 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/642c51b4-2774-4114-808c-1fb722862437-prometheus-metric-storage-rulefiles-1\") pod \"642c51b4-2774-4114-808c-1fb722862437\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " Mar 20 13:44:48 crc kubenswrapper[4973]: I0320 13:44:48.907469 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/642c51b4-2774-4114-808c-1fb722862437-thanos-prometheus-http-client-file\") pod \"642c51b4-2774-4114-808c-1fb722862437\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " Mar 20 13:44:48 crc kubenswrapper[4973]: I0320 13:44:48.907561 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/642c51b4-2774-4114-808c-1fb722862437-prometheus-metric-storage-rulefiles-0\") pod \"642c51b4-2774-4114-808c-1fb722862437\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " Mar 20 13:44:48 crc kubenswrapper[4973]: I0320 13:44:48.907691 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-db\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38da6b63-5881-4301-941a-df15b37fb4c2\") pod \"642c51b4-2774-4114-808c-1fb722862437\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " Mar 20 13:44:48 crc kubenswrapper[4973]: I0320 13:44:48.907728 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpb49\" (UniqueName: \"kubernetes.io/projected/642c51b4-2774-4114-808c-1fb722862437-kube-api-access-gpb49\") pod \"642c51b4-2774-4114-808c-1fb722862437\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " Mar 20 13:44:48 crc kubenswrapper[4973]: I0320 13:44:48.907838 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/642c51b4-2774-4114-808c-1fb722862437-config\") pod \"642c51b4-2774-4114-808c-1fb722862437\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " Mar 20 13:44:48 crc kubenswrapper[4973]: I0320 13:44:48.907863 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/642c51b4-2774-4114-808c-1fb722862437-prometheus-metric-storage-rulefiles-2\") pod \"642c51b4-2774-4114-808c-1fb722862437\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " Mar 20 13:44:48 crc kubenswrapper[4973]: I0320 13:44:48.907885 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/642c51b4-2774-4114-808c-1fb722862437-web-config\") pod \"642c51b4-2774-4114-808c-1fb722862437\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " Mar 20 13:44:48 crc kubenswrapper[4973]: I0320 13:44:48.907905 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/642c51b4-2774-4114-808c-1fb722862437-tls-assets\") pod \"642c51b4-2774-4114-808c-1fb722862437\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " Mar 20 13:44:48 crc kubenswrapper[4973]: I0320 13:44:48.907927 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/642c51b4-2774-4114-808c-1fb722862437-config-out\") pod \"642c51b4-2774-4114-808c-1fb722862437\" (UID: \"642c51b4-2774-4114-808c-1fb722862437\") " Mar 20 13:44:48 crc kubenswrapper[4973]: I0320 13:44:48.910828 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/642c51b4-2774-4114-808c-1fb722862437-prometheus-metric-storage-rulefiles-1" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-1") pod "642c51b4-2774-4114-808c-1fb722862437" (UID: "642c51b4-2774-4114-808c-1fb722862437"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:48 crc kubenswrapper[4973]: I0320 13:44:48.910948 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/642c51b4-2774-4114-808c-1fb722862437-prometheus-metric-storage-rulefiles-0" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-0") pod "642c51b4-2774-4114-808c-1fb722862437" (UID: "642c51b4-2774-4114-808c-1fb722862437"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:48 crc kubenswrapper[4973]: I0320 13:44:48.913756 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/642c51b4-2774-4114-808c-1fb722862437-prometheus-metric-storage-rulefiles-2" (OuterVolumeSpecName: "prometheus-metric-storage-rulefiles-2") pod "642c51b4-2774-4114-808c-1fb722862437" (UID: "642c51b4-2774-4114-808c-1fb722862437"). InnerVolumeSpecName "prometheus-metric-storage-rulefiles-2". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:48 crc kubenswrapper[4973]: I0320 13:44:48.919249 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/642c51b4-2774-4114-808c-1fb722862437-kube-api-access-gpb49" (OuterVolumeSpecName: "kube-api-access-gpb49") pod "642c51b4-2774-4114-808c-1fb722862437" (UID: "642c51b4-2774-4114-808c-1fb722862437"). InnerVolumeSpecName "kube-api-access-gpb49". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:48 crc kubenswrapper[4973]: I0320 13:44:48.929318 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/642c51b4-2774-4114-808c-1fb722862437-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "642c51b4-2774-4114-808c-1fb722862437" (UID: "642c51b4-2774-4114-808c-1fb722862437"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:48 crc kubenswrapper[4973]: I0320 13:44:48.934876 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/642c51b4-2774-4114-808c-1fb722862437-config-out" (OuterVolumeSpecName: "config-out") pod "642c51b4-2774-4114-808c-1fb722862437" (UID: "642c51b4-2774-4114-808c-1fb722862437"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:44:48 crc kubenswrapper[4973]: I0320 13:44:48.935051 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/642c51b4-2774-4114-808c-1fb722862437-config" (OuterVolumeSpecName: "config") pod "642c51b4-2774-4114-808c-1fb722862437" (UID: "642c51b4-2774-4114-808c-1fb722862437"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:48 crc kubenswrapper[4973]: I0320 13:44:48.952259 4973 scope.go:117] "RemoveContainer" containerID="97cecec69536e1f8243868aef7821cad1dcd935afc8c9c21dd2f6bb5fe575497" Mar 20 13:44:48 crc kubenswrapper[4973]: I0320 13:44:48.954323 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/642c51b4-2774-4114-808c-1fb722862437-web-config" (OuterVolumeSpecName: "web-config") pod "642c51b4-2774-4114-808c-1fb722862437" (UID: "642c51b4-2774-4114-808c-1fb722862437"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:48 crc kubenswrapper[4973]: I0320 13:44:48.963829 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38da6b63-5881-4301-941a-df15b37fb4c2" (OuterVolumeSpecName: "prometheus-metric-storage-db") pod "642c51b4-2774-4114-808c-1fb722862437" (UID: "642c51b4-2774-4114-808c-1fb722862437"). InnerVolumeSpecName "pvc-38da6b63-5881-4301-941a-df15b37fb4c2". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:44:48 crc kubenswrapper[4973]: I0320 13:44:48.971201 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/642c51b4-2774-4114-808c-1fb722862437-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "642c51b4-2774-4114-808c-1fb722862437" (UID: "642c51b4-2774-4114-808c-1fb722862437"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.011603 4973 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/642c51b4-2774-4114-808c-1fb722862437-web-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.011903 4973 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/642c51b4-2774-4114-808c-1fb722862437-tls-assets\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.011914 4973 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/642c51b4-2774-4114-808c-1fb722862437-config-out\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.011929 4973 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/642c51b4-2774-4114-808c-1fb722862437-prometheus-metric-storage-rulefiles-1\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.011943 4973 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/642c51b4-2774-4114-808c-1fb722862437-thanos-prometheus-http-client-file\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.011957 4973 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/642c51b4-2774-4114-808c-1fb722862437-prometheus-metric-storage-rulefiles-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.011997 4973 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-38da6b63-5881-4301-941a-df15b37fb4c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38da6b63-5881-4301-941a-df15b37fb4c2\") on node \"crc\" " Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.012013 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpb49\" (UniqueName: \"kubernetes.io/projected/642c51b4-2774-4114-808c-1fb722862437-kube-api-access-gpb49\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.012024 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/642c51b4-2774-4114-808c-1fb722862437-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.012035 4973 reconciler_common.go:293] "Volume detached for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/642c51b4-2774-4114-808c-1fb722862437-prometheus-metric-storage-rulefiles-2\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.018059 4973 scope.go:117] "RemoveContainer" containerID="2e35fac61505298631322828fcdedf86c60d2615539c2e83fe79fc0f40f3ac65" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.023200 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-bhvvf"] Mar 20 13:44:49 crc kubenswrapper[4973]: W0320 13:44:49.027989 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1286f2f0_7c61_4ba2_b2a6_3ec09e602af9.slice/crio-f65553944961dc872304df9a7852e9eaf49e0f4f52ffa3d9dba214d539725937 WatchSource:0}: Error finding container f65553944961dc872304df9a7852e9eaf49e0f4f52ffa3d9dba214d539725937: Status 404 returned error can't find the container with id f65553944961dc872304df9a7852e9eaf49e0f4f52ffa3d9dba214d539725937 Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.072918 4973 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.073054 4973 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-38da6b63-5881-4301-941a-df15b37fb4c2" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38da6b63-5881-4301-941a-df15b37fb4c2") on node "crc" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.119985 4973 reconciler_common.go:293] "Volume detached for volume \"pvc-38da6b63-5881-4301-941a-df15b37fb4c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38da6b63-5881-4301-941a-df15b37fb4c2\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.123183 4973 scope.go:117] "RemoveContainer" containerID="85fdf994d82452b154747e1cea74fc781b71f5de0915d878d8caa20f3828abc5" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.235302 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.246103 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.267590 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 13:44:49 crc kubenswrapper[4973]: E0320 13:44:49.268326 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="642c51b4-2774-4114-808c-1fb722862437" containerName="prometheus" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.268365 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="642c51b4-2774-4114-808c-1fb722862437" containerName="prometheus" Mar 20 13:44:49 crc kubenswrapper[4973]: E0320 13:44:49.268378 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="642c51b4-2774-4114-808c-1fb722862437" containerName="thanos-sidecar" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.268386 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="642c51b4-2774-4114-808c-1fb722862437" containerName="thanos-sidecar" Mar 20 13:44:49 crc kubenswrapper[4973]: E0320 13:44:49.268399 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="642c51b4-2774-4114-808c-1fb722862437" containerName="config-reloader" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.268406 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="642c51b4-2774-4114-808c-1fb722862437" containerName="config-reloader" Mar 20 13:44:49 crc kubenswrapper[4973]: E0320 13:44:49.268434 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="642c51b4-2774-4114-808c-1fb722862437" containerName="init-config-reloader" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.268442 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="642c51b4-2774-4114-808c-1fb722862437" containerName="init-config-reloader" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.268661 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="642c51b4-2774-4114-808c-1fb722862437" containerName="thanos-sidecar" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.268683 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="642c51b4-2774-4114-808c-1fb722862437" containerName="config-reloader" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.268699 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="642c51b4-2774-4114-808c-1fb722862437" containerName="prometheus" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.272157 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.286180 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.286316 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-metric-storage-prometheus-svc" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.286427 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-1" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.286554 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.286729 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.286904 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-2" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.287009 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-rhx8x" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.287218 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.289473 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.314063 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.424418 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-wdm6w"] Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.428554 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b51c1ea9-b42f-47a5-8f74-164a29b2d036-config\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.428584 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/b51c1ea9-b42f-47a5-8f74-164a29b2d036-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.428620 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b51c1ea9-b42f-47a5-8f74-164a29b2d036-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.428650 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqstn\" (UniqueName: \"kubernetes.io/projected/b51c1ea9-b42f-47a5-8f74-164a29b2d036-kube-api-access-rqstn\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.428675 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b51c1ea9-b42f-47a5-8f74-164a29b2d036-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.428693 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b51c1ea9-b42f-47a5-8f74-164a29b2d036-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.428715 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b51c1ea9-b42f-47a5-8f74-164a29b2d036-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.433697 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b51c1ea9-b42f-47a5-8f74-164a29b2d036-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.433794 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b51c1ea9-b42f-47a5-8f74-164a29b2d036-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.434158 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-38da6b63-5881-4301-941a-df15b37fb4c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38da6b63-5881-4301-941a-df15b37fb4c2\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.434231 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/b51c1ea9-b42f-47a5-8f74-164a29b2d036-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.434355 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b51c1ea9-b42f-47a5-8f74-164a29b2d036-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.434440 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b51c1ea9-b42f-47a5-8f74-164a29b2d036-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.545646 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b51c1ea9-b42f-47a5-8f74-164a29b2d036-config\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.545715 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/b51c1ea9-b42f-47a5-8f74-164a29b2d036-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.545854 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b51c1ea9-b42f-47a5-8f74-164a29b2d036-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.546559 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqstn\" (UniqueName: \"kubernetes.io/projected/b51c1ea9-b42f-47a5-8f74-164a29b2d036-kube-api-access-rqstn\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.547126 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b51c1ea9-b42f-47a5-8f74-164a29b2d036-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.547208 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b51c1ea9-b42f-47a5-8f74-164a29b2d036-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.547260 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b51c1ea9-b42f-47a5-8f74-164a29b2d036-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.547776 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b51c1ea9-b42f-47a5-8f74-164a29b2d036-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.559886 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b51c1ea9-b42f-47a5-8f74-164a29b2d036-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.560016 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-38da6b63-5881-4301-941a-df15b37fb4c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38da6b63-5881-4301-941a-df15b37fb4c2\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.560101 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/b51c1ea9-b42f-47a5-8f74-164a29b2d036-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.560155 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b51c1ea9-b42f-47a5-8f74-164a29b2d036-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.560273 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b51c1ea9-b42f-47a5-8f74-164a29b2d036-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.547034 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-1\" (UniqueName: \"kubernetes.io/configmap/b51c1ea9-b42f-47a5-8f74-164a29b2d036-prometheus-metric-storage-rulefiles-1\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.563900 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b51c1ea9-b42f-47a5-8f74-164a29b2d036-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.564524 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b51c1ea9-b42f-47a5-8f74-164a29b2d036-config\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.564829 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b51c1ea9-b42f-47a5-8f74-164a29b2d036-secret-combined-ca-bundle\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.565161 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-2\" (UniqueName: \"kubernetes.io/configmap/b51c1ea9-b42f-47a5-8f74-164a29b2d036-prometheus-metric-storage-rulefiles-2\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.565494 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b51c1ea9-b42f-47a5-8f74-164a29b2d036-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.567680 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b51c1ea9-b42f-47a5-8f74-164a29b2d036-web-config-tls-secret-key-cert-metric-storage-promethe-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.570628 4973 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.570666 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-38da6b63-5881-4301-941a-df15b37fb4c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38da6b63-5881-4301-941a-df15b37fb4c2\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9fc2caf26902e4acb0c152b4b6832175eae9a7bf975678e3cfa2391449f57e79/globalmount\"" pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.570934 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\" (UniqueName: \"kubernetes.io/secret/b51c1ea9-b42f-47a5-8f74-164a29b2d036-web-config-tls-secret-cert-cert-metric-storage-prometh-dc638c2d\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.571012 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b51c1ea9-b42f-47a5-8f74-164a29b2d036-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.571844 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b51c1ea9-b42f-47a5-8f74-164a29b2d036-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.583626 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-1972-account-create-update-l6mqj"] Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.608065 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqstn\" (UniqueName: \"kubernetes.io/projected/b51c1ea9-b42f-47a5-8f74-164a29b2d036-kube-api-access-rqstn\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.613473 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b51c1ea9-b42f-47a5-8f74-164a29b2d036-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.615173 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-n2wvb"] Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.693115 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-a222-account-create-update-bsrjg"] Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.711555 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-8fxhw"] Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.731654 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-2k8xz"] Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.732626 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-38da6b63-5881-4301-941a-df15b37fb4c2\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-38da6b63-5881-4301-941a-df15b37fb4c2\") pod \"prometheus-metric-storage-0\" (UID: \"b51c1ea9-b42f-47a5-8f74-164a29b2d036\") " pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.742219 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-rwnmf"] Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.750905 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-c2db-account-create-update-4m67k"] Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.829883 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-0791-account-create-update-rz2hx"] Mar 20 13:44:49 crc kubenswrapper[4973]: W0320 13:44:49.835568 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda26bab7b_118f_46a6_8809_8f7b8b7a83d4.slice/crio-bfa659cacc1611128a977273aaf5b0d61ef472a2596d5bc48ae7c8feea775f99 WatchSource:0}: Error finding container bfa659cacc1611128a977273aaf5b0d61ef472a2596d5bc48ae7c8feea775f99: Status 404 returned error can't find the container with id bfa659cacc1611128a977273aaf5b0d61ef472a2596d5bc48ae7c8feea775f99 Mar 20 13:44:49 crc kubenswrapper[4973]: I0320 13:44:49.956227 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Mar 20 13:44:50 crc kubenswrapper[4973]: I0320 13:44:50.018993 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="642c51b4-2774-4114-808c-1fb722862437" path="/var/lib/kubelet/pods/642c51b4-2774-4114-808c-1fb722862437/volumes" Mar 20 13:44:50 crc kubenswrapper[4973]: I0320 13:44:50.021982 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1972-account-create-update-l6mqj" event={"ID":"be6b9024-516f-484d-94ff-391bf79246bf","Type":"ContainerStarted","Data":"c16bf435dcf8cac007facfdf6121aec37fb7ae426b150d3d21f8ee1cf1a24334"} Mar 20 13:44:50 crc kubenswrapper[4973]: I0320 13:44:50.022043 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-0791-account-create-update-rz2hx" event={"ID":"a26bab7b-118f-46a6-8809-8f7b8b7a83d4","Type":"ContainerStarted","Data":"bfa659cacc1611128a977273aaf5b0d61ef472a2596d5bc48ae7c8feea775f99"} Mar 20 13:44:50 crc kubenswrapper[4973]: I0320 13:44:50.022056 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c2db-account-create-update-4m67k" event={"ID":"a14f07ce-f8ea-4af0-bf01-9ad6b1af917a","Type":"ContainerStarted","Data":"0d9c3e78726e87a448ac751c85dfd1187d99af0cb9be659ae6ebb70c21c0ce48"} Mar 20 13:44:50 crc kubenswrapper[4973]: I0320 13:44:50.022067 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-rwnmf" event={"ID":"0850cf2e-8755-4161-97ea-a1507ef8f2fb","Type":"ContainerStarted","Data":"f0b4bd1c2a135a081761b9b96f880b44a989565991da24d72b6d8feaa956b285"} Mar 20 13:44:50 crc kubenswrapper[4973]: I0320 13:44:50.022078 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8fxhw" event={"ID":"024e6737-2a16-43e9-99f4-62d2b39df77b","Type":"ContainerStarted","Data":"379fe10d7c234c96460f2dd12640f1fb51394b12d165dd303537232eca18b9fe"} Mar 20 13:44:50 crc kubenswrapper[4973]: I0320 13:44:50.022091 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2k8xz" event={"ID":"b1f86541-a1d0-4a26-80c2-be736fa62eb9","Type":"ContainerStarted","Data":"da9f75f57b3dffff6d027d4fc68cc4a50da45fc10e646b9e4b294c98255ab131"} Mar 20 13:44:50 crc kubenswrapper[4973]: I0320 13:44:50.022103 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerStarted","Data":"6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e"} Mar 20 13:44:50 crc kubenswrapper[4973]: I0320 13:44:50.022124 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bhvvf" event={"ID":"1286f2f0-7c61-4ba2-b2a6-3ec09e602af9","Type":"ContainerStarted","Data":"c1f1c397ae71d75e71535afe1687540c2e9c8ae5090a077a9a71530258b81724"} Mar 20 13:44:50 crc kubenswrapper[4973]: I0320 13:44:50.022135 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bhvvf" event={"ID":"1286f2f0-7c61-4ba2-b2a6-3ec09e602af9","Type":"ContainerStarted","Data":"f65553944961dc872304df9a7852e9eaf49e0f4f52ffa3d9dba214d539725937"} Mar 20 13:44:50 crc kubenswrapper[4973]: I0320 13:44:50.022144 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mfb4w" event={"ID":"716148fc-095e-4811-8e79-53bf3e2cd53b","Type":"ContainerStarted","Data":"07e4031eb51a244b66dff2c2092044894ab5856be895615cfe8e14f94283f0d9"} Mar 20 13:44:50 crc kubenswrapper[4973]: I0320 13:44:50.024081 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-wdm6w" event={"ID":"0bb79520-00fa-4626-9fef-c7250fccd210","Type":"ContainerStarted","Data":"f2c6a5b5eb747c059098e13e81f0eb4a93aa737e4a72b2d2ed66852cabb84ef3"} Mar 20 13:44:50 crc kubenswrapper[4973]: I0320 13:44:50.025746 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a222-account-create-update-bsrjg" event={"ID":"2d0fe71e-2a81-41d4-86d6-dd4345597c6e","Type":"ContainerStarted","Data":"e325c6d2c2fa62aa194dc8fde5154fb7dde26f4b7943fbea14bbe0a3f210eb2c"} Mar 20 13:44:50 crc kubenswrapper[4973]: I0320 13:44:50.028321 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" event={"ID":"86ba366f-247a-4630-8fa6-196198d8aec7","Type":"ContainerStarted","Data":"ecf6f5bd6ab1ed88304d3614f92656f95c74d4951b1d66a11edcc5ff9ac2c49f"} Mar 20 13:44:50 crc kubenswrapper[4973]: I0320 13:44:50.029225 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" Mar 20 13:44:50 crc kubenswrapper[4973]: I0320 13:44:50.030900 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-n2wvb" event={"ID":"0f86183c-8d71-4d52-860b-9579ba761393","Type":"ContainerStarted","Data":"98ab39e50c683760d613b3a87b957af9de07174e2e57de59e3ff265011ca8d3e"} Mar 20 13:44:50 crc kubenswrapper[4973]: I0320 13:44:50.215717 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" podStartSLOduration=12.215695398 podStartE2EDuration="12.215695398s" podCreationTimestamp="2026-03-20 13:44:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:44:50.210178507 +0000 UTC m=+1410.953848291" watchObservedRunningTime="2026-03-20 13:44:50.215695398 +0000 UTC m=+1410.959365142" Mar 20 13:44:50 crc kubenswrapper[4973]: I0320 13:44:50.236001 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-mfb4w" podStartSLOduration=3.769316929 podStartE2EDuration="22.235983341s" podCreationTimestamp="2026-03-20 13:44:28 +0000 UTC" firstStartedPulling="2026-03-20 13:44:29.738446408 +0000 UTC m=+1390.482116162" lastFinishedPulling="2026-03-20 13:44:48.20511282 +0000 UTC m=+1408.948782574" observedRunningTime="2026-03-20 13:44:50.231377036 +0000 UTC m=+1410.975046780" watchObservedRunningTime="2026-03-20 13:44:50.235983341 +0000 UTC m=+1410.979653085" Mar 20 13:44:50 crc kubenswrapper[4973]: I0320 13:44:50.583072 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Mar 20 13:44:50 crc kubenswrapper[4973]: W0320 13:44:50.609052 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb51c1ea9_b42f_47a5_8f74_164a29b2d036.slice/crio-99876c4ae27b4aa2457aa62d70447bc040cca5daadc2e852f2e0729bddf7302c WatchSource:0}: Error finding container 99876c4ae27b4aa2457aa62d70447bc040cca5daadc2e852f2e0729bddf7302c: Status 404 returned error can't find the container with id 99876c4ae27b4aa2457aa62d70447bc040cca5daadc2e852f2e0729bddf7302c Mar 20 13:44:51 crc kubenswrapper[4973]: I0320 13:44:51.048855 4973 generic.go:334] "Generic (PLEG): container finished" podID="a14f07ce-f8ea-4af0-bf01-9ad6b1af917a" containerID="ea23f03170bee7e6382632749bd595b9a1313dbc7c2fdb63027f117566faa374" exitCode=0 Mar 20 13:44:51 crc kubenswrapper[4973]: I0320 13:44:51.049142 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c2db-account-create-update-4m67k" event={"ID":"a14f07ce-f8ea-4af0-bf01-9ad6b1af917a","Type":"ContainerDied","Data":"ea23f03170bee7e6382632749bd595b9a1313dbc7c2fdb63027f117566faa374"} Mar 20 13:44:51 crc kubenswrapper[4973]: I0320 13:44:51.052612 4973 generic.go:334] "Generic (PLEG): container finished" podID="be6b9024-516f-484d-94ff-391bf79246bf" containerID="891e47aad9dc99c16ff5048519990252c678f6ef510adb6cc13cbbca6ed4996e" exitCode=0 Mar 20 13:44:51 crc kubenswrapper[4973]: I0320 13:44:51.052732 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1972-account-create-update-l6mqj" event={"ID":"be6b9024-516f-484d-94ff-391bf79246bf","Type":"ContainerDied","Data":"891e47aad9dc99c16ff5048519990252c678f6ef510adb6cc13cbbca6ed4996e"} Mar 20 13:44:51 crc kubenswrapper[4973]: I0320 13:44:51.054721 4973 generic.go:334] "Generic (PLEG): container finished" podID="0850cf2e-8755-4161-97ea-a1507ef8f2fb" containerID="ce47bf39326f73808f09bc041516cf0ea5cdc99e2bf914c2642e2a3f56df6cd1" exitCode=0 Mar 20 13:44:51 crc kubenswrapper[4973]: I0320 13:44:51.054794 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-rwnmf" event={"ID":"0850cf2e-8755-4161-97ea-a1507ef8f2fb","Type":"ContainerDied","Data":"ce47bf39326f73808f09bc041516cf0ea5cdc99e2bf914c2642e2a3f56df6cd1"} Mar 20 13:44:51 crc kubenswrapper[4973]: I0320 13:44:51.056674 4973 generic.go:334] "Generic (PLEG): container finished" podID="1286f2f0-7c61-4ba2-b2a6-3ec09e602af9" containerID="c1f1c397ae71d75e71535afe1687540c2e9c8ae5090a077a9a71530258b81724" exitCode=0 Mar 20 13:44:51 crc kubenswrapper[4973]: I0320 13:44:51.056740 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bhvvf" event={"ID":"1286f2f0-7c61-4ba2-b2a6-3ec09e602af9","Type":"ContainerDied","Data":"c1f1c397ae71d75e71535afe1687540c2e9c8ae5090a077a9a71530258b81724"} Mar 20 13:44:51 crc kubenswrapper[4973]: I0320 13:44:51.078630 4973 generic.go:334] "Generic (PLEG): container finished" podID="024e6737-2a16-43e9-99f4-62d2b39df77b" containerID="d2e5fe46f3780bfdf625f4f386b47b8f5ae67dd453e25cdd4cb37a9b57cf00b4" exitCode=0 Mar 20 13:44:51 crc kubenswrapper[4973]: I0320 13:44:51.078760 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8fxhw" event={"ID":"024e6737-2a16-43e9-99f4-62d2b39df77b","Type":"ContainerDied","Data":"d2e5fe46f3780bfdf625f4f386b47b8f5ae67dd453e25cdd4cb37a9b57cf00b4"} Mar 20 13:44:51 crc kubenswrapper[4973]: I0320 13:44:51.092842 4973 generic.go:334] "Generic (PLEG): container finished" podID="a26bab7b-118f-46a6-8809-8f7b8b7a83d4" containerID="d7e7456f83f9f22b10988cfe4a6c2c043104d4c1a7ce31cb7f90947f74e32844" exitCode=0 Mar 20 13:44:51 crc kubenswrapper[4973]: I0320 13:44:51.092994 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-0791-account-create-update-rz2hx" event={"ID":"a26bab7b-118f-46a6-8809-8f7b8b7a83d4","Type":"ContainerDied","Data":"d7e7456f83f9f22b10988cfe4a6c2c043104d4c1a7ce31cb7f90947f74e32844"} Mar 20 13:44:51 crc kubenswrapper[4973]: I0320 13:44:51.103071 4973 generic.go:334] "Generic (PLEG): container finished" podID="2d0fe71e-2a81-41d4-86d6-dd4345597c6e" containerID="4f344edff595c94c1d1e635246cd7485e152fc7b7aeabbf8c6b77c50458e3271" exitCode=0 Mar 20 13:44:51 crc kubenswrapper[4973]: I0320 13:44:51.103161 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a222-account-create-update-bsrjg" event={"ID":"2d0fe71e-2a81-41d4-86d6-dd4345597c6e","Type":"ContainerDied","Data":"4f344edff595c94c1d1e635246cd7485e152fc7b7aeabbf8c6b77c50458e3271"} Mar 20 13:44:51 crc kubenswrapper[4973]: I0320 13:44:51.109155 4973 generic.go:334] "Generic (PLEG): container finished" podID="b1f86541-a1d0-4a26-80c2-be736fa62eb9" containerID="f874485f813e1778c4784fe82ad905aac8cafec2fd4161b9d2d30540d5bf6d1c" exitCode=0 Mar 20 13:44:51 crc kubenswrapper[4973]: I0320 13:44:51.109382 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2k8xz" event={"ID":"b1f86541-a1d0-4a26-80c2-be736fa62eb9","Type":"ContainerDied","Data":"f874485f813e1778c4784fe82ad905aac8cafec2fd4161b9d2d30540d5bf6d1c"} Mar 20 13:44:51 crc kubenswrapper[4973]: I0320 13:44:51.121624 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b51c1ea9-b42f-47a5-8f74-164a29b2d036","Type":"ContainerStarted","Data":"99876c4ae27b4aa2457aa62d70447bc040cca5daadc2e852f2e0729bddf7302c"} Mar 20 13:44:51 crc kubenswrapper[4973]: I0320 13:44:51.124043 4973 generic.go:334] "Generic (PLEG): container finished" podID="0bb79520-00fa-4626-9fef-c7250fccd210" containerID="eb741610b3702bfcf6600e5e5bb46025d3e9cdefd0ac87dee15660383f37d5de" exitCode=0 Mar 20 13:44:51 crc kubenswrapper[4973]: I0320 13:44:51.124283 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-wdm6w" event={"ID":"0bb79520-00fa-4626-9fef-c7250fccd210","Type":"ContainerDied","Data":"eb741610b3702bfcf6600e5e5bb46025d3e9cdefd0ac87dee15660383f37d5de"} Mar 20 13:44:51 crc kubenswrapper[4973]: I0320 13:44:51.536766 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bhvvf" Mar 20 13:44:51 crc kubenswrapper[4973]: I0320 13:44:51.637154 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1286f2f0-7c61-4ba2-b2a6-3ec09e602af9-operator-scripts\") pod \"1286f2f0-7c61-4ba2-b2a6-3ec09e602af9\" (UID: \"1286f2f0-7c61-4ba2-b2a6-3ec09e602af9\") " Mar 20 13:44:51 crc kubenswrapper[4973]: I0320 13:44:51.637373 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqdmg\" (UniqueName: \"kubernetes.io/projected/1286f2f0-7c61-4ba2-b2a6-3ec09e602af9-kube-api-access-bqdmg\") pod \"1286f2f0-7c61-4ba2-b2a6-3ec09e602af9\" (UID: \"1286f2f0-7c61-4ba2-b2a6-3ec09e602af9\") " Mar 20 13:44:51 crc kubenswrapper[4973]: I0320 13:44:51.638154 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1286f2f0-7c61-4ba2-b2a6-3ec09e602af9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1286f2f0-7c61-4ba2-b2a6-3ec09e602af9" (UID: "1286f2f0-7c61-4ba2-b2a6-3ec09e602af9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:51 crc kubenswrapper[4973]: I0320 13:44:51.647733 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1286f2f0-7c61-4ba2-b2a6-3ec09e602af9-kube-api-access-bqdmg" (OuterVolumeSpecName: "kube-api-access-bqdmg") pod "1286f2f0-7c61-4ba2-b2a6-3ec09e602af9" (UID: "1286f2f0-7c61-4ba2-b2a6-3ec09e602af9"). InnerVolumeSpecName "kube-api-access-bqdmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:51 crc kubenswrapper[4973]: I0320 13:44:51.740933 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1286f2f0-7c61-4ba2-b2a6-3ec09e602af9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:51 crc kubenswrapper[4973]: I0320 13:44:51.740969 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqdmg\" (UniqueName: \"kubernetes.io/projected/1286f2f0-7c61-4ba2-b2a6-3ec09e602af9-kube-api-access-bqdmg\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:52 crc kubenswrapper[4973]: I0320 13:44:52.144511 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-bhvvf" Mar 20 13:44:52 crc kubenswrapper[4973]: I0320 13:44:52.144742 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-bhvvf" event={"ID":"1286f2f0-7c61-4ba2-b2a6-3ec09e602af9","Type":"ContainerDied","Data":"f65553944961dc872304df9a7852e9eaf49e0f4f52ffa3d9dba214d539725937"} Mar 20 13:44:52 crc kubenswrapper[4973]: I0320 13:44:52.145452 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f65553944961dc872304df9a7852e9eaf49e0f4f52ffa3d9dba214d539725937" Mar 20 13:44:53 crc kubenswrapper[4973]: I0320 13:44:53.561516 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" Mar 20 13:44:53 crc kubenswrapper[4973]: I0320 13:44:53.624743 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-j6blp"] Mar 20 13:44:53 crc kubenswrapper[4973]: I0320 13:44:53.625050 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-698758b865-j6blp" podUID="74c96d00-cb67-4436-a620-31f29aa6e358" containerName="dnsmasq-dns" containerID="cri-o://d29b755af738150402a347d490bd2b8a5cf113faa9c2a9102a321ef3bb04c4cb" gracePeriod=10 Mar 20 13:44:54 crc kubenswrapper[4973]: I0320 13:44:54.179806 4973 generic.go:334] "Generic (PLEG): container finished" podID="74c96d00-cb67-4436-a620-31f29aa6e358" containerID="d29b755af738150402a347d490bd2b8a5cf113faa9c2a9102a321ef3bb04c4cb" exitCode=0 Mar 20 13:44:54 crc kubenswrapper[4973]: I0320 13:44:54.179921 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-j6blp" event={"ID":"74c96d00-cb67-4436-a620-31f29aa6e358","Type":"ContainerDied","Data":"d29b755af738150402a347d490bd2b8a5cf113faa9c2a9102a321ef3bb04c4cb"} Mar 20 13:44:54 crc kubenswrapper[4973]: I0320 13:44:54.901543 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-698758b865-j6blp" podUID="74c96d00-cb67-4436-a620-31f29aa6e358" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.153:5353: connect: connection refused" Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.192067 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b51c1ea9-b42f-47a5-8f74-164a29b2d036","Type":"ContainerStarted","Data":"ebc9281b2b80117f4382093b724e1f8c0c0fc13127e86d6c823e07ca459e9ac3"} Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.692808 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-rwnmf" Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.754197 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a222-account-create-update-bsrjg" Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.779233 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7rxm\" (UniqueName: \"kubernetes.io/projected/0850cf2e-8755-4161-97ea-a1507ef8f2fb-kube-api-access-v7rxm\") pod \"0850cf2e-8755-4161-97ea-a1507ef8f2fb\" (UID: \"0850cf2e-8755-4161-97ea-a1507ef8f2fb\") " Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.779563 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0850cf2e-8755-4161-97ea-a1507ef8f2fb-operator-scripts\") pod \"0850cf2e-8755-4161-97ea-a1507ef8f2fb\" (UID: \"0850cf2e-8755-4161-97ea-a1507ef8f2fb\") " Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.781625 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0850cf2e-8755-4161-97ea-a1507ef8f2fb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0850cf2e-8755-4161-97ea-a1507ef8f2fb" (UID: "0850cf2e-8755-4161-97ea-a1507ef8f2fb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.784094 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8fxhw" Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.787269 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0850cf2e-8755-4161-97ea-a1507ef8f2fb-kube-api-access-v7rxm" (OuterVolumeSpecName: "kube-api-access-v7rxm") pod "0850cf2e-8755-4161-97ea-a1507ef8f2fb" (UID: "0850cf2e-8755-4161-97ea-a1507ef8f2fb"). InnerVolumeSpecName "kube-api-access-v7rxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.853434 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1972-account-create-update-l6mqj" Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.861950 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c2db-account-create-update-4m67k" Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.876186 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wdm6w" Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.881542 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d0fe71e-2a81-41d4-86d6-dd4345597c6e-operator-scripts\") pod \"2d0fe71e-2a81-41d4-86d6-dd4345597c6e\" (UID: \"2d0fe71e-2a81-41d4-86d6-dd4345597c6e\") " Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.881851 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/024e6737-2a16-43e9-99f4-62d2b39df77b-operator-scripts\") pod \"024e6737-2a16-43e9-99f4-62d2b39df77b\" (UID: \"024e6737-2a16-43e9-99f4-62d2b39df77b\") " Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.881959 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zx9t\" (UniqueName: \"kubernetes.io/projected/024e6737-2a16-43e9-99f4-62d2b39df77b-kube-api-access-6zx9t\") pod \"024e6737-2a16-43e9-99f4-62d2b39df77b\" (UID: \"024e6737-2a16-43e9-99f4-62d2b39df77b\") " Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.882226 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkfvj\" (UniqueName: \"kubernetes.io/projected/2d0fe71e-2a81-41d4-86d6-dd4345597c6e-kube-api-access-nkfvj\") pod \"2d0fe71e-2a81-41d4-86d6-dd4345597c6e\" (UID: \"2d0fe71e-2a81-41d4-86d6-dd4345597c6e\") " Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.883308 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0850cf2e-8755-4161-97ea-a1507ef8f2fb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.883327 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7rxm\" (UniqueName: \"kubernetes.io/projected/0850cf2e-8755-4161-97ea-a1507ef8f2fb-kube-api-access-v7rxm\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.883822 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d0fe71e-2a81-41d4-86d6-dd4345597c6e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2d0fe71e-2a81-41d4-86d6-dd4345597c6e" (UID: "2d0fe71e-2a81-41d4-86d6-dd4345597c6e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.886692 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/024e6737-2a16-43e9-99f4-62d2b39df77b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "024e6737-2a16-43e9-99f4-62d2b39df77b" (UID: "024e6737-2a16-43e9-99f4-62d2b39df77b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.889755 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d0fe71e-2a81-41d4-86d6-dd4345597c6e-kube-api-access-nkfvj" (OuterVolumeSpecName: "kube-api-access-nkfvj") pod "2d0fe71e-2a81-41d4-86d6-dd4345597c6e" (UID: "2d0fe71e-2a81-41d4-86d6-dd4345597c6e"). InnerVolumeSpecName "kube-api-access-nkfvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.890365 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/024e6737-2a16-43e9-99f4-62d2b39df77b-kube-api-access-6zx9t" (OuterVolumeSpecName: "kube-api-access-6zx9t") pod "024e6737-2a16-43e9-99f4-62d2b39df77b" (UID: "024e6737-2a16-43e9-99f4-62d2b39df77b"). InnerVolumeSpecName "kube-api-access-6zx9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.901004 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2k8xz" Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.906571 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-0791-account-create-update-rz2hx" Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.912911 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-j6blp" Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.985071 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74c96d00-cb67-4436-a620-31f29aa6e358-dns-svc\") pod \"74c96d00-cb67-4436-a620-31f29aa6e358\" (UID: \"74c96d00-cb67-4436-a620-31f29aa6e358\") " Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.985127 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzngg\" (UniqueName: \"kubernetes.io/projected/a26bab7b-118f-46a6-8809-8f7b8b7a83d4-kube-api-access-xzngg\") pod \"a26bab7b-118f-46a6-8809-8f7b8b7a83d4\" (UID: \"a26bab7b-118f-46a6-8809-8f7b8b7a83d4\") " Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.985168 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be6b9024-516f-484d-94ff-391bf79246bf-operator-scripts\") pod \"be6b9024-516f-484d-94ff-391bf79246bf\" (UID: \"be6b9024-516f-484d-94ff-391bf79246bf\") " Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.985228 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a14f07ce-f8ea-4af0-bf01-9ad6b1af917a-operator-scripts\") pod \"a14f07ce-f8ea-4af0-bf01-9ad6b1af917a\" (UID: \"a14f07ce-f8ea-4af0-bf01-9ad6b1af917a\") " Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.985256 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bb79520-00fa-4626-9fef-c7250fccd210-operator-scripts\") pod \"0bb79520-00fa-4626-9fef-c7250fccd210\" (UID: \"0bb79520-00fa-4626-9fef-c7250fccd210\") " Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.985366 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1f86541-a1d0-4a26-80c2-be736fa62eb9-operator-scripts\") pod \"b1f86541-a1d0-4a26-80c2-be736fa62eb9\" (UID: \"b1f86541-a1d0-4a26-80c2-be736fa62eb9\") " Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.985457 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-844tx\" (UniqueName: \"kubernetes.io/projected/a14f07ce-f8ea-4af0-bf01-9ad6b1af917a-kube-api-access-844tx\") pod \"a14f07ce-f8ea-4af0-bf01-9ad6b1af917a\" (UID: \"a14f07ce-f8ea-4af0-bf01-9ad6b1af917a\") " Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.985510 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79l52\" (UniqueName: \"kubernetes.io/projected/be6b9024-516f-484d-94ff-391bf79246bf-kube-api-access-79l52\") pod \"be6b9024-516f-484d-94ff-391bf79246bf\" (UID: \"be6b9024-516f-484d-94ff-391bf79246bf\") " Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.985587 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bdvnj\" (UniqueName: \"kubernetes.io/projected/b1f86541-a1d0-4a26-80c2-be736fa62eb9-kube-api-access-bdvnj\") pod \"b1f86541-a1d0-4a26-80c2-be736fa62eb9\" (UID: \"b1f86541-a1d0-4a26-80c2-be736fa62eb9\") " Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.985637 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74c96d00-cb67-4436-a620-31f29aa6e358-config\") pod \"74c96d00-cb67-4436-a620-31f29aa6e358\" (UID: \"74c96d00-cb67-4436-a620-31f29aa6e358\") " Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.985706 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dkzh\" (UniqueName: \"kubernetes.io/projected/74c96d00-cb67-4436-a620-31f29aa6e358-kube-api-access-6dkzh\") pod \"74c96d00-cb67-4436-a620-31f29aa6e358\" (UID: \"74c96d00-cb67-4436-a620-31f29aa6e358\") " Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.985730 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74c96d00-cb67-4436-a620-31f29aa6e358-ovsdbserver-sb\") pod \"74c96d00-cb67-4436-a620-31f29aa6e358\" (UID: \"74c96d00-cb67-4436-a620-31f29aa6e358\") " Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.985784 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gzdq\" (UniqueName: \"kubernetes.io/projected/0bb79520-00fa-4626-9fef-c7250fccd210-kube-api-access-2gzdq\") pod \"0bb79520-00fa-4626-9fef-c7250fccd210\" (UID: \"0bb79520-00fa-4626-9fef-c7250fccd210\") " Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.985801 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74c96d00-cb67-4436-a620-31f29aa6e358-ovsdbserver-nb\") pod \"74c96d00-cb67-4436-a620-31f29aa6e358\" (UID: \"74c96d00-cb67-4436-a620-31f29aa6e358\") " Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.985826 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a26bab7b-118f-46a6-8809-8f7b8b7a83d4-operator-scripts\") pod \"a26bab7b-118f-46a6-8809-8f7b8b7a83d4\" (UID: \"a26bab7b-118f-46a6-8809-8f7b8b7a83d4\") " Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.986464 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkfvj\" (UniqueName: \"kubernetes.io/projected/2d0fe71e-2a81-41d4-86d6-dd4345597c6e-kube-api-access-nkfvj\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.986481 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2d0fe71e-2a81-41d4-86d6-dd4345597c6e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.986491 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/024e6737-2a16-43e9-99f4-62d2b39df77b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.986501 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zx9t\" (UniqueName: \"kubernetes.io/projected/024e6737-2a16-43e9-99f4-62d2b39df77b-kube-api-access-6zx9t\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.987011 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a26bab7b-118f-46a6-8809-8f7b8b7a83d4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a26bab7b-118f-46a6-8809-8f7b8b7a83d4" (UID: "a26bab7b-118f-46a6-8809-8f7b8b7a83d4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.987988 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be6b9024-516f-484d-94ff-391bf79246bf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "be6b9024-516f-484d-94ff-391bf79246bf" (UID: "be6b9024-516f-484d-94ff-391bf79246bf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.988391 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a14f07ce-f8ea-4af0-bf01-9ad6b1af917a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a14f07ce-f8ea-4af0-bf01-9ad6b1af917a" (UID: "a14f07ce-f8ea-4af0-bf01-9ad6b1af917a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.988718 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1f86541-a1d0-4a26-80c2-be736fa62eb9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b1f86541-a1d0-4a26-80c2-be736fa62eb9" (UID: "b1f86541-a1d0-4a26-80c2-be736fa62eb9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.990673 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bb79520-00fa-4626-9fef-c7250fccd210-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0bb79520-00fa-4626-9fef-c7250fccd210" (UID: "0bb79520-00fa-4626-9fef-c7250fccd210"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:55 crc kubenswrapper[4973]: I0320 13:44:55.991249 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a26bab7b-118f-46a6-8809-8f7b8b7a83d4-kube-api-access-xzngg" (OuterVolumeSpecName: "kube-api-access-xzngg") pod "a26bab7b-118f-46a6-8809-8f7b8b7a83d4" (UID: "a26bab7b-118f-46a6-8809-8f7b8b7a83d4"). InnerVolumeSpecName "kube-api-access-xzngg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:55.999959 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1f86541-a1d0-4a26-80c2-be736fa62eb9-kube-api-access-bdvnj" (OuterVolumeSpecName: "kube-api-access-bdvnj") pod "b1f86541-a1d0-4a26-80c2-be736fa62eb9" (UID: "b1f86541-a1d0-4a26-80c2-be736fa62eb9"). InnerVolumeSpecName "kube-api-access-bdvnj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.015978 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be6b9024-516f-484d-94ff-391bf79246bf-kube-api-access-79l52" (OuterVolumeSpecName: "kube-api-access-79l52") pod "be6b9024-516f-484d-94ff-391bf79246bf" (UID: "be6b9024-516f-484d-94ff-391bf79246bf"). InnerVolumeSpecName "kube-api-access-79l52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.028533 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/74c96d00-cb67-4436-a620-31f29aa6e358-kube-api-access-6dkzh" (OuterVolumeSpecName: "kube-api-access-6dkzh") pod "74c96d00-cb67-4436-a620-31f29aa6e358" (UID: "74c96d00-cb67-4436-a620-31f29aa6e358"). InnerVolumeSpecName "kube-api-access-6dkzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.028656 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a14f07ce-f8ea-4af0-bf01-9ad6b1af917a-kube-api-access-844tx" (OuterVolumeSpecName: "kube-api-access-844tx") pod "a14f07ce-f8ea-4af0-bf01-9ad6b1af917a" (UID: "a14f07ce-f8ea-4af0-bf01-9ad6b1af917a"). InnerVolumeSpecName "kube-api-access-844tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.028886 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bb79520-00fa-4626-9fef-c7250fccd210-kube-api-access-2gzdq" (OuterVolumeSpecName: "kube-api-access-2gzdq") pod "0bb79520-00fa-4626-9fef-c7250fccd210" (UID: "0bb79520-00fa-4626-9fef-c7250fccd210"). InnerVolumeSpecName "kube-api-access-2gzdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.045725 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74c96d00-cb67-4436-a620-31f29aa6e358-config" (OuterVolumeSpecName: "config") pod "74c96d00-cb67-4436-a620-31f29aa6e358" (UID: "74c96d00-cb67-4436-a620-31f29aa6e358"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.048067 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74c96d00-cb67-4436-a620-31f29aa6e358-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "74c96d00-cb67-4436-a620-31f29aa6e358" (UID: "74c96d00-cb67-4436-a620-31f29aa6e358"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.077257 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74c96d00-cb67-4436-a620-31f29aa6e358-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "74c96d00-cb67-4436-a620-31f29aa6e358" (UID: "74c96d00-cb67-4436-a620-31f29aa6e358"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.086877 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74c96d00-cb67-4436-a620-31f29aa6e358-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "74c96d00-cb67-4436-a620-31f29aa6e358" (UID: "74c96d00-cb67-4436-a620-31f29aa6e358"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.088132 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74c96d00-cb67-4436-a620-31f29aa6e358-ovsdbserver-sb\") pod \"74c96d00-cb67-4436-a620-31f29aa6e358\" (UID: \"74c96d00-cb67-4436-a620-31f29aa6e358\") " Mar 20 13:44:56 crc kubenswrapper[4973]: W0320 13:44:56.088297 4973 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/74c96d00-cb67-4436-a620-31f29aa6e358/volumes/kubernetes.io~configmap/ovsdbserver-sb Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.088318 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/74c96d00-cb67-4436-a620-31f29aa6e358-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "74c96d00-cb67-4436-a620-31f29aa6e358" (UID: "74c96d00-cb67-4436-a620-31f29aa6e358"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.089190 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b1f86541-a1d0-4a26-80c2-be736fa62eb9-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.089290 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-844tx\" (UniqueName: \"kubernetes.io/projected/a14f07ce-f8ea-4af0-bf01-9ad6b1af917a-kube-api-access-844tx\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.089392 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79l52\" (UniqueName: \"kubernetes.io/projected/be6b9024-516f-484d-94ff-391bf79246bf-kube-api-access-79l52\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.089484 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bdvnj\" (UniqueName: \"kubernetes.io/projected/b1f86541-a1d0-4a26-80c2-be736fa62eb9-kube-api-access-bdvnj\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.089545 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/74c96d00-cb67-4436-a620-31f29aa6e358-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.089599 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dkzh\" (UniqueName: \"kubernetes.io/projected/74c96d00-cb67-4436-a620-31f29aa6e358-kube-api-access-6dkzh\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.089658 4973 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/74c96d00-cb67-4436-a620-31f29aa6e358-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.089712 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gzdq\" (UniqueName: \"kubernetes.io/projected/0bb79520-00fa-4626-9fef-c7250fccd210-kube-api-access-2gzdq\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.089838 4973 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/74c96d00-cb67-4436-a620-31f29aa6e358-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.089931 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a26bab7b-118f-46a6-8809-8f7b8b7a83d4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.089988 4973 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/74c96d00-cb67-4436-a620-31f29aa6e358-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.090049 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzngg\" (UniqueName: \"kubernetes.io/projected/a26bab7b-118f-46a6-8809-8f7b8b7a83d4-kube-api-access-xzngg\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.090103 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be6b9024-516f-484d-94ff-391bf79246bf-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.090161 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a14f07ce-f8ea-4af0-bf01-9ad6b1af917a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.090236 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0bb79520-00fa-4626-9fef-c7250fccd210-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.202931 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-2k8xz" event={"ID":"b1f86541-a1d0-4a26-80c2-be736fa62eb9","Type":"ContainerDied","Data":"da9f75f57b3dffff6d027d4fc68cc4a50da45fc10e646b9e4b294c98255ab131"} Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.202977 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da9f75f57b3dffff6d027d4fc68cc4a50da45fc10e646b9e4b294c98255ab131" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.203241 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-2k8xz" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.204599 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-rwnmf" event={"ID":"0850cf2e-8755-4161-97ea-a1507ef8f2fb","Type":"ContainerDied","Data":"f0b4bd1c2a135a081761b9b96f880b44a989565991da24d72b6d8feaa956b285"} Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.204660 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0b4bd1c2a135a081761b9b96f880b44a989565991da24d72b6d8feaa956b285" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.204747 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-rwnmf" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.206416 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-0791-account-create-update-rz2hx" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.206573 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-0791-account-create-update-rz2hx" event={"ID":"a26bab7b-118f-46a6-8809-8f7b8b7a83d4","Type":"ContainerDied","Data":"bfa659cacc1611128a977273aaf5b0d61ef472a2596d5bc48ae7c8feea775f99"} Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.206613 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bfa659cacc1611128a977273aaf5b0d61ef472a2596d5bc48ae7c8feea775f99" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.208602 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-c2db-account-create-update-4m67k" event={"ID":"a14f07ce-f8ea-4af0-bf01-9ad6b1af917a","Type":"ContainerDied","Data":"0d9c3e78726e87a448ac751c85dfd1187d99af0cb9be659ae6ebb70c21c0ce48"} Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.208681 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-c2db-account-create-update-4m67k" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.208699 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d9c3e78726e87a448ac751c85dfd1187d99af0cb9be659ae6ebb70c21c0ce48" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.210435 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-a222-account-create-update-bsrjg" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.210471 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-a222-account-create-update-bsrjg" event={"ID":"2d0fe71e-2a81-41d4-86d6-dd4345597c6e","Type":"ContainerDied","Data":"e325c6d2c2fa62aa194dc8fde5154fb7dde26f4b7943fbea14bbe0a3f210eb2c"} Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.210534 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e325c6d2c2fa62aa194dc8fde5154fb7dde26f4b7943fbea14bbe0a3f210eb2c" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.214319 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-n2wvb" event={"ID":"0f86183c-8d71-4d52-860b-9579ba761393","Type":"ContainerStarted","Data":"65371447e17e8ee321413c1c298e0361c8d1fff86683a74c4b2db804417c3dec"} Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.216187 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-wdm6w" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.216467 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-wdm6w" event={"ID":"0bb79520-00fa-4626-9fef-c7250fccd210","Type":"ContainerDied","Data":"f2c6a5b5eb747c059098e13e81f0eb4a93aa737e4a72b2d2ed66852cabb84ef3"} Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.216488 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2c6a5b5eb747c059098e13e81f0eb4a93aa737e4a72b2d2ed66852cabb84ef3" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.218146 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-8fxhw" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.218168 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-8fxhw" event={"ID":"024e6737-2a16-43e9-99f4-62d2b39df77b","Type":"ContainerDied","Data":"379fe10d7c234c96460f2dd12640f1fb51394b12d165dd303537232eca18b9fe"} Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.218205 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="379fe10d7c234c96460f2dd12640f1fb51394b12d165dd303537232eca18b9fe" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.219968 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-1972-account-create-update-l6mqj" event={"ID":"be6b9024-516f-484d-94ff-391bf79246bf","Type":"ContainerDied","Data":"c16bf435dcf8cac007facfdf6121aec37fb7ae426b150d3d21f8ee1cf1a24334"} Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.220000 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c16bf435dcf8cac007facfdf6121aec37fb7ae426b150d3d21f8ee1cf1a24334" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.220049 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-1972-account-create-update-l6mqj" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.224229 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-698758b865-j6blp" event={"ID":"74c96d00-cb67-4436-a620-31f29aa6e358","Type":"ContainerDied","Data":"f253fa348636fa0b871e6f71a7d260f293a88d1749aeb7e5b8aacd0e1912e7fe"} Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.224296 4973 scope.go:117] "RemoveContainer" containerID="d29b755af738150402a347d490bd2b8a5cf113faa9c2a9102a321ef3bb04c4cb" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.224395 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-698758b865-j6blp" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.243011 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-n2wvb" podStartSLOduration=4.175391458 podStartE2EDuration="10.242991821s" podCreationTimestamp="2026-03-20 13:44:46 +0000 UTC" firstStartedPulling="2026-03-20 13:44:49.508652369 +0000 UTC m=+1410.252322113" lastFinishedPulling="2026-03-20 13:44:55.576252732 +0000 UTC m=+1416.319922476" observedRunningTime="2026-03-20 13:44:56.227529579 +0000 UTC m=+1416.971199313" watchObservedRunningTime="2026-03-20 13:44:56.242991821 +0000 UTC m=+1416.986661555" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.273683 4973 scope.go:117] "RemoveContainer" containerID="3e9afe9d9b18aa750a8743da1a3b1c54904b1d5358c06debce83cdafa90239a5" Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.291836 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-698758b865-j6blp"] Mar 20 13:44:56 crc kubenswrapper[4973]: I0320 13:44:56.304722 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-698758b865-j6blp"] Mar 20 13:44:57 crc kubenswrapper[4973]: I0320 13:44:57.963068 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="74c96d00-cb67-4436-a620-31f29aa6e358" path="/var/lib/kubelet/pods/74c96d00-cb67-4436-a620-31f29aa6e358/volumes" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.134134 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566905-blc9z"] Mar 20 13:45:00 crc kubenswrapper[4973]: E0320 13:45:00.135484 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be6b9024-516f-484d-94ff-391bf79246bf" containerName="mariadb-account-create-update" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.135502 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="be6b9024-516f-484d-94ff-391bf79246bf" containerName="mariadb-account-create-update" Mar 20 13:45:00 crc kubenswrapper[4973]: E0320 13:45:00.135520 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bb79520-00fa-4626-9fef-c7250fccd210" containerName="mariadb-database-create" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.135533 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bb79520-00fa-4626-9fef-c7250fccd210" containerName="mariadb-database-create" Mar 20 13:45:00 crc kubenswrapper[4973]: E0320 13:45:00.135555 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a14f07ce-f8ea-4af0-bf01-9ad6b1af917a" containerName="mariadb-account-create-update" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.135563 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="a14f07ce-f8ea-4af0-bf01-9ad6b1af917a" containerName="mariadb-account-create-update" Mar 20 13:45:00 crc kubenswrapper[4973]: E0320 13:45:00.135578 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="024e6737-2a16-43e9-99f4-62d2b39df77b" containerName="mariadb-database-create" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.135587 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="024e6737-2a16-43e9-99f4-62d2b39df77b" containerName="mariadb-database-create" Mar 20 13:45:00 crc kubenswrapper[4973]: E0320 13:45:00.135598 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74c96d00-cb67-4436-a620-31f29aa6e358" containerName="dnsmasq-dns" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.135605 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c96d00-cb67-4436-a620-31f29aa6e358" containerName="dnsmasq-dns" Mar 20 13:45:00 crc kubenswrapper[4973]: E0320 13:45:00.135616 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1286f2f0-7c61-4ba2-b2a6-3ec09e602af9" containerName="mariadb-database-create" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.135623 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="1286f2f0-7c61-4ba2-b2a6-3ec09e602af9" containerName="mariadb-database-create" Mar 20 13:45:00 crc kubenswrapper[4973]: E0320 13:45:00.135636 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a26bab7b-118f-46a6-8809-8f7b8b7a83d4" containerName="mariadb-account-create-update" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.135644 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="a26bab7b-118f-46a6-8809-8f7b8b7a83d4" containerName="mariadb-account-create-update" Mar 20 13:45:00 crc kubenswrapper[4973]: E0320 13:45:00.135653 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1f86541-a1d0-4a26-80c2-be736fa62eb9" containerName="mariadb-account-create-update" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.135660 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1f86541-a1d0-4a26-80c2-be736fa62eb9" containerName="mariadb-account-create-update" Mar 20 13:45:00 crc kubenswrapper[4973]: E0320 13:45:00.135674 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="74c96d00-cb67-4436-a620-31f29aa6e358" containerName="init" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.135681 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="74c96d00-cb67-4436-a620-31f29aa6e358" containerName="init" Mar 20 13:45:00 crc kubenswrapper[4973]: E0320 13:45:00.135707 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0850cf2e-8755-4161-97ea-a1507ef8f2fb" containerName="mariadb-database-create" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.135718 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="0850cf2e-8755-4161-97ea-a1507ef8f2fb" containerName="mariadb-database-create" Mar 20 13:45:00 crc kubenswrapper[4973]: E0320 13:45:00.135742 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d0fe71e-2a81-41d4-86d6-dd4345597c6e" containerName="mariadb-account-create-update" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.135751 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d0fe71e-2a81-41d4-86d6-dd4345597c6e" containerName="mariadb-account-create-update" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.135974 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="0850cf2e-8755-4161-97ea-a1507ef8f2fb" containerName="mariadb-database-create" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.135992 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="a14f07ce-f8ea-4af0-bf01-9ad6b1af917a" containerName="mariadb-account-create-update" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.136002 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="a26bab7b-118f-46a6-8809-8f7b8b7a83d4" containerName="mariadb-account-create-update" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.136012 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d0fe71e-2a81-41d4-86d6-dd4345597c6e" containerName="mariadb-account-create-update" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.136025 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1f86541-a1d0-4a26-80c2-be736fa62eb9" containerName="mariadb-account-create-update" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.136043 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="024e6737-2a16-43e9-99f4-62d2b39df77b" containerName="mariadb-database-create" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.136055 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="1286f2f0-7c61-4ba2-b2a6-3ec09e602af9" containerName="mariadb-database-create" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.136065 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="74c96d00-cb67-4436-a620-31f29aa6e358" containerName="dnsmasq-dns" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.136076 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bb79520-00fa-4626-9fef-c7250fccd210" containerName="mariadb-database-create" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.136095 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="be6b9024-516f-484d-94ff-391bf79246bf" containerName="mariadb-account-create-update" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.136973 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-blc9z" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.141048 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.141329 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.145464 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566905-blc9z"] Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.262249 4973 generic.go:334] "Generic (PLEG): container finished" podID="716148fc-095e-4811-8e79-53bf3e2cd53b" containerID="07e4031eb51a244b66dff2c2092044894ab5856be895615cfe8e14f94283f0d9" exitCode=0 Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.262300 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mfb4w" event={"ID":"716148fc-095e-4811-8e79-53bf3e2cd53b","Type":"ContainerDied","Data":"07e4031eb51a244b66dff2c2092044894ab5856be895615cfe8e14f94283f0d9"} Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.277924 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c64034f-6ded-417d-8b24-6e7f779cabca-config-volume\") pod \"collect-profiles-29566905-blc9z\" (UID: \"9c64034f-6ded-417d-8b24-6e7f779cabca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-blc9z" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.277969 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9c64034f-6ded-417d-8b24-6e7f779cabca-secret-volume\") pod \"collect-profiles-29566905-blc9z\" (UID: \"9c64034f-6ded-417d-8b24-6e7f779cabca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-blc9z" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.278111 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsf5h\" (UniqueName: \"kubernetes.io/projected/9c64034f-6ded-417d-8b24-6e7f779cabca-kube-api-access-xsf5h\") pod \"collect-profiles-29566905-blc9z\" (UID: \"9c64034f-6ded-417d-8b24-6e7f779cabca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-blc9z" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.380303 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9c64034f-6ded-417d-8b24-6e7f779cabca-secret-volume\") pod \"collect-profiles-29566905-blc9z\" (UID: \"9c64034f-6ded-417d-8b24-6e7f779cabca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-blc9z" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.380673 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c64034f-6ded-417d-8b24-6e7f779cabca-config-volume\") pod \"collect-profiles-29566905-blc9z\" (UID: \"9c64034f-6ded-417d-8b24-6e7f779cabca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-blc9z" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.381684 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c64034f-6ded-417d-8b24-6e7f779cabca-config-volume\") pod \"collect-profiles-29566905-blc9z\" (UID: \"9c64034f-6ded-417d-8b24-6e7f779cabca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-blc9z" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.382132 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsf5h\" (UniqueName: \"kubernetes.io/projected/9c64034f-6ded-417d-8b24-6e7f779cabca-kube-api-access-xsf5h\") pod \"collect-profiles-29566905-blc9z\" (UID: \"9c64034f-6ded-417d-8b24-6e7f779cabca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-blc9z" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.386693 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9c64034f-6ded-417d-8b24-6e7f779cabca-secret-volume\") pod \"collect-profiles-29566905-blc9z\" (UID: \"9c64034f-6ded-417d-8b24-6e7f779cabca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-blc9z" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.397682 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsf5h\" (UniqueName: \"kubernetes.io/projected/9c64034f-6ded-417d-8b24-6e7f779cabca-kube-api-access-xsf5h\") pod \"collect-profiles-29566905-blc9z\" (UID: \"9c64034f-6ded-417d-8b24-6e7f779cabca\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-blc9z" Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.468215 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-blc9z" Mar 20 13:45:00 crc kubenswrapper[4973]: W0320 13:45:00.936647 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c64034f_6ded_417d_8b24_6e7f779cabca.slice/crio-cc62bfd9598cd68265951dca20dd001bcc5c0538e4fe5112234f82733e3ed58b WatchSource:0}: Error finding container cc62bfd9598cd68265951dca20dd001bcc5c0538e4fe5112234f82733e3ed58b: Status 404 returned error can't find the container with id cc62bfd9598cd68265951dca20dd001bcc5c0538e4fe5112234f82733e3ed58b Mar 20 13:45:00 crc kubenswrapper[4973]: I0320 13:45:00.939262 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566905-blc9z"] Mar 20 13:45:01 crc kubenswrapper[4973]: I0320 13:45:01.277416 4973 generic.go:334] "Generic (PLEG): container finished" podID="b51c1ea9-b42f-47a5-8f74-164a29b2d036" containerID="ebc9281b2b80117f4382093b724e1f8c0c0fc13127e86d6c823e07ca459e9ac3" exitCode=0 Mar 20 13:45:01 crc kubenswrapper[4973]: I0320 13:45:01.277508 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b51c1ea9-b42f-47a5-8f74-164a29b2d036","Type":"ContainerDied","Data":"ebc9281b2b80117f4382093b724e1f8c0c0fc13127e86d6c823e07ca459e9ac3"} Mar 20 13:45:01 crc kubenswrapper[4973]: I0320 13:45:01.282380 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-blc9z" event={"ID":"9c64034f-6ded-417d-8b24-6e7f779cabca","Type":"ContainerStarted","Data":"214278786dd60cfbce2505c105d7fbd9130f972bb06bc4b9c2de5e8cec410167"} Mar 20 13:45:01 crc kubenswrapper[4973]: I0320 13:45:01.282478 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-blc9z" event={"ID":"9c64034f-6ded-417d-8b24-6e7f779cabca","Type":"ContainerStarted","Data":"cc62bfd9598cd68265951dca20dd001bcc5c0538e4fe5112234f82733e3ed58b"} Mar 20 13:45:01 crc kubenswrapper[4973]: I0320 13:45:01.905864 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mfb4w" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.018742 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/716148fc-095e-4811-8e79-53bf3e2cd53b-config-data\") pod \"716148fc-095e-4811-8e79-53bf3e2cd53b\" (UID: \"716148fc-095e-4811-8e79-53bf3e2cd53b\") " Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.018788 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grvhn\" (UniqueName: \"kubernetes.io/projected/716148fc-095e-4811-8e79-53bf3e2cd53b-kube-api-access-grvhn\") pod \"716148fc-095e-4811-8e79-53bf3e2cd53b\" (UID: \"716148fc-095e-4811-8e79-53bf3e2cd53b\") " Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.018854 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/716148fc-095e-4811-8e79-53bf3e2cd53b-combined-ca-bundle\") pod \"716148fc-095e-4811-8e79-53bf3e2cd53b\" (UID: \"716148fc-095e-4811-8e79-53bf3e2cd53b\") " Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.018936 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/716148fc-095e-4811-8e79-53bf3e2cd53b-db-sync-config-data\") pod \"716148fc-095e-4811-8e79-53bf3e2cd53b\" (UID: \"716148fc-095e-4811-8e79-53bf3e2cd53b\") " Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.023126 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/716148fc-095e-4811-8e79-53bf3e2cd53b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "716148fc-095e-4811-8e79-53bf3e2cd53b" (UID: "716148fc-095e-4811-8e79-53bf3e2cd53b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.023193 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/716148fc-095e-4811-8e79-53bf3e2cd53b-kube-api-access-grvhn" (OuterVolumeSpecName: "kube-api-access-grvhn") pod "716148fc-095e-4811-8e79-53bf3e2cd53b" (UID: "716148fc-095e-4811-8e79-53bf3e2cd53b"). InnerVolumeSpecName "kube-api-access-grvhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.052644 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/716148fc-095e-4811-8e79-53bf3e2cd53b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "716148fc-095e-4811-8e79-53bf3e2cd53b" (UID: "716148fc-095e-4811-8e79-53bf3e2cd53b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.075993 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/716148fc-095e-4811-8e79-53bf3e2cd53b-config-data" (OuterVolumeSpecName: "config-data") pod "716148fc-095e-4811-8e79-53bf3e2cd53b" (UID: "716148fc-095e-4811-8e79-53bf3e2cd53b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.123014 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/716148fc-095e-4811-8e79-53bf3e2cd53b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.123062 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grvhn\" (UniqueName: \"kubernetes.io/projected/716148fc-095e-4811-8e79-53bf3e2cd53b-kube-api-access-grvhn\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.123076 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/716148fc-095e-4811-8e79-53bf3e2cd53b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.123087 4973 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/716148fc-095e-4811-8e79-53bf3e2cd53b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.295778 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-mfb4w" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.295815 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-mfb4w" event={"ID":"716148fc-095e-4811-8e79-53bf3e2cd53b","Type":"ContainerDied","Data":"68d4095d38164bc2c6126c49b9dd033002967f8fab653f013e394f2d1a239f1c"} Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.295862 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68d4095d38164bc2c6126c49b9dd033002967f8fab653f013e394f2d1a239f1c" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.299372 4973 generic.go:334] "Generic (PLEG): container finished" podID="9c64034f-6ded-417d-8b24-6e7f779cabca" containerID="214278786dd60cfbce2505c105d7fbd9130f972bb06bc4b9c2de5e8cec410167" exitCode=0 Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.299474 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-blc9z" event={"ID":"9c64034f-6ded-417d-8b24-6e7f779cabca","Type":"ContainerDied","Data":"214278786dd60cfbce2505c105d7fbd9130f972bb06bc4b9c2de5e8cec410167"} Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.301795 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b51c1ea9-b42f-47a5-8f74-164a29b2d036","Type":"ContainerStarted","Data":"bfa76964ee4103e5ebd143cb3c4dd42c6a8a7bbcf36c864d950ab5fb7d8234b4"} Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.712958 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-xc4vf"] Mar 20 13:45:02 crc kubenswrapper[4973]: E0320 13:45:02.713517 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="716148fc-095e-4811-8e79-53bf3e2cd53b" containerName="glance-db-sync" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.713535 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="716148fc-095e-4811-8e79-53bf3e2cd53b" containerName="glance-db-sync" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.713766 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="716148fc-095e-4811-8e79-53bf3e2cd53b" containerName="glance-db-sync" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.714891 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.725578 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-xc4vf"] Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.850892 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-xc4vf\" (UID: \"4acc56cd-01f7-4a24-bfba-7d7a52396699\") " pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.850944 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-config\") pod \"dnsmasq-dns-7ff5475cc9-xc4vf\" (UID: \"4acc56cd-01f7-4a24-bfba-7d7a52396699\") " pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.851002 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp54h\" (UniqueName: \"kubernetes.io/projected/4acc56cd-01f7-4a24-bfba-7d7a52396699-kube-api-access-tp54h\") pod \"dnsmasq-dns-7ff5475cc9-xc4vf\" (UID: \"4acc56cd-01f7-4a24-bfba-7d7a52396699\") " pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.851028 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-xc4vf\" (UID: \"4acc56cd-01f7-4a24-bfba-7d7a52396699\") " pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.851172 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-xc4vf\" (UID: \"4acc56cd-01f7-4a24-bfba-7d7a52396699\") " pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.851392 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-xc4vf\" (UID: \"4acc56cd-01f7-4a24-bfba-7d7a52396699\") " pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.952828 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-xc4vf\" (UID: \"4acc56cd-01f7-4a24-bfba-7d7a52396699\") " pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.952971 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-xc4vf\" (UID: \"4acc56cd-01f7-4a24-bfba-7d7a52396699\") " pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.953034 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-config\") pod \"dnsmasq-dns-7ff5475cc9-xc4vf\" (UID: \"4acc56cd-01f7-4a24-bfba-7d7a52396699\") " pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.953096 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp54h\" (UniqueName: \"kubernetes.io/projected/4acc56cd-01f7-4a24-bfba-7d7a52396699-kube-api-access-tp54h\") pod \"dnsmasq-dns-7ff5475cc9-xc4vf\" (UID: \"4acc56cd-01f7-4a24-bfba-7d7a52396699\") " pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.953121 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-xc4vf\" (UID: \"4acc56cd-01f7-4a24-bfba-7d7a52396699\") " pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.953151 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-xc4vf\" (UID: \"4acc56cd-01f7-4a24-bfba-7d7a52396699\") " pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.954121 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-ovsdbserver-nb\") pod \"dnsmasq-dns-7ff5475cc9-xc4vf\" (UID: \"4acc56cd-01f7-4a24-bfba-7d7a52396699\") " pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.954175 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-ovsdbserver-sb\") pod \"dnsmasq-dns-7ff5475cc9-xc4vf\" (UID: \"4acc56cd-01f7-4a24-bfba-7d7a52396699\") " pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.954242 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-dns-swift-storage-0\") pod \"dnsmasq-dns-7ff5475cc9-xc4vf\" (UID: \"4acc56cd-01f7-4a24-bfba-7d7a52396699\") " pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.954549 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-config\") pod \"dnsmasq-dns-7ff5475cc9-xc4vf\" (UID: \"4acc56cd-01f7-4a24-bfba-7d7a52396699\") " pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.954874 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-dns-svc\") pod \"dnsmasq-dns-7ff5475cc9-xc4vf\" (UID: \"4acc56cd-01f7-4a24-bfba-7d7a52396699\") " pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" Mar 20 13:45:02 crc kubenswrapper[4973]: I0320 13:45:02.970850 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp54h\" (UniqueName: \"kubernetes.io/projected/4acc56cd-01f7-4a24-bfba-7d7a52396699-kube-api-access-tp54h\") pod \"dnsmasq-dns-7ff5475cc9-xc4vf\" (UID: \"4acc56cd-01f7-4a24-bfba-7d7a52396699\") " pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" Mar 20 13:45:03 crc kubenswrapper[4973]: I0320 13:45:03.038318 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" Mar 20 13:45:03 crc kubenswrapper[4973]: I0320 13:45:03.564588 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-xc4vf"] Mar 20 13:45:03 crc kubenswrapper[4973]: I0320 13:45:03.851053 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-blc9z" Mar 20 13:45:03 crc kubenswrapper[4973]: I0320 13:45:03.988247 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9c64034f-6ded-417d-8b24-6e7f779cabca-secret-volume\") pod \"9c64034f-6ded-417d-8b24-6e7f779cabca\" (UID: \"9c64034f-6ded-417d-8b24-6e7f779cabca\") " Mar 20 13:45:03 crc kubenswrapper[4973]: I0320 13:45:03.988298 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xsf5h\" (UniqueName: \"kubernetes.io/projected/9c64034f-6ded-417d-8b24-6e7f779cabca-kube-api-access-xsf5h\") pod \"9c64034f-6ded-417d-8b24-6e7f779cabca\" (UID: \"9c64034f-6ded-417d-8b24-6e7f779cabca\") " Mar 20 13:45:03 crc kubenswrapper[4973]: I0320 13:45:03.989489 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c64034f-6ded-417d-8b24-6e7f779cabca-config-volume\") pod \"9c64034f-6ded-417d-8b24-6e7f779cabca\" (UID: \"9c64034f-6ded-417d-8b24-6e7f779cabca\") " Mar 20 13:45:03 crc kubenswrapper[4973]: I0320 13:45:03.991252 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c64034f-6ded-417d-8b24-6e7f779cabca-config-volume" (OuterVolumeSpecName: "config-volume") pod "9c64034f-6ded-417d-8b24-6e7f779cabca" (UID: "9c64034f-6ded-417d-8b24-6e7f779cabca"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:04 crc kubenswrapper[4973]: I0320 13:45:04.000314 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9c64034f-6ded-417d-8b24-6e7f779cabca-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9c64034f-6ded-417d-8b24-6e7f779cabca" (UID: "9c64034f-6ded-417d-8b24-6e7f779cabca"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:04 crc kubenswrapper[4973]: I0320 13:45:04.083029 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c64034f-6ded-417d-8b24-6e7f779cabca-kube-api-access-xsf5h" (OuterVolumeSpecName: "kube-api-access-xsf5h") pod "9c64034f-6ded-417d-8b24-6e7f779cabca" (UID: "9c64034f-6ded-417d-8b24-6e7f779cabca"). InnerVolumeSpecName "kube-api-access-xsf5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:04 crc kubenswrapper[4973]: I0320 13:45:04.093445 4973 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9c64034f-6ded-417d-8b24-6e7f779cabca-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:04 crc kubenswrapper[4973]: I0320 13:45:04.093498 4973 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9c64034f-6ded-417d-8b24-6e7f779cabca-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:04 crc kubenswrapper[4973]: I0320 13:45:04.093510 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xsf5h\" (UniqueName: \"kubernetes.io/projected/9c64034f-6ded-417d-8b24-6e7f779cabca-kube-api-access-xsf5h\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:04 crc kubenswrapper[4973]: I0320 13:45:04.326943 4973 generic.go:334] "Generic (PLEG): container finished" podID="4acc56cd-01f7-4a24-bfba-7d7a52396699" containerID="604a9792b07ffe1149da89915adb5fa17ea784d431d826a0d28c8831564b2320" exitCode=0 Mar 20 13:45:04 crc kubenswrapper[4973]: I0320 13:45:04.327011 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" event={"ID":"4acc56cd-01f7-4a24-bfba-7d7a52396699","Type":"ContainerDied","Data":"604a9792b07ffe1149da89915adb5fa17ea784d431d826a0d28c8831564b2320"} Mar 20 13:45:04 crc kubenswrapper[4973]: I0320 13:45:04.327300 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" event={"ID":"4acc56cd-01f7-4a24-bfba-7d7a52396699","Type":"ContainerStarted","Data":"6ce2ac951e21f2b9c9985b23e16690f41b03decd9c42a194c80fa48b610e3007"} Mar 20 13:45:04 crc kubenswrapper[4973]: I0320 13:45:04.330253 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-blc9z" event={"ID":"9c64034f-6ded-417d-8b24-6e7f779cabca","Type":"ContainerDied","Data":"cc62bfd9598cd68265951dca20dd001bcc5c0538e4fe5112234f82733e3ed58b"} Mar 20 13:45:04 crc kubenswrapper[4973]: I0320 13:45:04.330286 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc62bfd9598cd68265951dca20dd001bcc5c0538e4fe5112234f82733e3ed58b" Mar 20 13:45:04 crc kubenswrapper[4973]: I0320 13:45:04.330387 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566905-blc9z" Mar 20 13:45:05 crc kubenswrapper[4973]: I0320 13:45:05.346017 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" event={"ID":"4acc56cd-01f7-4a24-bfba-7d7a52396699","Type":"ContainerStarted","Data":"73ca2c4107abeb6bb6a53097b736eca760f316eb29dac86c7688d3004dd1d9bd"} Mar 20 13:45:05 crc kubenswrapper[4973]: I0320 13:45:05.346851 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" Mar 20 13:45:05 crc kubenswrapper[4973]: I0320 13:45:05.350538 4973 generic.go:334] "Generic (PLEG): container finished" podID="0f86183c-8d71-4d52-860b-9579ba761393" containerID="65371447e17e8ee321413c1c298e0361c8d1fff86683a74c4b2db804417c3dec" exitCode=0 Mar 20 13:45:05 crc kubenswrapper[4973]: I0320 13:45:05.350645 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-n2wvb" event={"ID":"0f86183c-8d71-4d52-860b-9579ba761393","Type":"ContainerDied","Data":"65371447e17e8ee321413c1c298e0361c8d1fff86683a74c4b2db804417c3dec"} Mar 20 13:45:05 crc kubenswrapper[4973]: I0320 13:45:05.354295 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b51c1ea9-b42f-47a5-8f74-164a29b2d036","Type":"ContainerStarted","Data":"21f0b1b30d4edad79c6f86b577f3629d66b96708588a4ef8e1dc3c0aec43abb1"} Mar 20 13:45:05 crc kubenswrapper[4973]: I0320 13:45:05.372821 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" podStartSLOduration=3.372800535 podStartE2EDuration="3.372800535s" podCreationTimestamp="2026-03-20 13:45:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:05.36195544 +0000 UTC m=+1426.105625184" watchObservedRunningTime="2026-03-20 13:45:05.372800535 +0000 UTC m=+1426.116470279" Mar 20 13:45:06 crc kubenswrapper[4973]: I0320 13:45:06.919339 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b51c1ea9-b42f-47a5-8f74-164a29b2d036","Type":"ContainerStarted","Data":"0683591f0711b5bc5011eaba5bed53fb59e7709c08322cc75649f2ccfe1e835f"} Mar 20 13:45:06 crc kubenswrapper[4973]: I0320 13:45:06.973211 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=17.973185487 podStartE2EDuration="17.973185487s" podCreationTimestamp="2026-03-20 13:44:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:06.966707531 +0000 UTC m=+1427.710377285" watchObservedRunningTime="2026-03-20 13:45:06.973185487 +0000 UTC m=+1427.716855231" Mar 20 13:45:07 crc kubenswrapper[4973]: I0320 13:45:07.405378 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-n2wvb" Mar 20 13:45:07 crc kubenswrapper[4973]: I0320 13:45:07.541764 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f86183c-8d71-4d52-860b-9579ba761393-combined-ca-bundle\") pod \"0f86183c-8d71-4d52-860b-9579ba761393\" (UID: \"0f86183c-8d71-4d52-860b-9579ba761393\") " Mar 20 13:45:07 crc kubenswrapper[4973]: I0320 13:45:07.541945 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qm2n4\" (UniqueName: \"kubernetes.io/projected/0f86183c-8d71-4d52-860b-9579ba761393-kube-api-access-qm2n4\") pod \"0f86183c-8d71-4d52-860b-9579ba761393\" (UID: \"0f86183c-8d71-4d52-860b-9579ba761393\") " Mar 20 13:45:07 crc kubenswrapper[4973]: I0320 13:45:07.541981 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f86183c-8d71-4d52-860b-9579ba761393-config-data\") pod \"0f86183c-8d71-4d52-860b-9579ba761393\" (UID: \"0f86183c-8d71-4d52-860b-9579ba761393\") " Mar 20 13:45:07 crc kubenswrapper[4973]: I0320 13:45:07.556507 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f86183c-8d71-4d52-860b-9579ba761393-kube-api-access-qm2n4" (OuterVolumeSpecName: "kube-api-access-qm2n4") pod "0f86183c-8d71-4d52-860b-9579ba761393" (UID: "0f86183c-8d71-4d52-860b-9579ba761393"). InnerVolumeSpecName "kube-api-access-qm2n4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:07 crc kubenswrapper[4973]: I0320 13:45:07.571108 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f86183c-8d71-4d52-860b-9579ba761393-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0f86183c-8d71-4d52-860b-9579ba761393" (UID: "0f86183c-8d71-4d52-860b-9579ba761393"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:07 crc kubenswrapper[4973]: I0320 13:45:07.604439 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0f86183c-8d71-4d52-860b-9579ba761393-config-data" (OuterVolumeSpecName: "config-data") pod "0f86183c-8d71-4d52-860b-9579ba761393" (UID: "0f86183c-8d71-4d52-860b-9579ba761393"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:07 crc kubenswrapper[4973]: I0320 13:45:07.644242 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0f86183c-8d71-4d52-860b-9579ba761393-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:07 crc kubenswrapper[4973]: I0320 13:45:07.644287 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qm2n4\" (UniqueName: \"kubernetes.io/projected/0f86183c-8d71-4d52-860b-9579ba761393-kube-api-access-qm2n4\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:07 crc kubenswrapper[4973]: I0320 13:45:07.644301 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0f86183c-8d71-4d52-860b-9579ba761393-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:07 crc kubenswrapper[4973]: I0320 13:45:07.994602 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-n2wvb" Mar 20 13:45:07 crc kubenswrapper[4973]: I0320 13:45:07.995521 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-n2wvb" event={"ID":"0f86183c-8d71-4d52-860b-9579ba761393","Type":"ContainerDied","Data":"98ab39e50c683760d613b3a87b957af9de07174e2e57de59e3ff265011ca8d3e"} Mar 20 13:45:07 crc kubenswrapper[4973]: I0320 13:45:07.995554 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="98ab39e50c683760d613b3a87b957af9de07174e2e57de59e3ff265011ca8d3e" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.092840 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-cnl48"] Mar 20 13:45:08 crc kubenswrapper[4973]: E0320 13:45:08.093323 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c64034f-6ded-417d-8b24-6e7f779cabca" containerName="collect-profiles" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.093358 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c64034f-6ded-417d-8b24-6e7f779cabca" containerName="collect-profiles" Mar 20 13:45:08 crc kubenswrapper[4973]: E0320 13:45:08.093387 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f86183c-8d71-4d52-860b-9579ba761393" containerName="keystone-db-sync" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.093393 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f86183c-8d71-4d52-860b-9579ba761393" containerName="keystone-db-sync" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.093602 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c64034f-6ded-417d-8b24-6e7f779cabca" containerName="collect-profiles" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.093626 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f86183c-8d71-4d52-860b-9579ba761393" containerName="keystone-db-sync" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.099060 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cnl48" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.108567 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-cnl48"] Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.109328 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nvzpw" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.109666 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.109798 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.110001 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.110226 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.188501 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-xc4vf"] Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.188986 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" podUID="4acc56cd-01f7-4a24-bfba-7d7a52396699" containerName="dnsmasq-dns" containerID="cri-o://73ca2c4107abeb6bb6a53097b736eca760f316eb29dac86c7688d3004dd1d9bd" gracePeriod=10 Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.199689 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-combined-ca-bundle\") pod \"keystone-bootstrap-cnl48\" (UID: \"00a88a6b-3101-40ec-90ee-62a00b5559a1\") " pod="openstack/keystone-bootstrap-cnl48" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.199785 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-fernet-keys\") pod \"keystone-bootstrap-cnl48\" (UID: \"00a88a6b-3101-40ec-90ee-62a00b5559a1\") " pod="openstack/keystone-bootstrap-cnl48" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.199828 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6chs\" (UniqueName: \"kubernetes.io/projected/00a88a6b-3101-40ec-90ee-62a00b5559a1-kube-api-access-d6chs\") pod \"keystone-bootstrap-cnl48\" (UID: \"00a88a6b-3101-40ec-90ee-62a00b5559a1\") " pod="openstack/keystone-bootstrap-cnl48" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.199955 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-credential-keys\") pod \"keystone-bootstrap-cnl48\" (UID: \"00a88a6b-3101-40ec-90ee-62a00b5559a1\") " pod="openstack/keystone-bootstrap-cnl48" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.200046 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-scripts\") pod \"keystone-bootstrap-cnl48\" (UID: \"00a88a6b-3101-40ec-90ee-62a00b5559a1\") " pod="openstack/keystone-bootstrap-cnl48" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.200205 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-config-data\") pod \"keystone-bootstrap-cnl48\" (UID: \"00a88a6b-3101-40ec-90ee-62a00b5559a1\") " pod="openstack/keystone-bootstrap-cnl48" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.304545 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-combined-ca-bundle\") pod \"keystone-bootstrap-cnl48\" (UID: \"00a88a6b-3101-40ec-90ee-62a00b5559a1\") " pod="openstack/keystone-bootstrap-cnl48" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.304927 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-fernet-keys\") pod \"keystone-bootstrap-cnl48\" (UID: \"00a88a6b-3101-40ec-90ee-62a00b5559a1\") " pod="openstack/keystone-bootstrap-cnl48" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.305048 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6chs\" (UniqueName: \"kubernetes.io/projected/00a88a6b-3101-40ec-90ee-62a00b5559a1-kube-api-access-d6chs\") pod \"keystone-bootstrap-cnl48\" (UID: \"00a88a6b-3101-40ec-90ee-62a00b5559a1\") " pod="openstack/keystone-bootstrap-cnl48" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.305193 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-credential-keys\") pod \"keystone-bootstrap-cnl48\" (UID: \"00a88a6b-3101-40ec-90ee-62a00b5559a1\") " pod="openstack/keystone-bootstrap-cnl48" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.305275 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-scripts\") pod \"keystone-bootstrap-cnl48\" (UID: \"00a88a6b-3101-40ec-90ee-62a00b5559a1\") " pod="openstack/keystone-bootstrap-cnl48" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.305424 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-config-data\") pod \"keystone-bootstrap-cnl48\" (UID: \"00a88a6b-3101-40ec-90ee-62a00b5559a1\") " pod="openstack/keystone-bootstrap-cnl48" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.308218 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6"] Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.313694 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.343454 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-credential-keys\") pod \"keystone-bootstrap-cnl48\" (UID: \"00a88a6b-3101-40ec-90ee-62a00b5559a1\") " pod="openstack/keystone-bootstrap-cnl48" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.344786 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-config-data\") pod \"keystone-bootstrap-cnl48\" (UID: \"00a88a6b-3101-40ec-90ee-62a00b5559a1\") " pod="openstack/keystone-bootstrap-cnl48" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.346159 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-combined-ca-bundle\") pod \"keystone-bootstrap-cnl48\" (UID: \"00a88a6b-3101-40ec-90ee-62a00b5559a1\") " pod="openstack/keystone-bootstrap-cnl48" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.355311 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6chs\" (UniqueName: \"kubernetes.io/projected/00a88a6b-3101-40ec-90ee-62a00b5559a1-kube-api-access-d6chs\") pod \"keystone-bootstrap-cnl48\" (UID: \"00a88a6b-3101-40ec-90ee-62a00b5559a1\") " pod="openstack/keystone-bootstrap-cnl48" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.357918 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-scripts\") pod \"keystone-bootstrap-cnl48\" (UID: \"00a88a6b-3101-40ec-90ee-62a00b5559a1\") " pod="openstack/keystone-bootstrap-cnl48" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.358591 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-fernet-keys\") pod \"keystone-bootstrap-cnl48\" (UID: \"00a88a6b-3101-40ec-90ee-62a00b5559a1\") " pod="openstack/keystone-bootstrap-cnl48" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.372761 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6"] Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.407732 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-h8gg6\" (UID: \"40e7d0d5-387a-4c17-9475-93998347341e\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.407784 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxdlt\" (UniqueName: \"kubernetes.io/projected/40e7d0d5-387a-4c17-9475-93998347341e-kube-api-access-xxdlt\") pod \"dnsmasq-dns-5c5cc7c5ff-h8gg6\" (UID: \"40e7d0d5-387a-4c17-9475-93998347341e\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.407869 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-config\") pod \"dnsmasq-dns-5c5cc7c5ff-h8gg6\" (UID: \"40e7d0d5-387a-4c17-9475-93998347341e\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.407893 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-h8gg6\" (UID: \"40e7d0d5-387a-4c17-9475-93998347341e\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.407927 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-h8gg6\" (UID: \"40e7d0d5-387a-4c17-9475-93998347341e\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.407976 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-h8gg6\" (UID: \"40e7d0d5-387a-4c17-9475-93998347341e\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.416952 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-znp6r"] Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.418421 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-znp6r" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.424815 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-wcbnn" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.425402 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.442701 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-znp6r"] Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.485185 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cnl48" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.512659 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-h8gg6\" (UID: \"40e7d0d5-387a-4c17-9475-93998347341e\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.512706 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxdlt\" (UniqueName: \"kubernetes.io/projected/40e7d0d5-387a-4c17-9475-93998347341e-kube-api-access-xxdlt\") pod \"dnsmasq-dns-5c5cc7c5ff-h8gg6\" (UID: \"40e7d0d5-387a-4c17-9475-93998347341e\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.512753 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d867119-66df-4aa7-a2dd-13d0d40ce2cc-config-data\") pod \"heat-db-sync-znp6r\" (UID: \"5d867119-66df-4aa7-a2dd-13d0d40ce2cc\") " pod="openstack/heat-db-sync-znp6r" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.512802 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsksr\" (UniqueName: \"kubernetes.io/projected/5d867119-66df-4aa7-a2dd-13d0d40ce2cc-kube-api-access-vsksr\") pod \"heat-db-sync-znp6r\" (UID: \"5d867119-66df-4aa7-a2dd-13d0d40ce2cc\") " pod="openstack/heat-db-sync-znp6r" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.512836 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-config\") pod \"dnsmasq-dns-5c5cc7c5ff-h8gg6\" (UID: \"40e7d0d5-387a-4c17-9475-93998347341e\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.512860 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-h8gg6\" (UID: \"40e7d0d5-387a-4c17-9475-93998347341e\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.512886 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d867119-66df-4aa7-a2dd-13d0d40ce2cc-combined-ca-bundle\") pod \"heat-db-sync-znp6r\" (UID: \"5d867119-66df-4aa7-a2dd-13d0d40ce2cc\") " pod="openstack/heat-db-sync-znp6r" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.512905 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-h8gg6\" (UID: \"40e7d0d5-387a-4c17-9475-93998347341e\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.512954 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-h8gg6\" (UID: \"40e7d0d5-387a-4c17-9475-93998347341e\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.514057 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-dns-svc\") pod \"dnsmasq-dns-5c5cc7c5ff-h8gg6\" (UID: \"40e7d0d5-387a-4c17-9475-93998347341e\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.525147 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-ovsdbserver-nb\") pod \"dnsmasq-dns-5c5cc7c5ff-h8gg6\" (UID: \"40e7d0d5-387a-4c17-9475-93998347341e\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.525159 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-config\") pod \"dnsmasq-dns-5c5cc7c5ff-h8gg6\" (UID: \"40e7d0d5-387a-4c17-9475-93998347341e\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.525736 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-dns-swift-storage-0\") pod \"dnsmasq-dns-5c5cc7c5ff-h8gg6\" (UID: \"40e7d0d5-387a-4c17-9475-93998347341e\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.525933 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-ovsdbserver-sb\") pod \"dnsmasq-dns-5c5cc7c5ff-h8gg6\" (UID: \"40e7d0d5-387a-4c17-9475-93998347341e\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.562625 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-zzsmb"] Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.564580 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zzsmb" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.574555 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-c9vcl" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.574837 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.581468 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.585770 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxdlt\" (UniqueName: \"kubernetes.io/projected/40e7d0d5-387a-4c17-9475-93998347341e-kube-api-access-xxdlt\") pod \"dnsmasq-dns-5c5cc7c5ff-h8gg6\" (UID: \"40e7d0d5-387a-4c17-9475-93998347341e\") " pod="openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.596404 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-zzsmb"] Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.608668 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-lbsl6"] Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.610161 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lbsl6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.617680 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.617911 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-8pfqw" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.627054 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fec0901-00c6-410f-986c-4dcac4fe1359-config-data\") pod \"cinder-db-sync-zzsmb\" (UID: \"6fec0901-00c6-410f-986c-4dcac4fe1359\") " pod="openstack/cinder-db-sync-zzsmb" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.627122 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6fec0901-00c6-410f-986c-4dcac4fe1359-db-sync-config-data\") pod \"cinder-db-sync-zzsmb\" (UID: \"6fec0901-00c6-410f-986c-4dcac4fe1359\") " pod="openstack/cinder-db-sync-zzsmb" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.627225 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8dvh\" (UniqueName: \"kubernetes.io/projected/6fec0901-00c6-410f-986c-4dcac4fe1359-kube-api-access-q8dvh\") pod \"cinder-db-sync-zzsmb\" (UID: \"6fec0901-00c6-410f-986c-4dcac4fe1359\") " pod="openstack/cinder-db-sync-zzsmb" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.627252 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d867119-66df-4aa7-a2dd-13d0d40ce2cc-config-data\") pod \"heat-db-sync-znp6r\" (UID: \"5d867119-66df-4aa7-a2dd-13d0d40ce2cc\") " pod="openstack/heat-db-sync-znp6r" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.627385 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsksr\" (UniqueName: \"kubernetes.io/projected/5d867119-66df-4aa7-a2dd-13d0d40ce2cc-kube-api-access-vsksr\") pod \"heat-db-sync-znp6r\" (UID: \"5d867119-66df-4aa7-a2dd-13d0d40ce2cc\") " pod="openstack/heat-db-sync-znp6r" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.627512 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d867119-66df-4aa7-a2dd-13d0d40ce2cc-combined-ca-bundle\") pod \"heat-db-sync-znp6r\" (UID: \"5d867119-66df-4aa7-a2dd-13d0d40ce2cc\") " pod="openstack/heat-db-sync-znp6r" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.627551 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6fec0901-00c6-410f-986c-4dcac4fe1359-etc-machine-id\") pod \"cinder-db-sync-zzsmb\" (UID: \"6fec0901-00c6-410f-986c-4dcac4fe1359\") " pod="openstack/cinder-db-sync-zzsmb" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.627586 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fec0901-00c6-410f-986c-4dcac4fe1359-combined-ca-bundle\") pod \"cinder-db-sync-zzsmb\" (UID: \"6fec0901-00c6-410f-986c-4dcac4fe1359\") " pod="openstack/cinder-db-sync-zzsmb" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.627751 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fec0901-00c6-410f-986c-4dcac4fe1359-scripts\") pod \"cinder-db-sync-zzsmb\" (UID: \"6fec0901-00c6-410f-986c-4dcac4fe1359\") " pod="openstack/cinder-db-sync-zzsmb" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.641777 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-lbsl6"] Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.649113 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d867119-66df-4aa7-a2dd-13d0d40ce2cc-combined-ca-bundle\") pod \"heat-db-sync-znp6r\" (UID: \"5d867119-66df-4aa7-a2dd-13d0d40ce2cc\") " pod="openstack/heat-db-sync-znp6r" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.661280 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d867119-66df-4aa7-a2dd-13d0d40ce2cc-config-data\") pod \"heat-db-sync-znp6r\" (UID: \"5d867119-66df-4aa7-a2dd-13d0d40ce2cc\") " pod="openstack/heat-db-sync-znp6r" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.675141 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsksr\" (UniqueName: \"kubernetes.io/projected/5d867119-66df-4aa7-a2dd-13d0d40ce2cc-kube-api-access-vsksr\") pod \"heat-db-sync-znp6r\" (UID: \"5d867119-66df-4aa7-a2dd-13d0d40ce2cc\") " pod="openstack/heat-db-sync-znp6r" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.690594 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-vjtzq"] Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.692163 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vjtzq" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.699041 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-hmk7b" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.699317 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.699448 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.728460 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.729888 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92ef179e-0d37-4a3d-986f-5a4ea5bc5a23-combined-ca-bundle\") pod \"barbican-db-sync-lbsl6\" (UID: \"92ef179e-0d37-4a3d-986f-5a4ea5bc5a23\") " pod="openstack/barbican-db-sync-lbsl6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.729933 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6fec0901-00c6-410f-986c-4dcac4fe1359-etc-machine-id\") pod \"cinder-db-sync-zzsmb\" (UID: \"6fec0901-00c6-410f-986c-4dcac4fe1359\") " pod="openstack/cinder-db-sync-zzsmb" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.729955 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fec0901-00c6-410f-986c-4dcac4fe1359-combined-ca-bundle\") pod \"cinder-db-sync-zzsmb\" (UID: \"6fec0901-00c6-410f-986c-4dcac4fe1359\") " pod="openstack/cinder-db-sync-zzsmb" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.729976 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2rb4\" (UniqueName: \"kubernetes.io/projected/92ef179e-0d37-4a3d-986f-5a4ea5bc5a23-kube-api-access-s2rb4\") pod \"barbican-db-sync-lbsl6\" (UID: \"92ef179e-0d37-4a3d-986f-5a4ea5bc5a23\") " pod="openstack/barbican-db-sync-lbsl6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.730009 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68-config\") pod \"neutron-db-sync-vjtzq\" (UID: \"0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68\") " pod="openstack/neutron-db-sync-vjtzq" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.730137 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/92ef179e-0d37-4a3d-986f-5a4ea5bc5a23-db-sync-config-data\") pod \"barbican-db-sync-lbsl6\" (UID: \"92ef179e-0d37-4a3d-986f-5a4ea5bc5a23\") " pod="openstack/barbican-db-sync-lbsl6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.730194 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fec0901-00c6-410f-986c-4dcac4fe1359-scripts\") pod \"cinder-db-sync-zzsmb\" (UID: \"6fec0901-00c6-410f-986c-4dcac4fe1359\") " pod="openstack/cinder-db-sync-zzsmb" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.730257 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fec0901-00c6-410f-986c-4dcac4fe1359-config-data\") pod \"cinder-db-sync-zzsmb\" (UID: \"6fec0901-00c6-410f-986c-4dcac4fe1359\") " pod="openstack/cinder-db-sync-zzsmb" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.730281 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6fec0901-00c6-410f-986c-4dcac4fe1359-db-sync-config-data\") pod \"cinder-db-sync-zzsmb\" (UID: \"6fec0901-00c6-410f-986c-4dcac4fe1359\") " pod="openstack/cinder-db-sync-zzsmb" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.730316 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5slqw\" (UniqueName: \"kubernetes.io/projected/0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68-kube-api-access-5slqw\") pod \"neutron-db-sync-vjtzq\" (UID: \"0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68\") " pod="openstack/neutron-db-sync-vjtzq" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.730356 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8dvh\" (UniqueName: \"kubernetes.io/projected/6fec0901-00c6-410f-986c-4dcac4fe1359-kube-api-access-q8dvh\") pod \"cinder-db-sync-zzsmb\" (UID: \"6fec0901-00c6-410f-986c-4dcac4fe1359\") " pod="openstack/cinder-db-sync-zzsmb" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.730413 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68-combined-ca-bundle\") pod \"neutron-db-sync-vjtzq\" (UID: \"0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68\") " pod="openstack/neutron-db-sync-vjtzq" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.730503 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6fec0901-00c6-410f-986c-4dcac4fe1359-etc-machine-id\") pod \"cinder-db-sync-zzsmb\" (UID: \"6fec0901-00c6-410f-986c-4dcac4fe1359\") " pod="openstack/cinder-db-sync-zzsmb" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.740696 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vjtzq"] Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.750905 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fec0901-00c6-410f-986c-4dcac4fe1359-scripts\") pod \"cinder-db-sync-zzsmb\" (UID: \"6fec0901-00c6-410f-986c-4dcac4fe1359\") " pod="openstack/cinder-db-sync-zzsmb" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.751680 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6fec0901-00c6-410f-986c-4dcac4fe1359-db-sync-config-data\") pod \"cinder-db-sync-zzsmb\" (UID: \"6fec0901-00c6-410f-986c-4dcac4fe1359\") " pod="openstack/cinder-db-sync-zzsmb" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.752584 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fec0901-00c6-410f-986c-4dcac4fe1359-config-data\") pod \"cinder-db-sync-zzsmb\" (UID: \"6fec0901-00c6-410f-986c-4dcac4fe1359\") " pod="openstack/cinder-db-sync-zzsmb" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.753832 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fec0901-00c6-410f-986c-4dcac4fe1359-combined-ca-bundle\") pod \"cinder-db-sync-zzsmb\" (UID: \"6fec0901-00c6-410f-986c-4dcac4fe1359\") " pod="openstack/cinder-db-sync-zzsmb" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.762868 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-znp6r" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.763502 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8dvh\" (UniqueName: \"kubernetes.io/projected/6fec0901-00c6-410f-986c-4dcac4fe1359-kube-api-access-q8dvh\") pod \"cinder-db-sync-zzsmb\" (UID: \"6fec0901-00c6-410f-986c-4dcac4fe1359\") " pod="openstack/cinder-db-sync-zzsmb" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.786798 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6"] Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.796210 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zzsmb" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.834903 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5slqw\" (UniqueName: \"kubernetes.io/projected/0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68-kube-api-access-5slqw\") pod \"neutron-db-sync-vjtzq\" (UID: \"0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68\") " pod="openstack/neutron-db-sync-vjtzq" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.835112 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68-combined-ca-bundle\") pod \"neutron-db-sync-vjtzq\" (UID: \"0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68\") " pod="openstack/neutron-db-sync-vjtzq" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.835180 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92ef179e-0d37-4a3d-986f-5a4ea5bc5a23-combined-ca-bundle\") pod \"barbican-db-sync-lbsl6\" (UID: \"92ef179e-0d37-4a3d-986f-5a4ea5bc5a23\") " pod="openstack/barbican-db-sync-lbsl6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.835264 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2rb4\" (UniqueName: \"kubernetes.io/projected/92ef179e-0d37-4a3d-986f-5a4ea5bc5a23-kube-api-access-s2rb4\") pod \"barbican-db-sync-lbsl6\" (UID: \"92ef179e-0d37-4a3d-986f-5a4ea5bc5a23\") " pod="openstack/barbican-db-sync-lbsl6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.835321 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68-config\") pod \"neutron-db-sync-vjtzq\" (UID: \"0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68\") " pod="openstack/neutron-db-sync-vjtzq" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.835411 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/92ef179e-0d37-4a3d-986f-5a4ea5bc5a23-db-sync-config-data\") pod \"barbican-db-sync-lbsl6\" (UID: \"92ef179e-0d37-4a3d-986f-5a4ea5bc5a23\") " pod="openstack/barbican-db-sync-lbsl6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.847104 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68-combined-ca-bundle\") pod \"neutron-db-sync-vjtzq\" (UID: \"0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68\") " pod="openstack/neutron-db-sync-vjtzq" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.851796 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92ef179e-0d37-4a3d-986f-5a4ea5bc5a23-combined-ca-bundle\") pod \"barbican-db-sync-lbsl6\" (UID: \"92ef179e-0d37-4a3d-986f-5a4ea5bc5a23\") " pod="openstack/barbican-db-sync-lbsl6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.855136 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/92ef179e-0d37-4a3d-986f-5a4ea5bc5a23-db-sync-config-data\") pod \"barbican-db-sync-lbsl6\" (UID: \"92ef179e-0d37-4a3d-986f-5a4ea5bc5a23\") " pod="openstack/barbican-db-sync-lbsl6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.872172 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5slqw\" (UniqueName: \"kubernetes.io/projected/0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68-kube-api-access-5slqw\") pod \"neutron-db-sync-vjtzq\" (UID: \"0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68\") " pod="openstack/neutron-db-sync-vjtzq" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.886721 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68-config\") pod \"neutron-db-sync-vjtzq\" (UID: \"0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68\") " pod="openstack/neutron-db-sync-vjtzq" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.891931 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2rb4\" (UniqueName: \"kubernetes.io/projected/92ef179e-0d37-4a3d-986f-5a4ea5bc5a23-kube-api-access-s2rb4\") pod \"barbican-db-sync-lbsl6\" (UID: \"92ef179e-0d37-4a3d-986f-5a4ea5bc5a23\") " pod="openstack/barbican-db-sync-lbsl6" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.944489 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-jwvg7"] Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.969942 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" Mar 20 13:45:08 crc kubenswrapper[4973]: I0320 13:45:08.972292 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-jwvg7"] Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.016422 4973 generic.go:334] "Generic (PLEG): container finished" podID="4acc56cd-01f7-4a24-bfba-7d7a52396699" containerID="73ca2c4107abeb6bb6a53097b736eca760f316eb29dac86c7688d3004dd1d9bd" exitCode=0 Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.016481 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" event={"ID":"4acc56cd-01f7-4a24-bfba-7d7a52396699","Type":"ContainerDied","Data":"73ca2c4107abeb6bb6a53097b736eca760f316eb29dac86c7688d3004dd1d9bd"} Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.029290 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-2wgvh"] Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.031194 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2wgvh" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.033751 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.035529 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.035671 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-n6z9b" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.068125 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2wgvh"] Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.095020 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.120310 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lbsl6" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.123496 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.123791 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.124286 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vjtzq" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.126212 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.126400 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.128583 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.143654 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bcf98ea-17cb-432f-8d35-18cc016401ed-config-data\") pod \"placement-db-sync-2wgvh\" (UID: \"5bcf98ea-17cb-432f-8d35-18cc016401ed\") " pod="openstack/placement-db-sync-2wgvh" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.143729 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bcf98ea-17cb-432f-8d35-18cc016401ed-logs\") pod \"placement-db-sync-2wgvh\" (UID: \"5bcf98ea-17cb-432f-8d35-18cc016401ed\") " pod="openstack/placement-db-sync-2wgvh" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.144027 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-config\") pod \"dnsmasq-dns-8b5c85b87-jwvg7\" (UID: \"a0006a98-e902-4378-b895-cf31a555b3f6\") " pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.144549 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-jwvg7\" (UID: \"a0006a98-e902-4378-b895-cf31a555b3f6\") " pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.144589 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkpsf\" (UniqueName: \"kubernetes.io/projected/a0006a98-e902-4378-b895-cf31a555b3f6-kube-api-access-vkpsf\") pod \"dnsmasq-dns-8b5c85b87-jwvg7\" (UID: \"a0006a98-e902-4378-b895-cf31a555b3f6\") " pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.144611 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dll9m\" (UniqueName: \"kubernetes.io/projected/5bcf98ea-17cb-432f-8d35-18cc016401ed-kube-api-access-dll9m\") pod \"placement-db-sync-2wgvh\" (UID: \"5bcf98ea-17cb-432f-8d35-18cc016401ed\") " pod="openstack/placement-db-sync-2wgvh" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.144629 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bcf98ea-17cb-432f-8d35-18cc016401ed-scripts\") pod \"placement-db-sync-2wgvh\" (UID: \"5bcf98ea-17cb-432f-8d35-18cc016401ed\") " pod="openstack/placement-db-sync-2wgvh" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.144665 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-jwvg7\" (UID: \"a0006a98-e902-4378-b895-cf31a555b3f6\") " pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.144684 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-jwvg7\" (UID: \"a0006a98-e902-4378-b895-cf31a555b3f6\") " pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.144711 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bcf98ea-17cb-432f-8d35-18cc016401ed-combined-ca-bundle\") pod \"placement-db-sync-2wgvh\" (UID: \"5bcf98ea-17cb-432f-8d35-18cc016401ed\") " pod="openstack/placement-db-sync-2wgvh" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.144730 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-jwvg7\" (UID: \"a0006a98-e902-4378-b895-cf31a555b3f6\") " pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.249174 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-ovsdbserver-sb\") pod \"4acc56cd-01f7-4a24-bfba-7d7a52396699\" (UID: \"4acc56cd-01f7-4a24-bfba-7d7a52396699\") " Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.249538 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-ovsdbserver-nb\") pod \"4acc56cd-01f7-4a24-bfba-7d7a52396699\" (UID: \"4acc56cd-01f7-4a24-bfba-7d7a52396699\") " Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.249624 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-dns-swift-storage-0\") pod \"4acc56cd-01f7-4a24-bfba-7d7a52396699\" (UID: \"4acc56cd-01f7-4a24-bfba-7d7a52396699\") " Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.249676 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-config\") pod \"4acc56cd-01f7-4a24-bfba-7d7a52396699\" (UID: \"4acc56cd-01f7-4a24-bfba-7d7a52396699\") " Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.249725 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tp54h\" (UniqueName: \"kubernetes.io/projected/4acc56cd-01f7-4a24-bfba-7d7a52396699-kube-api-access-tp54h\") pod \"4acc56cd-01f7-4a24-bfba-7d7a52396699\" (UID: \"4acc56cd-01f7-4a24-bfba-7d7a52396699\") " Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.249759 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-dns-svc\") pod \"4acc56cd-01f7-4a24-bfba-7d7a52396699\" (UID: \"4acc56cd-01f7-4a24-bfba-7d7a52396699\") " Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.251326 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-jwvg7\" (UID: \"a0006a98-e902-4378-b895-cf31a555b3f6\") " pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.251445 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/198d4fb1-4784-4553-b617-a2c33cec6df6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " pod="openstack/ceilometer-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.251489 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkpsf\" (UniqueName: \"kubernetes.io/projected/a0006a98-e902-4378-b895-cf31a555b3f6-kube-api-access-vkpsf\") pod \"dnsmasq-dns-8b5c85b87-jwvg7\" (UID: \"a0006a98-e902-4378-b895-cf31a555b3f6\") " pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.251526 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dll9m\" (UniqueName: \"kubernetes.io/projected/5bcf98ea-17cb-432f-8d35-18cc016401ed-kube-api-access-dll9m\") pod \"placement-db-sync-2wgvh\" (UID: \"5bcf98ea-17cb-432f-8d35-18cc016401ed\") " pod="openstack/placement-db-sync-2wgvh" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.251559 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bcf98ea-17cb-432f-8d35-18cc016401ed-scripts\") pod \"placement-db-sync-2wgvh\" (UID: \"5bcf98ea-17cb-432f-8d35-18cc016401ed\") " pod="openstack/placement-db-sync-2wgvh" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.251647 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-jwvg7\" (UID: \"a0006a98-e902-4378-b895-cf31a555b3f6\") " pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.251679 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-jwvg7\" (UID: \"a0006a98-e902-4378-b895-cf31a555b3f6\") " pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.251714 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt626\" (UniqueName: \"kubernetes.io/projected/198d4fb1-4784-4553-b617-a2c33cec6df6-kube-api-access-pt626\") pod \"ceilometer-0\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " pod="openstack/ceilometer-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.251742 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/198d4fb1-4784-4553-b617-a2c33cec6df6-run-httpd\") pod \"ceilometer-0\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " pod="openstack/ceilometer-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.251773 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bcf98ea-17cb-432f-8d35-18cc016401ed-combined-ca-bundle\") pod \"placement-db-sync-2wgvh\" (UID: \"5bcf98ea-17cb-432f-8d35-18cc016401ed\") " pod="openstack/placement-db-sync-2wgvh" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.252043 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-jwvg7\" (UID: \"a0006a98-e902-4378-b895-cf31a555b3f6\") " pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.252390 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bcf98ea-17cb-432f-8d35-18cc016401ed-config-data\") pod \"placement-db-sync-2wgvh\" (UID: \"5bcf98ea-17cb-432f-8d35-18cc016401ed\") " pod="openstack/placement-db-sync-2wgvh" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.252516 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bcf98ea-17cb-432f-8d35-18cc016401ed-logs\") pod \"placement-db-sync-2wgvh\" (UID: \"5bcf98ea-17cb-432f-8d35-18cc016401ed\") " pod="openstack/placement-db-sync-2wgvh" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.252591 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-ovsdbserver-nb\") pod \"dnsmasq-dns-8b5c85b87-jwvg7\" (UID: \"a0006a98-e902-4378-b895-cf31a555b3f6\") " pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.252617 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/198d4fb1-4784-4553-b617-a2c33cec6df6-log-httpd\") pod \"ceilometer-0\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " pod="openstack/ceilometer-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.252771 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-config\") pod \"dnsmasq-dns-8b5c85b87-jwvg7\" (UID: \"a0006a98-e902-4378-b895-cf31a555b3f6\") " pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.263283 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-dns-swift-storage-0\") pod \"dnsmasq-dns-8b5c85b87-jwvg7\" (UID: \"a0006a98-e902-4378-b895-cf31a555b3f6\") " pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.263595 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-ovsdbserver-sb\") pod \"dnsmasq-dns-8b5c85b87-jwvg7\" (UID: \"a0006a98-e902-4378-b895-cf31a555b3f6\") " pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.253254 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/198d4fb1-4784-4553-b617-a2c33cec6df6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " pod="openstack/ceilometer-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.263821 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/198d4fb1-4784-4553-b617-a2c33cec6df6-scripts\") pod \"ceilometer-0\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " pod="openstack/ceilometer-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.263904 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bcf98ea-17cb-432f-8d35-18cc016401ed-logs\") pod \"placement-db-sync-2wgvh\" (UID: \"5bcf98ea-17cb-432f-8d35-18cc016401ed\") " pod="openstack/placement-db-sync-2wgvh" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.263913 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/198d4fb1-4784-4553-b617-a2c33cec6df6-config-data\") pod \"ceilometer-0\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " pod="openstack/ceilometer-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.265021 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-config\") pod \"dnsmasq-dns-8b5c85b87-jwvg7\" (UID: \"a0006a98-e902-4378-b895-cf31a555b3f6\") " pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.265832 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-dns-svc\") pod \"dnsmasq-dns-8b5c85b87-jwvg7\" (UID: \"a0006a98-e902-4378-b895-cf31a555b3f6\") " pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.266698 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bcf98ea-17cb-432f-8d35-18cc016401ed-config-data\") pod \"placement-db-sync-2wgvh\" (UID: \"5bcf98ea-17cb-432f-8d35-18cc016401ed\") " pod="openstack/placement-db-sync-2wgvh" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.268789 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bcf98ea-17cb-432f-8d35-18cc016401ed-combined-ca-bundle\") pod \"placement-db-sync-2wgvh\" (UID: \"5bcf98ea-17cb-432f-8d35-18cc016401ed\") " pod="openstack/placement-db-sync-2wgvh" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.271505 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4acc56cd-01f7-4a24-bfba-7d7a52396699-kube-api-access-tp54h" (OuterVolumeSpecName: "kube-api-access-tp54h") pod "4acc56cd-01f7-4a24-bfba-7d7a52396699" (UID: "4acc56cd-01f7-4a24-bfba-7d7a52396699"). InnerVolumeSpecName "kube-api-access-tp54h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.274122 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dll9m\" (UniqueName: \"kubernetes.io/projected/5bcf98ea-17cb-432f-8d35-18cc016401ed-kube-api-access-dll9m\") pod \"placement-db-sync-2wgvh\" (UID: \"5bcf98ea-17cb-432f-8d35-18cc016401ed\") " pod="openstack/placement-db-sync-2wgvh" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.306452 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bcf98ea-17cb-432f-8d35-18cc016401ed-scripts\") pod \"placement-db-sync-2wgvh\" (UID: \"5bcf98ea-17cb-432f-8d35-18cc016401ed\") " pod="openstack/placement-db-sync-2wgvh" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.346635 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4acc56cd-01f7-4a24-bfba-7d7a52396699" (UID: "4acc56cd-01f7-4a24-bfba-7d7a52396699"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.350938 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkpsf\" (UniqueName: \"kubernetes.io/projected/a0006a98-e902-4378-b895-cf31a555b3f6-kube-api-access-vkpsf\") pod \"dnsmasq-dns-8b5c85b87-jwvg7\" (UID: \"a0006a98-e902-4378-b895-cf31a555b3f6\") " pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.351023 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:45:09 crc kubenswrapper[4973]: E0320 13:45:09.351589 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4acc56cd-01f7-4a24-bfba-7d7a52396699" containerName="dnsmasq-dns" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.351607 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="4acc56cd-01f7-4a24-bfba-7d7a52396699" containerName="dnsmasq-dns" Mar 20 13:45:09 crc kubenswrapper[4973]: E0320 13:45:09.351642 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4acc56cd-01f7-4a24-bfba-7d7a52396699" containerName="init" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.351651 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="4acc56cd-01f7-4a24-bfba-7d7a52396699" containerName="init" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.352774 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="4acc56cd-01f7-4a24-bfba-7d7a52396699" containerName="dnsmasq-dns" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.353094 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.359290 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.402814 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/198d4fb1-4784-4553-b617-a2c33cec6df6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " pod="openstack/ceilometer-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.402991 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt626\" (UniqueName: \"kubernetes.io/projected/198d4fb1-4784-4553-b617-a2c33cec6df6-kube-api-access-pt626\") pod \"ceilometer-0\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " pod="openstack/ceilometer-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.403034 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/198d4fb1-4784-4553-b617-a2c33cec6df6-run-httpd\") pod \"ceilometer-0\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " pod="openstack/ceilometer-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.403317 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/198d4fb1-4784-4553-b617-a2c33cec6df6-log-httpd\") pod \"ceilometer-0\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " pod="openstack/ceilometer-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.403477 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/198d4fb1-4784-4553-b617-a2c33cec6df6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " pod="openstack/ceilometer-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.403518 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/198d4fb1-4784-4553-b617-a2c33cec6df6-scripts\") pod \"ceilometer-0\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " pod="openstack/ceilometer-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.403548 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/198d4fb1-4784-4553-b617-a2c33cec6df6-config-data\") pod \"ceilometer-0\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " pod="openstack/ceilometer-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.403701 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tp54h\" (UniqueName: \"kubernetes.io/projected/4acc56cd-01f7-4a24-bfba-7d7a52396699-kube-api-access-tp54h\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.403725 4973 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.407504 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.407632 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/198d4fb1-4784-4553-b617-a2c33cec6df6-run-httpd\") pod \"ceilometer-0\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " pod="openstack/ceilometer-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.407878 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.408194 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-tjqdw" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.425810 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2wgvh" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.430723 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.434390 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4acc56cd-01f7-4a24-bfba-7d7a52396699" (UID: "4acc56cd-01f7-4a24-bfba-7d7a52396699"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.435176 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.463370 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/198d4fb1-4784-4553-b617-a2c33cec6df6-log-httpd\") pod \"ceilometer-0\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " pod="openstack/ceilometer-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.497844 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt626\" (UniqueName: \"kubernetes.io/projected/198d4fb1-4784-4553-b617-a2c33cec6df6-kube-api-access-pt626\") pod \"ceilometer-0\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " pod="openstack/ceilometer-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.521134 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/198d4fb1-4784-4553-b617-a2c33cec6df6-scripts\") pod \"ceilometer-0\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " pod="openstack/ceilometer-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.522187 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.523854 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/198d4fb1-4784-4553-b617-a2c33cec6df6-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " pod="openstack/ceilometer-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.530648 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.550016 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.550259 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.551003 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/198d4fb1-4784-4553-b617-a2c33cec6df6-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " pod="openstack/ceilometer-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.551367 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/198d4fb1-4784-4553-b617-a2c33cec6df6-config-data\") pod \"ceilometer-0\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " pod="openstack/ceilometer-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.562531 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5k5d\" (UniqueName: \"kubernetes.io/projected/dc7dca93-568d-4f21-9162-21244ab6aa49-kube-api-access-j5k5d\") pod \"glance-default-external-api-0\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.562686 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc7dca93-568d-4f21-9162-21244ab6aa49-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.562730 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc7dca93-568d-4f21-9162-21244ab6aa49-scripts\") pod \"glance-default-external-api-0\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.562779 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\") pod \"glance-default-external-api-0\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.562858 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7dca93-568d-4f21-9162-21244ab6aa49-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.562910 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc7dca93-568d-4f21-9162-21244ab6aa49-logs\") pod \"glance-default-external-api-0\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.563127 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7dca93-568d-4f21-9162-21244ab6aa49-config-data\") pod \"glance-default-external-api-0\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.563177 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7dca93-568d-4f21-9162-21244ab6aa49-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.594472 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4acc56cd-01f7-4a24-bfba-7d7a52396699" (UID: "4acc56cd-01f7-4a24-bfba-7d7a52396699"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.598713 4973 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.622443 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4acc56cd-01f7-4a24-bfba-7d7a52396699" (UID: "4acc56cd-01f7-4a24-bfba-7d7a52396699"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.702031 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b455c66-699c-4a18-91cb-0e8fca97642e-logs\") pod \"glance-default-internal-api-0\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.702081 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b455c66-699c-4a18-91cb-0e8fca97642e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.702118 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5k5d\" (UniqueName: \"kubernetes.io/projected/dc7dca93-568d-4f21-9162-21244ab6aa49-kube-api-access-j5k5d\") pod \"glance-default-external-api-0\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.702153 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b455c66-699c-4a18-91cb-0e8fca97642e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.702216 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc7dca93-568d-4f21-9162-21244ab6aa49-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.702241 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc7dca93-568d-4f21-9162-21244ab6aa49-scripts\") pod \"glance-default-external-api-0\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.702270 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\") pod \"glance-default-external-api-0\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.702308 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7dca93-568d-4f21-9162-21244ab6aa49-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.702354 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc7dca93-568d-4f21-9162-21244ab6aa49-logs\") pod \"glance-default-external-api-0\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.702378 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b455c66-699c-4a18-91cb-0e8fca97642e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.702426 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b455c66-699c-4a18-91cb-0e8fca97642e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.702476 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4da075d1-9614-421d-9161-1d84cabf00c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4da075d1-9614-421d-9161-1d84cabf00c7\") pod \"glance-default-internal-api-0\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.702503 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l6np\" (UniqueName: \"kubernetes.io/projected/7b455c66-699c-4a18-91cb-0e8fca97642e-kube-api-access-4l6np\") pod \"glance-default-internal-api-0\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.702547 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7dca93-568d-4f21-9162-21244ab6aa49-config-data\") pod \"glance-default-external-api-0\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.702570 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7dca93-568d-4f21-9162-21244ab6aa49-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.702613 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b455c66-699c-4a18-91cb-0e8fca97642e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.702690 4973 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.702702 4973 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.704073 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc7dca93-568d-4f21-9162-21244ab6aa49-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.706826 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc7dca93-568d-4f21-9162-21244ab6aa49-scripts\") pod \"glance-default-external-api-0\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.707078 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc7dca93-568d-4f21-9162-21244ab6aa49-logs\") pod \"glance-default-external-api-0\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.707987 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.721764 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7dca93-568d-4f21-9162-21244ab6aa49-config-data\") pod \"glance-default-external-api-0\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.726158 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7dca93-568d-4f21-9162-21244ab6aa49-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.728656 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-cnl48"] Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.744524 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7dca93-568d-4f21-9162-21244ab6aa49-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.746178 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5k5d\" (UniqueName: \"kubernetes.io/projected/dc7dca93-568d-4f21-9162-21244ab6aa49-kube-api-access-j5k5d\") pod \"glance-default-external-api-0\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.749084 4973 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.749118 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\") pod \"glance-default-external-api-0\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/da3d9df538268a2045279e6c35b80a9f9f98a48c1b51b6173b49a5c36304d8af/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.756823 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.772957 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-config" (OuterVolumeSpecName: "config") pod "4acc56cd-01f7-4a24-bfba-7d7a52396699" (UID: "4acc56cd-01f7-4a24-bfba-7d7a52396699"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.805122 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b455c66-699c-4a18-91cb-0e8fca97642e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.805182 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b455c66-699c-4a18-91cb-0e8fca97642e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.805271 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4da075d1-9614-421d-9161-1d84cabf00c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4da075d1-9614-421d-9161-1d84cabf00c7\") pod \"glance-default-internal-api-0\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.805313 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l6np\" (UniqueName: \"kubernetes.io/projected/7b455c66-699c-4a18-91cb-0e8fca97642e-kube-api-access-4l6np\") pod \"glance-default-internal-api-0\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.805460 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b455c66-699c-4a18-91cb-0e8fca97642e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.806028 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b455c66-699c-4a18-91cb-0e8fca97642e-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.809509 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b455c66-699c-4a18-91cb-0e8fca97642e-logs\") pod \"glance-default-internal-api-0\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.809607 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b455c66-699c-4a18-91cb-0e8fca97642e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.809682 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b455c66-699c-4a18-91cb-0e8fca97642e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.810063 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4acc56cd-01f7-4a24-bfba-7d7a52396699-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.811756 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b455c66-699c-4a18-91cb-0e8fca97642e-logs\") pod \"glance-default-internal-api-0\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.812460 4973 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.812503 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4da075d1-9614-421d-9161-1d84cabf00c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4da075d1-9614-421d-9161-1d84cabf00c7\") pod \"glance-default-internal-api-0\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5c263bf6e6632ecbb6820e7ea731ec6b961228752a610e87fa7ed7c4de533cfd/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.817229 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b455c66-699c-4a18-91cb-0e8fca97642e-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.819588 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b455c66-699c-4a18-91cb-0e8fca97642e-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.823044 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b455c66-699c-4a18-91cb-0e8fca97642e-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.824560 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\") pod \"glance-default-external-api-0\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.832069 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b455c66-699c-4a18-91cb-0e8fca97642e-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.844168 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l6np\" (UniqueName: \"kubernetes.io/projected/7b455c66-699c-4a18-91cb-0e8fca97642e-kube-api-access-4l6np\") pod \"glance-default-internal-api-0\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.882848 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4da075d1-9614-421d-9161-1d84cabf00c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4da075d1-9614-421d-9161-1d84cabf00c7\") pod \"glance-default-internal-api-0\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:09 crc kubenswrapper[4973]: I0320 13:45:09.976025 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Mar 20 13:45:10 crc kubenswrapper[4973]: I0320 13:45:10.052989 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:45:10 crc kubenswrapper[4973]: I0320 13:45:10.062881 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-znp6r"] Mar 20 13:45:10 crc kubenswrapper[4973]: I0320 13:45:10.063150 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:10 crc kubenswrapper[4973]: I0320 13:45:10.065951 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cnl48" event={"ID":"00a88a6b-3101-40ec-90ee-62a00b5559a1","Type":"ContainerStarted","Data":"26a1303df44a856e46d238c49d8b690d011e33c34e097b56cb5d024f808b2d6c"} Mar 20 13:45:10 crc kubenswrapper[4973]: I0320 13:45:10.078246 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" event={"ID":"4acc56cd-01f7-4a24-bfba-7d7a52396699","Type":"ContainerDied","Data":"6ce2ac951e21f2b9c9985b23e16690f41b03decd9c42a194c80fa48b610e3007"} Mar 20 13:45:10 crc kubenswrapper[4973]: I0320 13:45:10.078312 4973 scope.go:117] "RemoveContainer" containerID="73ca2c4107abeb6bb6a53097b736eca760f316eb29dac86c7688d3004dd1d9bd" Mar 20 13:45:10 crc kubenswrapper[4973]: I0320 13:45:10.078588 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7ff5475cc9-xc4vf" Mar 20 13:45:10 crc kubenswrapper[4973]: W0320 13:45:10.080522 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d867119_66df_4aa7_a2dd_13d0d40ce2cc.slice/crio-1e7fce08aeb554166dadf690d74f9ad7a54e13ee2073601803431c8a10c95be1 WatchSource:0}: Error finding container 1e7fce08aeb554166dadf690d74f9ad7a54e13ee2073601803431c8a10c95be1: Status 404 returned error can't find the container with id 1e7fce08aeb554166dadf690d74f9ad7a54e13ee2073601803431c8a10c95be1 Mar 20 13:45:10 crc kubenswrapper[4973]: I0320 13:45:10.096819 4973 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:45:10 crc kubenswrapper[4973]: I0320 13:45:10.123528 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6"] Mar 20 13:45:10 crc kubenswrapper[4973]: I0320 13:45:10.141837 4973 scope.go:117] "RemoveContainer" containerID="604a9792b07ffe1149da89915adb5fa17ea784d431d826a0d28c8831564b2320" Mar 20 13:45:10 crc kubenswrapper[4973]: I0320 13:45:10.148761 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-xc4vf"] Mar 20 13:45:10 crc kubenswrapper[4973]: I0320 13:45:10.159326 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7ff5475cc9-xc4vf"] Mar 20 13:45:10 crc kubenswrapper[4973]: I0320 13:45:10.353140 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-zzsmb"] Mar 20 13:45:10 crc kubenswrapper[4973]: I0320 13:45:10.380643 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-vjtzq"] Mar 20 13:45:10 crc kubenswrapper[4973]: I0320 13:45:10.815312 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:45:10 crc kubenswrapper[4973]: I0320 13:45:10.850014 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-jwvg7"] Mar 20 13:45:10 crc kubenswrapper[4973]: I0320 13:45:10.868315 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-lbsl6"] Mar 20 13:45:11 crc kubenswrapper[4973]: I0320 13:45:11.037875 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:45:11 crc kubenswrapper[4973]: I0320 13:45:11.050000 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-2wgvh"] Mar 20 13:45:11 crc kubenswrapper[4973]: I0320 13:45:11.070745 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:11 crc kubenswrapper[4973]: W0320 13:45:11.078034 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5bcf98ea_17cb_432f_8d35_18cc016401ed.slice/crio-a9b71dad3317467be88db6ab222a7c3e5a209d1e9298bfa1c99fa5f8807dbbb0 WatchSource:0}: Error finding container a9b71dad3317467be88db6ab222a7c3e5a209d1e9298bfa1c99fa5f8807dbbb0: Status 404 returned error can't find the container with id a9b71dad3317467be88db6ab222a7c3e5a209d1e9298bfa1c99fa5f8807dbbb0 Mar 20 13:45:11 crc kubenswrapper[4973]: I0320 13:45:11.132947 4973 generic.go:334] "Generic (PLEG): container finished" podID="40e7d0d5-387a-4c17-9475-93998347341e" containerID="2c2cae1cda92d889478f2b07aba1cc111182411a95c96805d61b8db9b4330197" exitCode=0 Mar 20 13:45:11 crc kubenswrapper[4973]: I0320 13:45:11.134539 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6" event={"ID":"40e7d0d5-387a-4c17-9475-93998347341e","Type":"ContainerDied","Data":"2c2cae1cda92d889478f2b07aba1cc111182411a95c96805d61b8db9b4330197"} Mar 20 13:45:11 crc kubenswrapper[4973]: I0320 13:45:11.134636 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6" event={"ID":"40e7d0d5-387a-4c17-9475-93998347341e","Type":"ContainerStarted","Data":"6c88543647cfaaa7e6cb2056ef31c14c01fe998c50d42d8f8f3f82f8e10a5ac8"} Mar 20 13:45:11 crc kubenswrapper[4973]: I0320 13:45:11.142921 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zzsmb" event={"ID":"6fec0901-00c6-410f-986c-4dcac4fe1359","Type":"ContainerStarted","Data":"b3f901d3fb7e6053f4d4efb8956efd3a57c5ed921eec8093a75bf61826095000"} Mar 20 13:45:11 crc kubenswrapper[4973]: I0320 13:45:11.145032 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"198d4fb1-4784-4553-b617-a2c33cec6df6","Type":"ContainerStarted","Data":"dc6d86bbf08f4755e01861e808d829309fbeb4919bdb4b247a0a90a264372195"} Mar 20 13:45:11 crc kubenswrapper[4973]: I0320 13:45:11.151141 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-znp6r" event={"ID":"5d867119-66df-4aa7-a2dd-13d0d40ce2cc","Type":"ContainerStarted","Data":"1e7fce08aeb554166dadf690d74f9ad7a54e13ee2073601803431c8a10c95be1"} Mar 20 13:45:11 crc kubenswrapper[4973]: I0320 13:45:11.166587 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cnl48" event={"ID":"00a88a6b-3101-40ec-90ee-62a00b5559a1","Type":"ContainerStarted","Data":"1978ad4d84dbc6a8919d8ebe3feeb799305389dff7ee374866a8cb8995edfb71"} Mar 20 13:45:11 crc kubenswrapper[4973]: I0320 13:45:11.177038 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2wgvh" event={"ID":"5bcf98ea-17cb-432f-8d35-18cc016401ed","Type":"ContainerStarted","Data":"a9b71dad3317467be88db6ab222a7c3e5a209d1e9298bfa1c99fa5f8807dbbb0"} Mar 20 13:45:11 crc kubenswrapper[4973]: I0320 13:45:11.191253 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lbsl6" event={"ID":"92ef179e-0d37-4a3d-986f-5a4ea5bc5a23","Type":"ContainerStarted","Data":"19f924119305226e017b38a7d64bc5853ef6bad70341bbe6a04c074641712c78"} Mar 20 13:45:11 crc kubenswrapper[4973]: I0320 13:45:11.207190 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vjtzq" event={"ID":"0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68","Type":"ContainerStarted","Data":"ef78a08289134992f9c632dad673844b359d1a7ca4b4df90922ece411e9c3da1"} Mar 20 13:45:11 crc kubenswrapper[4973]: I0320 13:45:11.207248 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vjtzq" event={"ID":"0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68","Type":"ContainerStarted","Data":"01b63697c8292f394e1cf265d3998f5967b9bc912cebe09217fd54bade0f98b7"} Mar 20 13:45:11 crc kubenswrapper[4973]: I0320 13:45:11.219634 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" event={"ID":"a0006a98-e902-4378-b895-cf31a555b3f6","Type":"ContainerStarted","Data":"f59ea8c1e64277ab1e80a4d15a9bfa66999488a923b8ae98fda1ed026aa1acf2"} Mar 20 13:45:11 crc kubenswrapper[4973]: I0320 13:45:11.224955 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-cnl48" podStartSLOduration=3.224935507 podStartE2EDuration="3.224935507s" podCreationTimestamp="2026-03-20 13:45:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:11.191875875 +0000 UTC m=+1431.935545639" watchObservedRunningTime="2026-03-20 13:45:11.224935507 +0000 UTC m=+1431.968605251" Mar 20 13:45:11 crc kubenswrapper[4973]: I0320 13:45:11.238118 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-vjtzq" podStartSLOduration=3.238093696 podStartE2EDuration="3.238093696s" podCreationTimestamp="2026-03-20 13:45:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:11.230508769 +0000 UTC m=+1431.974178523" watchObservedRunningTime="2026-03-20 13:45:11.238093696 +0000 UTC m=+1431.981763440" Mar 20 13:45:11 crc kubenswrapper[4973]: I0320 13:45:11.409538 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:45:11 crc kubenswrapper[4973]: I0320 13:45:11.572957 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:45:11 crc kubenswrapper[4973]: I0320 13:45:11.939237 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6" Mar 20 13:45:11 crc kubenswrapper[4973]: I0320 13:45:11.981467 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4acc56cd-01f7-4a24-bfba-7d7a52396699" path="/var/lib/kubelet/pods/4acc56cd-01f7-4a24-bfba-7d7a52396699/volumes" Mar 20 13:45:12 crc kubenswrapper[4973]: I0320 13:45:12.048072 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxdlt\" (UniqueName: \"kubernetes.io/projected/40e7d0d5-387a-4c17-9475-93998347341e-kube-api-access-xxdlt\") pod \"40e7d0d5-387a-4c17-9475-93998347341e\" (UID: \"40e7d0d5-387a-4c17-9475-93998347341e\") " Mar 20 13:45:12 crc kubenswrapper[4973]: I0320 13:45:12.048177 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-ovsdbserver-nb\") pod \"40e7d0d5-387a-4c17-9475-93998347341e\" (UID: \"40e7d0d5-387a-4c17-9475-93998347341e\") " Mar 20 13:45:12 crc kubenswrapper[4973]: I0320 13:45:12.048546 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-dns-swift-storage-0\") pod \"40e7d0d5-387a-4c17-9475-93998347341e\" (UID: \"40e7d0d5-387a-4c17-9475-93998347341e\") " Mar 20 13:45:12 crc kubenswrapper[4973]: I0320 13:45:12.048619 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-dns-svc\") pod \"40e7d0d5-387a-4c17-9475-93998347341e\" (UID: \"40e7d0d5-387a-4c17-9475-93998347341e\") " Mar 20 13:45:12 crc kubenswrapper[4973]: I0320 13:45:12.048644 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-ovsdbserver-sb\") pod \"40e7d0d5-387a-4c17-9475-93998347341e\" (UID: \"40e7d0d5-387a-4c17-9475-93998347341e\") " Mar 20 13:45:12 crc kubenswrapper[4973]: I0320 13:45:12.048694 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-config\") pod \"40e7d0d5-387a-4c17-9475-93998347341e\" (UID: \"40e7d0d5-387a-4c17-9475-93998347341e\") " Mar 20 13:45:12 crc kubenswrapper[4973]: I0320 13:45:12.055528 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40e7d0d5-387a-4c17-9475-93998347341e-kube-api-access-xxdlt" (OuterVolumeSpecName: "kube-api-access-xxdlt") pod "40e7d0d5-387a-4c17-9475-93998347341e" (UID: "40e7d0d5-387a-4c17-9475-93998347341e"). InnerVolumeSpecName "kube-api-access-xxdlt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:12 crc kubenswrapper[4973]: I0320 13:45:12.080855 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "40e7d0d5-387a-4c17-9475-93998347341e" (UID: "40e7d0d5-387a-4c17-9475-93998347341e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:12 crc kubenswrapper[4973]: I0320 13:45:12.083750 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "40e7d0d5-387a-4c17-9475-93998347341e" (UID: "40e7d0d5-387a-4c17-9475-93998347341e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:12 crc kubenswrapper[4973]: I0320 13:45:12.086634 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-config" (OuterVolumeSpecName: "config") pod "40e7d0d5-387a-4c17-9475-93998347341e" (UID: "40e7d0d5-387a-4c17-9475-93998347341e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:12 crc kubenswrapper[4973]: I0320 13:45:12.091870 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "40e7d0d5-387a-4c17-9475-93998347341e" (UID: "40e7d0d5-387a-4c17-9475-93998347341e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:12 crc kubenswrapper[4973]: I0320 13:45:12.123003 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "40e7d0d5-387a-4c17-9475-93998347341e" (UID: "40e7d0d5-387a-4c17-9475-93998347341e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:12 crc kubenswrapper[4973]: I0320 13:45:12.151914 4973 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:12 crc kubenswrapper[4973]: I0320 13:45:12.151955 4973 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:12 crc kubenswrapper[4973]: I0320 13:45:12.151969 4973 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:12 crc kubenswrapper[4973]: I0320 13:45:12.151980 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:12 crc kubenswrapper[4973]: I0320 13:45:12.151994 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxdlt\" (UniqueName: \"kubernetes.io/projected/40e7d0d5-387a-4c17-9475-93998347341e-kube-api-access-xxdlt\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:12 crc kubenswrapper[4973]: I0320 13:45:12.152006 4973 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/40e7d0d5-387a-4c17-9475-93998347341e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:12 crc kubenswrapper[4973]: I0320 13:45:12.239742 4973 generic.go:334] "Generic (PLEG): container finished" podID="a0006a98-e902-4378-b895-cf31a555b3f6" containerID="934abc44721f6dc5fce46e9def2e043b28599651fd64efb74642f7c4bf719914" exitCode=0 Mar 20 13:45:12 crc kubenswrapper[4973]: I0320 13:45:12.239799 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" event={"ID":"a0006a98-e902-4378-b895-cf31a555b3f6","Type":"ContainerDied","Data":"934abc44721f6dc5fce46e9def2e043b28599651fd64efb74642f7c4bf719914"} Mar 20 13:45:12 crc kubenswrapper[4973]: I0320 13:45:12.245562 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6" Mar 20 13:45:12 crc kubenswrapper[4973]: I0320 13:45:12.245554 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6" event={"ID":"40e7d0d5-387a-4c17-9475-93998347341e","Type":"ContainerDied","Data":"6c88543647cfaaa7e6cb2056ef31c14c01fe998c50d42d8f8f3f82f8e10a5ac8"} Mar 20 13:45:12 crc kubenswrapper[4973]: I0320 13:45:12.245748 4973 scope.go:117] "RemoveContainer" containerID="2c2cae1cda92d889478f2b07aba1cc111182411a95c96805d61b8db9b4330197" Mar 20 13:45:12 crc kubenswrapper[4973]: I0320 13:45:12.252910 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dc7dca93-568d-4f21-9162-21244ab6aa49","Type":"ContainerStarted","Data":"21a37fc255953b0b2257b0d4d5e72338ef76961cd7a3e792fae443c071948872"} Mar 20 13:45:12 crc kubenswrapper[4973]: I0320 13:45:12.401955 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6"] Mar 20 13:45:12 crc kubenswrapper[4973]: I0320 13:45:12.421496 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c5cc7c5ff-h8gg6"] Mar 20 13:45:12 crc kubenswrapper[4973]: I0320 13:45:12.440093 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:45:13 crc kubenswrapper[4973]: I0320 13:45:13.297014 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dc7dca93-568d-4f21-9162-21244ab6aa49","Type":"ContainerStarted","Data":"fcdcfae7c0422c8b17b5fabfc8996b5c4cfc2a9a66dcaac963eb6a59457c2644"} Mar 20 13:45:13 crc kubenswrapper[4973]: I0320 13:45:13.299249 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7b455c66-699c-4a18-91cb-0e8fca97642e","Type":"ContainerStarted","Data":"799a4416ab47103a1d3e494ba90fd3278d926263e5ea58badaefbe06ff6388b0"} Mar 20 13:45:13 crc kubenswrapper[4973]: I0320 13:45:13.304683 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" event={"ID":"a0006a98-e902-4378-b895-cf31a555b3f6","Type":"ContainerStarted","Data":"50aaf9928844e9feee6b2c0d1b4bd40f41ffcc0751174b6b279ca3eadddbbde6"} Mar 20 13:45:13 crc kubenswrapper[4973]: I0320 13:45:13.305408 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" Mar 20 13:45:13 crc kubenswrapper[4973]: I0320 13:45:13.346786 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" podStartSLOduration=5.346761441 podStartE2EDuration="5.346761441s" podCreationTimestamp="2026-03-20 13:45:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:13.329571143 +0000 UTC m=+1434.073240887" watchObservedRunningTime="2026-03-20 13:45:13.346761441 +0000 UTC m=+1434.090431195" Mar 20 13:45:13 crc kubenswrapper[4973]: I0320 13:45:13.976509 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40e7d0d5-387a-4c17-9475-93998347341e" path="/var/lib/kubelet/pods/40e7d0d5-387a-4c17-9475-93998347341e/volumes" Mar 20 13:45:14 crc kubenswrapper[4973]: I0320 13:45:14.345817 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dc7dca93-568d-4f21-9162-21244ab6aa49","Type":"ContainerStarted","Data":"3fe0918ca34bacc5b730a8d4ec4b9207ea03b617344a8587056e8d4af34e44cc"} Mar 20 13:45:14 crc kubenswrapper[4973]: I0320 13:45:14.348125 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7b455c66-699c-4a18-91cb-0e8fca97642e","Type":"ContainerStarted","Data":"d7d30c2f5c6f9bbc0871452af8c9840f478efc3187eedcef4d9320834860fb19"} Mar 20 13:45:15 crc kubenswrapper[4973]: I0320 13:45:15.360981 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="dc7dca93-568d-4f21-9162-21244ab6aa49" containerName="glance-log" containerID="cri-o://fcdcfae7c0422c8b17b5fabfc8996b5c4cfc2a9a66dcaac963eb6a59457c2644" gracePeriod=30 Mar 20 13:45:15 crc kubenswrapper[4973]: I0320 13:45:15.361449 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7b455c66-699c-4a18-91cb-0e8fca97642e" containerName="glance-log" containerID="cri-o://d7d30c2f5c6f9bbc0871452af8c9840f478efc3187eedcef4d9320834860fb19" gracePeriod=30 Mar 20 13:45:15 crc kubenswrapper[4973]: I0320 13:45:15.361465 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="dc7dca93-568d-4f21-9162-21244ab6aa49" containerName="glance-httpd" containerID="cri-o://3fe0918ca34bacc5b730a8d4ec4b9207ea03b617344a8587056e8d4af34e44cc" gracePeriod=30 Mar 20 13:45:15 crc kubenswrapper[4973]: I0320 13:45:15.361514 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7b455c66-699c-4a18-91cb-0e8fca97642e","Type":"ContainerStarted","Data":"239a16c7abc7e3d0d3b8d364c939b685e93e4e5d7fc693b726db6e98d85f1b44"} Mar 20 13:45:15 crc kubenswrapper[4973]: I0320 13:45:15.361715 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7b455c66-699c-4a18-91cb-0e8fca97642e" containerName="glance-httpd" containerID="cri-o://239a16c7abc7e3d0d3b8d364c939b685e93e4e5d7fc693b726db6e98d85f1b44" gracePeriod=30 Mar 20 13:45:15 crc kubenswrapper[4973]: I0320 13:45:15.412385 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=7.412360032 podStartE2EDuration="7.412360032s" podCreationTimestamp="2026-03-20 13:45:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:15.396118138 +0000 UTC m=+1436.139787882" watchObservedRunningTime="2026-03-20 13:45:15.412360032 +0000 UTC m=+1436.156029796" Mar 20 13:45:15 crc kubenswrapper[4973]: I0320 13:45:15.433085 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=7.433063696 podStartE2EDuration="7.433063696s" podCreationTimestamp="2026-03-20 13:45:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:15.431399651 +0000 UTC m=+1436.175069395" watchObservedRunningTime="2026-03-20 13:45:15.433063696 +0000 UTC m=+1436.176733440" Mar 20 13:45:16 crc kubenswrapper[4973]: I0320 13:45:16.375460 4973 generic.go:334] "Generic (PLEG): container finished" podID="dc7dca93-568d-4f21-9162-21244ab6aa49" containerID="3fe0918ca34bacc5b730a8d4ec4b9207ea03b617344a8587056e8d4af34e44cc" exitCode=0 Mar 20 13:45:16 crc kubenswrapper[4973]: I0320 13:45:16.376168 4973 generic.go:334] "Generic (PLEG): container finished" podID="dc7dca93-568d-4f21-9162-21244ab6aa49" containerID="fcdcfae7c0422c8b17b5fabfc8996b5c4cfc2a9a66dcaac963eb6a59457c2644" exitCode=143 Mar 20 13:45:16 crc kubenswrapper[4973]: I0320 13:45:16.375547 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dc7dca93-568d-4f21-9162-21244ab6aa49","Type":"ContainerDied","Data":"3fe0918ca34bacc5b730a8d4ec4b9207ea03b617344a8587056e8d4af34e44cc"} Mar 20 13:45:16 crc kubenswrapper[4973]: I0320 13:45:16.376251 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dc7dca93-568d-4f21-9162-21244ab6aa49","Type":"ContainerDied","Data":"fcdcfae7c0422c8b17b5fabfc8996b5c4cfc2a9a66dcaac963eb6a59457c2644"} Mar 20 13:45:16 crc kubenswrapper[4973]: I0320 13:45:16.381043 4973 generic.go:334] "Generic (PLEG): container finished" podID="7b455c66-699c-4a18-91cb-0e8fca97642e" containerID="239a16c7abc7e3d0d3b8d364c939b685e93e4e5d7fc693b726db6e98d85f1b44" exitCode=0 Mar 20 13:45:16 crc kubenswrapper[4973]: I0320 13:45:16.381066 4973 generic.go:334] "Generic (PLEG): container finished" podID="7b455c66-699c-4a18-91cb-0e8fca97642e" containerID="d7d30c2f5c6f9bbc0871452af8c9840f478efc3187eedcef4d9320834860fb19" exitCode=143 Mar 20 13:45:16 crc kubenswrapper[4973]: I0320 13:45:16.381088 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7b455c66-699c-4a18-91cb-0e8fca97642e","Type":"ContainerDied","Data":"239a16c7abc7e3d0d3b8d364c939b685e93e4e5d7fc693b726db6e98d85f1b44"} Mar 20 13:45:16 crc kubenswrapper[4973]: I0320 13:45:16.381122 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7b455c66-699c-4a18-91cb-0e8fca97642e","Type":"ContainerDied","Data":"d7d30c2f5c6f9bbc0871452af8c9840f478efc3187eedcef4d9320834860fb19"} Mar 20 13:45:17 crc kubenswrapper[4973]: I0320 13:45:17.395347 4973 generic.go:334] "Generic (PLEG): container finished" podID="00a88a6b-3101-40ec-90ee-62a00b5559a1" containerID="1978ad4d84dbc6a8919d8ebe3feeb799305389dff7ee374866a8cb8995edfb71" exitCode=0 Mar 20 13:45:17 crc kubenswrapper[4973]: I0320 13:45:17.395416 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cnl48" event={"ID":"00a88a6b-3101-40ec-90ee-62a00b5559a1","Type":"ContainerDied","Data":"1978ad4d84dbc6a8919d8ebe3feeb799305389dff7ee374866a8cb8995edfb71"} Mar 20 13:45:19 crc kubenswrapper[4973]: I0320 13:45:19.354854 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" Mar 20 13:45:19 crc kubenswrapper[4973]: I0320 13:45:19.430166 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-wgtqz"] Mar 20 13:45:19 crc kubenswrapper[4973]: I0320 13:45:19.440474 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" podUID="86ba366f-247a-4630-8fa6-196198d8aec7" containerName="dnsmasq-dns" containerID="cri-o://ecf6f5bd6ab1ed88304d3614f92656f95c74d4951b1d66a11edcc5ff9ac2c49f" gracePeriod=10 Mar 20 13:45:19 crc kubenswrapper[4973]: I0320 13:45:19.992572 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Mar 20 13:45:20 crc kubenswrapper[4973]: I0320 13:45:20.005098 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Mar 20 13:45:20 crc kubenswrapper[4973]: I0320 13:45:20.441940 4973 generic.go:334] "Generic (PLEG): container finished" podID="86ba366f-247a-4630-8fa6-196198d8aec7" containerID="ecf6f5bd6ab1ed88304d3614f92656f95c74d4951b1d66a11edcc5ff9ac2c49f" exitCode=0 Mar 20 13:45:20 crc kubenswrapper[4973]: I0320 13:45:20.442501 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" event={"ID":"86ba366f-247a-4630-8fa6-196198d8aec7","Type":"ContainerDied","Data":"ecf6f5bd6ab1ed88304d3614f92656f95c74d4951b1d66a11edcc5ff9ac2c49f"} Mar 20 13:45:20 crc kubenswrapper[4973]: I0320 13:45:20.447515 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Mar 20 13:45:23 crc kubenswrapper[4973]: I0320 13:45:23.558637 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" podUID="86ba366f-247a-4630-8fa6-196198d8aec7" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.171:5353: connect: connection refused" Mar 20 13:45:28 crc kubenswrapper[4973]: I0320 13:45:28.558866 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" podUID="86ba366f-247a-4630-8fa6-196198d8aec7" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.171:5353: connect: connection refused" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.558759 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" podUID="86ba366f-247a-4630-8fa6-196198d8aec7" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.171:5353: connect: connection refused" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.559460 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.570265 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-cnl48" event={"ID":"00a88a6b-3101-40ec-90ee-62a00b5559a1","Type":"ContainerDied","Data":"26a1303df44a856e46d238c49d8b690d011e33c34e097b56cb5d024f808b2d6c"} Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.570321 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26a1303df44a856e46d238c49d8b690d011e33c34e097b56cb5d024f808b2d6c" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.572802 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7b455c66-699c-4a18-91cb-0e8fca97642e","Type":"ContainerDied","Data":"799a4416ab47103a1d3e494ba90fd3278d926263e5ea58badaefbe06ff6388b0"} Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.572834 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="799a4416ab47103a1d3e494ba90fd3278d926263e5ea58badaefbe06ff6388b0" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.696495 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.709478 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cnl48" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.795966 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-fernet-keys\") pod \"00a88a6b-3101-40ec-90ee-62a00b5559a1\" (UID: \"00a88a6b-3101-40ec-90ee-62a00b5559a1\") " Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.796128 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b455c66-699c-4a18-91cb-0e8fca97642e-scripts\") pod \"7b455c66-699c-4a18-91cb-0e8fca97642e\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.796161 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b455c66-699c-4a18-91cb-0e8fca97642e-internal-tls-certs\") pod \"7b455c66-699c-4a18-91cb-0e8fca97642e\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.796227 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b455c66-699c-4a18-91cb-0e8fca97642e-combined-ca-bundle\") pod \"7b455c66-699c-4a18-91cb-0e8fca97642e\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.796281 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l6np\" (UniqueName: \"kubernetes.io/projected/7b455c66-699c-4a18-91cb-0e8fca97642e-kube-api-access-4l6np\") pod \"7b455c66-699c-4a18-91cb-0e8fca97642e\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.796307 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-scripts\") pod \"00a88a6b-3101-40ec-90ee-62a00b5559a1\" (UID: \"00a88a6b-3101-40ec-90ee-62a00b5559a1\") " Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.796355 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-combined-ca-bundle\") pod \"00a88a6b-3101-40ec-90ee-62a00b5559a1\" (UID: \"00a88a6b-3101-40ec-90ee-62a00b5559a1\") " Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.796383 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-credential-keys\") pod \"00a88a6b-3101-40ec-90ee-62a00b5559a1\" (UID: \"00a88a6b-3101-40ec-90ee-62a00b5559a1\") " Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.796664 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b455c66-699c-4a18-91cb-0e8fca97642e-config-data\") pod \"7b455c66-699c-4a18-91cb-0e8fca97642e\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.797034 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4da075d1-9614-421d-9161-1d84cabf00c7\") pod \"7b455c66-699c-4a18-91cb-0e8fca97642e\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.797114 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-config-data\") pod \"00a88a6b-3101-40ec-90ee-62a00b5559a1\" (UID: \"00a88a6b-3101-40ec-90ee-62a00b5559a1\") " Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.797180 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b455c66-699c-4a18-91cb-0e8fca97642e-logs\") pod \"7b455c66-699c-4a18-91cb-0e8fca97642e\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.797357 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6chs\" (UniqueName: \"kubernetes.io/projected/00a88a6b-3101-40ec-90ee-62a00b5559a1-kube-api-access-d6chs\") pod \"00a88a6b-3101-40ec-90ee-62a00b5559a1\" (UID: \"00a88a6b-3101-40ec-90ee-62a00b5559a1\") " Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.797700 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b455c66-699c-4a18-91cb-0e8fca97642e-httpd-run\") pod \"7b455c66-699c-4a18-91cb-0e8fca97642e\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.802057 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b455c66-699c-4a18-91cb-0e8fca97642e-logs" (OuterVolumeSpecName: "logs") pod "7b455c66-699c-4a18-91cb-0e8fca97642e" (UID: "7b455c66-699c-4a18-91cb-0e8fca97642e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.802700 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b455c66-699c-4a18-91cb-0e8fca97642e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7b455c66-699c-4a18-91cb-0e8fca97642e" (UID: "7b455c66-699c-4a18-91cb-0e8fca97642e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.804894 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b455c66-699c-4a18-91cb-0e8fca97642e-scripts" (OuterVolumeSpecName: "scripts") pod "7b455c66-699c-4a18-91cb-0e8fca97642e" (UID: "7b455c66-699c-4a18-91cb-0e8fca97642e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.806580 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "00a88a6b-3101-40ec-90ee-62a00b5559a1" (UID: "00a88a6b-3101-40ec-90ee-62a00b5559a1"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.807294 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00a88a6b-3101-40ec-90ee-62a00b5559a1-kube-api-access-d6chs" (OuterVolumeSpecName: "kube-api-access-d6chs") pod "00a88a6b-3101-40ec-90ee-62a00b5559a1" (UID: "00a88a6b-3101-40ec-90ee-62a00b5559a1"). InnerVolumeSpecName "kube-api-access-d6chs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.810562 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-scripts" (OuterVolumeSpecName: "scripts") pod "00a88a6b-3101-40ec-90ee-62a00b5559a1" (UID: "00a88a6b-3101-40ec-90ee-62a00b5559a1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.810661 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "00a88a6b-3101-40ec-90ee-62a00b5559a1" (UID: "00a88a6b-3101-40ec-90ee-62a00b5559a1"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.832003 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b455c66-699c-4a18-91cb-0e8fca97642e-kube-api-access-4l6np" (OuterVolumeSpecName: "kube-api-access-4l6np") pod "7b455c66-699c-4a18-91cb-0e8fca97642e" (UID: "7b455c66-699c-4a18-91cb-0e8fca97642e"). InnerVolumeSpecName "kube-api-access-4l6np". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.839712 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-config-data" (OuterVolumeSpecName: "config-data") pod "00a88a6b-3101-40ec-90ee-62a00b5559a1" (UID: "00a88a6b-3101-40ec-90ee-62a00b5559a1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.840135 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "00a88a6b-3101-40ec-90ee-62a00b5559a1" (UID: "00a88a6b-3101-40ec-90ee-62a00b5559a1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:33 crc kubenswrapper[4973]: E0320 13:45:33.843461 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4da075d1-9614-421d-9161-1d84cabf00c7 podName:7b455c66-699c-4a18-91cb-0e8fca97642e nodeName:}" failed. No retries permitted until 2026-03-20 13:45:34.343436371 +0000 UTC m=+1455.087106115 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "glance" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4da075d1-9614-421d-9161-1d84cabf00c7") pod "7b455c66-699c-4a18-91cb-0e8fca97642e" (UID: "7b455c66-699c-4a18-91cb-0e8fca97642e") : kubernetes.io/csi: Unmounter.TearDownAt failed: rpc error: code = Unknown desc = check target path: could not get consistent content of /proc/mounts after 3 attempts Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.853726 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b455c66-699c-4a18-91cb-0e8fca97642e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b455c66-699c-4a18-91cb-0e8fca97642e" (UID: "7b455c66-699c-4a18-91cb-0e8fca97642e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.881036 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b455c66-699c-4a18-91cb-0e8fca97642e-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7b455c66-699c-4a18-91cb-0e8fca97642e" (UID: "7b455c66-699c-4a18-91cb-0e8fca97642e"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.886834 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b455c66-699c-4a18-91cb-0e8fca97642e-config-data" (OuterVolumeSpecName: "config-data") pod "7b455c66-699c-4a18-91cb-0e8fca97642e" (UID: "7b455c66-699c-4a18-91cb-0e8fca97642e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.902647 4973 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.902679 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7b455c66-699c-4a18-91cb-0e8fca97642e-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.902688 4973 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7b455c66-699c-4a18-91cb-0e8fca97642e-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.902699 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b455c66-699c-4a18-91cb-0e8fca97642e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.902707 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l6np\" (UniqueName: \"kubernetes.io/projected/7b455c66-699c-4a18-91cb-0e8fca97642e-kube-api-access-4l6np\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.902715 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.902723 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.902732 4973 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.902739 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b455c66-699c-4a18-91cb-0e8fca97642e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.902747 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00a88a6b-3101-40ec-90ee-62a00b5559a1-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.902755 4973 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7b455c66-699c-4a18-91cb-0e8fca97642e-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.902762 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6chs\" (UniqueName: \"kubernetes.io/projected/00a88a6b-3101-40ec-90ee-62a00b5559a1-kube-api-access-d6chs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:33 crc kubenswrapper[4973]: I0320 13:45:33.902770 4973 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7b455c66-699c-4a18-91cb-0e8fca97642e-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.080836 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.106199 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc7dca93-568d-4f21-9162-21244ab6aa49-logs\") pod \"dc7dca93-568d-4f21-9162-21244ab6aa49\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.106488 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5k5d\" (UniqueName: \"kubernetes.io/projected/dc7dca93-568d-4f21-9162-21244ab6aa49-kube-api-access-j5k5d\") pod \"dc7dca93-568d-4f21-9162-21244ab6aa49\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.106533 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc7dca93-568d-4f21-9162-21244ab6aa49-scripts\") pod \"dc7dca93-568d-4f21-9162-21244ab6aa49\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.106638 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\") pod \"dc7dca93-568d-4f21-9162-21244ab6aa49\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.106800 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc7dca93-568d-4f21-9162-21244ab6aa49-httpd-run\") pod \"dc7dca93-568d-4f21-9162-21244ab6aa49\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.106846 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc7dca93-568d-4f21-9162-21244ab6aa49-logs" (OuterVolumeSpecName: "logs") pod "dc7dca93-568d-4f21-9162-21244ab6aa49" (UID: "dc7dca93-568d-4f21-9162-21244ab6aa49"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.106889 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7dca93-568d-4f21-9162-21244ab6aa49-public-tls-certs\") pod \"dc7dca93-568d-4f21-9162-21244ab6aa49\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.106927 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7dca93-568d-4f21-9162-21244ab6aa49-config-data\") pod \"dc7dca93-568d-4f21-9162-21244ab6aa49\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.107003 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7dca93-568d-4f21-9162-21244ab6aa49-combined-ca-bundle\") pod \"dc7dca93-568d-4f21-9162-21244ab6aa49\" (UID: \"dc7dca93-568d-4f21-9162-21244ab6aa49\") " Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.107203 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc7dca93-568d-4f21-9162-21244ab6aa49-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "dc7dca93-568d-4f21-9162-21244ab6aa49" (UID: "dc7dca93-568d-4f21-9162-21244ab6aa49"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:34 crc kubenswrapper[4973]: E0320 13:45:34.107303 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified" Mar 20 13:45:34 crc kubenswrapper[4973]: E0320 13:45:34.107466 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vsksr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-znp6r_openstack(5d867119-66df-4aa7-a2dd-13d0d40ce2cc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.107526 4973 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/dc7dca93-568d-4f21-9162-21244ab6aa49-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.107541 4973 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/dc7dca93-568d-4f21-9162-21244ab6aa49-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.110890 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7dca93-568d-4f21-9162-21244ab6aa49-scripts" (OuterVolumeSpecName: "scripts") pod "dc7dca93-568d-4f21-9162-21244ab6aa49" (UID: "dc7dca93-568d-4f21-9162-21244ab6aa49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:34 crc kubenswrapper[4973]: E0320 13:45:34.112440 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-znp6r" podUID="5d867119-66df-4aa7-a2dd-13d0d40ce2cc" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.112642 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc7dca93-568d-4f21-9162-21244ab6aa49-kube-api-access-j5k5d" (OuterVolumeSpecName: "kube-api-access-j5k5d") pod "dc7dca93-568d-4f21-9162-21244ab6aa49" (UID: "dc7dca93-568d-4f21-9162-21244ab6aa49"). InnerVolumeSpecName "kube-api-access-j5k5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.155790 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564" (OuterVolumeSpecName: "glance") pod "dc7dca93-568d-4f21-9162-21244ab6aa49" (UID: "dc7dca93-568d-4f21-9162-21244ab6aa49"). InnerVolumeSpecName "pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.177581 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7dca93-568d-4f21-9162-21244ab6aa49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dc7dca93-568d-4f21-9162-21244ab6aa49" (UID: "dc7dca93-568d-4f21-9162-21244ab6aa49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.209262 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7dca93-568d-4f21-9162-21244ab6aa49-config-data" (OuterVolumeSpecName: "config-data") pod "dc7dca93-568d-4f21-9162-21244ab6aa49" (UID: "dc7dca93-568d-4f21-9162-21244ab6aa49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.209699 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc7dca93-568d-4f21-9162-21244ab6aa49-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "dc7dca93-568d-4f21-9162-21244ab6aa49" (UID: "dc7dca93-568d-4f21-9162-21244ab6aa49"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.210188 4973 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/dc7dca93-568d-4f21-9162-21244ab6aa49-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.210232 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dc7dca93-568d-4f21-9162-21244ab6aa49-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.210241 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dc7dca93-568d-4f21-9162-21244ab6aa49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.210250 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5k5d\" (UniqueName: \"kubernetes.io/projected/dc7dca93-568d-4f21-9162-21244ab6aa49-kube-api-access-j5k5d\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.210259 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dc7dca93-568d-4f21-9162-21244ab6aa49-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.210297 4973 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\") on node \"crc\" " Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.273920 4973 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.274159 4973 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564") on node "crc" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.312544 4973 reconciler_common.go:293] "Volume detached for volume \"pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.413933 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4da075d1-9614-421d-9161-1d84cabf00c7\") pod \"7b455c66-699c-4a18-91cb-0e8fca97642e\" (UID: \"7b455c66-699c-4a18-91cb-0e8fca97642e\") " Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.439284 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4da075d1-9614-421d-9161-1d84cabf00c7" (OuterVolumeSpecName: "glance") pod "7b455c66-699c-4a18-91cb-0e8fca97642e" (UID: "7b455c66-699c-4a18-91cb-0e8fca97642e"). InnerVolumeSpecName "pvc-4da075d1-9614-421d-9161-1d84cabf00c7". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.519667 4973 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-4da075d1-9614-421d-9161-1d84cabf00c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4da075d1-9614-421d-9161-1d84cabf00c7\") on node \"crc\" " Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.548590 4973 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.548733 4973 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-4da075d1-9614-421d-9161-1d84cabf00c7" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4da075d1-9614-421d-9161-1d84cabf00c7") on node "crc" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.587464 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"dc7dca93-568d-4f21-9162-21244ab6aa49","Type":"ContainerDied","Data":"21a37fc255953b0b2257b0d4d5e72338ef76961cd7a3e792fae443c071948872"} Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.587529 4973 scope.go:117] "RemoveContainer" containerID="3fe0918ca34bacc5b730a8d4ec4b9207ea03b617344a8587056e8d4af34e44cc" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.587618 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-cnl48" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.587636 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.588239 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:45:34 crc kubenswrapper[4973]: E0320 13:45:34.589819 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-heat-engine:current-podified\\\"\"" pod="openstack/heat-db-sync-znp6r" podUID="5d867119-66df-4aa7-a2dd-13d0d40ce2cc" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.622193 4973 reconciler_common.go:293] "Volume detached for volume \"pvc-4da075d1-9614-421d-9161-1d84cabf00c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4da075d1-9614-421d-9161-1d84cabf00c7\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.674233 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.684756 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.713446 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.734667 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.751759 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:45:34 crc kubenswrapper[4973]: E0320 13:45:34.752427 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc7dca93-568d-4f21-9162-21244ab6aa49" containerName="glance-log" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.752451 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc7dca93-568d-4f21-9162-21244ab6aa49" containerName="glance-log" Mar 20 13:45:34 crc kubenswrapper[4973]: E0320 13:45:34.752490 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00a88a6b-3101-40ec-90ee-62a00b5559a1" containerName="keystone-bootstrap" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.752499 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="00a88a6b-3101-40ec-90ee-62a00b5559a1" containerName="keystone-bootstrap" Mar 20 13:45:34 crc kubenswrapper[4973]: E0320 13:45:34.752518 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b455c66-699c-4a18-91cb-0e8fca97642e" containerName="glance-log" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.752526 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b455c66-699c-4a18-91cb-0e8fca97642e" containerName="glance-log" Mar 20 13:45:34 crc kubenswrapper[4973]: E0320 13:45:34.752549 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40e7d0d5-387a-4c17-9475-93998347341e" containerName="init" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.752557 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="40e7d0d5-387a-4c17-9475-93998347341e" containerName="init" Mar 20 13:45:34 crc kubenswrapper[4973]: E0320 13:45:34.752592 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc7dca93-568d-4f21-9162-21244ab6aa49" containerName="glance-httpd" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.752600 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc7dca93-568d-4f21-9162-21244ab6aa49" containerName="glance-httpd" Mar 20 13:45:34 crc kubenswrapper[4973]: E0320 13:45:34.752623 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b455c66-699c-4a18-91cb-0e8fca97642e" containerName="glance-httpd" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.752630 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b455c66-699c-4a18-91cb-0e8fca97642e" containerName="glance-httpd" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.752877 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="00a88a6b-3101-40ec-90ee-62a00b5559a1" containerName="keystone-bootstrap" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.752894 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b455c66-699c-4a18-91cb-0e8fca97642e" containerName="glance-httpd" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.752919 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b455c66-699c-4a18-91cb-0e8fca97642e" containerName="glance-log" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.752932 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="40e7d0d5-387a-4c17-9475-93998347341e" containerName="init" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.752947 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc7dca93-568d-4f21-9162-21244ab6aa49" containerName="glance-httpd" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.752968 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc7dca93-568d-4f21-9162-21244ab6aa49" containerName="glance-log" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.755052 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.758068 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.758312 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.760422 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.760741 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-tjqdw" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.774473 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.792493 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.794569 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.797149 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.797412 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.806494 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.921142 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-cnl48"] Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.929781 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-logs\") pod \"glance-default-internal-api-0\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.929839 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/446ab543-b039-4065-964f-945824ddec63-scripts\") pod \"glance-default-external-api-0\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.929867 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/446ab543-b039-4065-964f-945824ddec63-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.929905 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmdlq\" (UniqueName: \"kubernetes.io/projected/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-kube-api-access-xmdlq\") pod \"glance-default-internal-api-0\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.930008 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.930043 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/446ab543-b039-4065-964f-945824ddec63-logs\") pod \"glance-default-external-api-0\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.930139 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\") pod \"glance-default-external-api-0\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.930166 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.930189 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/446ab543-b039-4065-964f-945824ddec63-config-data\") pod \"glance-default-external-api-0\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.930214 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/446ab543-b039-4065-964f-945824ddec63-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.930233 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.930264 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/446ab543-b039-4065-964f-945824ddec63-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.930295 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssctt\" (UniqueName: \"kubernetes.io/projected/446ab543-b039-4065-964f-945824ddec63-kube-api-access-ssctt\") pod \"glance-default-external-api-0\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.930378 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4da075d1-9614-421d-9161-1d84cabf00c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4da075d1-9614-421d-9161-1d84cabf00c7\") pod \"glance-default-internal-api-0\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.930417 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.930468 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:34 crc kubenswrapper[4973]: I0320 13:45:34.935565 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-cnl48"] Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.028573 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-dr5z9"] Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.031936 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dr5z9" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.032660 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/446ab543-b039-4065-964f-945824ddec63-logs\") pod \"glance-default-external-api-0\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.032784 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\") pod \"glance-default-external-api-0\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.032817 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.032846 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/446ab543-b039-4065-964f-945824ddec63-config-data\") pod \"glance-default-external-api-0\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.032874 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/446ab543-b039-4065-964f-945824ddec63-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.032898 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.032932 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/446ab543-b039-4065-964f-945824ddec63-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.032964 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssctt\" (UniqueName: \"kubernetes.io/projected/446ab543-b039-4065-964f-945824ddec63-kube-api-access-ssctt\") pod \"glance-default-external-api-0\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.033566 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/446ab543-b039-4065-964f-945824ddec63-logs\") pod \"glance-default-external-api-0\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.034071 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4da075d1-9614-421d-9161-1d84cabf00c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4da075d1-9614-421d-9161-1d84cabf00c7\") pod \"glance-default-internal-api-0\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.034117 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.034152 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.034202 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-logs\") pod \"glance-default-internal-api-0\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.034225 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/446ab543-b039-4065-964f-945824ddec63-scripts\") pod \"glance-default-external-api-0\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.034240 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/446ab543-b039-4065-964f-945824ddec63-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.034270 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmdlq\" (UniqueName: \"kubernetes.io/projected/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-kube-api-access-xmdlq\") pod \"glance-default-internal-api-0\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.034291 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.034899 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/446ab543-b039-4065-964f-945824ddec63-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.034950 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-logs\") pod \"glance-default-internal-api-0\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.035234 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.039739 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.040956 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.041523 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.041868 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.041901 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.042077 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nvzpw" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.042237 4973 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.042265 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4da075d1-9614-421d-9161-1d84cabf00c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4da075d1-9614-421d-9161-1d84cabf00c7\") pod \"glance-default-internal-api-0\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5c263bf6e6632ecbb6820e7ea731ec6b961228752a610e87fa7ed7c4de533cfd/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.042303 4973 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.042328 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\") pod \"glance-default-external-api-0\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/da3d9df538268a2045279e6c35b80a9f9f98a48c1b51b6173b49a5c36304d8af/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.042243 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.045744 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/446ab543-b039-4065-964f-945824ddec63-config-data\") pod \"glance-default-external-api-0\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.049358 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/446ab543-b039-4065-964f-945824ddec63-scripts\") pod \"glance-default-external-api-0\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.050133 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/446ab543-b039-4065-964f-945824ddec63-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.050635 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-scripts\") pod \"glance-default-internal-api-0\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.052677 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/446ab543-b039-4065-964f-945824ddec63-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.053617 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dr5z9"] Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.068144 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssctt\" (UniqueName: \"kubernetes.io/projected/446ab543-b039-4065-964f-945824ddec63-kube-api-access-ssctt\") pod \"glance-default-external-api-0\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.070879 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-config-data\") pod \"glance-default-internal-api-0\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.072290 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmdlq\" (UniqueName: \"kubernetes.io/projected/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-kube-api-access-xmdlq\") pod \"glance-default-internal-api-0\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.119766 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\") pod \"glance-default-external-api-0\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " pod="openstack/glance-default-external-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.128114 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4da075d1-9614-421d-9161-1d84cabf00c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4da075d1-9614-421d-9161-1d84cabf00c7\") pod \"glance-default-internal-api-0\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.137355 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-scripts\") pod \"keystone-bootstrap-dr5z9\" (UID: \"9bc32f3f-2c79-4331-aa41-47d648fc6499\") " pod="openstack/keystone-bootstrap-dr5z9" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.137907 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-credential-keys\") pod \"keystone-bootstrap-dr5z9\" (UID: \"9bc32f3f-2c79-4331-aa41-47d648fc6499\") " pod="openstack/keystone-bootstrap-dr5z9" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.137974 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcn6j\" (UniqueName: \"kubernetes.io/projected/9bc32f3f-2c79-4331-aa41-47d648fc6499-kube-api-access-mcn6j\") pod \"keystone-bootstrap-dr5z9\" (UID: \"9bc32f3f-2c79-4331-aa41-47d648fc6499\") " pod="openstack/keystone-bootstrap-dr5z9" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.138173 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-config-data\") pod \"keystone-bootstrap-dr5z9\" (UID: \"9bc32f3f-2c79-4331-aa41-47d648fc6499\") " pod="openstack/keystone-bootstrap-dr5z9" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.138243 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-combined-ca-bundle\") pod \"keystone-bootstrap-dr5z9\" (UID: \"9bc32f3f-2c79-4331-aa41-47d648fc6499\") " pod="openstack/keystone-bootstrap-dr5z9" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.138286 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-fernet-keys\") pod \"keystone-bootstrap-dr5z9\" (UID: \"9bc32f3f-2c79-4331-aa41-47d648fc6499\") " pod="openstack/keystone-bootstrap-dr5z9" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.240676 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-config-data\") pod \"keystone-bootstrap-dr5z9\" (UID: \"9bc32f3f-2c79-4331-aa41-47d648fc6499\") " pod="openstack/keystone-bootstrap-dr5z9" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.240741 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-combined-ca-bundle\") pod \"keystone-bootstrap-dr5z9\" (UID: \"9bc32f3f-2c79-4331-aa41-47d648fc6499\") " pod="openstack/keystone-bootstrap-dr5z9" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.240770 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-fernet-keys\") pod \"keystone-bootstrap-dr5z9\" (UID: \"9bc32f3f-2c79-4331-aa41-47d648fc6499\") " pod="openstack/keystone-bootstrap-dr5z9" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.240851 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-scripts\") pod \"keystone-bootstrap-dr5z9\" (UID: \"9bc32f3f-2c79-4331-aa41-47d648fc6499\") " pod="openstack/keystone-bootstrap-dr5z9" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.240925 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-credential-keys\") pod \"keystone-bootstrap-dr5z9\" (UID: \"9bc32f3f-2c79-4331-aa41-47d648fc6499\") " pod="openstack/keystone-bootstrap-dr5z9" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.240946 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcn6j\" (UniqueName: \"kubernetes.io/projected/9bc32f3f-2c79-4331-aa41-47d648fc6499-kube-api-access-mcn6j\") pod \"keystone-bootstrap-dr5z9\" (UID: \"9bc32f3f-2c79-4331-aa41-47d648fc6499\") " pod="openstack/keystone-bootstrap-dr5z9" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.245462 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-scripts\") pod \"keystone-bootstrap-dr5z9\" (UID: \"9bc32f3f-2c79-4331-aa41-47d648fc6499\") " pod="openstack/keystone-bootstrap-dr5z9" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.246872 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-config-data\") pod \"keystone-bootstrap-dr5z9\" (UID: \"9bc32f3f-2c79-4331-aa41-47d648fc6499\") " pod="openstack/keystone-bootstrap-dr5z9" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.246910 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-combined-ca-bundle\") pod \"keystone-bootstrap-dr5z9\" (UID: \"9bc32f3f-2c79-4331-aa41-47d648fc6499\") " pod="openstack/keystone-bootstrap-dr5z9" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.247400 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-fernet-keys\") pod \"keystone-bootstrap-dr5z9\" (UID: \"9bc32f3f-2c79-4331-aa41-47d648fc6499\") " pod="openstack/keystone-bootstrap-dr5z9" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.256939 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-credential-keys\") pod \"keystone-bootstrap-dr5z9\" (UID: \"9bc32f3f-2c79-4331-aa41-47d648fc6499\") " pod="openstack/keystone-bootstrap-dr5z9" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.265537 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcn6j\" (UniqueName: \"kubernetes.io/projected/9bc32f3f-2c79-4331-aa41-47d648fc6499-kube-api-access-mcn6j\") pod \"keystone-bootstrap-dr5z9\" (UID: \"9bc32f3f-2c79-4331-aa41-47d648fc6499\") " pod="openstack/keystone-bootstrap-dr5z9" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.384383 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.422703 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.472578 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dr5z9" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.965633 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00a88a6b-3101-40ec-90ee-62a00b5559a1" path="/var/lib/kubelet/pods/00a88a6b-3101-40ec-90ee-62a00b5559a1/volumes" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.981831 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b455c66-699c-4a18-91cb-0e8fca97642e" path="/var/lib/kubelet/pods/7b455c66-699c-4a18-91cb-0e8fca97642e/volumes" Mar 20 13:45:35 crc kubenswrapper[4973]: I0320 13:45:35.983069 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc7dca93-568d-4f21-9162-21244ab6aa49" path="/var/lib/kubelet/pods/dc7dca93-568d-4f21-9162-21244ab6aa49/volumes" Mar 20 13:45:38 crc kubenswrapper[4973]: I0320 13:45:38.221422 4973 scope.go:117] "RemoveContainer" containerID="0c11382906a33a4d1a09b8ce97b60d980b2fb98cdca388f8ba9c13611675de20" Mar 20 13:45:43 crc kubenswrapper[4973]: I0320 13:45:43.558983 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" podUID="86ba366f-247a-4630-8fa6-196198d8aec7" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.171:5353: i/o timeout" Mar 20 13:45:43 crc kubenswrapper[4973]: E0320 13:45:43.732940 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Mar 20 13:45:43 crc kubenswrapper[4973]: E0320 13:45:43.733122 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n64ch68dh549h56bh5fh5fh5cdh64ch5c4h57fh5bh676h597h67fh66fh67h84h59dh4h587hc5h5ddh4h587h58dh59h5d9hc5h5cdh579h674h687q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-pt626,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(198d4fb1-4784-4553-b617-a2c33cec6df6): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:45:44 crc kubenswrapper[4973]: E0320 13:45:44.280134 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 20 13:45:44 crc kubenswrapper[4973]: E0320 13:45:44.280684 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2rb4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-lbsl6_openstack(92ef179e-0d37-4a3d-986f-5a4ea5bc5a23): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:45:44 crc kubenswrapper[4973]: E0320 13:45:44.282187 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-lbsl6" podUID="92ef179e-0d37-4a3d-986f-5a4ea5bc5a23" Mar 20 13:45:44 crc kubenswrapper[4973]: I0320 13:45:44.395834 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" Mar 20 13:45:44 crc kubenswrapper[4973]: I0320 13:45:44.573643 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc7rh\" (UniqueName: \"kubernetes.io/projected/86ba366f-247a-4630-8fa6-196198d8aec7-kube-api-access-xc7rh\") pod \"86ba366f-247a-4630-8fa6-196198d8aec7\" (UID: \"86ba366f-247a-4630-8fa6-196198d8aec7\") " Mar 20 13:45:44 crc kubenswrapper[4973]: I0320 13:45:44.573868 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-dns-swift-storage-0\") pod \"86ba366f-247a-4630-8fa6-196198d8aec7\" (UID: \"86ba366f-247a-4630-8fa6-196198d8aec7\") " Mar 20 13:45:44 crc kubenswrapper[4973]: I0320 13:45:44.573934 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-dns-svc\") pod \"86ba366f-247a-4630-8fa6-196198d8aec7\" (UID: \"86ba366f-247a-4630-8fa6-196198d8aec7\") " Mar 20 13:45:44 crc kubenswrapper[4973]: I0320 13:45:44.574056 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-ovsdbserver-sb\") pod \"86ba366f-247a-4630-8fa6-196198d8aec7\" (UID: \"86ba366f-247a-4630-8fa6-196198d8aec7\") " Mar 20 13:45:44 crc kubenswrapper[4973]: I0320 13:45:44.574385 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-ovsdbserver-nb\") pod \"86ba366f-247a-4630-8fa6-196198d8aec7\" (UID: \"86ba366f-247a-4630-8fa6-196198d8aec7\") " Mar 20 13:45:44 crc kubenswrapper[4973]: I0320 13:45:44.574422 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-config\") pod \"86ba366f-247a-4630-8fa6-196198d8aec7\" (UID: \"86ba366f-247a-4630-8fa6-196198d8aec7\") " Mar 20 13:45:44 crc kubenswrapper[4973]: I0320 13:45:44.579067 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86ba366f-247a-4630-8fa6-196198d8aec7-kube-api-access-xc7rh" (OuterVolumeSpecName: "kube-api-access-xc7rh") pod "86ba366f-247a-4630-8fa6-196198d8aec7" (UID: "86ba366f-247a-4630-8fa6-196198d8aec7"). InnerVolumeSpecName "kube-api-access-xc7rh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:44 crc kubenswrapper[4973]: I0320 13:45:44.627684 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "86ba366f-247a-4630-8fa6-196198d8aec7" (UID: "86ba366f-247a-4630-8fa6-196198d8aec7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:44 crc kubenswrapper[4973]: I0320 13:45:44.631000 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "86ba366f-247a-4630-8fa6-196198d8aec7" (UID: "86ba366f-247a-4630-8fa6-196198d8aec7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:44 crc kubenswrapper[4973]: I0320 13:45:44.640010 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "86ba366f-247a-4630-8fa6-196198d8aec7" (UID: "86ba366f-247a-4630-8fa6-196198d8aec7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:44 crc kubenswrapper[4973]: I0320 13:45:44.644211 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "86ba366f-247a-4630-8fa6-196198d8aec7" (UID: "86ba366f-247a-4630-8fa6-196198d8aec7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:44 crc kubenswrapper[4973]: I0320 13:45:44.654749 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-config" (OuterVolumeSpecName: "config") pod "86ba366f-247a-4630-8fa6-196198d8aec7" (UID: "86ba366f-247a-4630-8fa6-196198d8aec7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:45:44 crc kubenswrapper[4973]: I0320 13:45:44.677204 4973 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:44 crc kubenswrapper[4973]: I0320 13:45:44.677250 4973 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:44 crc kubenswrapper[4973]: I0320 13:45:44.677261 4973 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:44 crc kubenswrapper[4973]: I0320 13:45:44.677269 4973 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:44 crc kubenswrapper[4973]: I0320 13:45:44.677280 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86ba366f-247a-4630-8fa6-196198d8aec7-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:44 crc kubenswrapper[4973]: I0320 13:45:44.677289 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc7rh\" (UniqueName: \"kubernetes.io/projected/86ba366f-247a-4630-8fa6-196198d8aec7-kube-api-access-xc7rh\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:44 crc kubenswrapper[4973]: I0320 13:45:44.721551 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" event={"ID":"86ba366f-247a-4630-8fa6-196198d8aec7","Type":"ContainerDied","Data":"ce4cc5705c246c0732cd3085297b51ce303e226385309cd32693c47f580867ad"} Mar 20 13:45:44 crc kubenswrapper[4973]: I0320 13:45:44.721581 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" Mar 20 13:45:44 crc kubenswrapper[4973]: E0320 13:45:44.725033 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-lbsl6" podUID="92ef179e-0d37-4a3d-986f-5a4ea5bc5a23" Mar 20 13:45:44 crc kubenswrapper[4973]: I0320 13:45:44.820486 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-wgtqz"] Mar 20 13:45:44 crc kubenswrapper[4973]: I0320 13:45:44.833302 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77585f5f8c-wgtqz"] Mar 20 13:45:45 crc kubenswrapper[4973]: E0320 13:45:45.924924 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 20 13:45:45 crc kubenswrapper[4973]: E0320 13:45:45.925575 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q8dvh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-zzsmb_openstack(6fec0901-00c6-410f-986c-4dcac4fe1359): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:45:45 crc kubenswrapper[4973]: E0320 13:45:45.926726 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-zzsmb" podUID="6fec0901-00c6-410f-986c-4dcac4fe1359" Mar 20 13:45:45 crc kubenswrapper[4973]: I0320 13:45:45.966808 4973 scope.go:117] "RemoveContainer" containerID="fcdcfae7c0422c8b17b5fabfc8996b5c4cfc2a9a66dcaac963eb6a59457c2644" Mar 20 13:45:45 crc kubenswrapper[4973]: I0320 13:45:45.970362 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86ba366f-247a-4630-8fa6-196198d8aec7" path="/var/lib/kubelet/pods/86ba366f-247a-4630-8fa6-196198d8aec7/volumes" Mar 20 13:45:46 crc kubenswrapper[4973]: I0320 13:45:46.539995 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:45:46 crc kubenswrapper[4973]: I0320 13:45:46.655773 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:45:46 crc kubenswrapper[4973]: W0320 13:45:46.658546 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod446ab543_b039_4065_964f_945824ddec63.slice/crio-087378d9e75af5932d6ffb6f2b09ab926ea359a7002e69608ba724c3dbcc2e78 WatchSource:0}: Error finding container 087378d9e75af5932d6ffb6f2b09ab926ea359a7002e69608ba724c3dbcc2e78: Status 404 returned error can't find the container with id 087378d9e75af5932d6ffb6f2b09ab926ea359a7002e69608ba724c3dbcc2e78 Mar 20 13:45:46 crc kubenswrapper[4973]: I0320 13:45:46.740408 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"446ab543-b039-4065-964f-945824ddec63","Type":"ContainerStarted","Data":"087378d9e75af5932d6ffb6f2b09ab926ea359a7002e69608ba724c3dbcc2e78"} Mar 20 13:45:46 crc kubenswrapper[4973]: I0320 13:45:46.808727 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-dr5z9"] Mar 20 13:45:47 crc kubenswrapper[4973]: E0320 13:45:47.026146 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-zzsmb" podUID="6fec0901-00c6-410f-986c-4dcac4fe1359" Mar 20 13:45:47 crc kubenswrapper[4973]: W0320 13:45:47.087191 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bc32f3f_2c79_4331_aa41_47d648fc6499.slice/crio-3c2dd293ee66a674468bf910da05e2d3a261736b884ea1038e447724aa8d86d3 WatchSource:0}: Error finding container 3c2dd293ee66a674468bf910da05e2d3a261736b884ea1038e447724aa8d86d3: Status 404 returned error can't find the container with id 3c2dd293ee66a674468bf910da05e2d3a261736b884ea1038e447724aa8d86d3 Mar 20 13:45:47 crc kubenswrapper[4973]: I0320 13:45:47.135161 4973 scope.go:117] "RemoveContainer" containerID="ecf6f5bd6ab1ed88304d3614f92656f95c74d4951b1d66a11edcc5ff9ac2c49f" Mar 20 13:45:47 crc kubenswrapper[4973]: I0320 13:45:47.217574 4973 scope.go:117] "RemoveContainer" containerID="5b15ddb77213cb46099b1880baa1f7699211e31f7d30a7d59159593f8d8ffc93" Mar 20 13:45:47 crc kubenswrapper[4973]: I0320 13:45:47.764893 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dr5z9" event={"ID":"9bc32f3f-2c79-4331-aa41-47d648fc6499","Type":"ContainerStarted","Data":"c74802542357b1e2931636b4d8c55aaf2be5116e981fc6e9cacf286b79a634fd"} Mar 20 13:45:47 crc kubenswrapper[4973]: I0320 13:45:47.765361 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dr5z9" event={"ID":"9bc32f3f-2c79-4331-aa41-47d648fc6499","Type":"ContainerStarted","Data":"3c2dd293ee66a674468bf910da05e2d3a261736b884ea1038e447724aa8d86d3"} Mar 20 13:45:47 crc kubenswrapper[4973]: I0320 13:45:47.767245 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2wgvh" event={"ID":"5bcf98ea-17cb-432f-8d35-18cc016401ed","Type":"ContainerStarted","Data":"da7adef8f2c02307e433d570bf8a3a9ccef94d506092b871aba2526cf05588b9"} Mar 20 13:45:47 crc kubenswrapper[4973]: I0320 13:45:47.769873 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9befea21-7c31-4cd9-b5a2-2f86a1d32b28","Type":"ContainerStarted","Data":"437e3854f9943662a434b350517f6830327bba98666c5a03a7d1b3db2ecd96b7"} Mar 20 13:45:47 crc kubenswrapper[4973]: I0320 13:45:47.780706 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"198d4fb1-4784-4553-b617-a2c33cec6df6","Type":"ContainerStarted","Data":"186ab78de0056ee24a059d1993ee768dc788875d32e78405f88b053e204b2bd0"} Mar 20 13:45:47 crc kubenswrapper[4973]: I0320 13:45:47.792232 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-znp6r" event={"ID":"5d867119-66df-4aa7-a2dd-13d0d40ce2cc","Type":"ContainerStarted","Data":"17c6198576f37401cc560eb9a955f8ab065ebd950c3c257644e6a555cefe1935"} Mar 20 13:45:47 crc kubenswrapper[4973]: I0320 13:45:47.798068 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-dr5z9" podStartSLOduration=12.798040408 podStartE2EDuration="12.798040408s" podCreationTimestamp="2026-03-20 13:45:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:47.790743769 +0000 UTC m=+1468.534413513" watchObservedRunningTime="2026-03-20 13:45:47.798040408 +0000 UTC m=+1468.541710152" Mar 20 13:45:47 crc kubenswrapper[4973]: I0320 13:45:47.811886 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-2wgvh" podStartSLOduration=7.218903082 podStartE2EDuration="39.811863306s" podCreationTimestamp="2026-03-20 13:45:08 +0000 UTC" firstStartedPulling="2026-03-20 13:45:11.118678137 +0000 UTC m=+1431.862347871" lastFinishedPulling="2026-03-20 13:45:43.711638351 +0000 UTC m=+1464.455308095" observedRunningTime="2026-03-20 13:45:47.806357325 +0000 UTC m=+1468.550027069" watchObservedRunningTime="2026-03-20 13:45:47.811863306 +0000 UTC m=+1468.555533050" Mar 20 13:45:47 crc kubenswrapper[4973]: I0320 13:45:47.840824 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-znp6r" podStartSLOduration=2.735802236 podStartE2EDuration="39.840805365s" podCreationTimestamp="2026-03-20 13:45:08 +0000 UTC" firstStartedPulling="2026-03-20 13:45:10.096521758 +0000 UTC m=+1430.840191502" lastFinishedPulling="2026-03-20 13:45:47.201524887 +0000 UTC m=+1467.945194631" observedRunningTime="2026-03-20 13:45:47.83544728 +0000 UTC m=+1468.579117024" watchObservedRunningTime="2026-03-20 13:45:47.840805365 +0000 UTC m=+1468.584475109" Mar 20 13:45:48 crc kubenswrapper[4973]: I0320 13:45:48.560302 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-77585f5f8c-wgtqz" podUID="86ba366f-247a-4630-8fa6-196198d8aec7" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.171:5353: i/o timeout" Mar 20 13:45:48 crc kubenswrapper[4973]: I0320 13:45:48.833682 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9befea21-7c31-4cd9-b5a2-2f86a1d32b28","Type":"ContainerStarted","Data":"80e73f3dbaee92c1970189eab012b3d029931f9afa64eb568818b4b37843f08c"} Mar 20 13:45:48 crc kubenswrapper[4973]: I0320 13:45:48.833994 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9befea21-7c31-4cd9-b5a2-2f86a1d32b28","Type":"ContainerStarted","Data":"29f1058286c4135a0040c93a2b493cec6fb2e8530a48378dd6d0e669e6e7f8c2"} Mar 20 13:45:48 crc kubenswrapper[4973]: I0320 13:45:48.844030 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"446ab543-b039-4065-964f-945824ddec63","Type":"ContainerStarted","Data":"d089e46bbfe0c52652e56f493ae3b60a798e861828b80405b16d6c6a7ae423ae"} Mar 20 13:45:48 crc kubenswrapper[4973]: I0320 13:45:48.844076 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"446ab543-b039-4065-964f-945824ddec63","Type":"ContainerStarted","Data":"250bc14e923b459bc4688d4c37932bb175e6fd2bd1c85de12c3664484335bf8e"} Mar 20 13:45:48 crc kubenswrapper[4973]: I0320 13:45:48.858266 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=14.858246876 podStartE2EDuration="14.858246876s" podCreationTimestamp="2026-03-20 13:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:48.852493919 +0000 UTC m=+1469.596163663" watchObservedRunningTime="2026-03-20 13:45:48.858246876 +0000 UTC m=+1469.601916620" Mar 20 13:45:53 crc kubenswrapper[4973]: I0320 13:45:53.919393 4973 generic.go:334] "Generic (PLEG): container finished" podID="9bc32f3f-2c79-4331-aa41-47d648fc6499" containerID="c74802542357b1e2931636b4d8c55aaf2be5116e981fc6e9cacf286b79a634fd" exitCode=0 Mar 20 13:45:53 crc kubenswrapper[4973]: I0320 13:45:53.920001 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dr5z9" event={"ID":"9bc32f3f-2c79-4331-aa41-47d648fc6499","Type":"ContainerDied","Data":"c74802542357b1e2931636b4d8c55aaf2be5116e981fc6e9cacf286b79a634fd"} Mar 20 13:45:53 crc kubenswrapper[4973]: I0320 13:45:53.943382 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=19.943330312 podStartE2EDuration="19.943330312s" podCreationTimestamp="2026-03-20 13:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:45:48.892018978 +0000 UTC m=+1469.635688732" watchObservedRunningTime="2026-03-20 13:45:53.943330312 +0000 UTC m=+1474.687000056" Mar 20 13:45:54 crc kubenswrapper[4973]: I0320 13:45:54.932250 4973 generic.go:334] "Generic (PLEG): container finished" podID="5bcf98ea-17cb-432f-8d35-18cc016401ed" containerID="da7adef8f2c02307e433d570bf8a3a9ccef94d506092b871aba2526cf05588b9" exitCode=0 Mar 20 13:45:54 crc kubenswrapper[4973]: I0320 13:45:54.932387 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2wgvh" event={"ID":"5bcf98ea-17cb-432f-8d35-18cc016401ed","Type":"ContainerDied","Data":"da7adef8f2c02307e433d570bf8a3a9ccef94d506092b871aba2526cf05588b9"} Mar 20 13:45:55 crc kubenswrapper[4973]: I0320 13:45:55.385704 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:55 crc kubenswrapper[4973]: I0320 13:45:55.386074 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:55 crc kubenswrapper[4973]: I0320 13:45:55.423196 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 13:45:55 crc kubenswrapper[4973]: I0320 13:45:55.423241 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 13:45:55 crc kubenswrapper[4973]: I0320 13:45:55.442262 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:55 crc kubenswrapper[4973]: I0320 13:45:55.444922 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:55 crc kubenswrapper[4973]: I0320 13:45:55.458114 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 13:45:55 crc kubenswrapper[4973]: I0320 13:45:55.498071 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 13:45:55 crc kubenswrapper[4973]: I0320 13:45:55.949117 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 13:45:55 crc kubenswrapper[4973]: I0320 13:45:55.950144 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 13:45:55 crc kubenswrapper[4973]: I0320 13:45:55.950414 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:55 crc kubenswrapper[4973]: I0320 13:45:55.969524 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 13:45:57 crc kubenswrapper[4973]: I0320 13:45:57.973649 4973 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:45:57 crc kubenswrapper[4973]: I0320 13:45:57.974011 4973 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:45:57 crc kubenswrapper[4973]: I0320 13:45:57.975811 4973 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:45:57 crc kubenswrapper[4973]: I0320 13:45:57.975826 4973 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.756471 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dr5z9" Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.767885 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2wgvh" Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.833292 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-fernet-keys\") pod \"9bc32f3f-2c79-4331-aa41-47d648fc6499\" (UID: \"9bc32f3f-2c79-4331-aa41-47d648fc6499\") " Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.833424 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-scripts\") pod \"9bc32f3f-2c79-4331-aa41-47d648fc6499\" (UID: \"9bc32f3f-2c79-4331-aa41-47d648fc6499\") " Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.833505 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-credential-keys\") pod \"9bc32f3f-2c79-4331-aa41-47d648fc6499\" (UID: \"9bc32f3f-2c79-4331-aa41-47d648fc6499\") " Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.833630 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-config-data\") pod \"9bc32f3f-2c79-4331-aa41-47d648fc6499\" (UID: \"9bc32f3f-2c79-4331-aa41-47d648fc6499\") " Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.833714 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-combined-ca-bundle\") pod \"9bc32f3f-2c79-4331-aa41-47d648fc6499\" (UID: \"9bc32f3f-2c79-4331-aa41-47d648fc6499\") " Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.833803 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcn6j\" (UniqueName: \"kubernetes.io/projected/9bc32f3f-2c79-4331-aa41-47d648fc6499-kube-api-access-mcn6j\") pod \"9bc32f3f-2c79-4331-aa41-47d648fc6499\" (UID: \"9bc32f3f-2c79-4331-aa41-47d648fc6499\") " Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.839729 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "9bc32f3f-2c79-4331-aa41-47d648fc6499" (UID: "9bc32f3f-2c79-4331-aa41-47d648fc6499"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.840148 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-scripts" (OuterVolumeSpecName: "scripts") pod "9bc32f3f-2c79-4331-aa41-47d648fc6499" (UID: "9bc32f3f-2c79-4331-aa41-47d648fc6499"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.842620 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bc32f3f-2c79-4331-aa41-47d648fc6499-kube-api-access-mcn6j" (OuterVolumeSpecName: "kube-api-access-mcn6j") pod "9bc32f3f-2c79-4331-aa41-47d648fc6499" (UID: "9bc32f3f-2c79-4331-aa41-47d648fc6499"). InnerVolumeSpecName "kube-api-access-mcn6j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.849462 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "9bc32f3f-2c79-4331-aa41-47d648fc6499" (UID: "9bc32f3f-2c79-4331-aa41-47d648fc6499"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.868519 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-config-data" (OuterVolumeSpecName: "config-data") pod "9bc32f3f-2c79-4331-aa41-47d648fc6499" (UID: "9bc32f3f-2c79-4331-aa41-47d648fc6499"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.869833 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9bc32f3f-2c79-4331-aa41-47d648fc6499" (UID: "9bc32f3f-2c79-4331-aa41-47d648fc6499"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.935422 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bcf98ea-17cb-432f-8d35-18cc016401ed-scripts\") pod \"5bcf98ea-17cb-432f-8d35-18cc016401ed\" (UID: \"5bcf98ea-17cb-432f-8d35-18cc016401ed\") " Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.936217 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dll9m\" (UniqueName: \"kubernetes.io/projected/5bcf98ea-17cb-432f-8d35-18cc016401ed-kube-api-access-dll9m\") pod \"5bcf98ea-17cb-432f-8d35-18cc016401ed\" (UID: \"5bcf98ea-17cb-432f-8d35-18cc016401ed\") " Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.936373 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bcf98ea-17cb-432f-8d35-18cc016401ed-config-data\") pod \"5bcf98ea-17cb-432f-8d35-18cc016401ed\" (UID: \"5bcf98ea-17cb-432f-8d35-18cc016401ed\") " Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.936608 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bcf98ea-17cb-432f-8d35-18cc016401ed-logs\") pod \"5bcf98ea-17cb-432f-8d35-18cc016401ed\" (UID: \"5bcf98ea-17cb-432f-8d35-18cc016401ed\") " Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.936717 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bcf98ea-17cb-432f-8d35-18cc016401ed-combined-ca-bundle\") pod \"5bcf98ea-17cb-432f-8d35-18cc016401ed\" (UID: \"5bcf98ea-17cb-432f-8d35-18cc016401ed\") " Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.937310 4973 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.937418 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.937485 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.937547 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcn6j\" (UniqueName: \"kubernetes.io/projected/9bc32f3f-2c79-4331-aa41-47d648fc6499-kube-api-access-mcn6j\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.937603 4973 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.937668 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bc32f3f-2c79-4331-aa41-47d648fc6499-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.938808 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bcf98ea-17cb-432f-8d35-18cc016401ed-logs" (OuterVolumeSpecName: "logs") pod "5bcf98ea-17cb-432f-8d35-18cc016401ed" (UID: "5bcf98ea-17cb-432f-8d35-18cc016401ed"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.942000 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bcf98ea-17cb-432f-8d35-18cc016401ed-scripts" (OuterVolumeSpecName: "scripts") pod "5bcf98ea-17cb-432f-8d35-18cc016401ed" (UID: "5bcf98ea-17cb-432f-8d35-18cc016401ed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.942682 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bcf98ea-17cb-432f-8d35-18cc016401ed-kube-api-access-dll9m" (OuterVolumeSpecName: "kube-api-access-dll9m") pod "5bcf98ea-17cb-432f-8d35-18cc016401ed" (UID: "5bcf98ea-17cb-432f-8d35-18cc016401ed"). InnerVolumeSpecName "kube-api-access-dll9m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.979379 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bcf98ea-17cb-432f-8d35-18cc016401ed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5bcf98ea-17cb-432f-8d35-18cc016401ed" (UID: "5bcf98ea-17cb-432f-8d35-18cc016401ed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.987867 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bcf98ea-17cb-432f-8d35-18cc016401ed-config-data" (OuterVolumeSpecName: "config-data") pod "5bcf98ea-17cb-432f-8d35-18cc016401ed" (UID: "5bcf98ea-17cb-432f-8d35-18cc016401ed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.991309 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-dr5z9" event={"ID":"9bc32f3f-2c79-4331-aa41-47d648fc6499","Type":"ContainerDied","Data":"3c2dd293ee66a674468bf910da05e2d3a261736b884ea1038e447724aa8d86d3"} Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.991379 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c2dd293ee66a674468bf910da05e2d3a261736b884ea1038e447724aa8d86d3" Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.991443 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-dr5z9" Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.998512 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-2wgvh" event={"ID":"5bcf98ea-17cb-432f-8d35-18cc016401ed","Type":"ContainerDied","Data":"a9b71dad3317467be88db6ab222a7c3e5a209d1e9298bfa1c99fa5f8807dbbb0"} Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.998558 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9b71dad3317467be88db6ab222a7c3e5a209d1e9298bfa1c99fa5f8807dbbb0" Mar 20 13:45:58 crc kubenswrapper[4973]: I0320 13:45:58.998615 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-2wgvh" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.047906 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5bcf98ea-17cb-432f-8d35-18cc016401ed-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.047945 4973 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5bcf98ea-17cb-432f-8d35-18cc016401ed-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.047958 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bcf98ea-17cb-432f-8d35-18cc016401ed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.047970 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5bcf98ea-17cb-432f-8d35-18cc016401ed-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.047982 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dll9m\" (UniqueName: \"kubernetes.io/projected/5bcf98ea-17cb-432f-8d35-18cc016401ed-kube-api-access-dll9m\") on node \"crc\" DevicePath \"\"" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.883478 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7b85d6bdf6-j78f6"] Mar 20 13:45:59 crc kubenswrapper[4973]: E0320 13:45:59.884484 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ba366f-247a-4630-8fa6-196198d8aec7" containerName="init" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.884508 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ba366f-247a-4630-8fa6-196198d8aec7" containerName="init" Mar 20 13:45:59 crc kubenswrapper[4973]: E0320 13:45:59.884551 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bc32f3f-2c79-4331-aa41-47d648fc6499" containerName="keystone-bootstrap" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.884561 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bc32f3f-2c79-4331-aa41-47d648fc6499" containerName="keystone-bootstrap" Mar 20 13:45:59 crc kubenswrapper[4973]: E0320 13:45:59.884585 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bcf98ea-17cb-432f-8d35-18cc016401ed" containerName="placement-db-sync" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.884594 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bcf98ea-17cb-432f-8d35-18cc016401ed" containerName="placement-db-sync" Mar 20 13:45:59 crc kubenswrapper[4973]: E0320 13:45:59.884607 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ba366f-247a-4630-8fa6-196198d8aec7" containerName="dnsmasq-dns" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.884614 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ba366f-247a-4630-8fa6-196198d8aec7" containerName="dnsmasq-dns" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.884899 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bcf98ea-17cb-432f-8d35-18cc016401ed" containerName="placement-db-sync" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.884924 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bc32f3f-2c79-4331-aa41-47d648fc6499" containerName="keystone-bootstrap" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.884945 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="86ba366f-247a-4630-8fa6-196198d8aec7" containerName="dnsmasq-dns" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.885984 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.891756 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.892424 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.892676 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.892832 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.893535 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.897108 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-nvzpw" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.905555 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7b85d6bdf6-j78f6"] Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.967925 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ee1828-62b2-46d7-9225-61a4725bd6a6-combined-ca-bundle\") pod \"keystone-7b85d6bdf6-j78f6\" (UID: \"47ee1828-62b2-46d7-9225-61a4725bd6a6\") " pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.968064 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/47ee1828-62b2-46d7-9225-61a4725bd6a6-fernet-keys\") pod \"keystone-7b85d6bdf6-j78f6\" (UID: \"47ee1828-62b2-46d7-9225-61a4725bd6a6\") " pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.968106 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47ee1828-62b2-46d7-9225-61a4725bd6a6-internal-tls-certs\") pod \"keystone-7b85d6bdf6-j78f6\" (UID: \"47ee1828-62b2-46d7-9225-61a4725bd6a6\") " pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.968162 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/47ee1828-62b2-46d7-9225-61a4725bd6a6-credential-keys\") pod \"keystone-7b85d6bdf6-j78f6\" (UID: \"47ee1828-62b2-46d7-9225-61a4725bd6a6\") " pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.968243 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47ee1828-62b2-46d7-9225-61a4725bd6a6-public-tls-certs\") pod \"keystone-7b85d6bdf6-j78f6\" (UID: \"47ee1828-62b2-46d7-9225-61a4725bd6a6\") " pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.968301 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dh2lw\" (UniqueName: \"kubernetes.io/projected/47ee1828-62b2-46d7-9225-61a4725bd6a6-kube-api-access-dh2lw\") pod \"keystone-7b85d6bdf6-j78f6\" (UID: \"47ee1828-62b2-46d7-9225-61a4725bd6a6\") " pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.968362 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47ee1828-62b2-46d7-9225-61a4725bd6a6-scripts\") pod \"keystone-7b85d6bdf6-j78f6\" (UID: \"47ee1828-62b2-46d7-9225-61a4725bd6a6\") " pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:45:59 crc kubenswrapper[4973]: I0320 13:45:59.969541 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47ee1828-62b2-46d7-9225-61a4725bd6a6-config-data\") pod \"keystone-7b85d6bdf6-j78f6\" (UID: \"47ee1828-62b2-46d7-9225-61a4725bd6a6\") " pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.014017 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d746875b8-pt6tm"] Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.015998 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.018704 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-n6z9b" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.018903 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.019066 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.019216 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.019417 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.025633 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"198d4fb1-4784-4553-b617-a2c33cec6df6","Type":"ContainerStarted","Data":"b790e51150b683894674eb14c8dce56ef6218db9eed72c18e694cb0087239c6c"} Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.071661 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/47ee1828-62b2-46d7-9225-61a4725bd6a6-fernet-keys\") pod \"keystone-7b85d6bdf6-j78f6\" (UID: \"47ee1828-62b2-46d7-9225-61a4725bd6a6\") " pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.071751 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47ee1828-62b2-46d7-9225-61a4725bd6a6-internal-tls-certs\") pod \"keystone-7b85d6bdf6-j78f6\" (UID: \"47ee1828-62b2-46d7-9225-61a4725bd6a6\") " pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.071788 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/47ee1828-62b2-46d7-9225-61a4725bd6a6-credential-keys\") pod \"keystone-7b85d6bdf6-j78f6\" (UID: \"47ee1828-62b2-46d7-9225-61a4725bd6a6\") " pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.071934 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47ee1828-62b2-46d7-9225-61a4725bd6a6-public-tls-certs\") pod \"keystone-7b85d6bdf6-j78f6\" (UID: \"47ee1828-62b2-46d7-9225-61a4725bd6a6\") " pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.072011 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dh2lw\" (UniqueName: \"kubernetes.io/projected/47ee1828-62b2-46d7-9225-61a4725bd6a6-kube-api-access-dh2lw\") pod \"keystone-7b85d6bdf6-j78f6\" (UID: \"47ee1828-62b2-46d7-9225-61a4725bd6a6\") " pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.072082 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47ee1828-62b2-46d7-9225-61a4725bd6a6-scripts\") pod \"keystone-7b85d6bdf6-j78f6\" (UID: \"47ee1828-62b2-46d7-9225-61a4725bd6a6\") " pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.072168 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47ee1828-62b2-46d7-9225-61a4725bd6a6-config-data\") pod \"keystone-7b85d6bdf6-j78f6\" (UID: \"47ee1828-62b2-46d7-9225-61a4725bd6a6\") " pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.072232 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ee1828-62b2-46d7-9225-61a4725bd6a6-combined-ca-bundle\") pod \"keystone-7b85d6bdf6-j78f6\" (UID: \"47ee1828-62b2-46d7-9225-61a4725bd6a6\") " pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.081820 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/47ee1828-62b2-46d7-9225-61a4725bd6a6-combined-ca-bundle\") pod \"keystone-7b85d6bdf6-j78f6\" (UID: \"47ee1828-62b2-46d7-9225-61a4725bd6a6\") " pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.083955 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/47ee1828-62b2-46d7-9225-61a4725bd6a6-scripts\") pod \"keystone-7b85d6bdf6-j78f6\" (UID: \"47ee1828-62b2-46d7-9225-61a4725bd6a6\") " pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.088208 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/47ee1828-62b2-46d7-9225-61a4725bd6a6-public-tls-certs\") pod \"keystone-7b85d6bdf6-j78f6\" (UID: \"47ee1828-62b2-46d7-9225-61a4725bd6a6\") " pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.092383 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/47ee1828-62b2-46d7-9225-61a4725bd6a6-fernet-keys\") pod \"keystone-7b85d6bdf6-j78f6\" (UID: \"47ee1828-62b2-46d7-9225-61a4725bd6a6\") " pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.098228 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/47ee1828-62b2-46d7-9225-61a4725bd6a6-internal-tls-certs\") pod \"keystone-7b85d6bdf6-j78f6\" (UID: \"47ee1828-62b2-46d7-9225-61a4725bd6a6\") " pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.099140 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/47ee1828-62b2-46d7-9225-61a4725bd6a6-config-data\") pod \"keystone-7b85d6bdf6-j78f6\" (UID: \"47ee1828-62b2-46d7-9225-61a4725bd6a6\") " pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.116405 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/47ee1828-62b2-46d7-9225-61a4725bd6a6-credential-keys\") pod \"keystone-7b85d6bdf6-j78f6\" (UID: \"47ee1828-62b2-46d7-9225-61a4725bd6a6\") " pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.123987 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d746875b8-pt6tm"] Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.124962 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dh2lw\" (UniqueName: \"kubernetes.io/projected/47ee1828-62b2-46d7-9225-61a4725bd6a6-kube-api-access-dh2lw\") pod \"keystone-7b85d6bdf6-j78f6\" (UID: \"47ee1828-62b2-46d7-9225-61a4725bd6a6\") " pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.181237 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llnq7\" (UniqueName: \"kubernetes.io/projected/260048b8-b24c-48f1-bfb4-12b8936a1249-kube-api-access-llnq7\") pod \"placement-d746875b8-pt6tm\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.181639 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-internal-tls-certs\") pod \"placement-d746875b8-pt6tm\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.181807 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-combined-ca-bundle\") pod \"placement-d746875b8-pt6tm\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.181927 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-public-tls-certs\") pod \"placement-d746875b8-pt6tm\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.182133 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/260048b8-b24c-48f1-bfb4-12b8936a1249-logs\") pod \"placement-d746875b8-pt6tm\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.182278 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-scripts\") pod \"placement-d746875b8-pt6tm\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.182445 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-config-data\") pod \"placement-d746875b8-pt6tm\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.209948 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.225645 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566906-smfqr"] Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.227613 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566906-smfqr" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.236731 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.237288 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.237459 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.260303 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566906-smfqr"] Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.273598 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-85f659f75b-sh7fp"] Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.275888 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-85f659f75b-sh7fp" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.289107 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/260048b8-b24c-48f1-bfb4-12b8936a1249-logs\") pod \"placement-d746875b8-pt6tm\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.289158 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-scripts\") pod \"placement-d746875b8-pt6tm\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.289198 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-config-data\") pod \"placement-d746875b8-pt6tm\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.289272 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bj96b\" (UniqueName: \"kubernetes.io/projected/1712bc13-36c4-4c56-b652-e0a0bd194179-kube-api-access-bj96b\") pod \"auto-csr-approver-29566906-smfqr\" (UID: \"1712bc13-36c4-4c56-b652-e0a0bd194179\") " pod="openshift-infra/auto-csr-approver-29566906-smfqr" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.289328 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llnq7\" (UniqueName: \"kubernetes.io/projected/260048b8-b24c-48f1-bfb4-12b8936a1249-kube-api-access-llnq7\") pod \"placement-d746875b8-pt6tm\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.289459 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-internal-tls-certs\") pod \"placement-d746875b8-pt6tm\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.289492 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-combined-ca-bundle\") pod \"placement-d746875b8-pt6tm\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.289521 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-public-tls-certs\") pod \"placement-d746875b8-pt6tm\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.295157 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/260048b8-b24c-48f1-bfb4-12b8936a1249-logs\") pod \"placement-d746875b8-pt6tm\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.297236 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-public-tls-certs\") pod \"placement-d746875b8-pt6tm\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.299515 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-scripts\") pod \"placement-d746875b8-pt6tm\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.301477 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-internal-tls-certs\") pod \"placement-d746875b8-pt6tm\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.301492 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-config-data\") pod \"placement-d746875b8-pt6tm\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.304286 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-combined-ca-bundle\") pod \"placement-d746875b8-pt6tm\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.319425 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-85f659f75b-sh7fp"] Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.330264 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llnq7\" (UniqueName: \"kubernetes.io/projected/260048b8-b24c-48f1-bfb4-12b8936a1249-kube-api-access-llnq7\") pod \"placement-d746875b8-pt6tm\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.379104 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.391512 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fff088b0-c9e8-47de-a698-80bcebcfd1a4-scripts\") pod \"placement-85f659f75b-sh7fp\" (UID: \"fff088b0-c9e8-47de-a698-80bcebcfd1a4\") " pod="openstack/placement-85f659f75b-sh7fp" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.391601 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fff088b0-c9e8-47de-a698-80bcebcfd1a4-logs\") pod \"placement-85f659f75b-sh7fp\" (UID: \"fff088b0-c9e8-47de-a698-80bcebcfd1a4\") " pod="openstack/placement-85f659f75b-sh7fp" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.391733 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fff088b0-c9e8-47de-a698-80bcebcfd1a4-public-tls-certs\") pod \"placement-85f659f75b-sh7fp\" (UID: \"fff088b0-c9e8-47de-a698-80bcebcfd1a4\") " pod="openstack/placement-85f659f75b-sh7fp" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.391966 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bj96b\" (UniqueName: \"kubernetes.io/projected/1712bc13-36c4-4c56-b652-e0a0bd194179-kube-api-access-bj96b\") pod \"auto-csr-approver-29566906-smfqr\" (UID: \"1712bc13-36c4-4c56-b652-e0a0bd194179\") " pod="openshift-infra/auto-csr-approver-29566906-smfqr" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.392007 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l77x\" (UniqueName: \"kubernetes.io/projected/fff088b0-c9e8-47de-a698-80bcebcfd1a4-kube-api-access-2l77x\") pod \"placement-85f659f75b-sh7fp\" (UID: \"fff088b0-c9e8-47de-a698-80bcebcfd1a4\") " pod="openstack/placement-85f659f75b-sh7fp" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.392068 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fff088b0-c9e8-47de-a698-80bcebcfd1a4-config-data\") pod \"placement-85f659f75b-sh7fp\" (UID: \"fff088b0-c9e8-47de-a698-80bcebcfd1a4\") " pod="openstack/placement-85f659f75b-sh7fp" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.392102 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fff088b0-c9e8-47de-a698-80bcebcfd1a4-combined-ca-bundle\") pod \"placement-85f659f75b-sh7fp\" (UID: \"fff088b0-c9e8-47de-a698-80bcebcfd1a4\") " pod="openstack/placement-85f659f75b-sh7fp" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.392139 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fff088b0-c9e8-47de-a698-80bcebcfd1a4-internal-tls-certs\") pod \"placement-85f659f75b-sh7fp\" (UID: \"fff088b0-c9e8-47de-a698-80bcebcfd1a4\") " pod="openstack/placement-85f659f75b-sh7fp" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.418102 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bj96b\" (UniqueName: \"kubernetes.io/projected/1712bc13-36c4-4c56-b652-e0a0bd194179-kube-api-access-bj96b\") pod \"auto-csr-approver-29566906-smfqr\" (UID: \"1712bc13-36c4-4c56-b652-e0a0bd194179\") " pod="openshift-infra/auto-csr-approver-29566906-smfqr" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.421404 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566906-smfqr" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.505555 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l77x\" (UniqueName: \"kubernetes.io/projected/fff088b0-c9e8-47de-a698-80bcebcfd1a4-kube-api-access-2l77x\") pod \"placement-85f659f75b-sh7fp\" (UID: \"fff088b0-c9e8-47de-a698-80bcebcfd1a4\") " pod="openstack/placement-85f659f75b-sh7fp" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.505947 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fff088b0-c9e8-47de-a698-80bcebcfd1a4-config-data\") pod \"placement-85f659f75b-sh7fp\" (UID: \"fff088b0-c9e8-47de-a698-80bcebcfd1a4\") " pod="openstack/placement-85f659f75b-sh7fp" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.505981 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fff088b0-c9e8-47de-a698-80bcebcfd1a4-combined-ca-bundle\") pod \"placement-85f659f75b-sh7fp\" (UID: \"fff088b0-c9e8-47de-a698-80bcebcfd1a4\") " pod="openstack/placement-85f659f75b-sh7fp" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.506020 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fff088b0-c9e8-47de-a698-80bcebcfd1a4-internal-tls-certs\") pod \"placement-85f659f75b-sh7fp\" (UID: \"fff088b0-c9e8-47de-a698-80bcebcfd1a4\") " pod="openstack/placement-85f659f75b-sh7fp" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.506640 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fff088b0-c9e8-47de-a698-80bcebcfd1a4-scripts\") pod \"placement-85f659f75b-sh7fp\" (UID: \"fff088b0-c9e8-47de-a698-80bcebcfd1a4\") " pod="openstack/placement-85f659f75b-sh7fp" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.506921 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fff088b0-c9e8-47de-a698-80bcebcfd1a4-logs\") pod \"placement-85f659f75b-sh7fp\" (UID: \"fff088b0-c9e8-47de-a698-80bcebcfd1a4\") " pod="openstack/placement-85f659f75b-sh7fp" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.507113 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fff088b0-c9e8-47de-a698-80bcebcfd1a4-public-tls-certs\") pod \"placement-85f659f75b-sh7fp\" (UID: \"fff088b0-c9e8-47de-a698-80bcebcfd1a4\") " pod="openstack/placement-85f659f75b-sh7fp" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.507977 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fff088b0-c9e8-47de-a698-80bcebcfd1a4-logs\") pod \"placement-85f659f75b-sh7fp\" (UID: \"fff088b0-c9e8-47de-a698-80bcebcfd1a4\") " pod="openstack/placement-85f659f75b-sh7fp" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.516527 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fff088b0-c9e8-47de-a698-80bcebcfd1a4-config-data\") pod \"placement-85f659f75b-sh7fp\" (UID: \"fff088b0-c9e8-47de-a698-80bcebcfd1a4\") " pod="openstack/placement-85f659f75b-sh7fp" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.524014 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fff088b0-c9e8-47de-a698-80bcebcfd1a4-scripts\") pod \"placement-85f659f75b-sh7fp\" (UID: \"fff088b0-c9e8-47de-a698-80bcebcfd1a4\") " pod="openstack/placement-85f659f75b-sh7fp" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.524588 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fff088b0-c9e8-47de-a698-80bcebcfd1a4-combined-ca-bundle\") pod \"placement-85f659f75b-sh7fp\" (UID: \"fff088b0-c9e8-47de-a698-80bcebcfd1a4\") " pod="openstack/placement-85f659f75b-sh7fp" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.528802 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fff088b0-c9e8-47de-a698-80bcebcfd1a4-internal-tls-certs\") pod \"placement-85f659f75b-sh7fp\" (UID: \"fff088b0-c9e8-47de-a698-80bcebcfd1a4\") " pod="openstack/placement-85f659f75b-sh7fp" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.529362 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fff088b0-c9e8-47de-a698-80bcebcfd1a4-public-tls-certs\") pod \"placement-85f659f75b-sh7fp\" (UID: \"fff088b0-c9e8-47de-a698-80bcebcfd1a4\") " pod="openstack/placement-85f659f75b-sh7fp" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.535075 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l77x\" (UniqueName: \"kubernetes.io/projected/fff088b0-c9e8-47de-a698-80bcebcfd1a4-kube-api-access-2l77x\") pod \"placement-85f659f75b-sh7fp\" (UID: \"fff088b0-c9e8-47de-a698-80bcebcfd1a4\") " pod="openstack/placement-85f659f75b-sh7fp" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.752554 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-85f659f75b-sh7fp" Mar 20 13:46:00 crc kubenswrapper[4973]: I0320 13:46:00.863053 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7b85d6bdf6-j78f6"] Mar 20 13:46:01 crc kubenswrapper[4973]: I0320 13:46:01.048350 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7b85d6bdf6-j78f6" event={"ID":"47ee1828-62b2-46d7-9225-61a4725bd6a6","Type":"ContainerStarted","Data":"8ce6aa69c803d01e797ee3e2479d464cff420f7ebcfe581e0508bd7aaa787c57"} Mar 20 13:46:01 crc kubenswrapper[4973]: I0320 13:46:01.147082 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d746875b8-pt6tm"] Mar 20 13:46:01 crc kubenswrapper[4973]: I0320 13:46:01.219141 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566906-smfqr"] Mar 20 13:46:01 crc kubenswrapper[4973]: W0320 13:46:01.358623 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1712bc13_36c4_4c56_b652_e0a0bd194179.slice/crio-f34f8a99aab39622e212d6716a445e5965895c17d3213a11ab4e044a7deede43 WatchSource:0}: Error finding container f34f8a99aab39622e212d6716a445e5965895c17d3213a11ab4e044a7deede43: Status 404 returned error can't find the container with id f34f8a99aab39622e212d6716a445e5965895c17d3213a11ab4e044a7deede43 Mar 20 13:46:01 crc kubenswrapper[4973]: W0320 13:46:01.486033 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfff088b0_c9e8_47de_a698_80bcebcfd1a4.slice/crio-b9309724b7a6bd66c969550ef86a9ef6192a555197dd18a3e5f3d8f59ded496a WatchSource:0}: Error finding container b9309724b7a6bd66c969550ef86a9ef6192a555197dd18a3e5f3d8f59ded496a: Status 404 returned error can't find the container with id b9309724b7a6bd66c969550ef86a9ef6192a555197dd18a3e5f3d8f59ded496a Mar 20 13:46:01 crc kubenswrapper[4973]: I0320 13:46:01.488926 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-85f659f75b-sh7fp"] Mar 20 13:46:02 crc kubenswrapper[4973]: I0320 13:46:02.086298 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-85f659f75b-sh7fp" event={"ID":"fff088b0-c9e8-47de-a698-80bcebcfd1a4","Type":"ContainerStarted","Data":"a1704542bb3256ed059315caaccae4fa11a64730584a78ca7b5a03d764df2a9f"} Mar 20 13:46:02 crc kubenswrapper[4973]: I0320 13:46:02.086658 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-85f659f75b-sh7fp" event={"ID":"fff088b0-c9e8-47de-a698-80bcebcfd1a4","Type":"ContainerStarted","Data":"b9309724b7a6bd66c969550ef86a9ef6192a555197dd18a3e5f3d8f59ded496a"} Mar 20 13:46:02 crc kubenswrapper[4973]: I0320 13:46:02.099267 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7b85d6bdf6-j78f6" event={"ID":"47ee1828-62b2-46d7-9225-61a4725bd6a6","Type":"ContainerStarted","Data":"f64c033e7ccdc48298b9cbbf80837b57960b24f0e36016ff72a3cf999017fbaf"} Mar 20 13:46:02 crc kubenswrapper[4973]: I0320 13:46:02.099948 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:46:02 crc kubenswrapper[4973]: I0320 13:46:02.106477 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566906-smfqr" event={"ID":"1712bc13-36c4-4c56-b652-e0a0bd194179","Type":"ContainerStarted","Data":"f34f8a99aab39622e212d6716a445e5965895c17d3213a11ab4e044a7deede43"} Mar 20 13:46:02 crc kubenswrapper[4973]: I0320 13:46:02.110496 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lbsl6" event={"ID":"92ef179e-0d37-4a3d-986f-5a4ea5bc5a23","Type":"ContainerStarted","Data":"43d8c42eff4680a96454959aeea1088a03cbfbd6f3c3956b87bf4b94480a0378"} Mar 20 13:46:02 crc kubenswrapper[4973]: I0320 13:46:02.127142 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d746875b8-pt6tm" event={"ID":"260048b8-b24c-48f1-bfb4-12b8936a1249","Type":"ContainerStarted","Data":"4d3077f09f571751f3f39f9330b353cc1cbc495f8ed9f9f2608e1a2965abc600"} Mar 20 13:46:02 crc kubenswrapper[4973]: I0320 13:46:02.127189 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d746875b8-pt6tm" event={"ID":"260048b8-b24c-48f1-bfb4-12b8936a1249","Type":"ContainerStarted","Data":"cae84c01af21e313b77c50636744c9b797e8d4a2c0a8f4f0ff62120017397ed7"} Mar 20 13:46:02 crc kubenswrapper[4973]: I0320 13:46:02.143472 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-lbsl6" podStartSLOduration=4.235313676 podStartE2EDuration="54.143456032s" podCreationTimestamp="2026-03-20 13:45:08 +0000 UTC" firstStartedPulling="2026-03-20 13:45:10.970550584 +0000 UTC m=+1431.714220328" lastFinishedPulling="2026-03-20 13:46:00.87869294 +0000 UTC m=+1481.622362684" observedRunningTime="2026-03-20 13:46:02.141785305 +0000 UTC m=+1482.885455049" watchObservedRunningTime="2026-03-20 13:46:02.143456032 +0000 UTC m=+1482.887125766" Mar 20 13:46:02 crc kubenswrapper[4973]: I0320 13:46:02.145675 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7b85d6bdf6-j78f6" podStartSLOduration=3.145662572 podStartE2EDuration="3.145662572s" podCreationTimestamp="2026-03-20 13:45:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:02.120598037 +0000 UTC m=+1482.864267781" watchObservedRunningTime="2026-03-20 13:46:02.145662572 +0000 UTC m=+1482.889332316" Mar 20 13:46:02 crc kubenswrapper[4973]: I0320 13:46:02.557477 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 13:46:02 crc kubenswrapper[4973]: I0320 13:46:02.557852 4973 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:46:02 crc kubenswrapper[4973]: I0320 13:46:02.562447 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 13:46:02 crc kubenswrapper[4973]: I0320 13:46:02.562579 4973 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:46:02 crc kubenswrapper[4973]: I0320 13:46:02.605953 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 13:46:02 crc kubenswrapper[4973]: I0320 13:46:02.615165 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 13:46:03 crc kubenswrapper[4973]: I0320 13:46:03.145624 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d746875b8-pt6tm" event={"ID":"260048b8-b24c-48f1-bfb4-12b8936a1249","Type":"ContainerStarted","Data":"510b8b9ee83268c66c5d91a566cdb00d5c413d265f5a1a78c47471f704b4d089"} Mar 20 13:46:03 crc kubenswrapper[4973]: I0320 13:46:03.146827 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:03 crc kubenswrapper[4973]: I0320 13:46:03.146874 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:03 crc kubenswrapper[4973]: I0320 13:46:03.164327 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-85f659f75b-sh7fp" event={"ID":"fff088b0-c9e8-47de-a698-80bcebcfd1a4","Type":"ContainerStarted","Data":"e066561d634c78022909294fe7329e5fe76b788f465e58320b31ced8612f146a"} Mar 20 13:46:03 crc kubenswrapper[4973]: I0320 13:46:03.164800 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-85f659f75b-sh7fp" Mar 20 13:46:03 crc kubenswrapper[4973]: I0320 13:46:03.164818 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-85f659f75b-sh7fp" Mar 20 13:46:03 crc kubenswrapper[4973]: I0320 13:46:03.267419 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-85f659f75b-sh7fp" podStartSLOduration=3.267390269 podStartE2EDuration="3.267390269s" podCreationTimestamp="2026-03-20 13:46:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:03.235453007 +0000 UTC m=+1483.979122771" watchObservedRunningTime="2026-03-20 13:46:03.267390269 +0000 UTC m=+1484.011060043" Mar 20 13:46:03 crc kubenswrapper[4973]: I0320 13:46:03.313570 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-d746875b8-pt6tm" podStartSLOduration=4.313539888 podStartE2EDuration="4.313539888s" podCreationTimestamp="2026-03-20 13:45:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:03.206291422 +0000 UTC m=+1483.949961166" watchObservedRunningTime="2026-03-20 13:46:03.313539888 +0000 UTC m=+1484.057209642" Mar 20 13:46:05 crc kubenswrapper[4973]: I0320 13:46:05.236039 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zzsmb" event={"ID":"6fec0901-00c6-410f-986c-4dcac4fe1359","Type":"ContainerStarted","Data":"cbaf7d0cf8de8e4a6cd3237e1a32f4af31c3690957694832e4bc0a512ce3bd31"} Mar 20 13:46:05 crc kubenswrapper[4973]: I0320 13:46:05.239309 4973 generic.go:334] "Generic (PLEG): container finished" podID="1712bc13-36c4-4c56-b652-e0a0bd194179" containerID="27abd357ed00f4bfa2fb6b2df72ead92b99d556f8566162efdcac6b4a3e05553" exitCode=0 Mar 20 13:46:05 crc kubenswrapper[4973]: I0320 13:46:05.239387 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566906-smfqr" event={"ID":"1712bc13-36c4-4c56-b652-e0a0bd194179","Type":"ContainerDied","Data":"27abd357ed00f4bfa2fb6b2df72ead92b99d556f8566162efdcac6b4a3e05553"} Mar 20 13:46:05 crc kubenswrapper[4973]: I0320 13:46:05.242148 4973 generic.go:334] "Generic (PLEG): container finished" podID="5d867119-66df-4aa7-a2dd-13d0d40ce2cc" containerID="17c6198576f37401cc560eb9a955f8ab065ebd950c3c257644e6a555cefe1935" exitCode=0 Mar 20 13:46:05 crc kubenswrapper[4973]: I0320 13:46:05.242209 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-znp6r" event={"ID":"5d867119-66df-4aa7-a2dd-13d0d40ce2cc","Type":"ContainerDied","Data":"17c6198576f37401cc560eb9a955f8ab065ebd950c3c257644e6a555cefe1935"} Mar 20 13:46:05 crc kubenswrapper[4973]: I0320 13:46:05.267268 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-zzsmb" podStartSLOduration=3.954914272 podStartE2EDuration="57.267244195s" podCreationTimestamp="2026-03-20 13:45:08 +0000 UTC" firstStartedPulling="2026-03-20 13:45:10.373862917 +0000 UTC m=+1431.117532661" lastFinishedPulling="2026-03-20 13:46:03.68619283 +0000 UTC m=+1484.429862584" observedRunningTime="2026-03-20 13:46:05.260669495 +0000 UTC m=+1486.004339239" watchObservedRunningTime="2026-03-20 13:46:05.267244195 +0000 UTC m=+1486.010913939" Mar 20 13:46:11 crc kubenswrapper[4973]: I0320 13:46:11.158948 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-znp6r" Mar 20 13:46:11 crc kubenswrapper[4973]: I0320 13:46:11.258052 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d867119-66df-4aa7-a2dd-13d0d40ce2cc-config-data\") pod \"5d867119-66df-4aa7-a2dd-13d0d40ce2cc\" (UID: \"5d867119-66df-4aa7-a2dd-13d0d40ce2cc\") " Mar 20 13:46:11 crc kubenswrapper[4973]: I0320 13:46:11.258185 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d867119-66df-4aa7-a2dd-13d0d40ce2cc-combined-ca-bundle\") pod \"5d867119-66df-4aa7-a2dd-13d0d40ce2cc\" (UID: \"5d867119-66df-4aa7-a2dd-13d0d40ce2cc\") " Mar 20 13:46:11 crc kubenswrapper[4973]: I0320 13:46:11.258219 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsksr\" (UniqueName: \"kubernetes.io/projected/5d867119-66df-4aa7-a2dd-13d0d40ce2cc-kube-api-access-vsksr\") pod \"5d867119-66df-4aa7-a2dd-13d0d40ce2cc\" (UID: \"5d867119-66df-4aa7-a2dd-13d0d40ce2cc\") " Mar 20 13:46:11 crc kubenswrapper[4973]: I0320 13:46:11.263745 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d867119-66df-4aa7-a2dd-13d0d40ce2cc-kube-api-access-vsksr" (OuterVolumeSpecName: "kube-api-access-vsksr") pod "5d867119-66df-4aa7-a2dd-13d0d40ce2cc" (UID: "5d867119-66df-4aa7-a2dd-13d0d40ce2cc"). InnerVolumeSpecName "kube-api-access-vsksr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:11 crc kubenswrapper[4973]: I0320 13:46:11.291395 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d867119-66df-4aa7-a2dd-13d0d40ce2cc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d867119-66df-4aa7-a2dd-13d0d40ce2cc" (UID: "5d867119-66df-4aa7-a2dd-13d0d40ce2cc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:11 crc kubenswrapper[4973]: I0320 13:46:11.323715 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-znp6r" event={"ID":"5d867119-66df-4aa7-a2dd-13d0d40ce2cc","Type":"ContainerDied","Data":"1e7fce08aeb554166dadf690d74f9ad7a54e13ee2073601803431c8a10c95be1"} Mar 20 13:46:11 crc kubenswrapper[4973]: I0320 13:46:11.323762 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e7fce08aeb554166dadf690d74f9ad7a54e13ee2073601803431c8a10c95be1" Mar 20 13:46:11 crc kubenswrapper[4973]: I0320 13:46:11.323808 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-znp6r" Mar 20 13:46:11 crc kubenswrapper[4973]: I0320 13:46:11.343444 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d867119-66df-4aa7-a2dd-13d0d40ce2cc-config-data" (OuterVolumeSpecName: "config-data") pod "5d867119-66df-4aa7-a2dd-13d0d40ce2cc" (UID: "5d867119-66df-4aa7-a2dd-13d0d40ce2cc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:11 crc kubenswrapper[4973]: I0320 13:46:11.360558 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d867119-66df-4aa7-a2dd-13d0d40ce2cc-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:11 crc kubenswrapper[4973]: I0320 13:46:11.360603 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d867119-66df-4aa7-a2dd-13d0d40ce2cc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:11 crc kubenswrapper[4973]: I0320 13:46:11.360623 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsksr\" (UniqueName: \"kubernetes.io/projected/5d867119-66df-4aa7-a2dd-13d0d40ce2cc-kube-api-access-vsksr\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:11 crc kubenswrapper[4973]: I0320 13:46:11.618153 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566906-smfqr" Mar 20 13:46:11 crc kubenswrapper[4973]: I0320 13:46:11.666798 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bj96b\" (UniqueName: \"kubernetes.io/projected/1712bc13-36c4-4c56-b652-e0a0bd194179-kube-api-access-bj96b\") pod \"1712bc13-36c4-4c56-b652-e0a0bd194179\" (UID: \"1712bc13-36c4-4c56-b652-e0a0bd194179\") " Mar 20 13:46:11 crc kubenswrapper[4973]: I0320 13:46:11.670125 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1712bc13-36c4-4c56-b652-e0a0bd194179-kube-api-access-bj96b" (OuterVolumeSpecName: "kube-api-access-bj96b") pod "1712bc13-36c4-4c56-b652-e0a0bd194179" (UID: "1712bc13-36c4-4c56-b652-e0a0bd194179"). InnerVolumeSpecName "kube-api-access-bj96b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:11 crc kubenswrapper[4973]: I0320 13:46:11.770188 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bj96b\" (UniqueName: \"kubernetes.io/projected/1712bc13-36c4-4c56-b652-e0a0bd194179-kube-api-access-bj96b\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:12 crc kubenswrapper[4973]: E0320 13:46:12.018700 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="198d4fb1-4784-4553-b617-a2c33cec6df6" Mar 20 13:46:12 crc kubenswrapper[4973]: I0320 13:46:12.339615 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"198d4fb1-4784-4553-b617-a2c33cec6df6","Type":"ContainerStarted","Data":"8ee7685b7b7a2a052e9c22d063d9f4ce40c031303874ebc785da73b46697b36d"} Mar 20 13:46:12 crc kubenswrapper[4973]: I0320 13:46:12.339700 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="198d4fb1-4784-4553-b617-a2c33cec6df6" containerName="ceilometer-notification-agent" containerID="cri-o://186ab78de0056ee24a059d1993ee768dc788875d32e78405f88b053e204b2bd0" gracePeriod=30 Mar 20 13:46:12 crc kubenswrapper[4973]: I0320 13:46:12.339753 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:46:12 crc kubenswrapper[4973]: I0320 13:46:12.339787 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="198d4fb1-4784-4553-b617-a2c33cec6df6" containerName="proxy-httpd" containerID="cri-o://8ee7685b7b7a2a052e9c22d063d9f4ce40c031303874ebc785da73b46697b36d" gracePeriod=30 Mar 20 13:46:12 crc kubenswrapper[4973]: I0320 13:46:12.339823 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="198d4fb1-4784-4553-b617-a2c33cec6df6" containerName="sg-core" containerID="cri-o://b790e51150b683894674eb14c8dce56ef6218db9eed72c18e694cb0087239c6c" gracePeriod=30 Mar 20 13:46:12 crc kubenswrapper[4973]: I0320 13:46:12.343648 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566906-smfqr" event={"ID":"1712bc13-36c4-4c56-b652-e0a0bd194179","Type":"ContainerDied","Data":"f34f8a99aab39622e212d6716a445e5965895c17d3213a11ab4e044a7deede43"} Mar 20 13:46:12 crc kubenswrapper[4973]: I0320 13:46:12.343691 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f34f8a99aab39622e212d6716a445e5965895c17d3213a11ab4e044a7deede43" Mar 20 13:46:12 crc kubenswrapper[4973]: I0320 13:46:12.343759 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566906-smfqr" Mar 20 13:46:12 crc kubenswrapper[4973]: I0320 13:46:12.722074 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566900-mxl6j"] Mar 20 13:46:12 crc kubenswrapper[4973]: I0320 13:46:12.737125 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566900-mxl6j"] Mar 20 13:46:13 crc kubenswrapper[4973]: I0320 13:46:13.355571 4973 generic.go:334] "Generic (PLEG): container finished" podID="92ef179e-0d37-4a3d-986f-5a4ea5bc5a23" containerID="43d8c42eff4680a96454959aeea1088a03cbfbd6f3c3956b87bf4b94480a0378" exitCode=0 Mar 20 13:46:13 crc kubenswrapper[4973]: I0320 13:46:13.355654 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lbsl6" event={"ID":"92ef179e-0d37-4a3d-986f-5a4ea5bc5a23","Type":"ContainerDied","Data":"43d8c42eff4680a96454959aeea1088a03cbfbd6f3c3956b87bf4b94480a0378"} Mar 20 13:46:13 crc kubenswrapper[4973]: I0320 13:46:13.358114 4973 generic.go:334] "Generic (PLEG): container finished" podID="198d4fb1-4784-4553-b617-a2c33cec6df6" containerID="8ee7685b7b7a2a052e9c22d063d9f4ce40c031303874ebc785da73b46697b36d" exitCode=0 Mar 20 13:46:13 crc kubenswrapper[4973]: I0320 13:46:13.358141 4973 generic.go:334] "Generic (PLEG): container finished" podID="198d4fb1-4784-4553-b617-a2c33cec6df6" containerID="b790e51150b683894674eb14c8dce56ef6218db9eed72c18e694cb0087239c6c" exitCode=2 Mar 20 13:46:13 crc kubenswrapper[4973]: I0320 13:46:13.358161 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"198d4fb1-4784-4553-b617-a2c33cec6df6","Type":"ContainerDied","Data":"8ee7685b7b7a2a052e9c22d063d9f4ce40c031303874ebc785da73b46697b36d"} Mar 20 13:46:13 crc kubenswrapper[4973]: I0320 13:46:13.358184 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"198d4fb1-4784-4553-b617-a2c33cec6df6","Type":"ContainerDied","Data":"b790e51150b683894674eb14c8dce56ef6218db9eed72c18e694cb0087239c6c"} Mar 20 13:46:13 crc kubenswrapper[4973]: I0320 13:46:13.962909 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="803bba01-75c9-4c14-80c0-0da407ad672d" path="/var/lib/kubelet/pods/803bba01-75c9-4c14-80c0-0da407ad672d/volumes" Mar 20 13:46:14 crc kubenswrapper[4973]: I0320 13:46:14.765166 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lbsl6" Mar 20 13:46:14 crc kubenswrapper[4973]: I0320 13:46:14.846917 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92ef179e-0d37-4a3d-986f-5a4ea5bc5a23-combined-ca-bundle\") pod \"92ef179e-0d37-4a3d-986f-5a4ea5bc5a23\" (UID: \"92ef179e-0d37-4a3d-986f-5a4ea5bc5a23\") " Mar 20 13:46:14 crc kubenswrapper[4973]: I0320 13:46:14.846976 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/92ef179e-0d37-4a3d-986f-5a4ea5bc5a23-db-sync-config-data\") pod \"92ef179e-0d37-4a3d-986f-5a4ea5bc5a23\" (UID: \"92ef179e-0d37-4a3d-986f-5a4ea5bc5a23\") " Mar 20 13:46:14 crc kubenswrapper[4973]: I0320 13:46:14.847000 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2rb4\" (UniqueName: \"kubernetes.io/projected/92ef179e-0d37-4a3d-986f-5a4ea5bc5a23-kube-api-access-s2rb4\") pod \"92ef179e-0d37-4a3d-986f-5a4ea5bc5a23\" (UID: \"92ef179e-0d37-4a3d-986f-5a4ea5bc5a23\") " Mar 20 13:46:14 crc kubenswrapper[4973]: I0320 13:46:14.853222 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92ef179e-0d37-4a3d-986f-5a4ea5bc5a23-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "92ef179e-0d37-4a3d-986f-5a4ea5bc5a23" (UID: "92ef179e-0d37-4a3d-986f-5a4ea5bc5a23"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:14 crc kubenswrapper[4973]: I0320 13:46:14.859655 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92ef179e-0d37-4a3d-986f-5a4ea5bc5a23-kube-api-access-s2rb4" (OuterVolumeSpecName: "kube-api-access-s2rb4") pod "92ef179e-0d37-4a3d-986f-5a4ea5bc5a23" (UID: "92ef179e-0d37-4a3d-986f-5a4ea5bc5a23"). InnerVolumeSpecName "kube-api-access-s2rb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:14 crc kubenswrapper[4973]: I0320 13:46:14.885210 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92ef179e-0d37-4a3d-986f-5a4ea5bc5a23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "92ef179e-0d37-4a3d-986f-5a4ea5bc5a23" (UID: "92ef179e-0d37-4a3d-986f-5a4ea5bc5a23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:14 crc kubenswrapper[4973]: I0320 13:46:14.949834 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/92ef179e-0d37-4a3d-986f-5a4ea5bc5a23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:14 crc kubenswrapper[4973]: I0320 13:46:14.949875 4973 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/92ef179e-0d37-4a3d-986f-5a4ea5bc5a23-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:14 crc kubenswrapper[4973]: I0320 13:46:14.949887 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2rb4\" (UniqueName: \"kubernetes.io/projected/92ef179e-0d37-4a3d-986f-5a4ea5bc5a23-kube-api-access-s2rb4\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.379833 4973 generic.go:334] "Generic (PLEG): container finished" podID="0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68" containerID="ef78a08289134992f9c632dad673844b359d1a7ca4b4df90922ece411e9c3da1" exitCode=0 Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.379911 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vjtzq" event={"ID":"0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68","Type":"ContainerDied","Data":"ef78a08289134992f9c632dad673844b359d1a7ca4b4df90922ece411e9c3da1"} Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.382399 4973 generic.go:334] "Generic (PLEG): container finished" podID="6fec0901-00c6-410f-986c-4dcac4fe1359" containerID="cbaf7d0cf8de8e4a6cd3237e1a32f4af31c3690957694832e4bc0a512ce3bd31" exitCode=0 Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.382472 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zzsmb" event={"ID":"6fec0901-00c6-410f-986c-4dcac4fe1359","Type":"ContainerDied","Data":"cbaf7d0cf8de8e4a6cd3237e1a32f4af31c3690957694832e4bc0a512ce3bd31"} Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.384489 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-lbsl6" event={"ID":"92ef179e-0d37-4a3d-986f-5a4ea5bc5a23","Type":"ContainerDied","Data":"19f924119305226e017b38a7d64bc5853ef6bad70341bbe6a04c074641712c78"} Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.384550 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="19f924119305226e017b38a7d64bc5853ef6bad70341bbe6a04c074641712c78" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.384679 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-lbsl6" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.665759 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-65b846dd97-gcnmc"] Mar 20 13:46:15 crc kubenswrapper[4973]: E0320 13:46:15.667112 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d867119-66df-4aa7-a2dd-13d0d40ce2cc" containerName="heat-db-sync" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.667206 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d867119-66df-4aa7-a2dd-13d0d40ce2cc" containerName="heat-db-sync" Mar 20 13:46:15 crc kubenswrapper[4973]: E0320 13:46:15.667295 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1712bc13-36c4-4c56-b652-e0a0bd194179" containerName="oc" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.667384 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="1712bc13-36c4-4c56-b652-e0a0bd194179" containerName="oc" Mar 20 13:46:15 crc kubenswrapper[4973]: E0320 13:46:15.667463 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="92ef179e-0d37-4a3d-986f-5a4ea5bc5a23" containerName="barbican-db-sync" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.667515 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="92ef179e-0d37-4a3d-986f-5a4ea5bc5a23" containerName="barbican-db-sync" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.667828 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="1712bc13-36c4-4c56-b652-e0a0bd194179" containerName="oc" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.667929 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="92ef179e-0d37-4a3d-986f-5a4ea5bc5a23" containerName="barbican-db-sync" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.668010 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d867119-66df-4aa7-a2dd-13d0d40ce2cc" containerName="heat-db-sync" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.669311 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65b846dd97-gcnmc" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.674282 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.674693 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-8pfqw" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.676118 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8df11824-a676-4f6a-8f12-ccff6ce1bdc6-logs\") pod \"barbican-worker-65b846dd97-gcnmc\" (UID: \"8df11824-a676-4f6a-8f12-ccff6ce1bdc6\") " pod="openstack/barbican-worker-65b846dd97-gcnmc" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.676226 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wqhh\" (UniqueName: \"kubernetes.io/projected/8df11824-a676-4f6a-8f12-ccff6ce1bdc6-kube-api-access-8wqhh\") pod \"barbican-worker-65b846dd97-gcnmc\" (UID: \"8df11824-a676-4f6a-8f12-ccff6ce1bdc6\") " pod="openstack/barbican-worker-65b846dd97-gcnmc" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.676488 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8df11824-a676-4f6a-8f12-ccff6ce1bdc6-config-data\") pod \"barbican-worker-65b846dd97-gcnmc\" (UID: \"8df11824-a676-4f6a-8f12-ccff6ce1bdc6\") " pod="openstack/barbican-worker-65b846dd97-gcnmc" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.676767 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8df11824-a676-4f6a-8f12-ccff6ce1bdc6-combined-ca-bundle\") pod \"barbican-worker-65b846dd97-gcnmc\" (UID: \"8df11824-a676-4f6a-8f12-ccff6ce1bdc6\") " pod="openstack/barbican-worker-65b846dd97-gcnmc" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.676926 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8df11824-a676-4f6a-8f12-ccff6ce1bdc6-config-data-custom\") pod \"barbican-worker-65b846dd97-gcnmc\" (UID: \"8df11824-a676-4f6a-8f12-ccff6ce1bdc6\") " pod="openstack/barbican-worker-65b846dd97-gcnmc" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.678695 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.688961 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-65b846dd97-gcnmc"] Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.762427 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-68cf78bb54-pdcln"] Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.765318 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-68cf78bb54-pdcln" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.771747 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.783647 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8df11824-a676-4f6a-8f12-ccff6ce1bdc6-combined-ca-bundle\") pod \"barbican-worker-65b846dd97-gcnmc\" (UID: \"8df11824-a676-4f6a-8f12-ccff6ce1bdc6\") " pod="openstack/barbican-worker-65b846dd97-gcnmc" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.788648 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-68cf78bb54-pdcln"] Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.792132 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8df11824-a676-4f6a-8f12-ccff6ce1bdc6-config-data-custom\") pod \"barbican-worker-65b846dd97-gcnmc\" (UID: \"8df11824-a676-4f6a-8f12-ccff6ce1bdc6\") " pod="openstack/barbican-worker-65b846dd97-gcnmc" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.792644 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8df11824-a676-4f6a-8f12-ccff6ce1bdc6-logs\") pod \"barbican-worker-65b846dd97-gcnmc\" (UID: \"8df11824-a676-4f6a-8f12-ccff6ce1bdc6\") " pod="openstack/barbican-worker-65b846dd97-gcnmc" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.792756 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wqhh\" (UniqueName: \"kubernetes.io/projected/8df11824-a676-4f6a-8f12-ccff6ce1bdc6-kube-api-access-8wqhh\") pod \"barbican-worker-65b846dd97-gcnmc\" (UID: \"8df11824-a676-4f6a-8f12-ccff6ce1bdc6\") " pod="openstack/barbican-worker-65b846dd97-gcnmc" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.792905 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8df11824-a676-4f6a-8f12-ccff6ce1bdc6-config-data\") pod \"barbican-worker-65b846dd97-gcnmc\" (UID: \"8df11824-a676-4f6a-8f12-ccff6ce1bdc6\") " pod="openstack/barbican-worker-65b846dd97-gcnmc" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.794156 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8df11824-a676-4f6a-8f12-ccff6ce1bdc6-logs\") pod \"barbican-worker-65b846dd97-gcnmc\" (UID: \"8df11824-a676-4f6a-8f12-ccff6ce1bdc6\") " pod="openstack/barbican-worker-65b846dd97-gcnmc" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.809731 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8df11824-a676-4f6a-8f12-ccff6ce1bdc6-combined-ca-bundle\") pod \"barbican-worker-65b846dd97-gcnmc\" (UID: \"8df11824-a676-4f6a-8f12-ccff6ce1bdc6\") " pod="openstack/barbican-worker-65b846dd97-gcnmc" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.815283 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8df11824-a676-4f6a-8f12-ccff6ce1bdc6-config-data-custom\") pod \"barbican-worker-65b846dd97-gcnmc\" (UID: \"8df11824-a676-4f6a-8f12-ccff6ce1bdc6\") " pod="openstack/barbican-worker-65b846dd97-gcnmc" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.815377 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-t8fv8"] Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.816663 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8df11824-a676-4f6a-8f12-ccff6ce1bdc6-config-data\") pod \"barbican-worker-65b846dd97-gcnmc\" (UID: \"8df11824-a676-4f6a-8f12-ccff6ce1bdc6\") " pod="openstack/barbican-worker-65b846dd97-gcnmc" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.817533 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.858465 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-t8fv8"] Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.874544 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wqhh\" (UniqueName: \"kubernetes.io/projected/8df11824-a676-4f6a-8f12-ccff6ce1bdc6-kube-api-access-8wqhh\") pod \"barbican-worker-65b846dd97-gcnmc\" (UID: \"8df11824-a676-4f6a-8f12-ccff6ce1bdc6\") " pod="openstack/barbican-worker-65b846dd97-gcnmc" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.895681 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5ff467f-t8fv8\" (UID: \"49e5f7d7-87ea-4274-8d70-57783772d3a1\") " pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.895766 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9a97269-d458-4405-988c-b32339897e4f-config-data-custom\") pod \"barbican-keystone-listener-68cf78bb54-pdcln\" (UID: \"e9a97269-d458-4405-988c-b32339897e4f\") " pod="openstack/barbican-keystone-listener-68cf78bb54-pdcln" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.895791 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdccf\" (UniqueName: \"kubernetes.io/projected/49e5f7d7-87ea-4274-8d70-57783772d3a1-kube-api-access-xdccf\") pod \"dnsmasq-dns-59d5ff467f-t8fv8\" (UID: \"49e5f7d7-87ea-4274-8d70-57783772d3a1\") " pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.895817 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-config\") pod \"dnsmasq-dns-59d5ff467f-t8fv8\" (UID: \"49e5f7d7-87ea-4274-8d70-57783772d3a1\") " pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.895861 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9a97269-d458-4405-988c-b32339897e4f-logs\") pod \"barbican-keystone-listener-68cf78bb54-pdcln\" (UID: \"e9a97269-d458-4405-988c-b32339897e4f\") " pod="openstack/barbican-keystone-listener-68cf78bb54-pdcln" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.895922 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-dns-swift-storage-0\") pod \"dnsmasq-dns-59d5ff467f-t8fv8\" (UID: \"49e5f7d7-87ea-4274-8d70-57783772d3a1\") " pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.895953 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9a97269-d458-4405-988c-b32339897e4f-combined-ca-bundle\") pod \"barbican-keystone-listener-68cf78bb54-pdcln\" (UID: \"e9a97269-d458-4405-988c-b32339897e4f\") " pod="openstack/barbican-keystone-listener-68cf78bb54-pdcln" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.895975 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5ff467f-t8fv8\" (UID: \"49e5f7d7-87ea-4274-8d70-57783772d3a1\") " pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.895998 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-dns-svc\") pod \"dnsmasq-dns-59d5ff467f-t8fv8\" (UID: \"49e5f7d7-87ea-4274-8d70-57783772d3a1\") " pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.896020 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsjgx\" (UniqueName: \"kubernetes.io/projected/e9a97269-d458-4405-988c-b32339897e4f-kube-api-access-lsjgx\") pod \"barbican-keystone-listener-68cf78bb54-pdcln\" (UID: \"e9a97269-d458-4405-988c-b32339897e4f\") " pod="openstack/barbican-keystone-listener-68cf78bb54-pdcln" Mar 20 13:46:15 crc kubenswrapper[4973]: I0320 13:46:15.896088 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9a97269-d458-4405-988c-b32339897e4f-config-data\") pod \"barbican-keystone-listener-68cf78bb54-pdcln\" (UID: \"e9a97269-d458-4405-988c-b32339897e4f\") " pod="openstack/barbican-keystone-listener-68cf78bb54-pdcln" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.003704 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-65b846dd97-gcnmc" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.043199 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5ff467f-t8fv8\" (UID: \"49e5f7d7-87ea-4274-8d70-57783772d3a1\") " pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.043281 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-dns-svc\") pod \"dnsmasq-dns-59d5ff467f-t8fv8\" (UID: \"49e5f7d7-87ea-4274-8d70-57783772d3a1\") " pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.043311 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsjgx\" (UniqueName: \"kubernetes.io/projected/e9a97269-d458-4405-988c-b32339897e4f-kube-api-access-lsjgx\") pod \"barbican-keystone-listener-68cf78bb54-pdcln\" (UID: \"e9a97269-d458-4405-988c-b32339897e4f\") " pod="openstack/barbican-keystone-listener-68cf78bb54-pdcln" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.043407 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9a97269-d458-4405-988c-b32339897e4f-config-data\") pod \"barbican-keystone-listener-68cf78bb54-pdcln\" (UID: \"e9a97269-d458-4405-988c-b32339897e4f\") " pod="openstack/barbican-keystone-listener-68cf78bb54-pdcln" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.043573 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5ff467f-t8fv8\" (UID: \"49e5f7d7-87ea-4274-8d70-57783772d3a1\") " pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.043616 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9a97269-d458-4405-988c-b32339897e4f-config-data-custom\") pod \"barbican-keystone-listener-68cf78bb54-pdcln\" (UID: \"e9a97269-d458-4405-988c-b32339897e4f\") " pod="openstack/barbican-keystone-listener-68cf78bb54-pdcln" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.043650 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdccf\" (UniqueName: \"kubernetes.io/projected/49e5f7d7-87ea-4274-8d70-57783772d3a1-kube-api-access-xdccf\") pod \"dnsmasq-dns-59d5ff467f-t8fv8\" (UID: \"49e5f7d7-87ea-4274-8d70-57783772d3a1\") " pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.043681 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-config\") pod \"dnsmasq-dns-59d5ff467f-t8fv8\" (UID: \"49e5f7d7-87ea-4274-8d70-57783772d3a1\") " pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.043713 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9a97269-d458-4405-988c-b32339897e4f-logs\") pod \"barbican-keystone-listener-68cf78bb54-pdcln\" (UID: \"e9a97269-d458-4405-988c-b32339897e4f\") " pod="openstack/barbican-keystone-listener-68cf78bb54-pdcln" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.043800 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-dns-swift-storage-0\") pod \"dnsmasq-dns-59d5ff467f-t8fv8\" (UID: \"49e5f7d7-87ea-4274-8d70-57783772d3a1\") " pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.043837 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9a97269-d458-4405-988c-b32339897e4f-combined-ca-bundle\") pod \"barbican-keystone-listener-68cf78bb54-pdcln\" (UID: \"e9a97269-d458-4405-988c-b32339897e4f\") " pod="openstack/barbican-keystone-listener-68cf78bb54-pdcln" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.045193 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-ovsdbserver-nb\") pod \"dnsmasq-dns-59d5ff467f-t8fv8\" (UID: \"49e5f7d7-87ea-4274-8d70-57783772d3a1\") " pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.045451 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9a97269-d458-4405-988c-b32339897e4f-logs\") pod \"barbican-keystone-listener-68cf78bb54-pdcln\" (UID: \"e9a97269-d458-4405-988c-b32339897e4f\") " pod="openstack/barbican-keystone-listener-68cf78bb54-pdcln" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.049753 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-config\") pod \"dnsmasq-dns-59d5ff467f-t8fv8\" (UID: \"49e5f7d7-87ea-4274-8d70-57783772d3a1\") " pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.050818 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-dns-svc\") pod \"dnsmasq-dns-59d5ff467f-t8fv8\" (UID: \"49e5f7d7-87ea-4274-8d70-57783772d3a1\") " pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.056952 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-dns-swift-storage-0\") pod \"dnsmasq-dns-59d5ff467f-t8fv8\" (UID: \"49e5f7d7-87ea-4274-8d70-57783772d3a1\") " pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.067457 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9a97269-d458-4405-988c-b32339897e4f-config-data\") pod \"barbican-keystone-listener-68cf78bb54-pdcln\" (UID: \"e9a97269-d458-4405-988c-b32339897e4f\") " pod="openstack/barbican-keystone-listener-68cf78bb54-pdcln" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.071193 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9a97269-d458-4405-988c-b32339897e4f-combined-ca-bundle\") pod \"barbican-keystone-listener-68cf78bb54-pdcln\" (UID: \"e9a97269-d458-4405-988c-b32339897e4f\") " pod="openstack/barbican-keystone-listener-68cf78bb54-pdcln" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.079302 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-ovsdbserver-sb\") pod \"dnsmasq-dns-59d5ff467f-t8fv8\" (UID: \"49e5f7d7-87ea-4274-8d70-57783772d3a1\") " pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.086061 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsjgx\" (UniqueName: \"kubernetes.io/projected/e9a97269-d458-4405-988c-b32339897e4f-kube-api-access-lsjgx\") pod \"barbican-keystone-listener-68cf78bb54-pdcln\" (UID: \"e9a97269-d458-4405-988c-b32339897e4f\") " pod="openstack/barbican-keystone-listener-68cf78bb54-pdcln" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.096423 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-56d7b66f8b-74fsz"] Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.101098 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56d7b66f8b-74fsz" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.109785 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.164499 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdccf\" (UniqueName: \"kubernetes.io/projected/49e5f7d7-87ea-4274-8d70-57783772d3a1-kube-api-access-xdccf\") pod \"dnsmasq-dns-59d5ff467f-t8fv8\" (UID: \"49e5f7d7-87ea-4274-8d70-57783772d3a1\") " pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.165280 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9a97269-d458-4405-988c-b32339897e4f-config-data-custom\") pod \"barbican-keystone-listener-68cf78bb54-pdcln\" (UID: \"e9a97269-d458-4405-988c-b32339897e4f\") " pod="openstack/barbican-keystone-listener-68cf78bb54-pdcln" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.166909 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9f37ccd-dcfc-472e-be04-7273945fa9ef-config-data\") pod \"barbican-api-56d7b66f8b-74fsz\" (UID: \"e9f37ccd-dcfc-472e-be04-7273945fa9ef\") " pod="openstack/barbican-api-56d7b66f8b-74fsz" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.166956 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9f37ccd-dcfc-472e-be04-7273945fa9ef-logs\") pod \"barbican-api-56d7b66f8b-74fsz\" (UID: \"e9f37ccd-dcfc-472e-be04-7273945fa9ef\") " pod="openstack/barbican-api-56d7b66f8b-74fsz" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.167071 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v7q7\" (UniqueName: \"kubernetes.io/projected/e9f37ccd-dcfc-472e-be04-7273945fa9ef-kube-api-access-5v7q7\") pod \"barbican-api-56d7b66f8b-74fsz\" (UID: \"e9f37ccd-dcfc-472e-be04-7273945fa9ef\") " pod="openstack/barbican-api-56d7b66f8b-74fsz" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.167101 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9f37ccd-dcfc-472e-be04-7273945fa9ef-config-data-custom\") pod \"barbican-api-56d7b66f8b-74fsz\" (UID: \"e9f37ccd-dcfc-472e-be04-7273945fa9ef\") " pod="openstack/barbican-api-56d7b66f8b-74fsz" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.167127 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9f37ccd-dcfc-472e-be04-7273945fa9ef-combined-ca-bundle\") pod \"barbican-api-56d7b66f8b-74fsz\" (UID: \"e9f37ccd-dcfc-472e-be04-7273945fa9ef\") " pod="openstack/barbican-api-56d7b66f8b-74fsz" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.173380 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56d7b66f8b-74fsz"] Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.270005 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9f37ccd-dcfc-472e-be04-7273945fa9ef-config-data-custom\") pod \"barbican-api-56d7b66f8b-74fsz\" (UID: \"e9f37ccd-dcfc-472e-be04-7273945fa9ef\") " pod="openstack/barbican-api-56d7b66f8b-74fsz" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.270060 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9f37ccd-dcfc-472e-be04-7273945fa9ef-combined-ca-bundle\") pod \"barbican-api-56d7b66f8b-74fsz\" (UID: \"e9f37ccd-dcfc-472e-be04-7273945fa9ef\") " pod="openstack/barbican-api-56d7b66f8b-74fsz" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.270547 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9f37ccd-dcfc-472e-be04-7273945fa9ef-config-data\") pod \"barbican-api-56d7b66f8b-74fsz\" (UID: \"e9f37ccd-dcfc-472e-be04-7273945fa9ef\") " pod="openstack/barbican-api-56d7b66f8b-74fsz" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.270636 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9f37ccd-dcfc-472e-be04-7273945fa9ef-logs\") pod \"barbican-api-56d7b66f8b-74fsz\" (UID: \"e9f37ccd-dcfc-472e-be04-7273945fa9ef\") " pod="openstack/barbican-api-56d7b66f8b-74fsz" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.270841 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5v7q7\" (UniqueName: \"kubernetes.io/projected/e9f37ccd-dcfc-472e-be04-7273945fa9ef-kube-api-access-5v7q7\") pod \"barbican-api-56d7b66f8b-74fsz\" (UID: \"e9f37ccd-dcfc-472e-be04-7273945fa9ef\") " pod="openstack/barbican-api-56d7b66f8b-74fsz" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.277648 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9f37ccd-dcfc-472e-be04-7273945fa9ef-logs\") pod \"barbican-api-56d7b66f8b-74fsz\" (UID: \"e9f37ccd-dcfc-472e-be04-7273945fa9ef\") " pod="openstack/barbican-api-56d7b66f8b-74fsz" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.286049 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9f37ccd-dcfc-472e-be04-7273945fa9ef-config-data-custom\") pod \"barbican-api-56d7b66f8b-74fsz\" (UID: \"e9f37ccd-dcfc-472e-be04-7273945fa9ef\") " pod="openstack/barbican-api-56d7b66f8b-74fsz" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.291518 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9f37ccd-dcfc-472e-be04-7273945fa9ef-combined-ca-bundle\") pod \"barbican-api-56d7b66f8b-74fsz\" (UID: \"e9f37ccd-dcfc-472e-be04-7273945fa9ef\") " pod="openstack/barbican-api-56d7b66f8b-74fsz" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.292409 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9f37ccd-dcfc-472e-be04-7273945fa9ef-config-data\") pod \"barbican-api-56d7b66f8b-74fsz\" (UID: \"e9f37ccd-dcfc-472e-be04-7273945fa9ef\") " pod="openstack/barbican-api-56d7b66f8b-74fsz" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.298882 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-68cf78bb54-pdcln" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.302571 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5v7q7\" (UniqueName: \"kubernetes.io/projected/e9f37ccd-dcfc-472e-be04-7273945fa9ef-kube-api-access-5v7q7\") pod \"barbican-api-56d7b66f8b-74fsz\" (UID: \"e9f37ccd-dcfc-472e-be04-7273945fa9ef\") " pod="openstack/barbican-api-56d7b66f8b-74fsz" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.340308 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.566608 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56d7b66f8b-74fsz" Mar 20 13:46:16 crc kubenswrapper[4973]: I0320 13:46:16.942441 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-68cf78bb54-pdcln"] Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.044762 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-65b846dd97-gcnmc"] Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.460587 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zzsmb" Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.491095 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-56d7b66f8b-74fsz"] Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.508265 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-68cf78bb54-pdcln" event={"ID":"e9a97269-d458-4405-988c-b32339897e4f","Type":"ContainerStarted","Data":"c47902b445f6f5de72d64ae9a72a957ba09758c159247b06c2992a2dddcb6bf0"} Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.517707 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65b846dd97-gcnmc" event={"ID":"8df11824-a676-4f6a-8f12-ccff6ce1bdc6","Type":"ContainerStarted","Data":"fcaad814481571b2d58d967a7d8e05e1b075df7f50a7172234aabcd7ce0547ba"} Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.533138 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-zzsmb" event={"ID":"6fec0901-00c6-410f-986c-4dcac4fe1359","Type":"ContainerDied","Data":"b3f901d3fb7e6053f4d4efb8956efd3a57c5ed921eec8093a75bf61826095000"} Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.533168 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zzsmb" Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.533192 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3f901d3fb7e6053f4d4efb8956efd3a57c5ed921eec8093a75bf61826095000" Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.553206 4973 generic.go:334] "Generic (PLEG): container finished" podID="198d4fb1-4784-4553-b617-a2c33cec6df6" containerID="186ab78de0056ee24a059d1993ee768dc788875d32e78405f88b053e204b2bd0" exitCode=0 Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.553252 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"198d4fb1-4784-4553-b617-a2c33cec6df6","Type":"ContainerDied","Data":"186ab78de0056ee24a059d1993ee768dc788875d32e78405f88b053e204b2bd0"} Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.585524 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vjtzq" Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.614860 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fec0901-00c6-410f-986c-4dcac4fe1359-config-data\") pod \"6fec0901-00c6-410f-986c-4dcac4fe1359\" (UID: \"6fec0901-00c6-410f-986c-4dcac4fe1359\") " Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.615109 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fec0901-00c6-410f-986c-4dcac4fe1359-combined-ca-bundle\") pod \"6fec0901-00c6-410f-986c-4dcac4fe1359\" (UID: \"6fec0901-00c6-410f-986c-4dcac4fe1359\") " Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.615206 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6fec0901-00c6-410f-986c-4dcac4fe1359-etc-machine-id\") pod \"6fec0901-00c6-410f-986c-4dcac4fe1359\" (UID: \"6fec0901-00c6-410f-986c-4dcac4fe1359\") " Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.615455 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8dvh\" (UniqueName: \"kubernetes.io/projected/6fec0901-00c6-410f-986c-4dcac4fe1359-kube-api-access-q8dvh\") pod \"6fec0901-00c6-410f-986c-4dcac4fe1359\" (UID: \"6fec0901-00c6-410f-986c-4dcac4fe1359\") " Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.615529 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6fec0901-00c6-410f-986c-4dcac4fe1359-db-sync-config-data\") pod \"6fec0901-00c6-410f-986c-4dcac4fe1359\" (UID: \"6fec0901-00c6-410f-986c-4dcac4fe1359\") " Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.615562 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fec0901-00c6-410f-986c-4dcac4fe1359-scripts\") pod \"6fec0901-00c6-410f-986c-4dcac4fe1359\" (UID: \"6fec0901-00c6-410f-986c-4dcac4fe1359\") " Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.616027 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6fec0901-00c6-410f-986c-4dcac4fe1359-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6fec0901-00c6-410f-986c-4dcac4fe1359" (UID: "6fec0901-00c6-410f-986c-4dcac4fe1359"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.616579 4973 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6fec0901-00c6-410f-986c-4dcac4fe1359-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.622816 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fec0901-00c6-410f-986c-4dcac4fe1359-scripts" (OuterVolumeSpecName: "scripts") pod "6fec0901-00c6-410f-986c-4dcac4fe1359" (UID: "6fec0901-00c6-410f-986c-4dcac4fe1359"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.629949 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fec0901-00c6-410f-986c-4dcac4fe1359-kube-api-access-q8dvh" (OuterVolumeSpecName: "kube-api-access-q8dvh") pod "6fec0901-00c6-410f-986c-4dcac4fe1359" (UID: "6fec0901-00c6-410f-986c-4dcac4fe1359"). InnerVolumeSpecName "kube-api-access-q8dvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.630772 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fec0901-00c6-410f-986c-4dcac4fe1359-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6fec0901-00c6-410f-986c-4dcac4fe1359" (UID: "6fec0901-00c6-410f-986c-4dcac4fe1359"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.685301 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fec0901-00c6-410f-986c-4dcac4fe1359-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6fec0901-00c6-410f-986c-4dcac4fe1359" (UID: "6fec0901-00c6-410f-986c-4dcac4fe1359"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.717617 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68-combined-ca-bundle\") pod \"0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68\" (UID: \"0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68\") " Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.717707 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68-config\") pod \"0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68\" (UID: \"0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68\") " Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.717739 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5slqw\" (UniqueName: \"kubernetes.io/projected/0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68-kube-api-access-5slqw\") pod \"0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68\" (UID: \"0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68\") " Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.719020 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fec0901-00c6-410f-986c-4dcac4fe1359-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.719048 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8dvh\" (UniqueName: \"kubernetes.io/projected/6fec0901-00c6-410f-986c-4dcac4fe1359-kube-api-access-q8dvh\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.719062 4973 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6fec0901-00c6-410f-986c-4dcac4fe1359-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.719076 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6fec0901-00c6-410f-986c-4dcac4fe1359-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.728385 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68-kube-api-access-5slqw" (OuterVolumeSpecName: "kube-api-access-5slqw") pod "0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68" (UID: "0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68"). InnerVolumeSpecName "kube-api-access-5slqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.751319 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fec0901-00c6-410f-986c-4dcac4fe1359-config-data" (OuterVolumeSpecName: "config-data") pod "6fec0901-00c6-410f-986c-4dcac4fe1359" (UID: "6fec0901-00c6-410f-986c-4dcac4fe1359"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.803819 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68" (UID: "0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.822183 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6fec0901-00c6-410f-986c-4dcac4fe1359-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.822220 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.822232 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5slqw\" (UniqueName: \"kubernetes.io/projected/0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68-kube-api-access-5slqw\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.837653 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68-config" (OuterVolumeSpecName: "config") pod "0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68" (UID: "0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.884837 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-t8fv8"] Mar 20 13:46:17 crc kubenswrapper[4973]: I0320 13:46:17.928399 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.233894 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.340883 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/198d4fb1-4784-4553-b617-a2c33cec6df6-config-data\") pod \"198d4fb1-4784-4553-b617-a2c33cec6df6\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.341423 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/198d4fb1-4784-4553-b617-a2c33cec6df6-run-httpd\") pod \"198d4fb1-4784-4553-b617-a2c33cec6df6\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.341479 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/198d4fb1-4784-4553-b617-a2c33cec6df6-scripts\") pod \"198d4fb1-4784-4553-b617-a2c33cec6df6\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.341508 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt626\" (UniqueName: \"kubernetes.io/projected/198d4fb1-4784-4553-b617-a2c33cec6df6-kube-api-access-pt626\") pod \"198d4fb1-4784-4553-b617-a2c33cec6df6\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.341533 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/198d4fb1-4784-4553-b617-a2c33cec6df6-combined-ca-bundle\") pod \"198d4fb1-4784-4553-b617-a2c33cec6df6\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.341716 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/198d4fb1-4784-4553-b617-a2c33cec6df6-sg-core-conf-yaml\") pod \"198d4fb1-4784-4553-b617-a2c33cec6df6\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.341788 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/198d4fb1-4784-4553-b617-a2c33cec6df6-log-httpd\") pod \"198d4fb1-4784-4553-b617-a2c33cec6df6\" (UID: \"198d4fb1-4784-4553-b617-a2c33cec6df6\") " Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.343568 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/198d4fb1-4784-4553-b617-a2c33cec6df6-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "198d4fb1-4784-4553-b617-a2c33cec6df6" (UID: "198d4fb1-4784-4553-b617-a2c33cec6df6"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.343578 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/198d4fb1-4784-4553-b617-a2c33cec6df6-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "198d4fb1-4784-4553-b617-a2c33cec6df6" (UID: "198d4fb1-4784-4553-b617-a2c33cec6df6"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.352543 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/198d4fb1-4784-4553-b617-a2c33cec6df6-kube-api-access-pt626" (OuterVolumeSpecName: "kube-api-access-pt626") pod "198d4fb1-4784-4553-b617-a2c33cec6df6" (UID: "198d4fb1-4784-4553-b617-a2c33cec6df6"). InnerVolumeSpecName "kube-api-access-pt626". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.352553 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/198d4fb1-4784-4553-b617-a2c33cec6df6-scripts" (OuterVolumeSpecName: "scripts") pod "198d4fb1-4784-4553-b617-a2c33cec6df6" (UID: "198d4fb1-4784-4553-b617-a2c33cec6df6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.392864 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/198d4fb1-4784-4553-b617-a2c33cec6df6-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "198d4fb1-4784-4553-b617-a2c33cec6df6" (UID: "198d4fb1-4784-4553-b617-a2c33cec6df6"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.427138 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/198d4fb1-4784-4553-b617-a2c33cec6df6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "198d4fb1-4784-4553-b617-a2c33cec6df6" (UID: "198d4fb1-4784-4553-b617-a2c33cec6df6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.445390 4973 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/198d4fb1-4784-4553-b617-a2c33cec6df6-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.445426 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/198d4fb1-4784-4553-b617-a2c33cec6df6-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.445438 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt626\" (UniqueName: \"kubernetes.io/projected/198d4fb1-4784-4553-b617-a2c33cec6df6-kube-api-access-pt626\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.445453 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/198d4fb1-4784-4553-b617-a2c33cec6df6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.445464 4973 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/198d4fb1-4784-4553-b617-a2c33cec6df6-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.445483 4973 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/198d4fb1-4784-4553-b617-a2c33cec6df6-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.456734 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/198d4fb1-4784-4553-b617-a2c33cec6df6-config-data" (OuterVolumeSpecName: "config-data") pod "198d4fb1-4784-4553-b617-a2c33cec6df6" (UID: "198d4fb1-4784-4553-b617-a2c33cec6df6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.548421 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/198d4fb1-4784-4553-b617-a2c33cec6df6-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.571371 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-vjtzq" event={"ID":"0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68","Type":"ContainerDied","Data":"01b63697c8292f394e1cf265d3998f5967b9bc912cebe09217fd54bade0f98b7"} Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.571424 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="01b63697c8292f394e1cf265d3998f5967b9bc912cebe09217fd54bade0f98b7" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.571389 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-vjtzq" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.574484 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"198d4fb1-4784-4553-b617-a2c33cec6df6","Type":"ContainerDied","Data":"dc6d86bbf08f4755e01861e808d829309fbeb4919bdb4b247a0a90a264372195"} Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.574525 4973 scope.go:117] "RemoveContainer" containerID="8ee7685b7b7a2a052e9c22d063d9f4ce40c031303874ebc785da73b46697b36d" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.574643 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.583888 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56d7b66f8b-74fsz" event={"ID":"e9f37ccd-dcfc-472e-be04-7273945fa9ef","Type":"ContainerStarted","Data":"b80e949024f1f3d04fb9e44ff5c878f8da2d5b4a8903ea06ca8e8c960cd03a87"} Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.583937 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56d7b66f8b-74fsz" event={"ID":"e9f37ccd-dcfc-472e-be04-7273945fa9ef","Type":"ContainerStarted","Data":"b82fd517e27a0b42cb6ca56a801db45b8058a59727b63c73a5d6114c35bc3a57"} Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.583955 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56d7b66f8b-74fsz" event={"ID":"e9f37ccd-dcfc-472e-be04-7273945fa9ef","Type":"ContainerStarted","Data":"3c06468fc5fa0f28653431f4d14d917d94365041c3b599139e9a95c48dd9e313"} Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.584239 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56d7b66f8b-74fsz" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.584271 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-56d7b66f8b-74fsz" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.586579 4973 generic.go:334] "Generic (PLEG): container finished" podID="49e5f7d7-87ea-4274-8d70-57783772d3a1" containerID="dd73c2f459eafebe224a7bf3f433ca1c042e3feb0c487e91811e559f070dd70d" exitCode=0 Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.586615 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" event={"ID":"49e5f7d7-87ea-4274-8d70-57783772d3a1","Type":"ContainerDied","Data":"dd73c2f459eafebe224a7bf3f433ca1c042e3feb0c487e91811e559f070dd70d"} Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.586635 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" event={"ID":"49e5f7d7-87ea-4274-8d70-57783772d3a1","Type":"ContainerStarted","Data":"127bf346870d7444f33af9799cc82a67a6da11efa17a2c3b004e2be0beffa323"} Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.606334 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-56d7b66f8b-74fsz" podStartSLOduration=3.606311019 podStartE2EDuration="3.606311019s" podCreationTimestamp="2026-03-20 13:46:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:18.604198582 +0000 UTC m=+1499.347868346" watchObservedRunningTime="2026-03-20 13:46:18.606311019 +0000 UTC m=+1499.349980763" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.687461 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.733822 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.783846 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:46:18 crc kubenswrapper[4973]: E0320 13:46:18.784530 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="198d4fb1-4784-4553-b617-a2c33cec6df6" containerName="ceilometer-notification-agent" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.784547 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="198d4fb1-4784-4553-b617-a2c33cec6df6" containerName="ceilometer-notification-agent" Mar 20 13:46:18 crc kubenswrapper[4973]: E0320 13:46:18.784574 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fec0901-00c6-410f-986c-4dcac4fe1359" containerName="cinder-db-sync" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.784582 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fec0901-00c6-410f-986c-4dcac4fe1359" containerName="cinder-db-sync" Mar 20 13:46:18 crc kubenswrapper[4973]: E0320 13:46:18.784600 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68" containerName="neutron-db-sync" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.784607 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68" containerName="neutron-db-sync" Mar 20 13:46:18 crc kubenswrapper[4973]: E0320 13:46:18.784631 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="198d4fb1-4784-4553-b617-a2c33cec6df6" containerName="sg-core" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.784638 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="198d4fb1-4784-4553-b617-a2c33cec6df6" containerName="sg-core" Mar 20 13:46:18 crc kubenswrapper[4973]: E0320 13:46:18.784663 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="198d4fb1-4784-4553-b617-a2c33cec6df6" containerName="proxy-httpd" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.784670 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="198d4fb1-4784-4553-b617-a2c33cec6df6" containerName="proxy-httpd" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.784935 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68" containerName="neutron-db-sync" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.784950 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="198d4fb1-4784-4553-b617-a2c33cec6df6" containerName="ceilometer-notification-agent" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.784967 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="198d4fb1-4784-4553-b617-a2c33cec6df6" containerName="proxy-httpd" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.784984 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fec0901-00c6-410f-986c-4dcac4fe1359" containerName="cinder-db-sync" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.784997 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="198d4fb1-4784-4553-b617-a2c33cec6df6" containerName="sg-core" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.792520 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.827288 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.827530 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-c9vcl" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.827639 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.827787 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.861441 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49ac7635-4e3b-47fb-a585-c07724eff983-scripts\") pod \"cinder-scheduler-0\" (UID: \"49ac7635-4e3b-47fb-a585-c07724eff983\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.861536 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ac7635-4e3b-47fb-a585-c07724eff983-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"49ac7635-4e3b-47fb-a585-c07724eff983\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.861583 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ac7635-4e3b-47fb-a585-c07724eff983-config-data\") pod \"cinder-scheduler-0\" (UID: \"49ac7635-4e3b-47fb-a585-c07724eff983\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.861714 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75ccb\" (UniqueName: \"kubernetes.io/projected/49ac7635-4e3b-47fb-a585-c07724eff983-kube-api-access-75ccb\") pod \"cinder-scheduler-0\" (UID: \"49ac7635-4e3b-47fb-a585-c07724eff983\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.861748 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49ac7635-4e3b-47fb-a585-c07724eff983-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"49ac7635-4e3b-47fb-a585-c07724eff983\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.861787 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/49ac7635-4e3b-47fb-a585-c07724eff983-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"49ac7635-4e3b-47fb-a585-c07724eff983\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.876577 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.892452 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.895724 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.908455 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.911571 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.964153 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75ccb\" (UniqueName: \"kubernetes.io/projected/49ac7635-4e3b-47fb-a585-c07724eff983-kube-api-access-75ccb\") pod \"cinder-scheduler-0\" (UID: \"49ac7635-4e3b-47fb-a585-c07724eff983\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.964210 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d355b77-7cc9-4184-a223-2bc448d9ca1b-run-httpd\") pod \"ceilometer-0\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " pod="openstack/ceilometer-0" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.964243 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d355b77-7cc9-4184-a223-2bc448d9ca1b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " pod="openstack/ceilometer-0" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.964274 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49ac7635-4e3b-47fb-a585-c07724eff983-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"49ac7635-4e3b-47fb-a585-c07724eff983\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.964306 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxf7c\" (UniqueName: \"kubernetes.io/projected/0d355b77-7cc9-4184-a223-2bc448d9ca1b-kube-api-access-wxf7c\") pod \"ceilometer-0\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " pod="openstack/ceilometer-0" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.964401 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/49ac7635-4e3b-47fb-a585-c07724eff983-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"49ac7635-4e3b-47fb-a585-c07724eff983\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.964527 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49ac7635-4e3b-47fb-a585-c07724eff983-scripts\") pod \"cinder-scheduler-0\" (UID: \"49ac7635-4e3b-47fb-a585-c07724eff983\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.964580 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d355b77-7cc9-4184-a223-2bc448d9ca1b-scripts\") pod \"ceilometer-0\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " pod="openstack/ceilometer-0" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.964602 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ac7635-4e3b-47fb-a585-c07724eff983-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"49ac7635-4e3b-47fb-a585-c07724eff983\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.964637 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d355b77-7cc9-4184-a223-2bc448d9ca1b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " pod="openstack/ceilometer-0" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.964680 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ac7635-4e3b-47fb-a585-c07724eff983-config-data\") pod \"cinder-scheduler-0\" (UID: \"49ac7635-4e3b-47fb-a585-c07724eff983\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.964783 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d355b77-7cc9-4184-a223-2bc448d9ca1b-config-data\") pod \"ceilometer-0\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " pod="openstack/ceilometer-0" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.964824 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d355b77-7cc9-4184-a223-2bc448d9ca1b-log-httpd\") pod \"ceilometer-0\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " pod="openstack/ceilometer-0" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.965004 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/49ac7635-4e3b-47fb-a585-c07724eff983-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"49ac7635-4e3b-47fb-a585-c07724eff983\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.973851 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49ac7635-4e3b-47fb-a585-c07724eff983-scripts\") pod \"cinder-scheduler-0\" (UID: \"49ac7635-4e3b-47fb-a585-c07724eff983\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:18 crc kubenswrapper[4973]: I0320 13:46:18.978369 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ac7635-4e3b-47fb-a585-c07724eff983-config-data\") pod \"cinder-scheduler-0\" (UID: \"49ac7635-4e3b-47fb-a585-c07724eff983\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.011962 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ac7635-4e3b-47fb-a585-c07724eff983-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"49ac7635-4e3b-47fb-a585-c07724eff983\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.012208 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49ac7635-4e3b-47fb-a585-c07724eff983-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"49ac7635-4e3b-47fb-a585-c07724eff983\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.038079 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75ccb\" (UniqueName: \"kubernetes.io/projected/49ac7635-4e3b-47fb-a585-c07724eff983-kube-api-access-75ccb\") pod \"cinder-scheduler-0\" (UID: \"49ac7635-4e3b-47fb-a585-c07724eff983\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.050936 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.071081 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d355b77-7cc9-4184-a223-2bc448d9ca1b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " pod="openstack/ceilometer-0" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.085740 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d355b77-7cc9-4184-a223-2bc448d9ca1b-config-data\") pod \"ceilometer-0\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " pod="openstack/ceilometer-0" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.086614 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d355b77-7cc9-4184-a223-2bc448d9ca1b-log-httpd\") pod \"ceilometer-0\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " pod="openstack/ceilometer-0" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.086879 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d355b77-7cc9-4184-a223-2bc448d9ca1b-run-httpd\") pod \"ceilometer-0\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " pod="openstack/ceilometer-0" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.086979 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d355b77-7cc9-4184-a223-2bc448d9ca1b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " pod="openstack/ceilometer-0" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.087104 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wxf7c\" (UniqueName: \"kubernetes.io/projected/0d355b77-7cc9-4184-a223-2bc448d9ca1b-kube-api-access-wxf7c\") pod \"ceilometer-0\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " pod="openstack/ceilometer-0" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.087381 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d355b77-7cc9-4184-a223-2bc448d9ca1b-scripts\") pod \"ceilometer-0\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " pod="openstack/ceilometer-0" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.088176 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d355b77-7cc9-4184-a223-2bc448d9ca1b-log-httpd\") pod \"ceilometer-0\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " pod="openstack/ceilometer-0" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.089665 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d355b77-7cc9-4184-a223-2bc448d9ca1b-run-httpd\") pod \"ceilometer-0\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " pod="openstack/ceilometer-0" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.109507 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d355b77-7cc9-4184-a223-2bc448d9ca1b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " pod="openstack/ceilometer-0" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.110927 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d355b77-7cc9-4184-a223-2bc448d9ca1b-config-data\") pod \"ceilometer-0\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " pod="openstack/ceilometer-0" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.114574 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-t8fv8"] Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.116365 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d355b77-7cc9-4184-a223-2bc448d9ca1b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " pod="openstack/ceilometer-0" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.144426 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d355b77-7cc9-4184-a223-2bc448d9ca1b-scripts\") pod \"ceilometer-0\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " pod="openstack/ceilometer-0" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.153786 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxf7c\" (UniqueName: \"kubernetes.io/projected/0d355b77-7cc9-4184-a223-2bc448d9ca1b-kube-api-access-wxf7c\") pod \"ceilometer-0\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " pod="openstack/ceilometer-0" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.246915 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.270485 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.307600 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-jq7fk"] Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.310091 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.399811 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-jq7fk"] Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.401940 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-jq7fk\" (UID: \"9321b726-0c2c-4c9c-a40a-73a387dfb215\") " pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.402083 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-dns-svc\") pod \"dnsmasq-dns-5784cf869f-jq7fk\" (UID: \"9321b726-0c2c-4c9c-a40a-73a387dfb215\") " pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.402105 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-jq7fk\" (UID: \"9321b726-0c2c-4c9c-a40a-73a387dfb215\") " pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.402137 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqvgc\" (UniqueName: \"kubernetes.io/projected/9321b726-0c2c-4c9c-a40a-73a387dfb215-kube-api-access-sqvgc\") pod \"dnsmasq-dns-5784cf869f-jq7fk\" (UID: \"9321b726-0c2c-4c9c-a40a-73a387dfb215\") " pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.402205 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-config\") pod \"dnsmasq-dns-5784cf869f-jq7fk\" (UID: \"9321b726-0c2c-4c9c-a40a-73a387dfb215\") " pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.402299 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-jq7fk\" (UID: \"9321b726-0c2c-4c9c-a40a-73a387dfb215\") " pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.509890 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-jq7fk\" (UID: \"9321b726-0c2c-4c9c-a40a-73a387dfb215\") " pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.509932 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-dns-svc\") pod \"dnsmasq-dns-5784cf869f-jq7fk\" (UID: \"9321b726-0c2c-4c9c-a40a-73a387dfb215\") " pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.509968 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqvgc\" (UniqueName: \"kubernetes.io/projected/9321b726-0c2c-4c9c-a40a-73a387dfb215-kube-api-access-sqvgc\") pod \"dnsmasq-dns-5784cf869f-jq7fk\" (UID: \"9321b726-0c2c-4c9c-a40a-73a387dfb215\") " pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.510024 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-config\") pod \"dnsmasq-dns-5784cf869f-jq7fk\" (UID: \"9321b726-0c2c-4c9c-a40a-73a387dfb215\") " pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.510116 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-jq7fk\" (UID: \"9321b726-0c2c-4c9c-a40a-73a387dfb215\") " pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.510157 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-jq7fk\" (UID: \"9321b726-0c2c-4c9c-a40a-73a387dfb215\") " pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.511157 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-dns-swift-storage-0\") pod \"dnsmasq-dns-5784cf869f-jq7fk\" (UID: \"9321b726-0c2c-4c9c-a40a-73a387dfb215\") " pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.511276 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-dns-svc\") pod \"dnsmasq-dns-5784cf869f-jq7fk\" (UID: \"9321b726-0c2c-4c9c-a40a-73a387dfb215\") " pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.511884 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-config\") pod \"dnsmasq-dns-5784cf869f-jq7fk\" (UID: \"9321b726-0c2c-4c9c-a40a-73a387dfb215\") " pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.512563 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-ovsdbserver-nb\") pod \"dnsmasq-dns-5784cf869f-jq7fk\" (UID: \"9321b726-0c2c-4c9c-a40a-73a387dfb215\") " pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.512619 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-ovsdbserver-sb\") pod \"dnsmasq-dns-5784cf869f-jq7fk\" (UID: \"9321b726-0c2c-4c9c-a40a-73a387dfb215\") " pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.562889 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqvgc\" (UniqueName: \"kubernetes.io/projected/9321b726-0c2c-4c9c-a40a-73a387dfb215-kube-api-access-sqvgc\") pod \"dnsmasq-dns-5784cf869f-jq7fk\" (UID: \"9321b726-0c2c-4c9c-a40a-73a387dfb215\") " pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.639052 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.688331 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-77cf5858cd-9xwk6"] Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.694657 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77cf5858cd-9xwk6" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.699703 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.700009 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.700079 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.700439 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-hmk7b" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.816973 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77cf5858cd-9xwk6"] Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.837815 4973 scope.go:117] "RemoveContainer" containerID="b790e51150b683894674eb14c8dce56ef6218db9eed72c18e694cb0087239c6c" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.945476 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.947196 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-ovndb-tls-certs\") pod \"neutron-77cf5858cd-9xwk6\" (UID: \"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d\") " pod="openstack/neutron-77cf5858cd-9xwk6" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.947289 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-combined-ca-bundle\") pod \"neutron-77cf5858cd-9xwk6\" (UID: \"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d\") " pod="openstack/neutron-77cf5858cd-9xwk6" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.947461 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-config\") pod \"neutron-77cf5858cd-9xwk6\" (UID: \"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d\") " pod="openstack/neutron-77cf5858cd-9xwk6" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.947592 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-httpd-config\") pod \"neutron-77cf5858cd-9xwk6\" (UID: \"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d\") " pod="openstack/neutron-77cf5858cd-9xwk6" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.947678 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrzqz\" (UniqueName: \"kubernetes.io/projected/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-kube-api-access-wrzqz\") pod \"neutron-77cf5858cd-9xwk6\" (UID: \"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d\") " pod="openstack/neutron-77cf5858cd-9xwk6" Mar 20 13:46:19 crc kubenswrapper[4973]: I0320 13:46:19.990674 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.068555 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="198d4fb1-4784-4553-b617-a2c33cec6df6" path="/var/lib/kubelet/pods/198d4fb1-4784-4553-b617-a2c33cec6df6/volumes" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.069885 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.083114 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.094599 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07dbf52e-f333-4466-8e56-b44815529a05-scripts\") pod \"cinder-api-0\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " pod="openstack/cinder-api-0" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.094662 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07dbf52e-f333-4466-8e56-b44815529a05-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " pod="openstack/cinder-api-0" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.095687 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07dbf52e-f333-4466-8e56-b44815529a05-logs\") pod \"cinder-api-0\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " pod="openstack/cinder-api-0" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.095810 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-ovndb-tls-certs\") pod \"neutron-77cf5858cd-9xwk6\" (UID: \"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d\") " pod="openstack/neutron-77cf5858cd-9xwk6" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.095861 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9qqg\" (UniqueName: \"kubernetes.io/projected/07dbf52e-f333-4466-8e56-b44815529a05-kube-api-access-l9qqg\") pod \"cinder-api-0\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " pod="openstack/cinder-api-0" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.095923 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-combined-ca-bundle\") pod \"neutron-77cf5858cd-9xwk6\" (UID: \"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d\") " pod="openstack/neutron-77cf5858cd-9xwk6" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.095943 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07dbf52e-f333-4466-8e56-b44815529a05-config-data-custom\") pod \"cinder-api-0\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " pod="openstack/cinder-api-0" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.096043 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-config\") pod \"neutron-77cf5858cd-9xwk6\" (UID: \"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d\") " pod="openstack/neutron-77cf5858cd-9xwk6" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.096087 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/07dbf52e-f333-4466-8e56-b44815529a05-etc-machine-id\") pod \"cinder-api-0\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " pod="openstack/cinder-api-0" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.096278 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-httpd-config\") pod \"neutron-77cf5858cd-9xwk6\" (UID: \"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d\") " pod="openstack/neutron-77cf5858cd-9xwk6" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.096417 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrzqz\" (UniqueName: \"kubernetes.io/projected/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-kube-api-access-wrzqz\") pod \"neutron-77cf5858cd-9xwk6\" (UID: \"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d\") " pod="openstack/neutron-77cf5858cd-9xwk6" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.096476 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07dbf52e-f333-4466-8e56-b44815529a05-config-data\") pod \"cinder-api-0\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " pod="openstack/cinder-api-0" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.100805 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.101058 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.107804 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.112380 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-ovndb-tls-certs\") pod \"neutron-77cf5858cd-9xwk6\" (UID: \"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d\") " pod="openstack/neutron-77cf5858cd-9xwk6" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.127206 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-httpd-config\") pod \"neutron-77cf5858cd-9xwk6\" (UID: \"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d\") " pod="openstack/neutron-77cf5858cd-9xwk6" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.139252 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-combined-ca-bundle\") pod \"neutron-77cf5858cd-9xwk6\" (UID: \"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d\") " pod="openstack/neutron-77cf5858cd-9xwk6" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.144547 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-config\") pod \"neutron-77cf5858cd-9xwk6\" (UID: \"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d\") " pod="openstack/neutron-77cf5858cd-9xwk6" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.159128 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrzqz\" (UniqueName: \"kubernetes.io/projected/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-kube-api-access-wrzqz\") pod \"neutron-77cf5858cd-9xwk6\" (UID: \"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d\") " pod="openstack/neutron-77cf5858cd-9xwk6" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.204273 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07dbf52e-f333-4466-8e56-b44815529a05-config-data-custom\") pod \"cinder-api-0\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " pod="openstack/cinder-api-0" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.204427 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/07dbf52e-f333-4466-8e56-b44815529a05-etc-machine-id\") pod \"cinder-api-0\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " pod="openstack/cinder-api-0" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.204653 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07dbf52e-f333-4466-8e56-b44815529a05-config-data\") pod \"cinder-api-0\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " pod="openstack/cinder-api-0" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.204714 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07dbf52e-f333-4466-8e56-b44815529a05-scripts\") pod \"cinder-api-0\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " pod="openstack/cinder-api-0" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.204749 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07dbf52e-f333-4466-8e56-b44815529a05-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " pod="openstack/cinder-api-0" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.204817 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07dbf52e-f333-4466-8e56-b44815529a05-logs\") pod \"cinder-api-0\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " pod="openstack/cinder-api-0" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.204895 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9qqg\" (UniqueName: \"kubernetes.io/projected/07dbf52e-f333-4466-8e56-b44815529a05-kube-api-access-l9qqg\") pod \"cinder-api-0\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " pod="openstack/cinder-api-0" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.207502 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/07dbf52e-f333-4466-8e56-b44815529a05-etc-machine-id\") pod \"cinder-api-0\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " pod="openstack/cinder-api-0" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.209313 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07dbf52e-f333-4466-8e56-b44815529a05-logs\") pod \"cinder-api-0\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " pod="openstack/cinder-api-0" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.227324 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07dbf52e-f333-4466-8e56-b44815529a05-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " pod="openstack/cinder-api-0" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.231026 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07dbf52e-f333-4466-8e56-b44815529a05-config-data-custom\") pod \"cinder-api-0\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " pod="openstack/cinder-api-0" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.232879 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07dbf52e-f333-4466-8e56-b44815529a05-scripts\") pod \"cinder-api-0\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " pod="openstack/cinder-api-0" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.240874 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07dbf52e-f333-4466-8e56-b44815529a05-config-data\") pod \"cinder-api-0\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " pod="openstack/cinder-api-0" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.247138 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9qqg\" (UniqueName: \"kubernetes.io/projected/07dbf52e-f333-4466-8e56-b44815529a05-kube-api-access-l9qqg\") pod \"cinder-api-0\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " pod="openstack/cinder-api-0" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.422468 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-hmk7b" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.430359 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77cf5858cd-9xwk6" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.476413 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:46:20 crc kubenswrapper[4973]: I0320 13:46:20.624356 4973 scope.go:117] "RemoveContainer" containerID="186ab78de0056ee24a059d1993ee768dc788875d32e78405f88b053e204b2bd0" Mar 20 13:46:21 crc kubenswrapper[4973]: I0320 13:46:21.197066 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:46:21 crc kubenswrapper[4973]: W0320 13:46:21.282152 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49ac7635_4e3b_47fb_a585_c07724eff983.slice/crio-fb95917f926dabf345eb17033cf0cb02e36daa4f6748c05bde032e273fa010b9 WatchSource:0}: Error finding container fb95917f926dabf345eb17033cf0cb02e36daa4f6748c05bde032e273fa010b9: Status 404 returned error can't find the container with id fb95917f926dabf345eb17033cf0cb02e36daa4f6748c05bde032e273fa010b9 Mar 20 13:46:21 crc kubenswrapper[4973]: I0320 13:46:21.608409 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-jq7fk"] Mar 20 13:46:21 crc kubenswrapper[4973]: W0320 13:46:21.630801 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9321b726_0c2c_4c9c_a40a_73a387dfb215.slice/crio-6df5d421a858e6250e1988e8e3f9ed867042f5344c40fd5e20efe26bb22f1f1d WatchSource:0}: Error finding container 6df5d421a858e6250e1988e8e3f9ed867042f5344c40fd5e20efe26bb22f1f1d: Status 404 returned error can't find the container with id 6df5d421a858e6250e1988e8e3f9ed867042f5344c40fd5e20efe26bb22f1f1d Mar 20 13:46:21 crc kubenswrapper[4973]: I0320 13:46:21.649051 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" event={"ID":"49e5f7d7-87ea-4274-8d70-57783772d3a1","Type":"ContainerStarted","Data":"ce64a707f9adc6866b4bbac61765c12e02f5d4e21b96a9faf7c864b3b5abfa0e"} Mar 20 13:46:21 crc kubenswrapper[4973]: I0320 13:46:21.649206 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" podUID="49e5f7d7-87ea-4274-8d70-57783772d3a1" containerName="dnsmasq-dns" containerID="cri-o://ce64a707f9adc6866b4bbac61765c12e02f5d4e21b96a9faf7c864b3b5abfa0e" gracePeriod=10 Mar 20 13:46:21 crc kubenswrapper[4973]: I0320 13:46:21.649476 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" Mar 20 13:46:21 crc kubenswrapper[4973]: I0320 13:46:21.657154 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-68cf78bb54-pdcln" event={"ID":"e9a97269-d458-4405-988c-b32339897e4f","Type":"ContainerStarted","Data":"5fa58172fc88c35172031f0a02c5cbc734efc4013a90f11b9255fd9dfa9e9ee2"} Mar 20 13:46:21 crc kubenswrapper[4973]: I0320 13:46:21.659285 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65b846dd97-gcnmc" event={"ID":"8df11824-a676-4f6a-8f12-ccff6ce1bdc6","Type":"ContainerStarted","Data":"fb035be11926fe4366bdab49fc27f90ac1fcb29ebbb3d34c3eec592ef9d4736b"} Mar 20 13:46:21 crc kubenswrapper[4973]: I0320 13:46:21.660853 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"49ac7635-4e3b-47fb-a585-c07724eff983","Type":"ContainerStarted","Data":"fb95917f926dabf345eb17033cf0cb02e36daa4f6748c05bde032e273fa010b9"} Mar 20 13:46:21 crc kubenswrapper[4973]: I0320 13:46:21.704210 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" podStartSLOduration=6.7041893550000005 podStartE2EDuration="6.704189355s" podCreationTimestamp="2026-03-20 13:46:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:21.672551141 +0000 UTC m=+1502.416220895" watchObservedRunningTime="2026-03-20 13:46:21.704189355 +0000 UTC m=+1502.447859099" Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.037547 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.054980 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.281830 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-77cf5858cd-9xwk6"] Mar 20 13:46:22 crc kubenswrapper[4973]: W0320 13:46:22.362839 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod58e69a08_7d61_4b8a_8ff1_6a9bb59c360d.slice/crio-c04c9ea43ebaf35ec4ce831abb2aa5c50e7456dfb24830774b761f1bfce62d71 WatchSource:0}: Error finding container c04c9ea43ebaf35ec4ce831abb2aa5c50e7456dfb24830774b761f1bfce62d71: Status 404 returned error can't find the container with id c04c9ea43ebaf35ec4ce831abb2aa5c50e7456dfb24830774b761f1bfce62d71 Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.496662 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.640727 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-dns-swift-storage-0\") pod \"49e5f7d7-87ea-4274-8d70-57783772d3a1\" (UID: \"49e5f7d7-87ea-4274-8d70-57783772d3a1\") " Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.640851 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-ovsdbserver-nb\") pod \"49e5f7d7-87ea-4274-8d70-57783772d3a1\" (UID: \"49e5f7d7-87ea-4274-8d70-57783772d3a1\") " Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.641424 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-dns-svc\") pod \"49e5f7d7-87ea-4274-8d70-57783772d3a1\" (UID: \"49e5f7d7-87ea-4274-8d70-57783772d3a1\") " Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.641517 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdccf\" (UniqueName: \"kubernetes.io/projected/49e5f7d7-87ea-4274-8d70-57783772d3a1-kube-api-access-xdccf\") pod \"49e5f7d7-87ea-4274-8d70-57783772d3a1\" (UID: \"49e5f7d7-87ea-4274-8d70-57783772d3a1\") " Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.641655 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-config\") pod \"49e5f7d7-87ea-4274-8d70-57783772d3a1\" (UID: \"49e5f7d7-87ea-4274-8d70-57783772d3a1\") " Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.641733 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-ovsdbserver-sb\") pod \"49e5f7d7-87ea-4274-8d70-57783772d3a1\" (UID: \"49e5f7d7-87ea-4274-8d70-57783772d3a1\") " Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.650525 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49e5f7d7-87ea-4274-8d70-57783772d3a1-kube-api-access-xdccf" (OuterVolumeSpecName: "kube-api-access-xdccf") pod "49e5f7d7-87ea-4274-8d70-57783772d3a1" (UID: "49e5f7d7-87ea-4274-8d70-57783772d3a1"). InnerVolumeSpecName "kube-api-access-xdccf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.701949 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-65b846dd97-gcnmc" event={"ID":"8df11824-a676-4f6a-8f12-ccff6ce1bdc6","Type":"ContainerStarted","Data":"c764eb7820ce41810ce1cf233f35923eb77a4b47b5dd0faf5b75b2691e6fc01a"} Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.710731 4973 generic.go:334] "Generic (PLEG): container finished" podID="9321b726-0c2c-4c9c-a40a-73a387dfb215" containerID="dbc121b549781d39bb93faa395feca0058a205d4f236dd0ec771e4838ca1e94a" exitCode=0 Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.710816 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" event={"ID":"9321b726-0c2c-4c9c-a40a-73a387dfb215","Type":"ContainerDied","Data":"dbc121b549781d39bb93faa395feca0058a205d4f236dd0ec771e4838ca1e94a"} Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.710852 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" event={"ID":"9321b726-0c2c-4c9c-a40a-73a387dfb215","Type":"ContainerStarted","Data":"6df5d421a858e6250e1988e8e3f9ed867042f5344c40fd5e20efe26bb22f1f1d"} Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.718062 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"07dbf52e-f333-4466-8e56-b44815529a05","Type":"ContainerStarted","Data":"a0a534c8ec433ab6392bfb7c4233a9b9a17705c83f443d38481620a980aec38a"} Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.741762 4973 generic.go:334] "Generic (PLEG): container finished" podID="49e5f7d7-87ea-4274-8d70-57783772d3a1" containerID="ce64a707f9adc6866b4bbac61765c12e02f5d4e21b96a9faf7c864b3b5abfa0e" exitCode=0 Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.742200 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" event={"ID":"49e5f7d7-87ea-4274-8d70-57783772d3a1","Type":"ContainerDied","Data":"ce64a707f9adc6866b4bbac61765c12e02f5d4e21b96a9faf7c864b3b5abfa0e"} Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.742233 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" event={"ID":"49e5f7d7-87ea-4274-8d70-57783772d3a1","Type":"ContainerDied","Data":"127bf346870d7444f33af9799cc82a67a6da11efa17a2c3b004e2be0beffa323"} Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.742250 4973 scope.go:117] "RemoveContainer" containerID="ce64a707f9adc6866b4bbac61765c12e02f5d4e21b96a9faf7c864b3b5abfa0e" Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.742463 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-59d5ff467f-t8fv8" Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.752681 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77cf5858cd-9xwk6" event={"ID":"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d","Type":"ContainerStarted","Data":"c04c9ea43ebaf35ec4ce831abb2aa5c50e7456dfb24830774b761f1bfce62d71"} Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.758003 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-65b846dd97-gcnmc" podStartSLOduration=3.870292385 podStartE2EDuration="7.757977928s" podCreationTimestamp="2026-03-20 13:46:15 +0000 UTC" firstStartedPulling="2026-03-20 13:46:17.053177467 +0000 UTC m=+1497.796847211" lastFinishedPulling="2026-03-20 13:46:20.94086301 +0000 UTC m=+1501.684532754" observedRunningTime="2026-03-20 13:46:22.723953799 +0000 UTC m=+1503.467623553" watchObservedRunningTime="2026-03-20 13:46:22.757977928 +0000 UTC m=+1503.501647672" Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.762570 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdccf\" (UniqueName: \"kubernetes.io/projected/49e5f7d7-87ea-4274-8d70-57783772d3a1-kube-api-access-xdccf\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.810020 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-68cf78bb54-pdcln" event={"ID":"e9a97269-d458-4405-988c-b32339897e4f","Type":"ContainerStarted","Data":"71bca5fc8955002d4a60b1c49595ddd96e90516308699f26c6126a18a1270f89"} Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.839223 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d355b77-7cc9-4184-a223-2bc448d9ca1b","Type":"ContainerStarted","Data":"c15cdb219151792214f1399a96a67096cb49c4ad006083863cd2c26f27d3ad39"} Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.911382 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "49e5f7d7-87ea-4274-8d70-57783772d3a1" (UID: "49e5f7d7-87ea-4274-8d70-57783772d3a1"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.945805 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "49e5f7d7-87ea-4274-8d70-57783772d3a1" (UID: "49e5f7d7-87ea-4274-8d70-57783772d3a1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.959982 4973 scope.go:117] "RemoveContainer" containerID="dd73c2f459eafebe224a7bf3f433ca1c042e3feb0c487e91811e559f070dd70d" Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.973167 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "49e5f7d7-87ea-4274-8d70-57783772d3a1" (UID: "49e5f7d7-87ea-4274-8d70-57783772d3a1"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.975446 4973 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.975472 4973 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:22 crc kubenswrapper[4973]: I0320 13:46:22.975486 4973 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:23 crc kubenswrapper[4973]: I0320 13:46:23.015471 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "49e5f7d7-87ea-4274-8d70-57783772d3a1" (UID: "49e5f7d7-87ea-4274-8d70-57783772d3a1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:23 crc kubenswrapper[4973]: I0320 13:46:23.018473 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-config" (OuterVolumeSpecName: "config") pod "49e5f7d7-87ea-4274-8d70-57783772d3a1" (UID: "49e5f7d7-87ea-4274-8d70-57783772d3a1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:23 crc kubenswrapper[4973]: I0320 13:46:23.030449 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-68cf78bb54-pdcln" podStartSLOduration=4.18228879 podStartE2EDuration="8.030428604s" podCreationTimestamp="2026-03-20 13:46:15 +0000 UTC" firstStartedPulling="2026-03-20 13:46:16.960920179 +0000 UTC m=+1497.704589923" lastFinishedPulling="2026-03-20 13:46:20.809059993 +0000 UTC m=+1501.552729737" observedRunningTime="2026-03-20 13:46:22.865163253 +0000 UTC m=+1503.608832997" watchObservedRunningTime="2026-03-20 13:46:23.030428604 +0000 UTC m=+1503.774098348" Mar 20 13:46:23 crc kubenswrapper[4973]: I0320 13:46:23.073859 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:46:23 crc kubenswrapper[4973]: I0320 13:46:23.127006 4973 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:23 crc kubenswrapper[4973]: I0320 13:46:23.127062 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49e5f7d7-87ea-4274-8d70-57783772d3a1-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:23 crc kubenswrapper[4973]: I0320 13:46:23.290678 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-t8fv8"] Mar 20 13:46:23 crc kubenswrapper[4973]: I0320 13:46:23.315311 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-59d5ff467f-t8fv8"] Mar 20 13:46:23 crc kubenswrapper[4973]: I0320 13:46:23.338196 4973 scope.go:117] "RemoveContainer" containerID="ce64a707f9adc6866b4bbac61765c12e02f5d4e21b96a9faf7c864b3b5abfa0e" Mar 20 13:46:23 crc kubenswrapper[4973]: E0320 13:46:23.341110 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce64a707f9adc6866b4bbac61765c12e02f5d4e21b96a9faf7c864b3b5abfa0e\": container with ID starting with ce64a707f9adc6866b4bbac61765c12e02f5d4e21b96a9faf7c864b3b5abfa0e not found: ID does not exist" containerID="ce64a707f9adc6866b4bbac61765c12e02f5d4e21b96a9faf7c864b3b5abfa0e" Mar 20 13:46:23 crc kubenswrapper[4973]: I0320 13:46:23.341161 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce64a707f9adc6866b4bbac61765c12e02f5d4e21b96a9faf7c864b3b5abfa0e"} err="failed to get container status \"ce64a707f9adc6866b4bbac61765c12e02f5d4e21b96a9faf7c864b3b5abfa0e\": rpc error: code = NotFound desc = could not find container \"ce64a707f9adc6866b4bbac61765c12e02f5d4e21b96a9faf7c864b3b5abfa0e\": container with ID starting with ce64a707f9adc6866b4bbac61765c12e02f5d4e21b96a9faf7c864b3b5abfa0e not found: ID does not exist" Mar 20 13:46:23 crc kubenswrapper[4973]: I0320 13:46:23.341189 4973 scope.go:117] "RemoveContainer" containerID="dd73c2f459eafebe224a7bf3f433ca1c042e3feb0c487e91811e559f070dd70d" Mar 20 13:46:23 crc kubenswrapper[4973]: E0320 13:46:23.341518 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd73c2f459eafebe224a7bf3f433ca1c042e3feb0c487e91811e559f070dd70d\": container with ID starting with dd73c2f459eafebe224a7bf3f433ca1c042e3feb0c487e91811e559f070dd70d not found: ID does not exist" containerID="dd73c2f459eafebe224a7bf3f433ca1c042e3feb0c487e91811e559f070dd70d" Mar 20 13:46:23 crc kubenswrapper[4973]: I0320 13:46:23.341561 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd73c2f459eafebe224a7bf3f433ca1c042e3feb0c487e91811e559f070dd70d"} err="failed to get container status \"dd73c2f459eafebe224a7bf3f433ca1c042e3feb0c487e91811e559f070dd70d\": rpc error: code = NotFound desc = could not find container \"dd73c2f459eafebe224a7bf3f433ca1c042e3feb0c487e91811e559f070dd70d\": container with ID starting with dd73c2f459eafebe224a7bf3f433ca1c042e3feb0c487e91811e559f070dd70d not found: ID does not exist" Mar 20 13:46:23 crc kubenswrapper[4973]: I0320 13:46:23.867040 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" event={"ID":"9321b726-0c2c-4c9c-a40a-73a387dfb215","Type":"ContainerStarted","Data":"1ed5313e377ebf010922dc90d59e59d3a8d03018a545e32201c9e941fbefd522"} Mar 20 13:46:23 crc kubenswrapper[4973]: I0320 13:46:23.867450 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" Mar 20 13:46:23 crc kubenswrapper[4973]: I0320 13:46:23.877832 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"07dbf52e-f333-4466-8e56-b44815529a05","Type":"ContainerStarted","Data":"5a0b537c60fe6d546b322aaa5804a3835204b8259c94156fe92808b7b16075fe"} Mar 20 13:46:23 crc kubenswrapper[4973]: I0320 13:46:23.903427 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77cf5858cd-9xwk6" event={"ID":"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d","Type":"ContainerStarted","Data":"8185079e7c55881e7a789b73d0c2cde66bafaaa4cb18f01382bb731f9cef3f0a"} Mar 20 13:46:23 crc kubenswrapper[4973]: I0320 13:46:23.903478 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77cf5858cd-9xwk6" event={"ID":"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d","Type":"ContainerStarted","Data":"3ccab10a39d8b83dbf6322bdc5ebddcba8df48beb7400dbf5d483a59cf18bca1"} Mar 20 13:46:23 crc kubenswrapper[4973]: I0320 13:46:23.904896 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-77cf5858cd-9xwk6" Mar 20 13:46:23 crc kubenswrapper[4973]: I0320 13:46:23.913211 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" podStartSLOduration=4.913188789 podStartE2EDuration="4.913188789s" podCreationTimestamp="2026-03-20 13:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:23.896202506 +0000 UTC m=+1504.639872270" watchObservedRunningTime="2026-03-20 13:46:23.913188789 +0000 UTC m=+1504.656858533" Mar 20 13:46:23 crc kubenswrapper[4973]: I0320 13:46:23.969789 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49e5f7d7-87ea-4274-8d70-57783772d3a1" path="/var/lib/kubelet/pods/49e5f7d7-87ea-4274-8d70-57783772d3a1/volumes" Mar 20 13:46:23 crc kubenswrapper[4973]: I0320 13:46:23.987275 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-77cf5858cd-9xwk6" podStartSLOduration=4.987255081 podStartE2EDuration="4.987255081s" podCreationTimestamp="2026-03-20 13:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:23.9828371 +0000 UTC m=+1504.726506844" watchObservedRunningTime="2026-03-20 13:46:23.987255081 +0000 UTC m=+1504.730924825" Mar 20 13:46:24 crc kubenswrapper[4973]: I0320 13:46:24.917902 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"49ac7635-4e3b-47fb-a585-c07724eff983","Type":"ContainerStarted","Data":"a690ace5386c481e1601f112f0f1f1ae547136eb7e9c65c44c37804125c2aa3e"} Mar 20 13:46:24 crc kubenswrapper[4973]: I0320 13:46:24.920098 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"07dbf52e-f333-4466-8e56-b44815529a05","Type":"ContainerStarted","Data":"55f829e539a8e500c99341aa1e054b0d9f26dde9c8e12cf45a1698d4ce9f2a30"} Mar 20 13:46:24 crc kubenswrapper[4973]: I0320 13:46:24.920225 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="07dbf52e-f333-4466-8e56-b44815529a05" containerName="cinder-api-log" containerID="cri-o://5a0b537c60fe6d546b322aaa5804a3835204b8259c94156fe92808b7b16075fe" gracePeriod=30 Mar 20 13:46:24 crc kubenswrapper[4973]: I0320 13:46:24.920259 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 13:46:24 crc kubenswrapper[4973]: I0320 13:46:24.920352 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="07dbf52e-f333-4466-8e56-b44815529a05" containerName="cinder-api" containerID="cri-o://55f829e539a8e500c99341aa1e054b0d9f26dde9c8e12cf45a1698d4ce9f2a30" gracePeriod=30 Mar 20 13:46:24 crc kubenswrapper[4973]: I0320 13:46:24.930570 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d355b77-7cc9-4184-a223-2bc448d9ca1b","Type":"ContainerStarted","Data":"131d15b9a05a258c4c680a989df302321e53c7be62c04536bb7336f954905cc5"} Mar 20 13:46:24 crc kubenswrapper[4973]: I0320 13:46:24.946800 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.94677641 podStartE2EDuration="5.94677641s" podCreationTimestamp="2026-03-20 13:46:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:24.943256384 +0000 UTC m=+1505.686926148" watchObservedRunningTime="2026-03-20 13:46:24.94677641 +0000 UTC m=+1505.690446174" Mar 20 13:46:25 crc kubenswrapper[4973]: I0320 13:46:25.147764 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xq9kw"] Mar 20 13:46:25 crc kubenswrapper[4973]: E0320 13:46:25.148660 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49e5f7d7-87ea-4274-8d70-57783772d3a1" containerName="init" Mar 20 13:46:25 crc kubenswrapper[4973]: I0320 13:46:25.148681 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="49e5f7d7-87ea-4274-8d70-57783772d3a1" containerName="init" Mar 20 13:46:25 crc kubenswrapper[4973]: E0320 13:46:25.148693 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49e5f7d7-87ea-4274-8d70-57783772d3a1" containerName="dnsmasq-dns" Mar 20 13:46:25 crc kubenswrapper[4973]: I0320 13:46:25.148699 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="49e5f7d7-87ea-4274-8d70-57783772d3a1" containerName="dnsmasq-dns" Mar 20 13:46:25 crc kubenswrapper[4973]: I0320 13:46:25.148950 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="49e5f7d7-87ea-4274-8d70-57783772d3a1" containerName="dnsmasq-dns" Mar 20 13:46:25 crc kubenswrapper[4973]: I0320 13:46:25.150965 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xq9kw" Mar 20 13:46:25 crc kubenswrapper[4973]: I0320 13:46:25.165512 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xq9kw"] Mar 20 13:46:25 crc kubenswrapper[4973]: I0320 13:46:25.309928 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8448c2a8-df10-4a2f-a14f-c59318a71ceb-catalog-content\") pod \"redhat-operators-xq9kw\" (UID: \"8448c2a8-df10-4a2f-a14f-c59318a71ceb\") " pod="openshift-marketplace/redhat-operators-xq9kw" Mar 20 13:46:25 crc kubenswrapper[4973]: I0320 13:46:25.309992 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkndn\" (UniqueName: \"kubernetes.io/projected/8448c2a8-df10-4a2f-a14f-c59318a71ceb-kube-api-access-kkndn\") pod \"redhat-operators-xq9kw\" (UID: \"8448c2a8-df10-4a2f-a14f-c59318a71ceb\") " pod="openshift-marketplace/redhat-operators-xq9kw" Mar 20 13:46:25 crc kubenswrapper[4973]: I0320 13:46:25.310388 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8448c2a8-df10-4a2f-a14f-c59318a71ceb-utilities\") pod \"redhat-operators-xq9kw\" (UID: \"8448c2a8-df10-4a2f-a14f-c59318a71ceb\") " pod="openshift-marketplace/redhat-operators-xq9kw" Mar 20 13:46:25 crc kubenswrapper[4973]: I0320 13:46:25.412681 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8448c2a8-df10-4a2f-a14f-c59318a71ceb-catalog-content\") pod \"redhat-operators-xq9kw\" (UID: \"8448c2a8-df10-4a2f-a14f-c59318a71ceb\") " pod="openshift-marketplace/redhat-operators-xq9kw" Mar 20 13:46:25 crc kubenswrapper[4973]: I0320 13:46:25.412762 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkndn\" (UniqueName: \"kubernetes.io/projected/8448c2a8-df10-4a2f-a14f-c59318a71ceb-kube-api-access-kkndn\") pod \"redhat-operators-xq9kw\" (UID: \"8448c2a8-df10-4a2f-a14f-c59318a71ceb\") " pod="openshift-marketplace/redhat-operators-xq9kw" Mar 20 13:46:25 crc kubenswrapper[4973]: I0320 13:46:25.412999 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8448c2a8-df10-4a2f-a14f-c59318a71ceb-utilities\") pod \"redhat-operators-xq9kw\" (UID: \"8448c2a8-df10-4a2f-a14f-c59318a71ceb\") " pod="openshift-marketplace/redhat-operators-xq9kw" Mar 20 13:46:25 crc kubenswrapper[4973]: I0320 13:46:25.413301 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8448c2a8-df10-4a2f-a14f-c59318a71ceb-catalog-content\") pod \"redhat-operators-xq9kw\" (UID: \"8448c2a8-df10-4a2f-a14f-c59318a71ceb\") " pod="openshift-marketplace/redhat-operators-xq9kw" Mar 20 13:46:25 crc kubenswrapper[4973]: I0320 13:46:25.413501 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8448c2a8-df10-4a2f-a14f-c59318a71ceb-utilities\") pod \"redhat-operators-xq9kw\" (UID: \"8448c2a8-df10-4a2f-a14f-c59318a71ceb\") " pod="openshift-marketplace/redhat-operators-xq9kw" Mar 20 13:46:25 crc kubenswrapper[4973]: I0320 13:46:25.538627 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkndn\" (UniqueName: \"kubernetes.io/projected/8448c2a8-df10-4a2f-a14f-c59318a71ceb-kube-api-access-kkndn\") pod \"redhat-operators-xq9kw\" (UID: \"8448c2a8-df10-4a2f-a14f-c59318a71ceb\") " pod="openshift-marketplace/redhat-operators-xq9kw" Mar 20 13:46:25 crc kubenswrapper[4973]: I0320 13:46:25.838313 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xq9kw" Mar 20 13:46:25 crc kubenswrapper[4973]: I0320 13:46:25.968737 4973 generic.go:334] "Generic (PLEG): container finished" podID="07dbf52e-f333-4466-8e56-b44815529a05" containerID="5a0b537c60fe6d546b322aaa5804a3835204b8259c94156fe92808b7b16075fe" exitCode=143 Mar 20 13:46:25 crc kubenswrapper[4973]: I0320 13:46:25.985373 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d355b77-7cc9-4184-a223-2bc448d9ca1b","Type":"ContainerStarted","Data":"8dc127458f2a429cd82307b7a3a2d526590507915e3dd783c3131ec53c85c887"} Mar 20 13:46:25 crc kubenswrapper[4973]: I0320 13:46:25.985414 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"49ac7635-4e3b-47fb-a585-c07724eff983","Type":"ContainerStarted","Data":"ca1b80ec5b60dcbff59f89aba489d788d387a3138452d2de099b9d2e20f12b3c"} Mar 20 13:46:25 crc kubenswrapper[4973]: I0320 13:46:25.985428 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"07dbf52e-f333-4466-8e56-b44815529a05","Type":"ContainerDied","Data":"5a0b537c60fe6d546b322aaa5804a3835204b8259c94156fe92808b7b16075fe"} Mar 20 13:46:26 crc kubenswrapper[4973]: I0320 13:46:26.000733 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.885909 podStartE2EDuration="8.000711288s" podCreationTimestamp="2026-03-20 13:46:18 +0000 UTC" firstStartedPulling="2026-03-20 13:46:21.291178982 +0000 UTC m=+1502.034848726" lastFinishedPulling="2026-03-20 13:46:22.40598127 +0000 UTC m=+1503.149651014" observedRunningTime="2026-03-20 13:46:25.992606316 +0000 UTC m=+1506.736276060" watchObservedRunningTime="2026-03-20 13:46:26.000711288 +0000 UTC m=+1506.744381032" Mar 20 13:46:26 crc kubenswrapper[4973]: I0320 13:46:26.495727 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xq9kw"] Mar 20 13:46:26 crc kubenswrapper[4973]: W0320 13:46:26.518926 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8448c2a8_df10_4a2f_a14f_c59318a71ceb.slice/crio-e411e7a065c1cb41cb65edb954f04b3c6bcd549ef41983e7b283b4f592f9ec1f WatchSource:0}: Error finding container e411e7a065c1cb41cb65edb954f04b3c6bcd549ef41983e7b283b4f592f9ec1f: Status 404 returned error can't find the container with id e411e7a065c1cb41cb65edb954f04b3c6bcd549ef41983e7b283b4f592f9ec1f Mar 20 13:46:26 crc kubenswrapper[4973]: I0320 13:46:26.826405 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5874c8d58f-l5f6s"] Mar 20 13:46:26 crc kubenswrapper[4973]: I0320 13:46:26.828484 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5874c8d58f-l5f6s" Mar 20 13:46:26 crc kubenswrapper[4973]: I0320 13:46:26.832158 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 20 13:46:26 crc kubenswrapper[4973]: I0320 13:46:26.839739 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 20 13:46:26 crc kubenswrapper[4973]: I0320 13:46:26.840880 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5874c8d58f-l5f6s"] Mar 20 13:46:26 crc kubenswrapper[4973]: I0320 13:46:26.948465 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/983eb56d-9c01-48a2-bcd1-3ea59f11bc01-combined-ca-bundle\") pod \"neutron-5874c8d58f-l5f6s\" (UID: \"983eb56d-9c01-48a2-bcd1-3ea59f11bc01\") " pod="openstack/neutron-5874c8d58f-l5f6s" Mar 20 13:46:26 crc kubenswrapper[4973]: I0320 13:46:26.948559 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/983eb56d-9c01-48a2-bcd1-3ea59f11bc01-public-tls-certs\") pod \"neutron-5874c8d58f-l5f6s\" (UID: \"983eb56d-9c01-48a2-bcd1-3ea59f11bc01\") " pod="openstack/neutron-5874c8d58f-l5f6s" Mar 20 13:46:26 crc kubenswrapper[4973]: I0320 13:46:26.948694 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tngj7\" (UniqueName: \"kubernetes.io/projected/983eb56d-9c01-48a2-bcd1-3ea59f11bc01-kube-api-access-tngj7\") pod \"neutron-5874c8d58f-l5f6s\" (UID: \"983eb56d-9c01-48a2-bcd1-3ea59f11bc01\") " pod="openstack/neutron-5874c8d58f-l5f6s" Mar 20 13:46:26 crc kubenswrapper[4973]: I0320 13:46:26.948815 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/983eb56d-9c01-48a2-bcd1-3ea59f11bc01-ovndb-tls-certs\") pod \"neutron-5874c8d58f-l5f6s\" (UID: \"983eb56d-9c01-48a2-bcd1-3ea59f11bc01\") " pod="openstack/neutron-5874c8d58f-l5f6s" Mar 20 13:46:26 crc kubenswrapper[4973]: I0320 13:46:26.948954 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/983eb56d-9c01-48a2-bcd1-3ea59f11bc01-httpd-config\") pod \"neutron-5874c8d58f-l5f6s\" (UID: \"983eb56d-9c01-48a2-bcd1-3ea59f11bc01\") " pod="openstack/neutron-5874c8d58f-l5f6s" Mar 20 13:46:26 crc kubenswrapper[4973]: I0320 13:46:26.949010 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/983eb56d-9c01-48a2-bcd1-3ea59f11bc01-config\") pod \"neutron-5874c8d58f-l5f6s\" (UID: \"983eb56d-9c01-48a2-bcd1-3ea59f11bc01\") " pod="openstack/neutron-5874c8d58f-l5f6s" Mar 20 13:46:26 crc kubenswrapper[4973]: I0320 13:46:26.949056 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/983eb56d-9c01-48a2-bcd1-3ea59f11bc01-internal-tls-certs\") pod \"neutron-5874c8d58f-l5f6s\" (UID: \"983eb56d-9c01-48a2-bcd1-3ea59f11bc01\") " pod="openstack/neutron-5874c8d58f-l5f6s" Mar 20 13:46:26 crc kubenswrapper[4973]: I0320 13:46:26.986568 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d355b77-7cc9-4184-a223-2bc448d9ca1b","Type":"ContainerStarted","Data":"d73daea7a5bcd0d9a89125476231ffd70c496368c325a05bfa846fdbdefd8217"} Mar 20 13:46:26 crc kubenswrapper[4973]: I0320 13:46:26.989094 4973 generic.go:334] "Generic (PLEG): container finished" podID="8448c2a8-df10-4a2f-a14f-c59318a71ceb" containerID="4dff7158b55c26562ba3c676d6d34d3ddc7ebcd33b3559da1bbf0b34d6416451" exitCode=0 Mar 20 13:46:26 crc kubenswrapper[4973]: I0320 13:46:26.989203 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xq9kw" event={"ID":"8448c2a8-df10-4a2f-a14f-c59318a71ceb","Type":"ContainerDied","Data":"4dff7158b55c26562ba3c676d6d34d3ddc7ebcd33b3559da1bbf0b34d6416451"} Mar 20 13:46:26 crc kubenswrapper[4973]: I0320 13:46:26.989266 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xq9kw" event={"ID":"8448c2a8-df10-4a2f-a14f-c59318a71ceb","Type":"ContainerStarted","Data":"e411e7a065c1cb41cb65edb954f04b3c6bcd549ef41983e7b283b4f592f9ec1f"} Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.050880 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tngj7\" (UniqueName: \"kubernetes.io/projected/983eb56d-9c01-48a2-bcd1-3ea59f11bc01-kube-api-access-tngj7\") pod \"neutron-5874c8d58f-l5f6s\" (UID: \"983eb56d-9c01-48a2-bcd1-3ea59f11bc01\") " pod="openstack/neutron-5874c8d58f-l5f6s" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.050992 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/983eb56d-9c01-48a2-bcd1-3ea59f11bc01-ovndb-tls-certs\") pod \"neutron-5874c8d58f-l5f6s\" (UID: \"983eb56d-9c01-48a2-bcd1-3ea59f11bc01\") " pod="openstack/neutron-5874c8d58f-l5f6s" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.051079 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/983eb56d-9c01-48a2-bcd1-3ea59f11bc01-httpd-config\") pod \"neutron-5874c8d58f-l5f6s\" (UID: \"983eb56d-9c01-48a2-bcd1-3ea59f11bc01\") " pod="openstack/neutron-5874c8d58f-l5f6s" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.051109 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/983eb56d-9c01-48a2-bcd1-3ea59f11bc01-config\") pod \"neutron-5874c8d58f-l5f6s\" (UID: \"983eb56d-9c01-48a2-bcd1-3ea59f11bc01\") " pod="openstack/neutron-5874c8d58f-l5f6s" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.051130 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/983eb56d-9c01-48a2-bcd1-3ea59f11bc01-internal-tls-certs\") pod \"neutron-5874c8d58f-l5f6s\" (UID: \"983eb56d-9c01-48a2-bcd1-3ea59f11bc01\") " pod="openstack/neutron-5874c8d58f-l5f6s" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.051170 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/983eb56d-9c01-48a2-bcd1-3ea59f11bc01-combined-ca-bundle\") pod \"neutron-5874c8d58f-l5f6s\" (UID: \"983eb56d-9c01-48a2-bcd1-3ea59f11bc01\") " pod="openstack/neutron-5874c8d58f-l5f6s" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.051199 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/983eb56d-9c01-48a2-bcd1-3ea59f11bc01-public-tls-certs\") pod \"neutron-5874c8d58f-l5f6s\" (UID: \"983eb56d-9c01-48a2-bcd1-3ea59f11bc01\") " pod="openstack/neutron-5874c8d58f-l5f6s" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.061984 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/983eb56d-9c01-48a2-bcd1-3ea59f11bc01-internal-tls-certs\") pod \"neutron-5874c8d58f-l5f6s\" (UID: \"983eb56d-9c01-48a2-bcd1-3ea59f11bc01\") " pod="openstack/neutron-5874c8d58f-l5f6s" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.062314 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/983eb56d-9c01-48a2-bcd1-3ea59f11bc01-public-tls-certs\") pod \"neutron-5874c8d58f-l5f6s\" (UID: \"983eb56d-9c01-48a2-bcd1-3ea59f11bc01\") " pod="openstack/neutron-5874c8d58f-l5f6s" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.062884 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/983eb56d-9c01-48a2-bcd1-3ea59f11bc01-httpd-config\") pod \"neutron-5874c8d58f-l5f6s\" (UID: \"983eb56d-9c01-48a2-bcd1-3ea59f11bc01\") " pod="openstack/neutron-5874c8d58f-l5f6s" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.070377 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/983eb56d-9c01-48a2-bcd1-3ea59f11bc01-combined-ca-bundle\") pod \"neutron-5874c8d58f-l5f6s\" (UID: \"983eb56d-9c01-48a2-bcd1-3ea59f11bc01\") " pod="openstack/neutron-5874c8d58f-l5f6s" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.071214 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/983eb56d-9c01-48a2-bcd1-3ea59f11bc01-config\") pod \"neutron-5874c8d58f-l5f6s\" (UID: \"983eb56d-9c01-48a2-bcd1-3ea59f11bc01\") " pod="openstack/neutron-5874c8d58f-l5f6s" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.071994 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/983eb56d-9c01-48a2-bcd1-3ea59f11bc01-ovndb-tls-certs\") pod \"neutron-5874c8d58f-l5f6s\" (UID: \"983eb56d-9c01-48a2-bcd1-3ea59f11bc01\") " pod="openstack/neutron-5874c8d58f-l5f6s" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.101159 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tngj7\" (UniqueName: \"kubernetes.io/projected/983eb56d-9c01-48a2-bcd1-3ea59f11bc01-kube-api-access-tngj7\") pod \"neutron-5874c8d58f-l5f6s\" (UID: \"983eb56d-9c01-48a2-bcd1-3ea59f11bc01\") " pod="openstack/neutron-5874c8d58f-l5f6s" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.156289 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5874c8d58f-l5f6s" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.332242 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5589965cd6-qwjps"] Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.335083 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5589965cd6-qwjps" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.344433 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.344756 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.399676 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5589965cd6-qwjps"] Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.467394 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfmqg\" (UniqueName: \"kubernetes.io/projected/bd8bb058-5f72-497c-9bf2-7ac7b932cc5d-kube-api-access-dfmqg\") pod \"barbican-api-5589965cd6-qwjps\" (UID: \"bd8bb058-5f72-497c-9bf2-7ac7b932cc5d\") " pod="openstack/barbican-api-5589965cd6-qwjps" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.467501 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd8bb058-5f72-497c-9bf2-7ac7b932cc5d-logs\") pod \"barbican-api-5589965cd6-qwjps\" (UID: \"bd8bb058-5f72-497c-9bf2-7ac7b932cc5d\") " pod="openstack/barbican-api-5589965cd6-qwjps" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.467593 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd8bb058-5f72-497c-9bf2-7ac7b932cc5d-config-data\") pod \"barbican-api-5589965cd6-qwjps\" (UID: \"bd8bb058-5f72-497c-9bf2-7ac7b932cc5d\") " pod="openstack/barbican-api-5589965cd6-qwjps" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.467946 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd8bb058-5f72-497c-9bf2-7ac7b932cc5d-public-tls-certs\") pod \"barbican-api-5589965cd6-qwjps\" (UID: \"bd8bb058-5f72-497c-9bf2-7ac7b932cc5d\") " pod="openstack/barbican-api-5589965cd6-qwjps" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.468003 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd8bb058-5f72-497c-9bf2-7ac7b932cc5d-config-data-custom\") pod \"barbican-api-5589965cd6-qwjps\" (UID: \"bd8bb058-5f72-497c-9bf2-7ac7b932cc5d\") " pod="openstack/barbican-api-5589965cd6-qwjps" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.468139 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd8bb058-5f72-497c-9bf2-7ac7b932cc5d-internal-tls-certs\") pod \"barbican-api-5589965cd6-qwjps\" (UID: \"bd8bb058-5f72-497c-9bf2-7ac7b932cc5d\") " pod="openstack/barbican-api-5589965cd6-qwjps" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.468181 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd8bb058-5f72-497c-9bf2-7ac7b932cc5d-combined-ca-bundle\") pod \"barbican-api-5589965cd6-qwjps\" (UID: \"bd8bb058-5f72-497c-9bf2-7ac7b932cc5d\") " pod="openstack/barbican-api-5589965cd6-qwjps" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.571676 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfmqg\" (UniqueName: \"kubernetes.io/projected/bd8bb058-5f72-497c-9bf2-7ac7b932cc5d-kube-api-access-dfmqg\") pod \"barbican-api-5589965cd6-qwjps\" (UID: \"bd8bb058-5f72-497c-9bf2-7ac7b932cc5d\") " pod="openstack/barbican-api-5589965cd6-qwjps" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.572092 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd8bb058-5f72-497c-9bf2-7ac7b932cc5d-logs\") pod \"barbican-api-5589965cd6-qwjps\" (UID: \"bd8bb058-5f72-497c-9bf2-7ac7b932cc5d\") " pod="openstack/barbican-api-5589965cd6-qwjps" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.572146 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd8bb058-5f72-497c-9bf2-7ac7b932cc5d-config-data\") pod \"barbican-api-5589965cd6-qwjps\" (UID: \"bd8bb058-5f72-497c-9bf2-7ac7b932cc5d\") " pod="openstack/barbican-api-5589965cd6-qwjps" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.572590 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd8bb058-5f72-497c-9bf2-7ac7b932cc5d-public-tls-certs\") pod \"barbican-api-5589965cd6-qwjps\" (UID: \"bd8bb058-5f72-497c-9bf2-7ac7b932cc5d\") " pod="openstack/barbican-api-5589965cd6-qwjps" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.572644 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd8bb058-5f72-497c-9bf2-7ac7b932cc5d-config-data-custom\") pod \"barbican-api-5589965cd6-qwjps\" (UID: \"bd8bb058-5f72-497c-9bf2-7ac7b932cc5d\") " pod="openstack/barbican-api-5589965cd6-qwjps" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.572765 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd8bb058-5f72-497c-9bf2-7ac7b932cc5d-internal-tls-certs\") pod \"barbican-api-5589965cd6-qwjps\" (UID: \"bd8bb058-5f72-497c-9bf2-7ac7b932cc5d\") " pod="openstack/barbican-api-5589965cd6-qwjps" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.572800 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd8bb058-5f72-497c-9bf2-7ac7b932cc5d-combined-ca-bundle\") pod \"barbican-api-5589965cd6-qwjps\" (UID: \"bd8bb058-5f72-497c-9bf2-7ac7b932cc5d\") " pod="openstack/barbican-api-5589965cd6-qwjps" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.585194 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bd8bb058-5f72-497c-9bf2-7ac7b932cc5d-logs\") pod \"barbican-api-5589965cd6-qwjps\" (UID: \"bd8bb058-5f72-497c-9bf2-7ac7b932cc5d\") " pod="openstack/barbican-api-5589965cd6-qwjps" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.589132 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bd8bb058-5f72-497c-9bf2-7ac7b932cc5d-config-data-custom\") pod \"barbican-api-5589965cd6-qwjps\" (UID: \"bd8bb058-5f72-497c-9bf2-7ac7b932cc5d\") " pod="openstack/barbican-api-5589965cd6-qwjps" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.609431 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd8bb058-5f72-497c-9bf2-7ac7b932cc5d-config-data\") pod \"barbican-api-5589965cd6-qwjps\" (UID: \"bd8bb058-5f72-497c-9bf2-7ac7b932cc5d\") " pod="openstack/barbican-api-5589965cd6-qwjps" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.617646 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd8bb058-5f72-497c-9bf2-7ac7b932cc5d-public-tls-certs\") pod \"barbican-api-5589965cd6-qwjps\" (UID: \"bd8bb058-5f72-497c-9bf2-7ac7b932cc5d\") " pod="openstack/barbican-api-5589965cd6-qwjps" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.619459 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd8bb058-5f72-497c-9bf2-7ac7b932cc5d-combined-ca-bundle\") pod \"barbican-api-5589965cd6-qwjps\" (UID: \"bd8bb058-5f72-497c-9bf2-7ac7b932cc5d\") " pod="openstack/barbican-api-5589965cd6-qwjps" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.623857 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/bd8bb058-5f72-497c-9bf2-7ac7b932cc5d-internal-tls-certs\") pod \"barbican-api-5589965cd6-qwjps\" (UID: \"bd8bb058-5f72-497c-9bf2-7ac7b932cc5d\") " pod="openstack/barbican-api-5589965cd6-qwjps" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.671019 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfmqg\" (UniqueName: \"kubernetes.io/projected/bd8bb058-5f72-497c-9bf2-7ac7b932cc5d-kube-api-access-dfmqg\") pod \"barbican-api-5589965cd6-qwjps\" (UID: \"bd8bb058-5f72-497c-9bf2-7ac7b932cc5d\") " pod="openstack/barbican-api-5589965cd6-qwjps" Mar 20 13:46:27 crc kubenswrapper[4973]: I0320 13:46:27.672123 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5589965cd6-qwjps" Mar 20 13:46:28 crc kubenswrapper[4973]: I0320 13:46:28.203599 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5874c8d58f-l5f6s"] Mar 20 13:46:28 crc kubenswrapper[4973]: I0320 13:46:28.590448 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5589965cd6-qwjps"] Mar 20 13:46:28 crc kubenswrapper[4973]: W0320 13:46:28.591133 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbd8bb058_5f72_497c_9bf2_7ac7b932cc5d.slice/crio-d4c1287b856104711a6cc105265b6fc4eeae00ae73ad361a256e5dad981e5172 WatchSource:0}: Error finding container d4c1287b856104711a6cc105265b6fc4eeae00ae73ad361a256e5dad981e5172: Status 404 returned error can't find the container with id d4c1287b856104711a6cc105265b6fc4eeae00ae73ad361a256e5dad981e5172 Mar 20 13:46:29 crc kubenswrapper[4973]: I0320 13:46:29.052919 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5589965cd6-qwjps" event={"ID":"bd8bb058-5f72-497c-9bf2-7ac7b932cc5d","Type":"ContainerStarted","Data":"d4c1287b856104711a6cc105265b6fc4eeae00ae73ad361a256e5dad981e5172"} Mar 20 13:46:29 crc kubenswrapper[4973]: I0320 13:46:29.064055 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xq9kw" event={"ID":"8448c2a8-df10-4a2f-a14f-c59318a71ceb","Type":"ContainerStarted","Data":"3894a5999a5ee709f6f1db2cabcaed6dc09283c505d4838f88b70efe0cf86696"} Mar 20 13:46:29 crc kubenswrapper[4973]: I0320 13:46:29.065052 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5874c8d58f-l5f6s" event={"ID":"983eb56d-9c01-48a2-bcd1-3ea59f11bc01","Type":"ContainerStarted","Data":"e5dd4bb7327d973d3493bc877913691c0bb893c67c4908311945e57bd74c17ee"} Mar 20 13:46:29 crc kubenswrapper[4973]: I0320 13:46:29.090406 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d355b77-7cc9-4184-a223-2bc448d9ca1b","Type":"ContainerStarted","Data":"81f67f519f04127b155b4be39fa9edd64bf13174df55ce6e45c34ac8abb8482b"} Mar 20 13:46:29 crc kubenswrapper[4973]: I0320 13:46:29.248473 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 13:46:29 crc kubenswrapper[4973]: I0320 13:46:29.250550 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/cinder-scheduler-0" podUID="49ac7635-4e3b-47fb-a585-c07724eff983" containerName="cinder-scheduler" probeResult="failure" output="Get \"http://10.217.0.207:8080/\": dial tcp 10.217.0.207:8080: connect: connection refused" Mar 20 13:46:29 crc kubenswrapper[4973]: I0320 13:46:29.641590 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" Mar 20 13:46:29 crc kubenswrapper[4973]: I0320 13:46:29.732140 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-jwvg7"] Mar 20 13:46:29 crc kubenswrapper[4973]: I0320 13:46:29.738110 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" podUID="a0006a98-e902-4378-b895-cf31a555b3f6" containerName="dnsmasq-dns" containerID="cri-o://50aaf9928844e9feee6b2c0d1b4bd40f41ffcc0751174b6b279ca3eadddbbde6" gracePeriod=10 Mar 20 13:46:30 crc kubenswrapper[4973]: I0320 13:46:30.176858 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5874c8d58f-l5f6s" event={"ID":"983eb56d-9c01-48a2-bcd1-3ea59f11bc01","Type":"ContainerStarted","Data":"4aab57c9e8b65c24b2c1cd425110a90489592bf4fa19368f0f2142fe50974363"} Mar 20 13:46:30 crc kubenswrapper[4973]: I0320 13:46:30.177236 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5874c8d58f-l5f6s" event={"ID":"983eb56d-9c01-48a2-bcd1-3ea59f11bc01","Type":"ContainerStarted","Data":"a8b6191c5876eae890221e7ff769af8a91e44596f5dda4e3294c8b6d4b4594b9"} Mar 20 13:46:30 crc kubenswrapper[4973]: I0320 13:46:30.178875 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-5874c8d58f-l5f6s" Mar 20 13:46:30 crc kubenswrapper[4973]: I0320 13:46:30.189856 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5589965cd6-qwjps" event={"ID":"bd8bb058-5f72-497c-9bf2-7ac7b932cc5d","Type":"ContainerStarted","Data":"2bc403fdefb78f1c7c1bca8dcd6aadc0609a3259b8b83e50b9501aae8a8402b1"} Mar 20 13:46:30 crc kubenswrapper[4973]: I0320 13:46:30.189923 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5589965cd6-qwjps" event={"ID":"bd8bb058-5f72-497c-9bf2-7ac7b932cc5d","Type":"ContainerStarted","Data":"7147c4c44275e0b9958ab35d07441bb2c89ebbd0f036fca8aade7ae023de7a94"} Mar 20 13:46:30 crc kubenswrapper[4973]: I0320 13:46:30.191214 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5589965cd6-qwjps" Mar 20 13:46:30 crc kubenswrapper[4973]: I0320 13:46:30.191245 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5589965cd6-qwjps" Mar 20 13:46:30 crc kubenswrapper[4973]: I0320 13:46:30.191572 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:46:30 crc kubenswrapper[4973]: I0320 13:46:30.227580 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-5874c8d58f-l5f6s" podStartSLOduration=4.227558828 podStartE2EDuration="4.227558828s" podCreationTimestamp="2026-03-20 13:46:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:30.215855138 +0000 UTC m=+1510.959524882" watchObservedRunningTime="2026-03-20 13:46:30.227558828 +0000 UTC m=+1510.971228572" Mar 20 13:46:30 crc kubenswrapper[4973]: I0320 13:46:30.258609 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5589965cd6-qwjps" podStartSLOduration=3.258587105 podStartE2EDuration="3.258587105s" podCreationTimestamp="2026-03-20 13:46:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:30.246420342 +0000 UTC m=+1510.990090116" watchObservedRunningTime="2026-03-20 13:46:30.258587105 +0000 UTC m=+1511.002256849" Mar 20 13:46:30 crc kubenswrapper[4973]: I0320 13:46:30.289899 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=6.126879362 podStartE2EDuration="12.289878299s" podCreationTimestamp="2026-03-20 13:46:18 +0000 UTC" firstStartedPulling="2026-03-20 13:46:22.112778297 +0000 UTC m=+1502.856448032" lastFinishedPulling="2026-03-20 13:46:28.275777225 +0000 UTC m=+1509.019446969" observedRunningTime="2026-03-20 13:46:30.280853533 +0000 UTC m=+1511.024523277" watchObservedRunningTime="2026-03-20 13:46:30.289878299 +0000 UTC m=+1511.033548043" Mar 20 13:46:30 crc kubenswrapper[4973]: I0320 13:46:30.647643 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56d7b66f8b-74fsz" Mar 20 13:46:31 crc kubenswrapper[4973]: I0320 13:46:31.293778 4973 generic.go:334] "Generic (PLEG): container finished" podID="a0006a98-e902-4378-b895-cf31a555b3f6" containerID="50aaf9928844e9feee6b2c0d1b4bd40f41ffcc0751174b6b279ca3eadddbbde6" exitCode=0 Mar 20 13:46:31 crc kubenswrapper[4973]: I0320 13:46:31.294028 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" event={"ID":"a0006a98-e902-4378-b895-cf31a555b3f6","Type":"ContainerDied","Data":"50aaf9928844e9feee6b2c0d1b4bd40f41ffcc0751174b6b279ca3eadddbbde6"} Mar 20 13:46:31 crc kubenswrapper[4973]: I0320 13:46:31.294073 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" event={"ID":"a0006a98-e902-4378-b895-cf31a555b3f6","Type":"ContainerDied","Data":"f59ea8c1e64277ab1e80a4d15a9bfa66999488a923b8ae98fda1ed026aa1acf2"} Mar 20 13:46:31 crc kubenswrapper[4973]: I0320 13:46:31.294084 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f59ea8c1e64277ab1e80a4d15a9bfa66999488a923b8ae98fda1ed026aa1acf2" Mar 20 13:46:31 crc kubenswrapper[4973]: I0320 13:46:31.372839 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" Mar 20 13:46:31 crc kubenswrapper[4973]: I0320 13:46:31.473309 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-dns-swift-storage-0\") pod \"a0006a98-e902-4378-b895-cf31a555b3f6\" (UID: \"a0006a98-e902-4378-b895-cf31a555b3f6\") " Mar 20 13:46:31 crc kubenswrapper[4973]: I0320 13:46:31.473465 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-ovsdbserver-sb\") pod \"a0006a98-e902-4378-b895-cf31a555b3f6\" (UID: \"a0006a98-e902-4378-b895-cf31a555b3f6\") " Mar 20 13:46:31 crc kubenswrapper[4973]: I0320 13:46:31.473532 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-config\") pod \"a0006a98-e902-4378-b895-cf31a555b3f6\" (UID: \"a0006a98-e902-4378-b895-cf31a555b3f6\") " Mar 20 13:46:31 crc kubenswrapper[4973]: I0320 13:46:31.473556 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-dns-svc\") pod \"a0006a98-e902-4378-b895-cf31a555b3f6\" (UID: \"a0006a98-e902-4378-b895-cf31a555b3f6\") " Mar 20 13:46:31 crc kubenswrapper[4973]: I0320 13:46:31.473669 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-ovsdbserver-nb\") pod \"a0006a98-e902-4378-b895-cf31a555b3f6\" (UID: \"a0006a98-e902-4378-b895-cf31a555b3f6\") " Mar 20 13:46:31 crc kubenswrapper[4973]: I0320 13:46:31.473769 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkpsf\" (UniqueName: \"kubernetes.io/projected/a0006a98-e902-4378-b895-cf31a555b3f6-kube-api-access-vkpsf\") pod \"a0006a98-e902-4378-b895-cf31a555b3f6\" (UID: \"a0006a98-e902-4378-b895-cf31a555b3f6\") " Mar 20 13:46:31 crc kubenswrapper[4973]: I0320 13:46:31.499683 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0006a98-e902-4378-b895-cf31a555b3f6-kube-api-access-vkpsf" (OuterVolumeSpecName: "kube-api-access-vkpsf") pod "a0006a98-e902-4378-b895-cf31a555b3f6" (UID: "a0006a98-e902-4378-b895-cf31a555b3f6"). InnerVolumeSpecName "kube-api-access-vkpsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:31 crc kubenswrapper[4973]: I0320 13:46:31.584183 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a0006a98-e902-4378-b895-cf31a555b3f6" (UID: "a0006a98-e902-4378-b895-cf31a555b3f6"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:31 crc kubenswrapper[4973]: I0320 13:46:31.593235 4973 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:31 crc kubenswrapper[4973]: I0320 13:46:31.593272 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkpsf\" (UniqueName: \"kubernetes.io/projected/a0006a98-e902-4378-b895-cf31a555b3f6-kube-api-access-vkpsf\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:31 crc kubenswrapper[4973]: I0320 13:46:31.612955 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56d7b66f8b-74fsz" podUID="e9f37ccd-dcfc-472e-be04-7273945fa9ef" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.206:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:46:31 crc kubenswrapper[4973]: I0320 13:46:31.614602 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a0006a98-e902-4378-b895-cf31a555b3f6" (UID: "a0006a98-e902-4378-b895-cf31a555b3f6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:31 crc kubenswrapper[4973]: I0320 13:46:31.622660 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a0006a98-e902-4378-b895-cf31a555b3f6" (UID: "a0006a98-e902-4378-b895-cf31a555b3f6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:31 crc kubenswrapper[4973]: I0320 13:46:31.668835 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-config" (OuterVolumeSpecName: "config") pod "a0006a98-e902-4378-b895-cf31a555b3f6" (UID: "a0006a98-e902-4378-b895-cf31a555b3f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:31 crc kubenswrapper[4973]: I0320 13:46:31.694558 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a0006a98-e902-4378-b895-cf31a555b3f6" (UID: "a0006a98-e902-4378-b895-cf31a555b3f6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:31 crc kubenswrapper[4973]: I0320 13:46:31.695809 4973 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:31 crc kubenswrapper[4973]: I0320 13:46:31.695840 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:31 crc kubenswrapper[4973]: I0320 13:46:31.695850 4973 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:31 crc kubenswrapper[4973]: I0320 13:46:31.695860 4973 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a0006a98-e902-4378-b895-cf31a555b3f6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:32 crc kubenswrapper[4973]: I0320 13:46:32.306157 4973 generic.go:334] "Generic (PLEG): container finished" podID="8448c2a8-df10-4a2f-a14f-c59318a71ceb" containerID="3894a5999a5ee709f6f1db2cabcaed6dc09283c505d4838f88b70efe0cf86696" exitCode=0 Mar 20 13:46:32 crc kubenswrapper[4973]: I0320 13:46:32.306232 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xq9kw" event={"ID":"8448c2a8-df10-4a2f-a14f-c59318a71ceb","Type":"ContainerDied","Data":"3894a5999a5ee709f6f1db2cabcaed6dc09283c505d4838f88b70efe0cf86696"} Mar 20 13:46:32 crc kubenswrapper[4973]: I0320 13:46:32.306743 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8b5c85b87-jwvg7" Mar 20 13:46:32 crc kubenswrapper[4973]: I0320 13:46:32.376580 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-jwvg7"] Mar 20 13:46:32 crc kubenswrapper[4973]: I0320 13:46:32.388006 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8b5c85b87-jwvg7"] Mar 20 13:46:33 crc kubenswrapper[4973]: I0320 13:46:33.320724 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xq9kw" event={"ID":"8448c2a8-df10-4a2f-a14f-c59318a71ceb","Type":"ContainerStarted","Data":"822ccc28a7ff7901b030a3684f7318dfb904c5e0eb49d975499614fe8d732270"} Mar 20 13:46:33 crc kubenswrapper[4973]: I0320 13:46:33.966489 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0006a98-e902-4378-b895-cf31a555b3f6" path="/var/lib/kubelet/pods/a0006a98-e902-4378-b895-cf31a555b3f6/volumes" Mar 20 13:46:34 crc kubenswrapper[4973]: I0320 13:46:34.566401 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:34 crc kubenswrapper[4973]: I0320 13:46:34.611331 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xq9kw" podStartSLOduration=3.900122615 podStartE2EDuration="9.61131116s" podCreationTimestamp="2026-03-20 13:46:25 +0000 UTC" firstStartedPulling="2026-03-20 13:46:26.991588382 +0000 UTC m=+1507.735258136" lastFinishedPulling="2026-03-20 13:46:32.702776937 +0000 UTC m=+1513.446446681" observedRunningTime="2026-03-20 13:46:33.370565435 +0000 UTC m=+1514.114235179" watchObservedRunningTime="2026-03-20 13:46:34.61131116 +0000 UTC m=+1515.354980904" Mar 20 13:46:34 crc kubenswrapper[4973]: I0320 13:46:34.753146 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-7b85d6bdf6-j78f6" Mar 20 13:46:34 crc kubenswrapper[4973]: I0320 13:46:34.841069 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 13:46:34 crc kubenswrapper[4973]: I0320 13:46:34.921329 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.132530 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-85f659f75b-sh7fp" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.173894 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.273437 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 20 13:46:35 crc kubenswrapper[4973]: E0320 13:46:35.274081 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0006a98-e902-4378-b895-cf31a555b3f6" containerName="init" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.274104 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0006a98-e902-4378-b895-cf31a555b3f6" containerName="init" Mar 20 13:46:35 crc kubenswrapper[4973]: E0320 13:46:35.274145 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0006a98-e902-4378-b895-cf31a555b3f6" containerName="dnsmasq-dns" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.274156 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0006a98-e902-4378-b895-cf31a555b3f6" containerName="dnsmasq-dns" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.274461 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0006a98-e902-4378-b895-cf31a555b3f6" containerName="dnsmasq-dns" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.275573 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.281183 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.281530 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.281666 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-hwkj9" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.288074 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.354017 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="49ac7635-4e3b-47fb-a585-c07724eff983" containerName="cinder-scheduler" containerID="cri-o://a690ace5386c481e1601f112f0f1f1ae547136eb7e9c65c44c37804125c2aa3e" gracePeriod=30 Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.354437 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="49ac7635-4e3b-47fb-a585-c07724eff983" containerName="probe" containerID="cri-o://ca1b80ec5b60dcbff59f89aba489d788d387a3138452d2de099b9d2e20f12b3c" gracePeriod=30 Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.383888 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-85f659f75b-sh7fp" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.417538 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9de4b166-9cce-4044-ae28-6583535ce40d-openstack-config-secret\") pod \"openstackclient\" (UID: \"9de4b166-9cce-4044-ae28-6583535ce40d\") " pod="openstack/openstackclient" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.417621 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9de4b166-9cce-4044-ae28-6583535ce40d-openstack-config\") pod \"openstackclient\" (UID: \"9de4b166-9cce-4044-ae28-6583535ce40d\") " pod="openstack/openstackclient" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.417963 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrp4b\" (UniqueName: \"kubernetes.io/projected/9de4b166-9cce-4044-ae28-6583535ce40d-kube-api-access-qrp4b\") pod \"openstackclient\" (UID: \"9de4b166-9cce-4044-ae28-6583535ce40d\") " pod="openstack/openstackclient" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.418000 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9de4b166-9cce-4044-ae28-6583535ce40d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9de4b166-9cce-4044-ae28-6583535ce40d\") " pod="openstack/openstackclient" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.484006 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d746875b8-pt6tm"] Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.520432 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="07dbf52e-f333-4466-8e56-b44815529a05" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.211:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.522683 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrp4b\" (UniqueName: \"kubernetes.io/projected/9de4b166-9cce-4044-ae28-6583535ce40d-kube-api-access-qrp4b\") pod \"openstackclient\" (UID: \"9de4b166-9cce-4044-ae28-6583535ce40d\") " pod="openstack/openstackclient" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.522716 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9de4b166-9cce-4044-ae28-6583535ce40d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9de4b166-9cce-4044-ae28-6583535ce40d\") " pod="openstack/openstackclient" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.522781 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9de4b166-9cce-4044-ae28-6583535ce40d-openstack-config-secret\") pod \"openstackclient\" (UID: \"9de4b166-9cce-4044-ae28-6583535ce40d\") " pod="openstack/openstackclient" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.522841 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9de4b166-9cce-4044-ae28-6583535ce40d-openstack-config\") pod \"openstackclient\" (UID: \"9de4b166-9cce-4044-ae28-6583535ce40d\") " pod="openstack/openstackclient" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.524327 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9de4b166-9cce-4044-ae28-6583535ce40d-openstack-config\") pod \"openstackclient\" (UID: \"9de4b166-9cce-4044-ae28-6583535ce40d\") " pod="openstack/openstackclient" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.540018 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9de4b166-9cce-4044-ae28-6583535ce40d-openstack-config-secret\") pod \"openstackclient\" (UID: \"9de4b166-9cce-4044-ae28-6583535ce40d\") " pod="openstack/openstackclient" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.546977 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9de4b166-9cce-4044-ae28-6583535ce40d-combined-ca-bundle\") pod \"openstackclient\" (UID: \"9de4b166-9cce-4044-ae28-6583535ce40d\") " pod="openstack/openstackclient" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.559111 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrp4b\" (UniqueName: \"kubernetes.io/projected/9de4b166-9cce-4044-ae28-6583535ce40d-kube-api-access-qrp4b\") pod \"openstackclient\" (UID: \"9de4b166-9cce-4044-ae28-6583535ce40d\") " pod="openstack/openstackclient" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.604606 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.703618 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.727503 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.758867 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.765329 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.774799 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.839516 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xq9kw" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.845469 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xq9kw" Mar 20 13:46:35 crc kubenswrapper[4973]: E0320 13:46:35.908635 4973 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 20 13:46:35 crc kubenswrapper[4973]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_9de4b166-9cce-4044-ae28-6583535ce40d_0(518cc0e2958f3e7e5a56bd7198e94de49c9847c3a55a47f700fcb2c42c46c1af): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"518cc0e2958f3e7e5a56bd7198e94de49c9847c3a55a47f700fcb2c42c46c1af" Netns:"/var/run/netns/d916a40d-fe1c-4ce4-a651-f437bf69d763" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=518cc0e2958f3e7e5a56bd7198e94de49c9847c3a55a47f700fcb2c42c46c1af;K8S_POD_UID=9de4b166-9cce-4044-ae28-6583535ce40d" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/9de4b166-9cce-4044-ae28-6583535ce40d]: expected pod UID "9de4b166-9cce-4044-ae28-6583535ce40d" but got "5e16c419-72e5-4d2c-bda0-3a0f6ec97aac" from Kube API Mar 20 13:46:35 crc kubenswrapper[4973]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:46:35 crc kubenswrapper[4973]: > Mar 20 13:46:35 crc kubenswrapper[4973]: E0320 13:46:35.908728 4973 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 20 13:46:35 crc kubenswrapper[4973]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_9de4b166-9cce-4044-ae28-6583535ce40d_0(518cc0e2958f3e7e5a56bd7198e94de49c9847c3a55a47f700fcb2c42c46c1af): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"518cc0e2958f3e7e5a56bd7198e94de49c9847c3a55a47f700fcb2c42c46c1af" Netns:"/var/run/netns/d916a40d-fe1c-4ce4-a651-f437bf69d763" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=518cc0e2958f3e7e5a56bd7198e94de49c9847c3a55a47f700fcb2c42c46c1af;K8S_POD_UID=9de4b166-9cce-4044-ae28-6583535ce40d" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/9de4b166-9cce-4044-ae28-6583535ce40d]: expected pod UID "9de4b166-9cce-4044-ae28-6583535ce40d" but got "5e16c419-72e5-4d2c-bda0-3a0f6ec97aac" from Kube API Mar 20 13:46:35 crc kubenswrapper[4973]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 20 13:46:35 crc kubenswrapper[4973]: > pod="openstack/openstackclient" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.951719 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5e16c419-72e5-4d2c-bda0-3a0f6ec97aac-openstack-config\") pod \"openstackclient\" (UID: \"5e16c419-72e5-4d2c-bda0-3a0f6ec97aac\") " pod="openstack/openstackclient" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.951887 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7wtk\" (UniqueName: \"kubernetes.io/projected/5e16c419-72e5-4d2c-bda0-3a0f6ec97aac-kube-api-access-v7wtk\") pod \"openstackclient\" (UID: \"5e16c419-72e5-4d2c-bda0-3a0f6ec97aac\") " pod="openstack/openstackclient" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.951974 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e16c419-72e5-4d2c-bda0-3a0f6ec97aac-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5e16c419-72e5-4d2c-bda0-3a0f6ec97aac\") " pod="openstack/openstackclient" Mar 20 13:46:35 crc kubenswrapper[4973]: I0320 13:46:35.952039 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5e16c419-72e5-4d2c-bda0-3a0f6ec97aac-openstack-config-secret\") pod \"openstackclient\" (UID: \"5e16c419-72e5-4d2c-bda0-3a0f6ec97aac\") " pod="openstack/openstackclient" Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.058851 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e16c419-72e5-4d2c-bda0-3a0f6ec97aac-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5e16c419-72e5-4d2c-bda0-3a0f6ec97aac\") " pod="openstack/openstackclient" Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.058934 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5e16c419-72e5-4d2c-bda0-3a0f6ec97aac-openstack-config-secret\") pod \"openstackclient\" (UID: \"5e16c419-72e5-4d2c-bda0-3a0f6ec97aac\") " pod="openstack/openstackclient" Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.059028 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5e16c419-72e5-4d2c-bda0-3a0f6ec97aac-openstack-config\") pod \"openstackclient\" (UID: \"5e16c419-72e5-4d2c-bda0-3a0f6ec97aac\") " pod="openstack/openstackclient" Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.059136 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7wtk\" (UniqueName: \"kubernetes.io/projected/5e16c419-72e5-4d2c-bda0-3a0f6ec97aac-kube-api-access-v7wtk\") pod \"openstackclient\" (UID: \"5e16c419-72e5-4d2c-bda0-3a0f6ec97aac\") " pod="openstack/openstackclient" Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.065248 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/5e16c419-72e5-4d2c-bda0-3a0f6ec97aac-openstack-config\") pod \"openstackclient\" (UID: \"5e16c419-72e5-4d2c-bda0-3a0f6ec97aac\") " pod="openstack/openstackclient" Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.069260 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/5e16c419-72e5-4d2c-bda0-3a0f6ec97aac-openstack-config-secret\") pod \"openstackclient\" (UID: \"5e16c419-72e5-4d2c-bda0-3a0f6ec97aac\") " pod="openstack/openstackclient" Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.079962 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-56d7b66f8b-74fsz" Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.080572 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7wtk\" (UniqueName: \"kubernetes.io/projected/5e16c419-72e5-4d2c-bda0-3a0f6ec97aac-kube-api-access-v7wtk\") pod \"openstackclient\" (UID: \"5e16c419-72e5-4d2c-bda0-3a0f6ec97aac\") " pod="openstack/openstackclient" Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.083065 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e16c419-72e5-4d2c-bda0-3a0f6ec97aac-combined-ca-bundle\") pod \"openstackclient\" (UID: \"5e16c419-72e5-4d2c-bda0-3a0f6ec97aac\") " pod="openstack/openstackclient" Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.120938 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.405743 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-d746875b8-pt6tm" podUID="260048b8-b24c-48f1-bfb4-12b8936a1249" containerName="placement-log" containerID="cri-o://4d3077f09f571751f3f39f9330b353cc1cbc495f8ed9f9f2608e1a2965abc600" gracePeriod=30 Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.405903 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.410970 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-d746875b8-pt6tm" podUID="260048b8-b24c-48f1-bfb4-12b8936a1249" containerName="placement-api" containerID="cri-o://510b8b9ee83268c66c5d91a566cdb00d5c413d265f5a1a78c47471f704b4d089" gracePeriod=30 Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.485786 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.498646 4973 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="9de4b166-9cce-4044-ae28-6583535ce40d" podUID="5e16c419-72e5-4d2c-bda0-3a0f6ec97aac" Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.590165 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrp4b\" (UniqueName: \"kubernetes.io/projected/9de4b166-9cce-4044-ae28-6583535ce40d-kube-api-access-qrp4b\") pod \"9de4b166-9cce-4044-ae28-6583535ce40d\" (UID: \"9de4b166-9cce-4044-ae28-6583535ce40d\") " Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.590248 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9de4b166-9cce-4044-ae28-6583535ce40d-openstack-config\") pod \"9de4b166-9cce-4044-ae28-6583535ce40d\" (UID: \"9de4b166-9cce-4044-ae28-6583535ce40d\") " Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.590313 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9de4b166-9cce-4044-ae28-6583535ce40d-combined-ca-bundle\") pod \"9de4b166-9cce-4044-ae28-6583535ce40d\" (UID: \"9de4b166-9cce-4044-ae28-6583535ce40d\") " Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.590517 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9de4b166-9cce-4044-ae28-6583535ce40d-openstack-config-secret\") pod \"9de4b166-9cce-4044-ae28-6583535ce40d\" (UID: \"9de4b166-9cce-4044-ae28-6583535ce40d\") " Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.597435 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9de4b166-9cce-4044-ae28-6583535ce40d-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "9de4b166-9cce-4044-ae28-6583535ce40d" (UID: "9de4b166-9cce-4044-ae28-6583535ce40d"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.606550 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de4b166-9cce-4044-ae28-6583535ce40d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9de4b166-9cce-4044-ae28-6583535ce40d" (UID: "9de4b166-9cce-4044-ae28-6583535ce40d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.608532 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9de4b166-9cce-4044-ae28-6583535ce40d-kube-api-access-qrp4b" (OuterVolumeSpecName: "kube-api-access-qrp4b") pod "9de4b166-9cce-4044-ae28-6583535ce40d" (UID: "9de4b166-9cce-4044-ae28-6583535ce40d"). InnerVolumeSpecName "kube-api-access-qrp4b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.610421 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de4b166-9cce-4044-ae28-6583535ce40d-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "9de4b166-9cce-4044-ae28-6583535ce40d" (UID: "9de4b166-9cce-4044-ae28-6583535ce40d"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.694386 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrp4b\" (UniqueName: \"kubernetes.io/projected/9de4b166-9cce-4044-ae28-6583535ce40d-kube-api-access-qrp4b\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.694423 4973 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/9de4b166-9cce-4044-ae28-6583535ce40d-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.694436 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9de4b166-9cce-4044-ae28-6583535ce40d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.694450 4973 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/9de4b166-9cce-4044-ae28-6583535ce40d-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:36 crc kubenswrapper[4973]: I0320 13:46:36.980680 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xq9kw" podUID="8448c2a8-df10-4a2f-a14f-c59318a71ceb" containerName="registry-server" probeResult="failure" output=< Mar 20 13:46:36 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 13:46:36 crc kubenswrapper[4973]: > Mar 20 13:46:37 crc kubenswrapper[4973]: W0320 13:46:37.038469 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5e16c419_72e5_4d2c_bda0_3a0f6ec97aac.slice/crio-742f9decb768e2ebcaee2c587824ec75507f578fea1ce00d423066ab64c6abed WatchSource:0}: Error finding container 742f9decb768e2ebcaee2c587824ec75507f578fea1ce00d423066ab64c6abed: Status 404 returned error can't find the container with id 742f9decb768e2ebcaee2c587824ec75507f578fea1ce00d423066ab64c6abed Mar 20 13:46:37 crc kubenswrapper[4973]: I0320 13:46:37.043303 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 20 13:46:37 crc kubenswrapper[4973]: I0320 13:46:37.496038 4973 generic.go:334] "Generic (PLEG): container finished" podID="260048b8-b24c-48f1-bfb4-12b8936a1249" containerID="4d3077f09f571751f3f39f9330b353cc1cbc495f8ed9f9f2608e1a2965abc600" exitCode=143 Mar 20 13:46:37 crc kubenswrapper[4973]: I0320 13:46:37.496238 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d746875b8-pt6tm" event={"ID":"260048b8-b24c-48f1-bfb4-12b8936a1249","Type":"ContainerDied","Data":"4d3077f09f571751f3f39f9330b353cc1cbc495f8ed9f9f2608e1a2965abc600"} Mar 20 13:46:37 crc kubenswrapper[4973]: I0320 13:46:37.513653 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5e16c419-72e5-4d2c-bda0-3a0f6ec97aac","Type":"ContainerStarted","Data":"742f9decb768e2ebcaee2c587824ec75507f578fea1ce00d423066ab64c6abed"} Mar 20 13:46:37 crc kubenswrapper[4973]: I0320 13:46:37.548970 4973 generic.go:334] "Generic (PLEG): container finished" podID="49ac7635-4e3b-47fb-a585-c07724eff983" containerID="ca1b80ec5b60dcbff59f89aba489d788d387a3138452d2de099b9d2e20f12b3c" exitCode=0 Mar 20 13:46:37 crc kubenswrapper[4973]: I0320 13:46:37.549006 4973 generic.go:334] "Generic (PLEG): container finished" podID="49ac7635-4e3b-47fb-a585-c07724eff983" containerID="a690ace5386c481e1601f112f0f1f1ae547136eb7e9c65c44c37804125c2aa3e" exitCode=0 Mar 20 13:46:37 crc kubenswrapper[4973]: I0320 13:46:37.549109 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 20 13:46:37 crc kubenswrapper[4973]: I0320 13:46:37.549560 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"49ac7635-4e3b-47fb-a585-c07724eff983","Type":"ContainerDied","Data":"ca1b80ec5b60dcbff59f89aba489d788d387a3138452d2de099b9d2e20f12b3c"} Mar 20 13:46:37 crc kubenswrapper[4973]: I0320 13:46:37.549615 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"49ac7635-4e3b-47fb-a585-c07724eff983","Type":"ContainerDied","Data":"a690ace5386c481e1601f112f0f1f1ae547136eb7e9c65c44c37804125c2aa3e"} Mar 20 13:46:37 crc kubenswrapper[4973]: I0320 13:46:37.587691 4973 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="9de4b166-9cce-4044-ae28-6583535ce40d" podUID="5e16c419-72e5-4d2c-bda0-3a0f6ec97aac" Mar 20 13:46:37 crc kubenswrapper[4973]: I0320 13:46:37.883392 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:46:37 crc kubenswrapper[4973]: I0320 13:46:37.994209 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9de4b166-9cce-4044-ae28-6583535ce40d" path="/var/lib/kubelet/pods/9de4b166-9cce-4044-ae28-6583535ce40d/volumes" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.031770 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/49ac7635-4e3b-47fb-a585-c07724eff983-etc-machine-id\") pod \"49ac7635-4e3b-47fb-a585-c07724eff983\" (UID: \"49ac7635-4e3b-47fb-a585-c07724eff983\") " Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.031829 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49ac7635-4e3b-47fb-a585-c07724eff983-scripts\") pod \"49ac7635-4e3b-47fb-a585-c07724eff983\" (UID: \"49ac7635-4e3b-47fb-a585-c07724eff983\") " Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.031992 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49ac7635-4e3b-47fb-a585-c07724eff983-config-data-custom\") pod \"49ac7635-4e3b-47fb-a585-c07724eff983\" (UID: \"49ac7635-4e3b-47fb-a585-c07724eff983\") " Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.032040 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75ccb\" (UniqueName: \"kubernetes.io/projected/49ac7635-4e3b-47fb-a585-c07724eff983-kube-api-access-75ccb\") pod \"49ac7635-4e3b-47fb-a585-c07724eff983\" (UID: \"49ac7635-4e3b-47fb-a585-c07724eff983\") " Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.032117 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ac7635-4e3b-47fb-a585-c07724eff983-config-data\") pod \"49ac7635-4e3b-47fb-a585-c07724eff983\" (UID: \"49ac7635-4e3b-47fb-a585-c07724eff983\") " Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.032222 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ac7635-4e3b-47fb-a585-c07724eff983-combined-ca-bundle\") pod \"49ac7635-4e3b-47fb-a585-c07724eff983\" (UID: \"49ac7635-4e3b-47fb-a585-c07724eff983\") " Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.035814 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49ac7635-4e3b-47fb-a585-c07724eff983-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "49ac7635-4e3b-47fb-a585-c07724eff983" (UID: "49ac7635-4e3b-47fb-a585-c07724eff983"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.048229 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ac7635-4e3b-47fb-a585-c07724eff983-scripts" (OuterVolumeSpecName: "scripts") pod "49ac7635-4e3b-47fb-a585-c07724eff983" (UID: "49ac7635-4e3b-47fb-a585-c07724eff983"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.052710 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ac7635-4e3b-47fb-a585-c07724eff983-kube-api-access-75ccb" (OuterVolumeSpecName: "kube-api-access-75ccb") pod "49ac7635-4e3b-47fb-a585-c07724eff983" (UID: "49ac7635-4e3b-47fb-a585-c07724eff983"). InnerVolumeSpecName "kube-api-access-75ccb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.072611 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ac7635-4e3b-47fb-a585-c07724eff983-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "49ac7635-4e3b-47fb-a585-c07724eff983" (UID: "49ac7635-4e3b-47fb-a585-c07724eff983"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.136866 4973 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/49ac7635-4e3b-47fb-a585-c07724eff983-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.136906 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75ccb\" (UniqueName: \"kubernetes.io/projected/49ac7635-4e3b-47fb-a585-c07724eff983-kube-api-access-75ccb\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.136921 4973 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/49ac7635-4e3b-47fb-a585-c07724eff983-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.136932 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/49ac7635-4e3b-47fb-a585-c07724eff983-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.194894 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ac7635-4e3b-47fb-a585-c07724eff983-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49ac7635-4e3b-47fb-a585-c07724eff983" (UID: "49ac7635-4e3b-47fb-a585-c07724eff983"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.239221 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49ac7635-4e3b-47fb-a585-c07724eff983-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.305269 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49ac7635-4e3b-47fb-a585-c07724eff983-config-data" (OuterVolumeSpecName: "config-data") pod "49ac7635-4e3b-47fb-a585-c07724eff983" (UID: "49ac7635-4e3b-47fb-a585-c07724eff983"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.342081 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49ac7635-4e3b-47fb-a585-c07724eff983-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.562021 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"49ac7635-4e3b-47fb-a585-c07724eff983","Type":"ContainerDied","Data":"fb95917f926dabf345eb17033cf0cb02e36daa4f6748c05bde032e273fa010b9"} Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.562317 4973 scope.go:117] "RemoveContainer" containerID="ca1b80ec5b60dcbff59f89aba489d788d387a3138452d2de099b9d2e20f12b3c" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.562511 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.618954 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.633486 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.652560 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:46:38 crc kubenswrapper[4973]: E0320 13:46:38.653147 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ac7635-4e3b-47fb-a585-c07724eff983" containerName="cinder-scheduler" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.653163 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ac7635-4e3b-47fb-a585-c07724eff983" containerName="cinder-scheduler" Mar 20 13:46:38 crc kubenswrapper[4973]: E0320 13:46:38.653201 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ac7635-4e3b-47fb-a585-c07724eff983" containerName="probe" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.653208 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ac7635-4e3b-47fb-a585-c07724eff983" containerName="probe" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.653676 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ac7635-4e3b-47fb-a585-c07724eff983" containerName="cinder-scheduler" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.653707 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ac7635-4e3b-47fb-a585-c07724eff983" containerName="probe" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.655284 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.655677 4973 scope.go:117] "RemoveContainer" containerID="a690ace5386c481e1601f112f0f1f1ae547136eb7e9c65c44c37804125c2aa3e" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.670645 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.673996 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.761403 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ca3c33-8a98-4ce8-8cb7-06c855d090ac-config-data\") pod \"cinder-scheduler-0\" (UID: \"95ca3c33-8a98-4ce8-8cb7-06c855d090ac\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.761470 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95ca3c33-8a98-4ce8-8cb7-06c855d090ac-scripts\") pod \"cinder-scheduler-0\" (UID: \"95ca3c33-8a98-4ce8-8cb7-06c855d090ac\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.761503 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ca3c33-8a98-4ce8-8cb7-06c855d090ac-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"95ca3c33-8a98-4ce8-8cb7-06c855d090ac\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.761567 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95ca3c33-8a98-4ce8-8cb7-06c855d090ac-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"95ca3c33-8a98-4ce8-8cb7-06c855d090ac\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.761605 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/95ca3c33-8a98-4ce8-8cb7-06c855d090ac-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"95ca3c33-8a98-4ce8-8cb7-06c855d090ac\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.761691 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsmt6\" (UniqueName: \"kubernetes.io/projected/95ca3c33-8a98-4ce8-8cb7-06c855d090ac-kube-api-access-jsmt6\") pod \"cinder-scheduler-0\" (UID: \"95ca3c33-8a98-4ce8-8cb7-06c855d090ac\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.864872 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ca3c33-8a98-4ce8-8cb7-06c855d090ac-config-data\") pod \"cinder-scheduler-0\" (UID: \"95ca3c33-8a98-4ce8-8cb7-06c855d090ac\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.865192 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95ca3c33-8a98-4ce8-8cb7-06c855d090ac-scripts\") pod \"cinder-scheduler-0\" (UID: \"95ca3c33-8a98-4ce8-8cb7-06c855d090ac\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.865253 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ca3c33-8a98-4ce8-8cb7-06c855d090ac-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"95ca3c33-8a98-4ce8-8cb7-06c855d090ac\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.865502 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95ca3c33-8a98-4ce8-8cb7-06c855d090ac-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"95ca3c33-8a98-4ce8-8cb7-06c855d090ac\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.865583 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/95ca3c33-8a98-4ce8-8cb7-06c855d090ac-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"95ca3c33-8a98-4ce8-8cb7-06c855d090ac\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.866024 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsmt6\" (UniqueName: \"kubernetes.io/projected/95ca3c33-8a98-4ce8-8cb7-06c855d090ac-kube-api-access-jsmt6\") pod \"cinder-scheduler-0\" (UID: \"95ca3c33-8a98-4ce8-8cb7-06c855d090ac\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.866501 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/95ca3c33-8a98-4ce8-8cb7-06c855d090ac-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"95ca3c33-8a98-4ce8-8cb7-06c855d090ac\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.871257 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95ca3c33-8a98-4ce8-8cb7-06c855d090ac-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"95ca3c33-8a98-4ce8-8cb7-06c855d090ac\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.893970 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95ca3c33-8a98-4ce8-8cb7-06c855d090ac-scripts\") pod \"cinder-scheduler-0\" (UID: \"95ca3c33-8a98-4ce8-8cb7-06c855d090ac\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.897969 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsmt6\" (UniqueName: \"kubernetes.io/projected/95ca3c33-8a98-4ce8-8cb7-06c855d090ac-kube-api-access-jsmt6\") pod \"cinder-scheduler-0\" (UID: \"95ca3c33-8a98-4ce8-8cb7-06c855d090ac\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.912633 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95ca3c33-8a98-4ce8-8cb7-06c855d090ac-config-data\") pod \"cinder-scheduler-0\" (UID: \"95ca3c33-8a98-4ce8-8cb7-06c855d090ac\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:38 crc kubenswrapper[4973]: I0320 13:46:38.916907 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/95ca3c33-8a98-4ce8-8cb7-06c855d090ac-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"95ca3c33-8a98-4ce8-8cb7-06c855d090ac\") " pod="openstack/cinder-scheduler-0" Mar 20 13:46:39 crc kubenswrapper[4973]: I0320 13:46:39.096045 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 20 13:46:39 crc kubenswrapper[4973]: I0320 13:46:39.741911 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 20 13:46:39 crc kubenswrapper[4973]: I0320 13:46:39.971937 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ac7635-4e3b-47fb-a585-c07724eff983" path="/var/lib/kubelet/pods/49ac7635-4e3b-47fb-a585-c07724eff983/volumes" Mar 20 13:46:40 crc kubenswrapper[4973]: I0320 13:46:40.455305 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5589965cd6-qwjps" Mar 20 13:46:40 crc kubenswrapper[4973]: I0320 13:46:40.566539 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="07dbf52e-f333-4466-8e56-b44815529a05" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.211:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:46:40 crc kubenswrapper[4973]: I0320 13:46:40.670941 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"95ca3c33-8a98-4ce8-8cb7-06c855d090ac","Type":"ContainerStarted","Data":"89a13df3cbfa7988456e2a331bc246f63ff1879833456d09abefa4cf965a7e73"} Mar 20 13:46:40 crc kubenswrapper[4973]: I0320 13:46:40.679394 4973 generic.go:334] "Generic (PLEG): container finished" podID="260048b8-b24c-48f1-bfb4-12b8936a1249" containerID="510b8b9ee83268c66c5d91a566cdb00d5c413d265f5a1a78c47471f704b4d089" exitCode=0 Mar 20 13:46:40 crc kubenswrapper[4973]: I0320 13:46:40.679436 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d746875b8-pt6tm" event={"ID":"260048b8-b24c-48f1-bfb4-12b8936a1249","Type":"ContainerDied","Data":"510b8b9ee83268c66c5d91a566cdb00d5c413d265f5a1a78c47471f704b4d089"} Mar 20 13:46:40 crc kubenswrapper[4973]: I0320 13:46:40.684518 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:40 crc kubenswrapper[4973]: I0320 13:46:40.743078 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-internal-tls-certs\") pod \"260048b8-b24c-48f1-bfb4-12b8936a1249\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " Mar 20 13:46:40 crc kubenswrapper[4973]: I0320 13:46:40.743152 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/260048b8-b24c-48f1-bfb4-12b8936a1249-logs\") pod \"260048b8-b24c-48f1-bfb4-12b8936a1249\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " Mar 20 13:46:40 crc kubenswrapper[4973]: I0320 13:46:40.743323 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llnq7\" (UniqueName: \"kubernetes.io/projected/260048b8-b24c-48f1-bfb4-12b8936a1249-kube-api-access-llnq7\") pod \"260048b8-b24c-48f1-bfb4-12b8936a1249\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " Mar 20 13:46:40 crc kubenswrapper[4973]: I0320 13:46:40.743408 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-config-data\") pod \"260048b8-b24c-48f1-bfb4-12b8936a1249\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " Mar 20 13:46:40 crc kubenswrapper[4973]: I0320 13:46:40.743448 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-public-tls-certs\") pod \"260048b8-b24c-48f1-bfb4-12b8936a1249\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " Mar 20 13:46:40 crc kubenswrapper[4973]: I0320 13:46:40.743526 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-combined-ca-bundle\") pod \"260048b8-b24c-48f1-bfb4-12b8936a1249\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " Mar 20 13:46:40 crc kubenswrapper[4973]: I0320 13:46:40.743605 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-scripts\") pod \"260048b8-b24c-48f1-bfb4-12b8936a1249\" (UID: \"260048b8-b24c-48f1-bfb4-12b8936a1249\") " Mar 20 13:46:40 crc kubenswrapper[4973]: I0320 13:46:40.748964 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/260048b8-b24c-48f1-bfb4-12b8936a1249-logs" (OuterVolumeSpecName: "logs") pod "260048b8-b24c-48f1-bfb4-12b8936a1249" (UID: "260048b8-b24c-48f1-bfb4-12b8936a1249"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:40 crc kubenswrapper[4973]: I0320 13:46:40.756594 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/260048b8-b24c-48f1-bfb4-12b8936a1249-kube-api-access-llnq7" (OuterVolumeSpecName: "kube-api-access-llnq7") pod "260048b8-b24c-48f1-bfb4-12b8936a1249" (UID: "260048b8-b24c-48f1-bfb4-12b8936a1249"). InnerVolumeSpecName "kube-api-access-llnq7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:40 crc kubenswrapper[4973]: I0320 13:46:40.780608 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-scripts" (OuterVolumeSpecName: "scripts") pod "260048b8-b24c-48f1-bfb4-12b8936a1249" (UID: "260048b8-b24c-48f1-bfb4-12b8936a1249"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:40 crc kubenswrapper[4973]: I0320 13:46:40.837522 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "260048b8-b24c-48f1-bfb4-12b8936a1249" (UID: "260048b8-b24c-48f1-bfb4-12b8936a1249"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:40 crc kubenswrapper[4973]: I0320 13:46:40.849219 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:40 crc kubenswrapper[4973]: I0320 13:46:40.849263 4973 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/260048b8-b24c-48f1-bfb4-12b8936a1249-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:40 crc kubenswrapper[4973]: I0320 13:46:40.849276 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llnq7\" (UniqueName: \"kubernetes.io/projected/260048b8-b24c-48f1-bfb4-12b8936a1249-kube-api-access-llnq7\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:40 crc kubenswrapper[4973]: I0320 13:46:40.849293 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:40 crc kubenswrapper[4973]: I0320 13:46:40.864934 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5589965cd6-qwjps" Mar 20 13:46:40 crc kubenswrapper[4973]: I0320 13:46:40.869463 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-config-data" (OuterVolumeSpecName: "config-data") pod "260048b8-b24c-48f1-bfb4-12b8936a1249" (UID: "260048b8-b24c-48f1-bfb4-12b8936a1249"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:40 crc kubenswrapper[4973]: I0320 13:46:40.961132 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:40 crc kubenswrapper[4973]: I0320 13:46:40.972561 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-56d7b66f8b-74fsz"] Mar 20 13:46:40 crc kubenswrapper[4973]: I0320 13:46:40.972826 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-56d7b66f8b-74fsz" podUID="e9f37ccd-dcfc-472e-be04-7273945fa9ef" containerName="barbican-api-log" containerID="cri-o://b82fd517e27a0b42cb6ca56a801db45b8058a59727b63c73a5d6114c35bc3a57" gracePeriod=30 Mar 20 13:46:40 crc kubenswrapper[4973]: I0320 13:46:40.973409 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-56d7b66f8b-74fsz" podUID="e9f37ccd-dcfc-472e-be04-7273945fa9ef" containerName="barbican-api" containerID="cri-o://b80e949024f1f3d04fb9e44ff5c878f8da2d5b4a8903ea06ca8e8c960cd03a87" gracePeriod=30 Mar 20 13:46:41 crc kubenswrapper[4973]: I0320 13:46:41.048512 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "260048b8-b24c-48f1-bfb4-12b8936a1249" (UID: "260048b8-b24c-48f1-bfb4-12b8936a1249"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:41 crc kubenswrapper[4973]: I0320 13:46:41.065171 4973 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:41 crc kubenswrapper[4973]: I0320 13:46:41.142720 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "260048b8-b24c-48f1-bfb4-12b8936a1249" (UID: "260048b8-b24c-48f1-bfb4-12b8936a1249"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:41 crc kubenswrapper[4973]: I0320 13:46:41.168293 4973 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/260048b8-b24c-48f1-bfb4-12b8936a1249-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:41 crc kubenswrapper[4973]: I0320 13:46:41.704111 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"95ca3c33-8a98-4ce8-8cb7-06c855d090ac","Type":"ContainerStarted","Data":"c4112903279386936de252a9123dcc64166a267f9121a2ed5268d620fab9ca7d"} Mar 20 13:46:41 crc kubenswrapper[4973]: I0320 13:46:41.708074 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d746875b8-pt6tm" event={"ID":"260048b8-b24c-48f1-bfb4-12b8936a1249","Type":"ContainerDied","Data":"cae84c01af21e313b77c50636744c9b797e8d4a2c0a8f4f0ff62120017397ed7"} Mar 20 13:46:41 crc kubenswrapper[4973]: I0320 13:46:41.708118 4973 scope.go:117] "RemoveContainer" containerID="510b8b9ee83268c66c5d91a566cdb00d5c413d265f5a1a78c47471f704b4d089" Mar 20 13:46:41 crc kubenswrapper[4973]: I0320 13:46:41.708249 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d746875b8-pt6tm" Mar 20 13:46:41 crc kubenswrapper[4973]: I0320 13:46:41.731024 4973 generic.go:334] "Generic (PLEG): container finished" podID="e9f37ccd-dcfc-472e-be04-7273945fa9ef" containerID="b82fd517e27a0b42cb6ca56a801db45b8058a59727b63c73a5d6114c35bc3a57" exitCode=143 Mar 20 13:46:41 crc kubenswrapper[4973]: I0320 13:46:41.731075 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56d7b66f8b-74fsz" event={"ID":"e9f37ccd-dcfc-472e-be04-7273945fa9ef","Type":"ContainerDied","Data":"b82fd517e27a0b42cb6ca56a801db45b8058a59727b63c73a5d6114c35bc3a57"} Mar 20 13:46:41 crc kubenswrapper[4973]: I0320 13:46:41.770904 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d746875b8-pt6tm"] Mar 20 13:46:41 crc kubenswrapper[4973]: I0320 13:46:41.789031 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d746875b8-pt6tm"] Mar 20 13:46:41 crc kubenswrapper[4973]: I0320 13:46:41.814264 4973 scope.go:117] "RemoveContainer" containerID="4d3077f09f571751f3f39f9330b353cc1cbc495f8ed9f9f2608e1a2965abc600" Mar 20 13:46:41 crc kubenswrapper[4973]: I0320 13:46:41.975702 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="260048b8-b24c-48f1-bfb4-12b8936a1249" path="/var/lib/kubelet/pods/260048b8-b24c-48f1-bfb4-12b8936a1249/volumes" Mar 20 13:46:42 crc kubenswrapper[4973]: I0320 13:46:42.748094 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"95ca3c33-8a98-4ce8-8cb7-06c855d090ac","Type":"ContainerStarted","Data":"526700246d66fe504f7618189b6f1a52298d8f33ad02d028abce928fc10b84bc"} Mar 20 13:46:42 crc kubenswrapper[4973]: I0320 13:46:42.783970 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.78394588 podStartE2EDuration="4.78394588s" podCreationTimestamp="2026-03-20 13:46:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:46:42.772880308 +0000 UTC m=+1523.516550052" watchObservedRunningTime="2026-03-20 13:46:42.78394588 +0000 UTC m=+1523.527615624" Mar 20 13:46:44 crc kubenswrapper[4973]: I0320 13:46:44.097829 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 13:46:44 crc kubenswrapper[4973]: I0320 13:46:44.225633 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56d7b66f8b-74fsz" podUID="e9f37ccd-dcfc-472e-be04-7273945fa9ef" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.206:9311/healthcheck\": read tcp 10.217.0.2:48968->10.217.0.206:9311: read: connection reset by peer" Mar 20 13:46:44 crc kubenswrapper[4973]: I0320 13:46:44.225980 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-56d7b66f8b-74fsz" podUID="e9f37ccd-dcfc-472e-be04-7273945fa9ef" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.206:9311/healthcheck\": read tcp 10.217.0.2:48964->10.217.0.206:9311: read: connection reset by peer" Mar 20 13:46:44 crc kubenswrapper[4973]: I0320 13:46:44.793436 4973 generic.go:334] "Generic (PLEG): container finished" podID="e9f37ccd-dcfc-472e-be04-7273945fa9ef" containerID="b80e949024f1f3d04fb9e44ff5c878f8da2d5b4a8903ea06ca8e8c960cd03a87" exitCode=0 Mar 20 13:46:44 crc kubenswrapper[4973]: I0320 13:46:44.793626 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56d7b66f8b-74fsz" event={"ID":"e9f37ccd-dcfc-472e-be04-7273945fa9ef","Type":"ContainerDied","Data":"b80e949024f1f3d04fb9e44ff5c878f8da2d5b4a8903ea06ca8e8c960cd03a87"} Mar 20 13:46:44 crc kubenswrapper[4973]: I0320 13:46:44.935805 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 20 13:46:45 crc kubenswrapper[4973]: I0320 13:46:45.039357 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56d7b66f8b-74fsz" Mar 20 13:46:45 crc kubenswrapper[4973]: I0320 13:46:45.187574 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5v7q7\" (UniqueName: \"kubernetes.io/projected/e9f37ccd-dcfc-472e-be04-7273945fa9ef-kube-api-access-5v7q7\") pod \"e9f37ccd-dcfc-472e-be04-7273945fa9ef\" (UID: \"e9f37ccd-dcfc-472e-be04-7273945fa9ef\") " Mar 20 13:46:45 crc kubenswrapper[4973]: I0320 13:46:45.187673 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9f37ccd-dcfc-472e-be04-7273945fa9ef-config-data-custom\") pod \"e9f37ccd-dcfc-472e-be04-7273945fa9ef\" (UID: \"e9f37ccd-dcfc-472e-be04-7273945fa9ef\") " Mar 20 13:46:45 crc kubenswrapper[4973]: I0320 13:46:45.187788 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9f37ccd-dcfc-472e-be04-7273945fa9ef-combined-ca-bundle\") pod \"e9f37ccd-dcfc-472e-be04-7273945fa9ef\" (UID: \"e9f37ccd-dcfc-472e-be04-7273945fa9ef\") " Mar 20 13:46:45 crc kubenswrapper[4973]: I0320 13:46:45.187904 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9f37ccd-dcfc-472e-be04-7273945fa9ef-logs\") pod \"e9f37ccd-dcfc-472e-be04-7273945fa9ef\" (UID: \"e9f37ccd-dcfc-472e-be04-7273945fa9ef\") " Mar 20 13:46:45 crc kubenswrapper[4973]: I0320 13:46:45.187993 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9f37ccd-dcfc-472e-be04-7273945fa9ef-config-data\") pod \"e9f37ccd-dcfc-472e-be04-7273945fa9ef\" (UID: \"e9f37ccd-dcfc-472e-be04-7273945fa9ef\") " Mar 20 13:46:45 crc kubenswrapper[4973]: I0320 13:46:45.189418 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e9f37ccd-dcfc-472e-be04-7273945fa9ef-logs" (OuterVolumeSpecName: "logs") pod "e9f37ccd-dcfc-472e-be04-7273945fa9ef" (UID: "e9f37ccd-dcfc-472e-be04-7273945fa9ef"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:45 crc kubenswrapper[4973]: I0320 13:46:45.190802 4973 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e9f37ccd-dcfc-472e-be04-7273945fa9ef-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:45 crc kubenswrapper[4973]: I0320 13:46:45.210680 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e9f37ccd-dcfc-472e-be04-7273945fa9ef-kube-api-access-5v7q7" (OuterVolumeSpecName: "kube-api-access-5v7q7") pod "e9f37ccd-dcfc-472e-be04-7273945fa9ef" (UID: "e9f37ccd-dcfc-472e-be04-7273945fa9ef"). InnerVolumeSpecName "kube-api-access-5v7q7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:45 crc kubenswrapper[4973]: I0320 13:46:45.211093 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9f37ccd-dcfc-472e-be04-7273945fa9ef-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e9f37ccd-dcfc-472e-be04-7273945fa9ef" (UID: "e9f37ccd-dcfc-472e-be04-7273945fa9ef"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:45 crc kubenswrapper[4973]: I0320 13:46:45.247412 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9f37ccd-dcfc-472e-be04-7273945fa9ef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e9f37ccd-dcfc-472e-be04-7273945fa9ef" (UID: "e9f37ccd-dcfc-472e-be04-7273945fa9ef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:45 crc kubenswrapper[4973]: I0320 13:46:45.298740 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e9f37ccd-dcfc-472e-be04-7273945fa9ef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:45 crc kubenswrapper[4973]: I0320 13:46:45.298766 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5v7q7\" (UniqueName: \"kubernetes.io/projected/e9f37ccd-dcfc-472e-be04-7273945fa9ef-kube-api-access-5v7q7\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:45 crc kubenswrapper[4973]: I0320 13:46:45.298777 4973 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e9f37ccd-dcfc-472e-be04-7273945fa9ef-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:45 crc kubenswrapper[4973]: I0320 13:46:45.331578 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e9f37ccd-dcfc-472e-be04-7273945fa9ef-config-data" (OuterVolumeSpecName: "config-data") pod "e9f37ccd-dcfc-472e-be04-7273945fa9ef" (UID: "e9f37ccd-dcfc-472e-be04-7273945fa9ef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:45 crc kubenswrapper[4973]: I0320 13:46:45.401604 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e9f37ccd-dcfc-472e-be04-7273945fa9ef-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:45 crc kubenswrapper[4973]: I0320 13:46:45.807143 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-56d7b66f8b-74fsz" event={"ID":"e9f37ccd-dcfc-472e-be04-7273945fa9ef","Type":"ContainerDied","Data":"3c06468fc5fa0f28653431f4d14d917d94365041c3b599139e9a95c48dd9e313"} Mar 20 13:46:45 crc kubenswrapper[4973]: I0320 13:46:45.807198 4973 scope.go:117] "RemoveContainer" containerID="b80e949024f1f3d04fb9e44ff5c878f8da2d5b4a8903ea06ca8e8c960cd03a87" Mar 20 13:46:45 crc kubenswrapper[4973]: I0320 13:46:45.807237 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-56d7b66f8b-74fsz" Mar 20 13:46:45 crc kubenswrapper[4973]: I0320 13:46:45.869884 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-56d7b66f8b-74fsz"] Mar 20 13:46:45 crc kubenswrapper[4973]: I0320 13:46:45.885060 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-56d7b66f8b-74fsz"] Mar 20 13:46:45 crc kubenswrapper[4973]: I0320 13:46:45.888102 4973 scope.go:117] "RemoveContainer" containerID="b82fd517e27a0b42cb6ca56a801db45b8058a59727b63c73a5d6114c35bc3a57" Mar 20 13:46:45 crc kubenswrapper[4973]: I0320 13:46:45.983895 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9f37ccd-dcfc-472e-be04-7273945fa9ef" path="/var/lib/kubelet/pods/e9f37ccd-dcfc-472e-be04-7273945fa9ef/volumes" Mar 20 13:46:46 crc kubenswrapper[4973]: I0320 13:46:46.918042 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xq9kw" podUID="8448c2a8-df10-4a2f-a14f-c59318a71ceb" containerName="registry-server" probeResult="failure" output=< Mar 20 13:46:46 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 13:46:46 crc kubenswrapper[4973]: > Mar 20 13:46:47 crc kubenswrapper[4973]: I0320 13:46:47.268847 4973 scope.go:117] "RemoveContainer" containerID="e83600963fce5240b7bf8aef16bc7dfa70c017aa2cb389f72dff2641d37d5b18" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.134050 4973 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod6fec0901-00c6-410f-986c-4dcac4fe1359"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod6fec0901-00c6-410f-986c-4dcac4fe1359] : Timed out while waiting for systemd to remove kubepods-besteffort-pod6fec0901_00c6_410f_986c_4dcac4fe1359.slice" Mar 20 13:46:48 crc kubenswrapper[4973]: E0320 13:46:48.134102 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod6fec0901-00c6-410f-986c-4dcac4fe1359] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod6fec0901-00c6-410f-986c-4dcac4fe1359] : Timed out while waiting for systemd to remove kubepods-besteffort-pod6fec0901_00c6_410f_986c_4dcac4fe1359.slice" pod="openstack/cinder-db-sync-zzsmb" podUID="6fec0901-00c6-410f-986c-4dcac4fe1359" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.410657 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-849bfc9c67-jsf5j"] Mar 20 13:46:48 crc kubenswrapper[4973]: E0320 13:46:48.411233 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="260048b8-b24c-48f1-bfb4-12b8936a1249" containerName="placement-api" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.411252 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="260048b8-b24c-48f1-bfb4-12b8936a1249" containerName="placement-api" Mar 20 13:46:48 crc kubenswrapper[4973]: E0320 13:46:48.411269 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9f37ccd-dcfc-472e-be04-7273945fa9ef" containerName="barbican-api" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.411277 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9f37ccd-dcfc-472e-be04-7273945fa9ef" containerName="barbican-api" Mar 20 13:46:48 crc kubenswrapper[4973]: E0320 13:46:48.411307 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="260048b8-b24c-48f1-bfb4-12b8936a1249" containerName="placement-log" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.411315 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="260048b8-b24c-48f1-bfb4-12b8936a1249" containerName="placement-log" Mar 20 13:46:48 crc kubenswrapper[4973]: E0320 13:46:48.411475 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9f37ccd-dcfc-472e-be04-7273945fa9ef" containerName="barbican-api-log" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.411484 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9f37ccd-dcfc-472e-be04-7273945fa9ef" containerName="barbican-api-log" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.411772 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="260048b8-b24c-48f1-bfb4-12b8936a1249" containerName="placement-log" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.411790 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="260048b8-b24c-48f1-bfb4-12b8936a1249" containerName="placement-api" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.411806 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9f37ccd-dcfc-472e-be04-7273945fa9ef" containerName="barbican-api-log" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.411856 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9f37ccd-dcfc-472e-be04-7273945fa9ef" containerName="barbican-api" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.413411 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.415370 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.416939 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.421152 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.435957 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-849bfc9c67-jsf5j"] Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.497121 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28f1bcfd-788c-47fa-a462-cd5068ec34d2-public-tls-certs\") pod \"swift-proxy-849bfc9c67-jsf5j\" (UID: \"28f1bcfd-788c-47fa-a462-cd5068ec34d2\") " pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.497543 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t87sn\" (UniqueName: \"kubernetes.io/projected/28f1bcfd-788c-47fa-a462-cd5068ec34d2-kube-api-access-t87sn\") pod \"swift-proxy-849bfc9c67-jsf5j\" (UID: \"28f1bcfd-788c-47fa-a462-cd5068ec34d2\") " pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.497607 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28f1bcfd-788c-47fa-a462-cd5068ec34d2-internal-tls-certs\") pod \"swift-proxy-849bfc9c67-jsf5j\" (UID: \"28f1bcfd-788c-47fa-a462-cd5068ec34d2\") " pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.497642 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f1bcfd-788c-47fa-a462-cd5068ec34d2-config-data\") pod \"swift-proxy-849bfc9c67-jsf5j\" (UID: \"28f1bcfd-788c-47fa-a462-cd5068ec34d2\") " pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.497826 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28f1bcfd-788c-47fa-a462-cd5068ec34d2-log-httpd\") pod \"swift-proxy-849bfc9c67-jsf5j\" (UID: \"28f1bcfd-788c-47fa-a462-cd5068ec34d2\") " pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.498001 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28f1bcfd-788c-47fa-a462-cd5068ec34d2-run-httpd\") pod \"swift-proxy-849bfc9c67-jsf5j\" (UID: \"28f1bcfd-788c-47fa-a462-cd5068ec34d2\") " pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.498130 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/28f1bcfd-788c-47fa-a462-cd5068ec34d2-etc-swift\") pod \"swift-proxy-849bfc9c67-jsf5j\" (UID: \"28f1bcfd-788c-47fa-a462-cd5068ec34d2\") " pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.498284 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f1bcfd-788c-47fa-a462-cd5068ec34d2-combined-ca-bundle\") pod \"swift-proxy-849bfc9c67-jsf5j\" (UID: \"28f1bcfd-788c-47fa-a462-cd5068ec34d2\") " pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.599968 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28f1bcfd-788c-47fa-a462-cd5068ec34d2-public-tls-certs\") pod \"swift-proxy-849bfc9c67-jsf5j\" (UID: \"28f1bcfd-788c-47fa-a462-cd5068ec34d2\") " pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.600037 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t87sn\" (UniqueName: \"kubernetes.io/projected/28f1bcfd-788c-47fa-a462-cd5068ec34d2-kube-api-access-t87sn\") pod \"swift-proxy-849bfc9c67-jsf5j\" (UID: \"28f1bcfd-788c-47fa-a462-cd5068ec34d2\") " pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.600086 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28f1bcfd-788c-47fa-a462-cd5068ec34d2-internal-tls-certs\") pod \"swift-proxy-849bfc9c67-jsf5j\" (UID: \"28f1bcfd-788c-47fa-a462-cd5068ec34d2\") " pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.600113 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f1bcfd-788c-47fa-a462-cd5068ec34d2-config-data\") pod \"swift-proxy-849bfc9c67-jsf5j\" (UID: \"28f1bcfd-788c-47fa-a462-cd5068ec34d2\") " pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.600247 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28f1bcfd-788c-47fa-a462-cd5068ec34d2-log-httpd\") pod \"swift-proxy-849bfc9c67-jsf5j\" (UID: \"28f1bcfd-788c-47fa-a462-cd5068ec34d2\") " pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.600404 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28f1bcfd-788c-47fa-a462-cd5068ec34d2-run-httpd\") pod \"swift-proxy-849bfc9c67-jsf5j\" (UID: \"28f1bcfd-788c-47fa-a462-cd5068ec34d2\") " pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.600522 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/28f1bcfd-788c-47fa-a462-cd5068ec34d2-etc-swift\") pod \"swift-proxy-849bfc9c67-jsf5j\" (UID: \"28f1bcfd-788c-47fa-a462-cd5068ec34d2\") " pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.600663 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f1bcfd-788c-47fa-a462-cd5068ec34d2-combined-ca-bundle\") pod \"swift-proxy-849bfc9c67-jsf5j\" (UID: \"28f1bcfd-788c-47fa-a462-cd5068ec34d2\") " pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.613150 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/28f1bcfd-788c-47fa-a462-cd5068ec34d2-config-data\") pod \"swift-proxy-849bfc9c67-jsf5j\" (UID: \"28f1bcfd-788c-47fa-a462-cd5068ec34d2\") " pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.614128 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/28f1bcfd-788c-47fa-a462-cd5068ec34d2-combined-ca-bundle\") pod \"swift-proxy-849bfc9c67-jsf5j\" (UID: \"28f1bcfd-788c-47fa-a462-cd5068ec34d2\") " pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.614520 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28f1bcfd-788c-47fa-a462-cd5068ec34d2-log-httpd\") pod \"swift-proxy-849bfc9c67-jsf5j\" (UID: \"28f1bcfd-788c-47fa-a462-cd5068ec34d2\") " pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.614767 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/28f1bcfd-788c-47fa-a462-cd5068ec34d2-run-httpd\") pod \"swift-proxy-849bfc9c67-jsf5j\" (UID: \"28f1bcfd-788c-47fa-a462-cd5068ec34d2\") " pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.622059 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/28f1bcfd-788c-47fa-a462-cd5068ec34d2-internal-tls-certs\") pod \"swift-proxy-849bfc9c67-jsf5j\" (UID: \"28f1bcfd-788c-47fa-a462-cd5068ec34d2\") " pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.634265 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/28f1bcfd-788c-47fa-a462-cd5068ec34d2-public-tls-certs\") pod \"swift-proxy-849bfc9c67-jsf5j\" (UID: \"28f1bcfd-788c-47fa-a462-cd5068ec34d2\") " pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.639971 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/28f1bcfd-788c-47fa-a462-cd5068ec34d2-etc-swift\") pod \"swift-proxy-849bfc9c67-jsf5j\" (UID: \"28f1bcfd-788c-47fa-a462-cd5068ec34d2\") " pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.663270 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t87sn\" (UniqueName: \"kubernetes.io/projected/28f1bcfd-788c-47fa-a462-cd5068ec34d2-kube-api-access-t87sn\") pod \"swift-proxy-849bfc9c67-jsf5j\" (UID: \"28f1bcfd-788c-47fa-a462-cd5068ec34d2\") " pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.743516 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:46:48 crc kubenswrapper[4973]: I0320 13:46:48.889471 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-zzsmb" Mar 20 13:46:49 crc kubenswrapper[4973]: I0320 13:46:49.285627 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 13:46:49 crc kubenswrapper[4973]: I0320 13:46:49.491067 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 13:46:49 crc kubenswrapper[4973]: I0320 13:46:49.865268 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:49 crc kubenswrapper[4973]: I0320 13:46:49.900242 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d355b77-7cc9-4184-a223-2bc448d9ca1b" containerName="ceilometer-central-agent" containerID="cri-o://131d15b9a05a258c4c680a989df302321e53c7be62c04536bb7336f954905cc5" gracePeriod=30 Mar 20 13:46:49 crc kubenswrapper[4973]: I0320 13:46:49.900691 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d355b77-7cc9-4184-a223-2bc448d9ca1b" containerName="proxy-httpd" containerID="cri-o://81f67f519f04127b155b4be39fa9edd64bf13174df55ce6e45c34ac8abb8482b" gracePeriod=30 Mar 20 13:46:49 crc kubenswrapper[4973]: I0320 13:46:49.900845 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d355b77-7cc9-4184-a223-2bc448d9ca1b" containerName="sg-core" containerID="cri-o://d73daea7a5bcd0d9a89125476231ffd70c496368c325a05bfa846fdbdefd8217" gracePeriod=30 Mar 20 13:46:49 crc kubenswrapper[4973]: I0320 13:46:49.900903 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0d355b77-7cc9-4184-a223-2bc448d9ca1b" containerName="ceilometer-notification-agent" containerID="cri-o://8dc127458f2a429cd82307b7a3a2d526590507915e3dd783c3131ec53c85c887" gracePeriod=30 Mar 20 13:46:50 crc kubenswrapper[4973]: I0320 13:46:50.446915 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-77cf5858cd-9xwk6" Mar 20 13:46:50 crc kubenswrapper[4973]: I0320 13:46:50.763621 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-xnwtw"] Mar 20 13:46:50 crc kubenswrapper[4973]: I0320 13:46:50.765684 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xnwtw" Mar 20 13:46:50 crc kubenswrapper[4973]: I0320 13:46:50.778898 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xnwtw"] Mar 20 13:46:50 crc kubenswrapper[4973]: I0320 13:46:50.859878 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-p2xzh"] Mar 20 13:46:50 crc kubenswrapper[4973]: I0320 13:46:50.862412 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-p2xzh" Mar 20 13:46:50 crc kubenswrapper[4973]: I0320 13:46:50.873993 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-p2xzh"] Mar 20 13:46:50 crc kubenswrapper[4973]: I0320 13:46:50.905483 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh2xs\" (UniqueName: \"kubernetes.io/projected/abeee3f9-2831-465f-8c3e-9853954f7087-kube-api-access-vh2xs\") pod \"nova-api-db-create-xnwtw\" (UID: \"abeee3f9-2831-465f-8c3e-9853954f7087\") " pod="openstack/nova-api-db-create-xnwtw" Mar 20 13:46:50 crc kubenswrapper[4973]: I0320 13:46:50.905789 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abeee3f9-2831-465f-8c3e-9853954f7087-operator-scripts\") pod \"nova-api-db-create-xnwtw\" (UID: \"abeee3f9-2831-465f-8c3e-9853954f7087\") " pod="openstack/nova-api-db-create-xnwtw" Mar 20 13:46:50 crc kubenswrapper[4973]: I0320 13:46:50.928598 4973 generic.go:334] "Generic (PLEG): container finished" podID="0d355b77-7cc9-4184-a223-2bc448d9ca1b" containerID="81f67f519f04127b155b4be39fa9edd64bf13174df55ce6e45c34ac8abb8482b" exitCode=0 Mar 20 13:46:50 crc kubenswrapper[4973]: I0320 13:46:50.929222 4973 generic.go:334] "Generic (PLEG): container finished" podID="0d355b77-7cc9-4184-a223-2bc448d9ca1b" containerID="d73daea7a5bcd0d9a89125476231ffd70c496368c325a05bfa846fdbdefd8217" exitCode=2 Mar 20 13:46:50 crc kubenswrapper[4973]: I0320 13:46:50.929447 4973 generic.go:334] "Generic (PLEG): container finished" podID="0d355b77-7cc9-4184-a223-2bc448d9ca1b" containerID="8dc127458f2a429cd82307b7a3a2d526590507915e3dd783c3131ec53c85c887" exitCode=0 Mar 20 13:46:50 crc kubenswrapper[4973]: I0320 13:46:50.929554 4973 generic.go:334] "Generic (PLEG): container finished" podID="0d355b77-7cc9-4184-a223-2bc448d9ca1b" containerID="131d15b9a05a258c4c680a989df302321e53c7be62c04536bb7336f954905cc5" exitCode=0 Mar 20 13:46:50 crc kubenswrapper[4973]: I0320 13:46:50.928709 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d355b77-7cc9-4184-a223-2bc448d9ca1b","Type":"ContainerDied","Data":"81f67f519f04127b155b4be39fa9edd64bf13174df55ce6e45c34ac8abb8482b"} Mar 20 13:46:50 crc kubenswrapper[4973]: I0320 13:46:50.929804 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d355b77-7cc9-4184-a223-2bc448d9ca1b","Type":"ContainerDied","Data":"d73daea7a5bcd0d9a89125476231ffd70c496368c325a05bfa846fdbdefd8217"} Mar 20 13:46:50 crc kubenswrapper[4973]: I0320 13:46:50.929948 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d355b77-7cc9-4184-a223-2bc448d9ca1b","Type":"ContainerDied","Data":"8dc127458f2a429cd82307b7a3a2d526590507915e3dd783c3131ec53c85c887"} Mar 20 13:46:50 crc kubenswrapper[4973]: I0320 13:46:50.930063 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d355b77-7cc9-4184-a223-2bc448d9ca1b","Type":"ContainerDied","Data":"131d15b9a05a258c4c680a989df302321e53c7be62c04536bb7336f954905cc5"} Mar 20 13:46:50 crc kubenswrapper[4973]: I0320 13:46:50.974039 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-4451-account-create-update-pddz5"] Mar 20 13:46:50 crc kubenswrapper[4973]: I0320 13:46:50.978547 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4451-account-create-update-pddz5"] Mar 20 13:46:50 crc kubenswrapper[4973]: I0320 13:46:50.978624 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4451-account-create-update-pddz5" Mar 20 13:46:50 crc kubenswrapper[4973]: I0320 13:46:50.980934 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.008969 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh2xs\" (UniqueName: \"kubernetes.io/projected/abeee3f9-2831-465f-8c3e-9853954f7087-kube-api-access-vh2xs\") pod \"nova-api-db-create-xnwtw\" (UID: \"abeee3f9-2831-465f-8c3e-9853954f7087\") " pod="openstack/nova-api-db-create-xnwtw" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.009022 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bf6849d-dd2b-4a3e-be4f-1b00c1826000-operator-scripts\") pod \"nova-cell0-db-create-p2xzh\" (UID: \"7bf6849d-dd2b-4a3e-be4f-1b00c1826000\") " pod="openstack/nova-cell0-db-create-p2xzh" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.009212 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-44rfq\" (UniqueName: \"kubernetes.io/projected/7bf6849d-dd2b-4a3e-be4f-1b00c1826000-kube-api-access-44rfq\") pod \"nova-cell0-db-create-p2xzh\" (UID: \"7bf6849d-dd2b-4a3e-be4f-1b00c1826000\") " pod="openstack/nova-cell0-db-create-p2xzh" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.009318 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abeee3f9-2831-465f-8c3e-9853954f7087-operator-scripts\") pod \"nova-api-db-create-xnwtw\" (UID: \"abeee3f9-2831-465f-8c3e-9853954f7087\") " pod="openstack/nova-api-db-create-xnwtw" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.010070 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abeee3f9-2831-465f-8c3e-9853954f7087-operator-scripts\") pod \"nova-api-db-create-xnwtw\" (UID: \"abeee3f9-2831-465f-8c3e-9853954f7087\") " pod="openstack/nova-api-db-create-xnwtw" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.044914 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh2xs\" (UniqueName: \"kubernetes.io/projected/abeee3f9-2831-465f-8c3e-9853954f7087-kube-api-access-vh2xs\") pod \"nova-api-db-create-xnwtw\" (UID: \"abeee3f9-2831-465f-8c3e-9853954f7087\") " pod="openstack/nova-api-db-create-xnwtw" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.093708 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xnwtw" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.096904 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-p4rbx"] Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.099535 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p4rbx" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.112140 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73458433-b92e-4758-84bd-7aba2f23e1e0-operator-scripts\") pod \"nova-api-4451-account-create-update-pddz5\" (UID: \"73458433-b92e-4758-84bd-7aba2f23e1e0\") " pod="openstack/nova-api-4451-account-create-update-pddz5" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.112217 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bf6849d-dd2b-4a3e-be4f-1b00c1826000-operator-scripts\") pod \"nova-cell0-db-create-p2xzh\" (UID: \"7bf6849d-dd2b-4a3e-be4f-1b00c1826000\") " pod="openstack/nova-cell0-db-create-p2xzh" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.112262 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zk9f7\" (UniqueName: \"kubernetes.io/projected/73458433-b92e-4758-84bd-7aba2f23e1e0-kube-api-access-zk9f7\") pod \"nova-api-4451-account-create-update-pddz5\" (UID: \"73458433-b92e-4758-84bd-7aba2f23e1e0\") " pod="openstack/nova-api-4451-account-create-update-pddz5" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.112389 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-44rfq\" (UniqueName: \"kubernetes.io/projected/7bf6849d-dd2b-4a3e-be4f-1b00c1826000-kube-api-access-44rfq\") pod \"nova-cell0-db-create-p2xzh\" (UID: \"7bf6849d-dd2b-4a3e-be4f-1b00c1826000\") " pod="openstack/nova-cell0-db-create-p2xzh" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.115699 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bf6849d-dd2b-4a3e-be4f-1b00c1826000-operator-scripts\") pod \"nova-cell0-db-create-p2xzh\" (UID: \"7bf6849d-dd2b-4a3e-be4f-1b00c1826000\") " pod="openstack/nova-cell0-db-create-p2xzh" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.116845 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-p4rbx"] Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.159939 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-44rfq\" (UniqueName: \"kubernetes.io/projected/7bf6849d-dd2b-4a3e-be4f-1b00c1826000-kube-api-access-44rfq\") pod \"nova-cell0-db-create-p2xzh\" (UID: \"7bf6849d-dd2b-4a3e-be4f-1b00c1826000\") " pod="openstack/nova-cell0-db-create-p2xzh" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.197124 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-p2xzh" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.218739 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-eea5-account-create-update-jjhm6"] Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.220328 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-eea5-account-create-update-jjhm6" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.238657 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.250744 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73458433-b92e-4758-84bd-7aba2f23e1e0-operator-scripts\") pod \"nova-api-4451-account-create-update-pddz5\" (UID: \"73458433-b92e-4758-84bd-7aba2f23e1e0\") " pod="openstack/nova-api-4451-account-create-update-pddz5" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.251133 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zk9f7\" (UniqueName: \"kubernetes.io/projected/73458433-b92e-4758-84bd-7aba2f23e1e0-kube-api-access-zk9f7\") pod \"nova-api-4451-account-create-update-pddz5\" (UID: \"73458433-b92e-4758-84bd-7aba2f23e1e0\") " pod="openstack/nova-api-4451-account-create-update-pddz5" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.251526 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shrgd\" (UniqueName: \"kubernetes.io/projected/3da0c070-14fe-41f9-9b97-f76831f43dbc-kube-api-access-shrgd\") pod \"nova-cell1-db-create-p4rbx\" (UID: \"3da0c070-14fe-41f9-9b97-f76831f43dbc\") " pod="openstack/nova-cell1-db-create-p4rbx" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.251657 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3da0c070-14fe-41f9-9b97-f76831f43dbc-operator-scripts\") pod \"nova-cell1-db-create-p4rbx\" (UID: \"3da0c070-14fe-41f9-9b97-f76831f43dbc\") " pod="openstack/nova-cell1-db-create-p4rbx" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.251932 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73458433-b92e-4758-84bd-7aba2f23e1e0-operator-scripts\") pod \"nova-api-4451-account-create-update-pddz5\" (UID: \"73458433-b92e-4758-84bd-7aba2f23e1e0\") " pod="openstack/nova-api-4451-account-create-update-pddz5" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.271071 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-eea5-account-create-update-jjhm6"] Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.322886 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zk9f7\" (UniqueName: \"kubernetes.io/projected/73458433-b92e-4758-84bd-7aba2f23e1e0-kube-api-access-zk9f7\") pod \"nova-api-4451-account-create-update-pddz5\" (UID: \"73458433-b92e-4758-84bd-7aba2f23e1e0\") " pod="openstack/nova-api-4451-account-create-update-pddz5" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.361947 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bb957cd-c6f9-4602-b4a2-56834928aabb-operator-scripts\") pod \"nova-cell0-eea5-account-create-update-jjhm6\" (UID: \"6bb957cd-c6f9-4602-b4a2-56834928aabb\") " pod="openstack/nova-cell0-eea5-account-create-update-jjhm6" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.362027 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shrgd\" (UniqueName: \"kubernetes.io/projected/3da0c070-14fe-41f9-9b97-f76831f43dbc-kube-api-access-shrgd\") pod \"nova-cell1-db-create-p4rbx\" (UID: \"3da0c070-14fe-41f9-9b97-f76831f43dbc\") " pod="openstack/nova-cell1-db-create-p4rbx" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.362059 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3da0c070-14fe-41f9-9b97-f76831f43dbc-operator-scripts\") pod \"nova-cell1-db-create-p4rbx\" (UID: \"3da0c070-14fe-41f9-9b97-f76831f43dbc\") " pod="openstack/nova-cell1-db-create-p4rbx" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.362104 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zswnd\" (UniqueName: \"kubernetes.io/projected/6bb957cd-c6f9-4602-b4a2-56834928aabb-kube-api-access-zswnd\") pod \"nova-cell0-eea5-account-create-update-jjhm6\" (UID: \"6bb957cd-c6f9-4602-b4a2-56834928aabb\") " pod="openstack/nova-cell0-eea5-account-create-update-jjhm6" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.363148 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3da0c070-14fe-41f9-9b97-f76831f43dbc-operator-scripts\") pod \"nova-cell1-db-create-p4rbx\" (UID: \"3da0c070-14fe-41f9-9b97-f76831f43dbc\") " pod="openstack/nova-cell1-db-create-p4rbx" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.410018 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shrgd\" (UniqueName: \"kubernetes.io/projected/3da0c070-14fe-41f9-9b97-f76831f43dbc-kube-api-access-shrgd\") pod \"nova-cell1-db-create-p4rbx\" (UID: \"3da0c070-14fe-41f9-9b97-f76831f43dbc\") " pod="openstack/nova-cell1-db-create-p4rbx" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.462222 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p4rbx" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.464027 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bb957cd-c6f9-4602-b4a2-56834928aabb-operator-scripts\") pod \"nova-cell0-eea5-account-create-update-jjhm6\" (UID: \"6bb957cd-c6f9-4602-b4a2-56834928aabb\") " pod="openstack/nova-cell0-eea5-account-create-update-jjhm6" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.464103 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zswnd\" (UniqueName: \"kubernetes.io/projected/6bb957cd-c6f9-4602-b4a2-56834928aabb-kube-api-access-zswnd\") pod \"nova-cell0-eea5-account-create-update-jjhm6\" (UID: \"6bb957cd-c6f9-4602-b4a2-56834928aabb\") " pod="openstack/nova-cell0-eea5-account-create-update-jjhm6" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.465177 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bb957cd-c6f9-4602-b4a2-56834928aabb-operator-scripts\") pod \"nova-cell0-eea5-account-create-update-jjhm6\" (UID: \"6bb957cd-c6f9-4602-b4a2-56834928aabb\") " pod="openstack/nova-cell0-eea5-account-create-update-jjhm6" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.490705 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-1a9f-account-create-update-4nplb"] Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.506031 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1a9f-account-create-update-4nplb" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.511748 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.520909 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zswnd\" (UniqueName: \"kubernetes.io/projected/6bb957cd-c6f9-4602-b4a2-56834928aabb-kube-api-access-zswnd\") pod \"nova-cell0-eea5-account-create-update-jjhm6\" (UID: \"6bb957cd-c6f9-4602-b4a2-56834928aabb\") " pod="openstack/nova-cell0-eea5-account-create-update-jjhm6" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.539074 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1a9f-account-create-update-4nplb"] Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.566020 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkqz9\" (UniqueName: \"kubernetes.io/projected/cd964d5b-2edc-41ea-8dc9-3f6e71d9da32-kube-api-access-bkqz9\") pod \"nova-cell1-1a9f-account-create-update-4nplb\" (UID: \"cd964d5b-2edc-41ea-8dc9-3f6e71d9da32\") " pod="openstack/nova-cell1-1a9f-account-create-update-4nplb" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.566321 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd964d5b-2edc-41ea-8dc9-3f6e71d9da32-operator-scripts\") pod \"nova-cell1-1a9f-account-create-update-4nplb\" (UID: \"cd964d5b-2edc-41ea-8dc9-3f6e71d9da32\") " pod="openstack/nova-cell1-1a9f-account-create-update-4nplb" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.579960 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-eea5-account-create-update-jjhm6" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.619391 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4451-account-create-update-pddz5" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.668386 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkqz9\" (UniqueName: \"kubernetes.io/projected/cd964d5b-2edc-41ea-8dc9-3f6e71d9da32-kube-api-access-bkqz9\") pod \"nova-cell1-1a9f-account-create-update-4nplb\" (UID: \"cd964d5b-2edc-41ea-8dc9-3f6e71d9da32\") " pod="openstack/nova-cell1-1a9f-account-create-update-4nplb" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.668441 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd964d5b-2edc-41ea-8dc9-3f6e71d9da32-operator-scripts\") pod \"nova-cell1-1a9f-account-create-update-4nplb\" (UID: \"cd964d5b-2edc-41ea-8dc9-3f6e71d9da32\") " pod="openstack/nova-cell1-1a9f-account-create-update-4nplb" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.669238 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd964d5b-2edc-41ea-8dc9-3f6e71d9da32-operator-scripts\") pod \"nova-cell1-1a9f-account-create-update-4nplb\" (UID: \"cd964d5b-2edc-41ea-8dc9-3f6e71d9da32\") " pod="openstack/nova-cell1-1a9f-account-create-update-4nplb" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.701205 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkqz9\" (UniqueName: \"kubernetes.io/projected/cd964d5b-2edc-41ea-8dc9-3f6e71d9da32-kube-api-access-bkqz9\") pod \"nova-cell1-1a9f-account-create-update-4nplb\" (UID: \"cd964d5b-2edc-41ea-8dc9-3f6e71d9da32\") " pod="openstack/nova-cell1-1a9f-account-create-update-4nplb" Mar 20 13:46:51 crc kubenswrapper[4973]: I0320 13:46:51.895887 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1a9f-account-create-update-4nplb" Mar 20 13:46:55 crc kubenswrapper[4973]: I0320 13:46:55.477879 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="07dbf52e-f333-4466-8e56-b44815529a05" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.211:8776/healthcheck\": dial tcp 10.217.0.211:8776: connect: connection refused" Mar 20 13:46:55 crc kubenswrapper[4973]: I0320 13:46:55.894024 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xq9kw" Mar 20 13:46:55 crc kubenswrapper[4973]: I0320 13:46:55.974653 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xq9kw" Mar 20 13:46:56 crc kubenswrapper[4973]: I0320 13:46:56.006624 4973 generic.go:334] "Generic (PLEG): container finished" podID="07dbf52e-f333-4466-8e56-b44815529a05" containerID="55f829e539a8e500c99341aa1e054b0d9f26dde9c8e12cf45a1698d4ce9f2a30" exitCode=137 Mar 20 13:46:56 crc kubenswrapper[4973]: I0320 13:46:56.008502 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"07dbf52e-f333-4466-8e56-b44815529a05","Type":"ContainerDied","Data":"55f829e539a8e500c99341aa1e054b0d9f26dde9c8e12cf45a1698d4ce9f2a30"} Mar 20 13:46:56 crc kubenswrapper[4973]: I0320 13:46:56.353903 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xq9kw"] Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.027412 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xq9kw" podUID="8448c2a8-df10-4a2f-a14f-c59318a71ceb" containerName="registry-server" containerID="cri-o://822ccc28a7ff7901b030a3684f7318dfb904c5e0eb49d975499614fe8d732270" gracePeriod=2 Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.027609 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0d355b77-7cc9-4184-a223-2bc448d9ca1b","Type":"ContainerDied","Data":"c15cdb219151792214f1399a96a67096cb49c4ad006083863cd2c26f27d3ad39"} Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.027728 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c15cdb219151792214f1399a96a67096cb49c4ad006083863cd2c26f27d3ad39" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.185691 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.193835 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-5874c8d58f-l5f6s" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.221414 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d355b77-7cc9-4184-a223-2bc448d9ca1b-sg-core-conf-yaml\") pod \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.221621 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d355b77-7cc9-4184-a223-2bc448d9ca1b-config-data\") pod \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.221656 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d355b77-7cc9-4184-a223-2bc448d9ca1b-log-httpd\") pod \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.221751 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxf7c\" (UniqueName: \"kubernetes.io/projected/0d355b77-7cc9-4184-a223-2bc448d9ca1b-kube-api-access-wxf7c\") pod \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.221837 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d355b77-7cc9-4184-a223-2bc448d9ca1b-scripts\") pod \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.221880 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d355b77-7cc9-4184-a223-2bc448d9ca1b-combined-ca-bundle\") pod \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.221951 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d355b77-7cc9-4184-a223-2bc448d9ca1b-run-httpd\") pod \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\" (UID: \"0d355b77-7cc9-4184-a223-2bc448d9ca1b\") " Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.238986 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d355b77-7cc9-4184-a223-2bc448d9ca1b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0d355b77-7cc9-4184-a223-2bc448d9ca1b" (UID: "0d355b77-7cc9-4184-a223-2bc448d9ca1b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.241224 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d355b77-7cc9-4184-a223-2bc448d9ca1b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0d355b77-7cc9-4184-a223-2bc448d9ca1b" (UID: "0d355b77-7cc9-4184-a223-2bc448d9ca1b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.244915 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d355b77-7cc9-4184-a223-2bc448d9ca1b-scripts" (OuterVolumeSpecName: "scripts") pod "0d355b77-7cc9-4184-a223-2bc448d9ca1b" (UID: "0d355b77-7cc9-4184-a223-2bc448d9ca1b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.256845 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d355b77-7cc9-4184-a223-2bc448d9ca1b-kube-api-access-wxf7c" (OuterVolumeSpecName: "kube-api-access-wxf7c") pod "0d355b77-7cc9-4184-a223-2bc448d9ca1b" (UID: "0d355b77-7cc9-4184-a223-2bc448d9ca1b"). InnerVolumeSpecName "kube-api-access-wxf7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.325970 4973 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d355b77-7cc9-4184-a223-2bc448d9ca1b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.326013 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxf7c\" (UniqueName: \"kubernetes.io/projected/0d355b77-7cc9-4184-a223-2bc448d9ca1b-kube-api-access-wxf7c\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.326026 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d355b77-7cc9-4184-a223-2bc448d9ca1b-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.326037 4973 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0d355b77-7cc9-4184-a223-2bc448d9ca1b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.399890 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.408144 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d355b77-7cc9-4184-a223-2bc448d9ca1b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0d355b77-7cc9-4184-a223-2bc448d9ca1b" (UID: "0d355b77-7cc9-4184-a223-2bc448d9ca1b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.409818 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-77cf5858cd-9xwk6"] Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.410061 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-77cf5858cd-9xwk6" podUID="58e69a08-7d61-4b8a-8ff1-6a9bb59c360d" containerName="neutron-api" containerID="cri-o://3ccab10a39d8b83dbf6322bdc5ebddcba8df48beb7400dbf5d483a59cf18bca1" gracePeriod=30 Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.410224 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-77cf5858cd-9xwk6" podUID="58e69a08-7d61-4b8a-8ff1-6a9bb59c360d" containerName="neutron-httpd" containerID="cri-o://8185079e7c55881e7a789b73d0c2cde66bafaaa4cb18f01382bb731f9cef3f0a" gracePeriod=30 Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.437326 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07dbf52e-f333-4466-8e56-b44815529a05-scripts\") pod \"07dbf52e-f333-4466-8e56-b44815529a05\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.437396 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07dbf52e-f333-4466-8e56-b44815529a05-config-data\") pod \"07dbf52e-f333-4466-8e56-b44815529a05\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.438049 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9qqg\" (UniqueName: \"kubernetes.io/projected/07dbf52e-f333-4466-8e56-b44815529a05-kube-api-access-l9qqg\") pod \"07dbf52e-f333-4466-8e56-b44815529a05\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.438103 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07dbf52e-f333-4466-8e56-b44815529a05-config-data-custom\") pod \"07dbf52e-f333-4466-8e56-b44815529a05\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.438208 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/07dbf52e-f333-4466-8e56-b44815529a05-etc-machine-id\") pod \"07dbf52e-f333-4466-8e56-b44815529a05\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.438264 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07dbf52e-f333-4466-8e56-b44815529a05-logs\") pod \"07dbf52e-f333-4466-8e56-b44815529a05\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.438443 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07dbf52e-f333-4466-8e56-b44815529a05-combined-ca-bundle\") pod \"07dbf52e-f333-4466-8e56-b44815529a05\" (UID: \"07dbf52e-f333-4466-8e56-b44815529a05\") " Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.439549 4973 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0d355b77-7cc9-4184-a223-2bc448d9ca1b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.442914 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07dbf52e-f333-4466-8e56-b44815529a05-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "07dbf52e-f333-4466-8e56-b44815529a05" (UID: "07dbf52e-f333-4466-8e56-b44815529a05"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.444156 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07dbf52e-f333-4466-8e56-b44815529a05-logs" (OuterVolumeSpecName: "logs") pod "07dbf52e-f333-4466-8e56-b44815529a05" (UID: "07dbf52e-f333-4466-8e56-b44815529a05"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.450670 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07dbf52e-f333-4466-8e56-b44815529a05-scripts" (OuterVolumeSpecName: "scripts") pod "07dbf52e-f333-4466-8e56-b44815529a05" (UID: "07dbf52e-f333-4466-8e56-b44815529a05"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.456713 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07dbf52e-f333-4466-8e56-b44815529a05-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "07dbf52e-f333-4466-8e56-b44815529a05" (UID: "07dbf52e-f333-4466-8e56-b44815529a05"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.456801 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07dbf52e-f333-4466-8e56-b44815529a05-kube-api-access-l9qqg" (OuterVolumeSpecName: "kube-api-access-l9qqg") pod "07dbf52e-f333-4466-8e56-b44815529a05" (UID: "07dbf52e-f333-4466-8e56-b44815529a05"). InnerVolumeSpecName "kube-api-access-l9qqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.488959 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d355b77-7cc9-4184-a223-2bc448d9ca1b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0d355b77-7cc9-4184-a223-2bc448d9ca1b" (UID: "0d355b77-7cc9-4184-a223-2bc448d9ca1b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.518485 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d355b77-7cc9-4184-a223-2bc448d9ca1b-config-data" (OuterVolumeSpecName: "config-data") pod "0d355b77-7cc9-4184-a223-2bc448d9ca1b" (UID: "0d355b77-7cc9-4184-a223-2bc448d9ca1b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.519966 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07dbf52e-f333-4466-8e56-b44815529a05-config-data" (OuterVolumeSpecName: "config-data") pod "07dbf52e-f333-4466-8e56-b44815529a05" (UID: "07dbf52e-f333-4466-8e56-b44815529a05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.543561 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07dbf52e-f333-4466-8e56-b44815529a05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07dbf52e-f333-4466-8e56-b44815529a05" (UID: "07dbf52e-f333-4466-8e56-b44815529a05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.546370 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9qqg\" (UniqueName: \"kubernetes.io/projected/07dbf52e-f333-4466-8e56-b44815529a05-kube-api-access-l9qqg\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.546413 4973 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07dbf52e-f333-4466-8e56-b44815529a05-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.546428 4973 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/07dbf52e-f333-4466-8e56-b44815529a05-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.546440 4973 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/07dbf52e-f333-4466-8e56-b44815529a05-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.546452 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d355b77-7cc9-4184-a223-2bc448d9ca1b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.546531 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07dbf52e-f333-4466-8e56-b44815529a05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.546547 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/07dbf52e-f333-4466-8e56-b44815529a05-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.546569 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07dbf52e-f333-4466-8e56-b44815529a05-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.546582 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d355b77-7cc9-4184-a223-2bc448d9ca1b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.609977 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6cc7bcd8df-hpkc4"] Mar 20 13:46:57 crc kubenswrapper[4973]: E0320 13:46:57.612174 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07dbf52e-f333-4466-8e56-b44815529a05" containerName="cinder-api-log" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.612194 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="07dbf52e-f333-4466-8e56-b44815529a05" containerName="cinder-api-log" Mar 20 13:46:57 crc kubenswrapper[4973]: E0320 13:46:57.612209 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d355b77-7cc9-4184-a223-2bc448d9ca1b" containerName="sg-core" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.612215 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d355b77-7cc9-4184-a223-2bc448d9ca1b" containerName="sg-core" Mar 20 13:46:57 crc kubenswrapper[4973]: E0320 13:46:57.612228 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d355b77-7cc9-4184-a223-2bc448d9ca1b" containerName="ceilometer-notification-agent" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.612234 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d355b77-7cc9-4184-a223-2bc448d9ca1b" containerName="ceilometer-notification-agent" Mar 20 13:46:57 crc kubenswrapper[4973]: E0320 13:46:57.612261 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d355b77-7cc9-4184-a223-2bc448d9ca1b" containerName="proxy-httpd" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.612266 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d355b77-7cc9-4184-a223-2bc448d9ca1b" containerName="proxy-httpd" Mar 20 13:46:57 crc kubenswrapper[4973]: E0320 13:46:57.612281 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d355b77-7cc9-4184-a223-2bc448d9ca1b" containerName="ceilometer-central-agent" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.612287 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d355b77-7cc9-4184-a223-2bc448d9ca1b" containerName="ceilometer-central-agent" Mar 20 13:46:57 crc kubenswrapper[4973]: E0320 13:46:57.612308 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07dbf52e-f333-4466-8e56-b44815529a05" containerName="cinder-api" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.612314 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="07dbf52e-f333-4466-8e56-b44815529a05" containerName="cinder-api" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.612559 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d355b77-7cc9-4184-a223-2bc448d9ca1b" containerName="sg-core" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.612575 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d355b77-7cc9-4184-a223-2bc448d9ca1b" containerName="ceilometer-notification-agent" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.612594 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d355b77-7cc9-4184-a223-2bc448d9ca1b" containerName="ceilometer-central-agent" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.612605 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="07dbf52e-f333-4466-8e56-b44815529a05" containerName="cinder-api" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.612622 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d355b77-7cc9-4184-a223-2bc448d9ca1b" containerName="proxy-httpd" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.612637 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="07dbf52e-f333-4466-8e56-b44815529a05" containerName="cinder-api-log" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.625048 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6cc7bcd8df-hpkc4" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.636425 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.636872 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.637119 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-wcbnn" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.667867 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6cc7bcd8df-hpkc4"] Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.753820 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff523993-52d2-46af-8f52-f3c9ca447fcb-config-data\") pod \"heat-engine-6cc7bcd8df-hpkc4\" (UID: \"ff523993-52d2-46af-8f52-f3c9ca447fcb\") " pod="openstack/heat-engine-6cc7bcd8df-hpkc4" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.753883 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xjcz\" (UniqueName: \"kubernetes.io/projected/ff523993-52d2-46af-8f52-f3c9ca447fcb-kube-api-access-7xjcz\") pod \"heat-engine-6cc7bcd8df-hpkc4\" (UID: \"ff523993-52d2-46af-8f52-f3c9ca447fcb\") " pod="openstack/heat-engine-6cc7bcd8df-hpkc4" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.753944 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff523993-52d2-46af-8f52-f3c9ca447fcb-combined-ca-bundle\") pod \"heat-engine-6cc7bcd8df-hpkc4\" (UID: \"ff523993-52d2-46af-8f52-f3c9ca447fcb\") " pod="openstack/heat-engine-6cc7bcd8df-hpkc4" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.754072 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff523993-52d2-46af-8f52-f3c9ca447fcb-config-data-custom\") pod \"heat-engine-6cc7bcd8df-hpkc4\" (UID: \"ff523993-52d2-46af-8f52-f3c9ca447fcb\") " pod="openstack/heat-engine-6cc7bcd8df-hpkc4" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.775681 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-9v29d"] Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.777828 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.816732 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-9v29d"] Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.856134 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-dns-svc\") pod \"dnsmasq-dns-f6bc4c6c9-9v29d\" (UID: \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.856194 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff523993-52d2-46af-8f52-f3c9ca447fcb-combined-ca-bundle\") pod \"heat-engine-6cc7bcd8df-hpkc4\" (UID: \"ff523993-52d2-46af-8f52-f3c9ca447fcb\") " pod="openstack/heat-engine-6cc7bcd8df-hpkc4" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.856245 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-ovsdbserver-sb\") pod \"dnsmasq-dns-f6bc4c6c9-9v29d\" (UID: \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.856288 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbzwh\" (UniqueName: \"kubernetes.io/projected/9d40d175-2e68-4d03-acd3-0a8ba7943b57-kube-api-access-tbzwh\") pod \"dnsmasq-dns-f6bc4c6c9-9v29d\" (UID: \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.856388 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-config\") pod \"dnsmasq-dns-f6bc4c6c9-9v29d\" (UID: \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.856428 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff523993-52d2-46af-8f52-f3c9ca447fcb-config-data-custom\") pod \"heat-engine-6cc7bcd8df-hpkc4\" (UID: \"ff523993-52d2-46af-8f52-f3c9ca447fcb\") " pod="openstack/heat-engine-6cc7bcd8df-hpkc4" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.856486 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-dns-swift-storage-0\") pod \"dnsmasq-dns-f6bc4c6c9-9v29d\" (UID: \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.856511 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff523993-52d2-46af-8f52-f3c9ca447fcb-config-data\") pod \"heat-engine-6cc7bcd8df-hpkc4\" (UID: \"ff523993-52d2-46af-8f52-f3c9ca447fcb\") " pod="openstack/heat-engine-6cc7bcd8df-hpkc4" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.856547 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-ovsdbserver-nb\") pod \"dnsmasq-dns-f6bc4c6c9-9v29d\" (UID: \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.856571 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xjcz\" (UniqueName: \"kubernetes.io/projected/ff523993-52d2-46af-8f52-f3c9ca447fcb-kube-api-access-7xjcz\") pod \"heat-engine-6cc7bcd8df-hpkc4\" (UID: \"ff523993-52d2-46af-8f52-f3c9ca447fcb\") " pod="openstack/heat-engine-6cc7bcd8df-hpkc4" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.862647 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-b6774478c-cwcjf"] Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.866012 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-b6774478c-cwcjf" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.867279 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff523993-52d2-46af-8f52-f3c9ca447fcb-config-data-custom\") pod \"heat-engine-6cc7bcd8df-hpkc4\" (UID: \"ff523993-52d2-46af-8f52-f3c9ca447fcb\") " pod="openstack/heat-engine-6cc7bcd8df-hpkc4" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.868754 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff523993-52d2-46af-8f52-f3c9ca447fcb-combined-ca-bundle\") pod \"heat-engine-6cc7bcd8df-hpkc4\" (UID: \"ff523993-52d2-46af-8f52-f3c9ca447fcb\") " pod="openstack/heat-engine-6cc7bcd8df-hpkc4" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.873090 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.886701 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff523993-52d2-46af-8f52-f3c9ca447fcb-config-data\") pod \"heat-engine-6cc7bcd8df-hpkc4\" (UID: \"ff523993-52d2-46af-8f52-f3c9ca447fcb\") " pod="openstack/heat-engine-6cc7bcd8df-hpkc4" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.886770 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-b6774478c-cwcjf"] Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.894744 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xjcz\" (UniqueName: \"kubernetes.io/projected/ff523993-52d2-46af-8f52-f3c9ca447fcb-kube-api-access-7xjcz\") pod \"heat-engine-6cc7bcd8df-hpkc4\" (UID: \"ff523993-52d2-46af-8f52-f3c9ca447fcb\") " pod="openstack/heat-engine-6cc7bcd8df-hpkc4" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.958060 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbzwh\" (UniqueName: \"kubernetes.io/projected/9d40d175-2e68-4d03-acd3-0a8ba7943b57-kube-api-access-tbzwh\") pod \"dnsmasq-dns-f6bc4c6c9-9v29d\" (UID: \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.958164 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b7eb71d-d134-4d3b-84a0-f14776f4410a-config-data\") pod \"heat-cfnapi-b6774478c-cwcjf\" (UID: \"7b7eb71d-d134-4d3b-84a0-f14776f4410a\") " pod="openstack/heat-cfnapi-b6774478c-cwcjf" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.958193 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdkgw\" (UniqueName: \"kubernetes.io/projected/7b7eb71d-d134-4d3b-84a0-f14776f4410a-kube-api-access-zdkgw\") pod \"heat-cfnapi-b6774478c-cwcjf\" (UID: \"7b7eb71d-d134-4d3b-84a0-f14776f4410a\") " pod="openstack/heat-cfnapi-b6774478c-cwcjf" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.958227 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-config\") pod \"dnsmasq-dns-f6bc4c6c9-9v29d\" (UID: \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.958274 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b7eb71d-d134-4d3b-84a0-f14776f4410a-combined-ca-bundle\") pod \"heat-cfnapi-b6774478c-cwcjf\" (UID: \"7b7eb71d-d134-4d3b-84a0-f14776f4410a\") " pod="openstack/heat-cfnapi-b6774478c-cwcjf" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.958311 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b7eb71d-d134-4d3b-84a0-f14776f4410a-config-data-custom\") pod \"heat-cfnapi-b6774478c-cwcjf\" (UID: \"7b7eb71d-d134-4d3b-84a0-f14776f4410a\") " pod="openstack/heat-cfnapi-b6774478c-cwcjf" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.958427 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-dns-swift-storage-0\") pod \"dnsmasq-dns-f6bc4c6c9-9v29d\" (UID: \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.958489 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-ovsdbserver-nb\") pod \"dnsmasq-dns-f6bc4c6c9-9v29d\" (UID: \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.958549 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-dns-svc\") pod \"dnsmasq-dns-f6bc4c6c9-9v29d\" (UID: \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.958603 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-ovsdbserver-sb\") pod \"dnsmasq-dns-f6bc4c6c9-9v29d\" (UID: \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.962226 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-ovsdbserver-sb\") pod \"dnsmasq-dns-f6bc4c6c9-9v29d\" (UID: \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.962944 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-dns-swift-storage-0\") pod \"dnsmasq-dns-f6bc4c6c9-9v29d\" (UID: \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.963386 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-ovsdbserver-nb\") pod \"dnsmasq-dns-f6bc4c6c9-9v29d\" (UID: \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.963557 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-dns-svc\") pod \"dnsmasq-dns-f6bc4c6c9-9v29d\" (UID: \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" Mar 20 13:46:57 crc kubenswrapper[4973]: I0320 13:46:57.970320 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-config\") pod \"dnsmasq-dns-f6bc4c6c9-9v29d\" (UID: \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.009508 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-667c56bb7d-c5695"] Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.018468 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-667c56bb7d-c5695" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.021148 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.025937 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbzwh\" (UniqueName: \"kubernetes.io/projected/9d40d175-2e68-4d03-acd3-0a8ba7943b57-kube-api-access-tbzwh\") pod \"dnsmasq-dns-f6bc4c6c9-9v29d\" (UID: \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\") " pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.063202 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b7eb71d-d134-4d3b-84a0-f14776f4410a-config-data-custom\") pod \"heat-cfnapi-b6774478c-cwcjf\" (UID: \"7b7eb71d-d134-4d3b-84a0-f14776f4410a\") " pod="openstack/heat-cfnapi-b6774478c-cwcjf" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.063369 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07fe89fe-2971-4d79-afd3-4688be48e51c-config-data-custom\") pod \"heat-api-667c56bb7d-c5695\" (UID: \"07fe89fe-2971-4d79-afd3-4688be48e51c\") " pod="openstack/heat-api-667c56bb7d-c5695" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.063432 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07fe89fe-2971-4d79-afd3-4688be48e51c-config-data\") pod \"heat-api-667c56bb7d-c5695\" (UID: \"07fe89fe-2971-4d79-afd3-4688be48e51c\") " pod="openstack/heat-api-667c56bb7d-c5695" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.063523 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07fe89fe-2971-4d79-afd3-4688be48e51c-combined-ca-bundle\") pod \"heat-api-667c56bb7d-c5695\" (UID: \"07fe89fe-2971-4d79-afd3-4688be48e51c\") " pod="openstack/heat-api-667c56bb7d-c5695" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.063553 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snz56\" (UniqueName: \"kubernetes.io/projected/07fe89fe-2971-4d79-afd3-4688be48e51c-kube-api-access-snz56\") pod \"heat-api-667c56bb7d-c5695\" (UID: \"07fe89fe-2971-4d79-afd3-4688be48e51c\") " pod="openstack/heat-api-667c56bb7d-c5695" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.063595 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b7eb71d-d134-4d3b-84a0-f14776f4410a-config-data\") pod \"heat-cfnapi-b6774478c-cwcjf\" (UID: \"7b7eb71d-d134-4d3b-84a0-f14776f4410a\") " pod="openstack/heat-cfnapi-b6774478c-cwcjf" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.063622 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdkgw\" (UniqueName: \"kubernetes.io/projected/7b7eb71d-d134-4d3b-84a0-f14776f4410a-kube-api-access-zdkgw\") pod \"heat-cfnapi-b6774478c-cwcjf\" (UID: \"7b7eb71d-d134-4d3b-84a0-f14776f4410a\") " pod="openstack/heat-cfnapi-b6774478c-cwcjf" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.063664 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b7eb71d-d134-4d3b-84a0-f14776f4410a-combined-ca-bundle\") pod \"heat-cfnapi-b6774478c-cwcjf\" (UID: \"7b7eb71d-d134-4d3b-84a0-f14776f4410a\") " pod="openstack/heat-cfnapi-b6774478c-cwcjf" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.070892 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6cc7bcd8df-hpkc4" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.071528 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-667c56bb7d-c5695"] Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.114217 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b7eb71d-d134-4d3b-84a0-f14776f4410a-combined-ca-bundle\") pod \"heat-cfnapi-b6774478c-cwcjf\" (UID: \"7b7eb71d-d134-4d3b-84a0-f14776f4410a\") " pod="openstack/heat-cfnapi-b6774478c-cwcjf" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.117089 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b7eb71d-d134-4d3b-84a0-f14776f4410a-config-data-custom\") pod \"heat-cfnapi-b6774478c-cwcjf\" (UID: \"7b7eb71d-d134-4d3b-84a0-f14776f4410a\") " pod="openstack/heat-cfnapi-b6774478c-cwcjf" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.125599 4973 generic.go:334] "Generic (PLEG): container finished" podID="58e69a08-7d61-4b8a-8ff1-6a9bb59c360d" containerID="8185079e7c55881e7a789b73d0c2cde66bafaaa4cb18f01382bb731f9cef3f0a" exitCode=0 Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.125698 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77cf5858cd-9xwk6" event={"ID":"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d","Type":"ContainerDied","Data":"8185079e7c55881e7a789b73d0c2cde66bafaaa4cb18f01382bb731f9cef3f0a"} Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.136583 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.149402 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdkgw\" (UniqueName: \"kubernetes.io/projected/7b7eb71d-d134-4d3b-84a0-f14776f4410a-kube-api-access-zdkgw\") pod \"heat-cfnapi-b6774478c-cwcjf\" (UID: \"7b7eb71d-d134-4d3b-84a0-f14776f4410a\") " pod="openstack/heat-cfnapi-b6774478c-cwcjf" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.150287 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"5e16c419-72e5-4d2c-bda0-3a0f6ec97aac","Type":"ContainerStarted","Data":"b986d0e6a2b00f9d7fe47dbe91a3491d05c724a389fd4bb9831f3eec4ed671c8"} Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.167118 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07fe89fe-2971-4d79-afd3-4688be48e51c-combined-ca-bundle\") pod \"heat-api-667c56bb7d-c5695\" (UID: \"07fe89fe-2971-4d79-afd3-4688be48e51c\") " pod="openstack/heat-api-667c56bb7d-c5695" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.167185 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snz56\" (UniqueName: \"kubernetes.io/projected/07fe89fe-2971-4d79-afd3-4688be48e51c-kube-api-access-snz56\") pod \"heat-api-667c56bb7d-c5695\" (UID: \"07fe89fe-2971-4d79-afd3-4688be48e51c\") " pod="openstack/heat-api-667c56bb7d-c5695" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.167475 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07fe89fe-2971-4d79-afd3-4688be48e51c-config-data-custom\") pod \"heat-api-667c56bb7d-c5695\" (UID: \"07fe89fe-2971-4d79-afd3-4688be48e51c\") " pod="openstack/heat-api-667c56bb7d-c5695" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.167551 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07fe89fe-2971-4d79-afd3-4688be48e51c-config-data\") pod \"heat-api-667c56bb7d-c5695\" (UID: \"07fe89fe-2971-4d79-afd3-4688be48e51c\") " pod="openstack/heat-api-667c56bb7d-c5695" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.177664 4973 generic.go:334] "Generic (PLEG): container finished" podID="8448c2a8-df10-4a2f-a14f-c59318a71ceb" containerID="822ccc28a7ff7901b030a3684f7318dfb904c5e0eb49d975499614fe8d732270" exitCode=0 Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.179904 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xq9kw" event={"ID":"8448c2a8-df10-4a2f-a14f-c59318a71ceb","Type":"ContainerDied","Data":"822ccc28a7ff7901b030a3684f7318dfb904c5e0eb49d975499614fe8d732270"} Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.183393 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07fe89fe-2971-4d79-afd3-4688be48e51c-combined-ca-bundle\") pod \"heat-api-667c56bb7d-c5695\" (UID: \"07fe89fe-2971-4d79-afd3-4688be48e51c\") " pod="openstack/heat-api-667c56bb7d-c5695" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.183754 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07fe89fe-2971-4d79-afd3-4688be48e51c-config-data-custom\") pod \"heat-api-667c56bb7d-c5695\" (UID: \"07fe89fe-2971-4d79-afd3-4688be48e51c\") " pod="openstack/heat-api-667c56bb7d-c5695" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.183968 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07fe89fe-2971-4d79-afd3-4688be48e51c-config-data\") pod \"heat-api-667c56bb7d-c5695\" (UID: \"07fe89fe-2971-4d79-afd3-4688be48e51c\") " pod="openstack/heat-api-667c56bb7d-c5695" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.199850 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snz56\" (UniqueName: \"kubernetes.io/projected/07fe89fe-2971-4d79-afd3-4688be48e51c-kube-api-access-snz56\") pod \"heat-api-667c56bb7d-c5695\" (UID: \"07fe89fe-2971-4d79-afd3-4688be48e51c\") " pod="openstack/heat-api-667c56bb7d-c5695" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.203686 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b7eb71d-d134-4d3b-84a0-f14776f4410a-config-data\") pod \"heat-cfnapi-b6774478c-cwcjf\" (UID: \"7b7eb71d-d134-4d3b-84a0-f14776f4410a\") " pod="openstack/heat-cfnapi-b6774478c-cwcjf" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.238939 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.241169 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.241271 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"07dbf52e-f333-4466-8e56-b44815529a05","Type":"ContainerDied","Data":"a0a534c8ec433ab6392bfb7c4233a9b9a17705c83f443d38481620a980aec38a"} Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.241352 4973 scope.go:117] "RemoveContainer" containerID="55f829e539a8e500c99341aa1e054b0d9f26dde9c8e12cf45a1698d4ce9f2a30" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.262819 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xq9kw" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.311747 4973 scope.go:117] "RemoveContainer" containerID="5a0b537c60fe6d546b322aaa5804a3835204b8259c94156fe92808b7b16075fe" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.323520 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.520422898 podStartE2EDuration="23.323492897s" podCreationTimestamp="2026-03-20 13:46:35 +0000 UTC" firstStartedPulling="2026-03-20 13:46:37.051073633 +0000 UTC m=+1517.794743367" lastFinishedPulling="2026-03-20 13:46:56.854143622 +0000 UTC m=+1537.597813366" observedRunningTime="2026-03-20 13:46:58.179588779 +0000 UTC m=+1538.923258523" watchObservedRunningTime="2026-03-20 13:46:58.323492897 +0000 UTC m=+1539.067162651" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.379976 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkndn\" (UniqueName: \"kubernetes.io/projected/8448c2a8-df10-4a2f-a14f-c59318a71ceb-kube-api-access-kkndn\") pod \"8448c2a8-df10-4a2f-a14f-c59318a71ceb\" (UID: \"8448c2a8-df10-4a2f-a14f-c59318a71ceb\") " Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.380156 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8448c2a8-df10-4a2f-a14f-c59318a71ceb-utilities\") pod \"8448c2a8-df10-4a2f-a14f-c59318a71ceb\" (UID: \"8448c2a8-df10-4a2f-a14f-c59318a71ceb\") " Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.380294 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8448c2a8-df10-4a2f-a14f-c59318a71ceb-catalog-content\") pod \"8448c2a8-df10-4a2f-a14f-c59318a71ceb\" (UID: \"8448c2a8-df10-4a2f-a14f-c59318a71ceb\") " Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.391942 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8448c2a8-df10-4a2f-a14f-c59318a71ceb-utilities" (OuterVolumeSpecName: "utilities") pod "8448c2a8-df10-4a2f-a14f-c59318a71ceb" (UID: "8448c2a8-df10-4a2f-a14f-c59318a71ceb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.392606 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8448c2a8-df10-4a2f-a14f-c59318a71ceb-kube-api-access-kkndn" (OuterVolumeSpecName: "kube-api-access-kkndn") pod "8448c2a8-df10-4a2f-a14f-c59318a71ceb" (UID: "8448c2a8-df10-4a2f-a14f-c59318a71ceb"). InnerVolumeSpecName "kube-api-access-kkndn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.407752 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.443428 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-b6774478c-cwcjf" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.455353 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-667c56bb7d-c5695" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.463809 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.484266 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkndn\" (UniqueName: \"kubernetes.io/projected/8448c2a8-df10-4a2f-a14f-c59318a71ceb-kube-api-access-kkndn\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.484310 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8448c2a8-df10-4a2f-a14f-c59318a71ceb-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.504060 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:58 crc kubenswrapper[4973]: E0320 13:46:58.504670 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8448c2a8-df10-4a2f-a14f-c59318a71ceb" containerName="extract-content" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.504683 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="8448c2a8-df10-4a2f-a14f-c59318a71ceb" containerName="extract-content" Mar 20 13:46:58 crc kubenswrapper[4973]: E0320 13:46:58.504696 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8448c2a8-df10-4a2f-a14f-c59318a71ceb" containerName="registry-server" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.504702 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="8448c2a8-df10-4a2f-a14f-c59318a71ceb" containerName="registry-server" Mar 20 13:46:58 crc kubenswrapper[4973]: E0320 13:46:58.504722 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8448c2a8-df10-4a2f-a14f-c59318a71ceb" containerName="extract-utilities" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.504729 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="8448c2a8-df10-4a2f-a14f-c59318a71ceb" containerName="extract-utilities" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.505029 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="8448c2a8-df10-4a2f-a14f-c59318a71ceb" containerName="registry-server" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.522727 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.542027 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.551642 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.552697 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.557645 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.578231 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.594722 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.596786 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.608616 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71549865-d1f9-44d8-bcf3-0040dcbeda6d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " pod="openstack/ceilometer-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.608921 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sb6lm\" (UniqueName: \"kubernetes.io/projected/71549865-d1f9-44d8-bcf3-0040dcbeda6d-kube-api-access-sb6lm\") pod \"ceilometer-0\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " pod="openstack/ceilometer-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.609397 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71549865-d1f9-44d8-bcf3-0040dcbeda6d-config-data\") pod \"ceilometer-0\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " pod="openstack/ceilometer-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.610247 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71549865-d1f9-44d8-bcf3-0040dcbeda6d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " pod="openstack/ceilometer-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.610373 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71549865-d1f9-44d8-bcf3-0040dcbeda6d-log-httpd\") pod \"ceilometer-0\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " pod="openstack/ceilometer-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.610403 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.610650 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.610409 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71549865-d1f9-44d8-bcf3-0040dcbeda6d-run-httpd\") pod \"ceilometer-0\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " pod="openstack/ceilometer-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.610976 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71549865-d1f9-44d8-bcf3-0040dcbeda6d-scripts\") pod \"ceilometer-0\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " pod="openstack/ceilometer-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.611078 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.634383 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.661543 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-xnwtw"] Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.713238 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/273c74ce-9e0d-437a-aaf8-b16451028b6e-logs\") pod \"cinder-api-0\" (UID: \"273c74ce-9e0d-437a-aaf8-b16451028b6e\") " pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.713644 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71549865-d1f9-44d8-bcf3-0040dcbeda6d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " pod="openstack/ceilometer-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.713734 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273c74ce-9e0d-437a-aaf8-b16451028b6e-config-data\") pod \"cinder-api-0\" (UID: \"273c74ce-9e0d-437a-aaf8-b16451028b6e\") " pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.713851 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45gpq\" (UniqueName: \"kubernetes.io/projected/273c74ce-9e0d-437a-aaf8-b16451028b6e-kube-api-access-45gpq\") pod \"cinder-api-0\" (UID: \"273c74ce-9e0d-437a-aaf8-b16451028b6e\") " pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.713942 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/273c74ce-9e0d-437a-aaf8-b16451028b6e-scripts\") pod \"cinder-api-0\" (UID: \"273c74ce-9e0d-437a-aaf8-b16451028b6e\") " pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.714036 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/273c74ce-9e0d-437a-aaf8-b16451028b6e-config-data-custom\") pod \"cinder-api-0\" (UID: \"273c74ce-9e0d-437a-aaf8-b16451028b6e\") " pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.716777 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sb6lm\" (UniqueName: \"kubernetes.io/projected/71549865-d1f9-44d8-bcf3-0040dcbeda6d-kube-api-access-sb6lm\") pod \"ceilometer-0\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " pod="openstack/ceilometer-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.717149 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71549865-d1f9-44d8-bcf3-0040dcbeda6d-config-data\") pod \"ceilometer-0\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " pod="openstack/ceilometer-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.717871 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/273c74ce-9e0d-437a-aaf8-b16451028b6e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"273c74ce-9e0d-437a-aaf8-b16451028b6e\") " pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.718011 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/273c74ce-9e0d-437a-aaf8-b16451028b6e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"273c74ce-9e0d-437a-aaf8-b16451028b6e\") " pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.718125 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/273c74ce-9e0d-437a-aaf8-b16451028b6e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"273c74ce-9e0d-437a-aaf8-b16451028b6e\") " pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.718212 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71549865-d1f9-44d8-bcf3-0040dcbeda6d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " pod="openstack/ceilometer-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.718299 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71549865-d1f9-44d8-bcf3-0040dcbeda6d-log-httpd\") pod \"ceilometer-0\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " pod="openstack/ceilometer-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.718784 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71549865-d1f9-44d8-bcf3-0040dcbeda6d-run-httpd\") pod \"ceilometer-0\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " pod="openstack/ceilometer-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.719511 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273c74ce-9e0d-437a-aaf8-b16451028b6e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"273c74ce-9e0d-437a-aaf8-b16451028b6e\") " pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.721839 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71549865-d1f9-44d8-bcf3-0040dcbeda6d-run-httpd\") pod \"ceilometer-0\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " pod="openstack/ceilometer-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.721868 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71549865-d1f9-44d8-bcf3-0040dcbeda6d-log-httpd\") pod \"ceilometer-0\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " pod="openstack/ceilometer-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.729163 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8448c2a8-df10-4a2f-a14f-c59318a71ceb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8448c2a8-df10-4a2f-a14f-c59318a71ceb" (UID: "8448c2a8-df10-4a2f-a14f-c59318a71ceb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.744024 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71549865-d1f9-44d8-bcf3-0040dcbeda6d-config-data\") pod \"ceilometer-0\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " pod="openstack/ceilometer-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.745852 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71549865-d1f9-44d8-bcf3-0040dcbeda6d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " pod="openstack/ceilometer-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.746703 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71549865-d1f9-44d8-bcf3-0040dcbeda6d-scripts\") pod \"ceilometer-0\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " pod="openstack/ceilometer-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.747006 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8448c2a8-df10-4a2f-a14f-c59318a71ceb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.752465 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sb6lm\" (UniqueName: \"kubernetes.io/projected/71549865-d1f9-44d8-bcf3-0040dcbeda6d-kube-api-access-sb6lm\") pod \"ceilometer-0\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " pod="openstack/ceilometer-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.752735 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71549865-d1f9-44d8-bcf3-0040dcbeda6d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " pod="openstack/ceilometer-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.755266 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71549865-d1f9-44d8-bcf3-0040dcbeda6d-scripts\") pod \"ceilometer-0\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " pod="openstack/ceilometer-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.765366 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-p2xzh"] Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.788180 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1a9f-account-create-update-4nplb"] Mar 20 13:46:58 crc kubenswrapper[4973]: W0320 13:46:58.814583 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73458433_b92e_4758_84bd_7aba2f23e1e0.slice/crio-763561df83d3e52e29b531fa9ff238f655e3dffe9c39a660d3e8958a27f309a8 WatchSource:0}: Error finding container 763561df83d3e52e29b531fa9ff238f655e3dffe9c39a660d3e8958a27f309a8: Status 404 returned error can't find the container with id 763561df83d3e52e29b531fa9ff238f655e3dffe9c39a660d3e8958a27f309a8 Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.848696 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/273c74ce-9e0d-437a-aaf8-b16451028b6e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"273c74ce-9e0d-437a-aaf8-b16451028b6e\") " pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.864835 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/273c74ce-9e0d-437a-aaf8-b16451028b6e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"273c74ce-9e0d-437a-aaf8-b16451028b6e\") " pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.865046 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/273c74ce-9e0d-437a-aaf8-b16451028b6e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"273c74ce-9e0d-437a-aaf8-b16451028b6e\") " pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.865244 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273c74ce-9e0d-437a-aaf8-b16451028b6e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"273c74ce-9e0d-437a-aaf8-b16451028b6e\") " pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.865676 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/273c74ce-9e0d-437a-aaf8-b16451028b6e-logs\") pod \"cinder-api-0\" (UID: \"273c74ce-9e0d-437a-aaf8-b16451028b6e\") " pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.865867 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273c74ce-9e0d-437a-aaf8-b16451028b6e-config-data\") pod \"cinder-api-0\" (UID: \"273c74ce-9e0d-437a-aaf8-b16451028b6e\") " pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.866010 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45gpq\" (UniqueName: \"kubernetes.io/projected/273c74ce-9e0d-437a-aaf8-b16451028b6e-kube-api-access-45gpq\") pod \"cinder-api-0\" (UID: \"273c74ce-9e0d-437a-aaf8-b16451028b6e\") " pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.866122 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/273c74ce-9e0d-437a-aaf8-b16451028b6e-scripts\") pod \"cinder-api-0\" (UID: \"273c74ce-9e0d-437a-aaf8-b16451028b6e\") " pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.866250 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/273c74ce-9e0d-437a-aaf8-b16451028b6e-config-data-custom\") pod \"cinder-api-0\" (UID: \"273c74ce-9e0d-437a-aaf8-b16451028b6e\") " pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.858536 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4451-account-create-update-pddz5"] Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.869773 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/273c74ce-9e0d-437a-aaf8-b16451028b6e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"273c74ce-9e0d-437a-aaf8-b16451028b6e\") " pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.888062 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/273c74ce-9e0d-437a-aaf8-b16451028b6e-logs\") pod \"cinder-api-0\" (UID: \"273c74ce-9e0d-437a-aaf8-b16451028b6e\") " pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.888709 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/273c74ce-9e0d-437a-aaf8-b16451028b6e-public-tls-certs\") pod \"cinder-api-0\" (UID: \"273c74ce-9e0d-437a-aaf8-b16451028b6e\") " pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.890289 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/273c74ce-9e0d-437a-aaf8-b16451028b6e-config-data-custom\") pod \"cinder-api-0\" (UID: \"273c74ce-9e0d-437a-aaf8-b16451028b6e\") " pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.890776 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/273c74ce-9e0d-437a-aaf8-b16451028b6e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"273c74ce-9e0d-437a-aaf8-b16451028b6e\") " pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.892630 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/273c74ce-9e0d-437a-aaf8-b16451028b6e-config-data\") pod \"cinder-api-0\" (UID: \"273c74ce-9e0d-437a-aaf8-b16451028b6e\") " pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.897176 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/273c74ce-9e0d-437a-aaf8-b16451028b6e-scripts\") pod \"cinder-api-0\" (UID: \"273c74ce-9e0d-437a-aaf8-b16451028b6e\") " pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.909155 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/273c74ce-9e0d-437a-aaf8-b16451028b6e-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"273c74ce-9e0d-437a-aaf8-b16451028b6e\") " pod="openstack/cinder-api-0" Mar 20 13:46:58 crc kubenswrapper[4973]: W0320 13:46:58.909970 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28f1bcfd_788c_47fa_a462_cd5068ec34d2.slice/crio-7f44119ff51d5d4421fa01386d524b2d27bee11e9cf4e93996a57707efca3eab WatchSource:0}: Error finding container 7f44119ff51d5d4421fa01386d524b2d27bee11e9cf4e93996a57707efca3eab: Status 404 returned error can't find the container with id 7f44119ff51d5d4421fa01386d524b2d27bee11e9cf4e93996a57707efca3eab Mar 20 13:46:58 crc kubenswrapper[4973]: I0320 13:46:58.914765 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45gpq\" (UniqueName: \"kubernetes.io/projected/273c74ce-9e0d-437a-aaf8-b16451028b6e-kube-api-access-45gpq\") pod \"cinder-api-0\" (UID: \"273c74ce-9e0d-437a-aaf8-b16451028b6e\") " pod="openstack/cinder-api-0" Mar 20 13:46:59 crc kubenswrapper[4973]: I0320 13:46:59.028336 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:46:59 crc kubenswrapper[4973]: I0320 13:46:59.036577 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-849bfc9c67-jsf5j"] Mar 20 13:46:59 crc kubenswrapper[4973]: I0320 13:46:59.054035 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 20 13:46:59 crc kubenswrapper[4973]: I0320 13:46:59.267975 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-p4rbx"] Mar 20 13:46:59 crc kubenswrapper[4973]: I0320 13:46:59.295552 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-849bfc9c67-jsf5j" event={"ID":"28f1bcfd-788c-47fa-a462-cd5068ec34d2","Type":"ContainerStarted","Data":"7f44119ff51d5d4421fa01386d524b2d27bee11e9cf4e93996a57707efca3eab"} Mar 20 13:46:59 crc kubenswrapper[4973]: I0320 13:46:59.310240 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-p2xzh" event={"ID":"7bf6849d-dd2b-4a3e-be4f-1b00c1826000","Type":"ContainerStarted","Data":"5a530756a889a0d24ff3eb6c40fb3c780e2fa1d6d39560472f153739c18192cd"} Mar 20 13:46:59 crc kubenswrapper[4973]: I0320 13:46:59.312168 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4451-account-create-update-pddz5" event={"ID":"73458433-b92e-4758-84bd-7aba2f23e1e0","Type":"ContainerStarted","Data":"763561df83d3e52e29b531fa9ff238f655e3dffe9c39a660d3e8958a27f309a8"} Mar 20 13:46:59 crc kubenswrapper[4973]: I0320 13:46:59.314234 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1a9f-account-create-update-4nplb" event={"ID":"cd964d5b-2edc-41ea-8dc9-3f6e71d9da32","Type":"ContainerStarted","Data":"31c4b788532e0791c16e13ebf06c1ff9ccbd6c3067e173303cb96043d7b36656"} Mar 20 13:46:59 crc kubenswrapper[4973]: I0320 13:46:59.326088 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xq9kw" event={"ID":"8448c2a8-df10-4a2f-a14f-c59318a71ceb","Type":"ContainerDied","Data":"e411e7a065c1cb41cb65edb954f04b3c6bcd549ef41983e7b283b4f592f9ec1f"} Mar 20 13:46:59 crc kubenswrapper[4973]: I0320 13:46:59.326192 4973 scope.go:117] "RemoveContainer" containerID="822ccc28a7ff7901b030a3684f7318dfb904c5e0eb49d975499614fe8d732270" Mar 20 13:46:59 crc kubenswrapper[4973]: I0320 13:46:59.326490 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xq9kw" Mar 20 13:46:59 crc kubenswrapper[4973]: I0320 13:46:59.341513 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xnwtw" event={"ID":"abeee3f9-2831-465f-8c3e-9853954f7087","Type":"ContainerStarted","Data":"aedd969c93b35e0312bc8573f2d6a62f6e79adc315f2a571ad23f6bd6e63b47f"} Mar 20 13:46:59 crc kubenswrapper[4973]: I0320 13:46:59.447733 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-eea5-account-create-update-jjhm6"] Mar 20 13:46:59 crc kubenswrapper[4973]: I0320 13:46:59.573535 4973 scope.go:117] "RemoveContainer" containerID="3894a5999a5ee709f6f1db2cabcaed6dc09283c505d4838f88b70efe0cf86696" Mar 20 13:46:59 crc kubenswrapper[4973]: I0320 13:46:59.593739 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-9v29d"] Mar 20 13:46:59 crc kubenswrapper[4973]: I0320 13:46:59.661555 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6cc7bcd8df-hpkc4"] Mar 20 13:46:59 crc kubenswrapper[4973]: W0320 13:46:59.684780 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9d40d175_2e68_4d03_acd3_0a8ba7943b57.slice/crio-1cd528f0ac2bfa2a82407e94d05d4f5961566ee9862931104daf4b2be20831b8 WatchSource:0}: Error finding container 1cd528f0ac2bfa2a82407e94d05d4f5961566ee9862931104daf4b2be20831b8: Status 404 returned error can't find the container with id 1cd528f0ac2bfa2a82407e94d05d4f5961566ee9862931104daf4b2be20831b8 Mar 20 13:46:59 crc kubenswrapper[4973]: W0320 13:46:59.805485 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7b7eb71d_d134_4d3b_84a0_f14776f4410a.slice/crio-c5fc82857565d1788375c28c1cc2a87f26c31a6e67530fcd1dac2f249b1086a5 WatchSource:0}: Error finding container c5fc82857565d1788375c28c1cc2a87f26c31a6e67530fcd1dac2f249b1086a5: Status 404 returned error can't find the container with id c5fc82857565d1788375c28c1cc2a87f26c31a6e67530fcd1dac2f249b1086a5 Mar 20 13:46:59 crc kubenswrapper[4973]: I0320 13:46:59.850988 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-b6774478c-cwcjf"] Mar 20 13:46:59 crc kubenswrapper[4973]: I0320 13:46:59.898667 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-667c56bb7d-c5695"] Mar 20 13:47:00 crc kubenswrapper[4973]: I0320 13:47:00.000017 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07dbf52e-f333-4466-8e56-b44815529a05" path="/var/lib/kubelet/pods/07dbf52e-f333-4466-8e56-b44815529a05/volumes" Mar 20 13:47:00 crc kubenswrapper[4973]: I0320 13:47:00.001580 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d355b77-7cc9-4184-a223-2bc448d9ca1b" path="/var/lib/kubelet/pods/0d355b77-7cc9-4184-a223-2bc448d9ca1b/volumes" Mar 20 13:47:00 crc kubenswrapper[4973]: I0320 13:47:00.161605 4973 scope.go:117] "RemoveContainer" containerID="4dff7158b55c26562ba3c676d6d34d3ddc7ebcd33b3559da1bbf0b34d6416451" Mar 20 13:47:00 crc kubenswrapper[4973]: I0320 13:47:00.363210 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xq9kw"] Mar 20 13:47:00 crc kubenswrapper[4973]: I0320 13:47:00.375488 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xq9kw"] Mar 20 13:47:00 crc kubenswrapper[4973]: I0320 13:47:00.396763 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 20 13:47:00 crc kubenswrapper[4973]: I0320 13:47:00.404280 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-b6774478c-cwcjf" event={"ID":"7b7eb71d-d134-4d3b-84a0-f14776f4410a","Type":"ContainerStarted","Data":"c5fc82857565d1788375c28c1cc2a87f26c31a6e67530fcd1dac2f249b1086a5"} Mar 20 13:47:00 crc kubenswrapper[4973]: I0320 13:47:00.417003 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:47:00 crc kubenswrapper[4973]: I0320 13:47:00.417907 4973 generic.go:334] "Generic (PLEG): container finished" podID="abeee3f9-2831-465f-8c3e-9853954f7087" containerID="594b59f55856a3ae9b944997d88058dcedcd548ee1690cbd564fb3a702344490" exitCode=0 Mar 20 13:47:00 crc kubenswrapper[4973]: I0320 13:47:00.418013 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xnwtw" event={"ID":"abeee3f9-2831-465f-8c3e-9853954f7087","Type":"ContainerDied","Data":"594b59f55856a3ae9b944997d88058dcedcd548ee1690cbd564fb3a702344490"} Mar 20 13:47:00 crc kubenswrapper[4973]: I0320 13:47:00.426759 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6cc7bcd8df-hpkc4" event={"ID":"ff523993-52d2-46af-8f52-f3c9ca447fcb","Type":"ContainerStarted","Data":"19a10bee73d7d9127fa52b47b461dfd7e0c9964373537fc9d69c5ca1ed033c19"} Mar 20 13:47:00 crc kubenswrapper[4973]: I0320 13:47:00.448971 4973 generic.go:334] "Generic (PLEG): container finished" podID="7bf6849d-dd2b-4a3e-be4f-1b00c1826000" containerID="ae3d8aa69626632f565d96c23915d440edfa3fe8a10884a7f681f436b737ab0c" exitCode=0 Mar 20 13:47:00 crc kubenswrapper[4973]: I0320 13:47:00.449073 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-p2xzh" event={"ID":"7bf6849d-dd2b-4a3e-be4f-1b00c1826000","Type":"ContainerDied","Data":"ae3d8aa69626632f565d96c23915d440edfa3fe8a10884a7f681f436b737ab0c"} Mar 20 13:47:00 crc kubenswrapper[4973]: W0320 13:47:00.453039 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71549865_d1f9_44d8_bcf3_0040dcbeda6d.slice/crio-db99524c410004ebceab83a58731bf85b433b846209006831521d45a9f4bd1a4 WatchSource:0}: Error finding container db99524c410004ebceab83a58731bf85b433b846209006831521d45a9f4bd1a4: Status 404 returned error can't find the container with id db99524c410004ebceab83a58731bf85b433b846209006831521d45a9f4bd1a4 Mar 20 13:47:00 crc kubenswrapper[4973]: I0320 13:47:00.466609 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-eea5-account-create-update-jjhm6" event={"ID":"6bb957cd-c6f9-4602-b4a2-56834928aabb","Type":"ContainerStarted","Data":"750dd998754af85127ab54d7ce15ca252cdef384ab46a0db4f48e20ebc12edb6"} Mar 20 13:47:00 crc kubenswrapper[4973]: I0320 13:47:00.481831 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-p4rbx" event={"ID":"3da0c070-14fe-41f9-9b97-f76831f43dbc","Type":"ContainerStarted","Data":"5cf9ce1eea02e37bf7bdaa07a9c80d5983c431baf6d4859bcd48ce739ee58526"} Mar 20 13:47:00 crc kubenswrapper[4973]: I0320 13:47:00.481882 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-p4rbx" event={"ID":"3da0c070-14fe-41f9-9b97-f76831f43dbc","Type":"ContainerStarted","Data":"b0d9fd22160edb6fdf0154c168e7fb085c118acb16595c6c0d23d68bd0a66a9b"} Mar 20 13:47:00 crc kubenswrapper[4973]: I0320 13:47:00.495734 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-667c56bb7d-c5695" event={"ID":"07fe89fe-2971-4d79-afd3-4688be48e51c","Type":"ContainerStarted","Data":"882c71bc20b25f789677fc7a06fd093602e54bd95f9575661b353cbb5a4d9d45"} Mar 20 13:47:00 crc kubenswrapper[4973]: I0320 13:47:00.514677 4973 generic.go:334] "Generic (PLEG): container finished" podID="cd964d5b-2edc-41ea-8dc9-3f6e71d9da32" containerID="3b8c4124bd0aef0f3c36277c16d5fcfeb94a2c582b7e38567d2feae6b95ee4b2" exitCode=0 Mar 20 13:47:00 crc kubenswrapper[4973]: I0320 13:47:00.514781 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1a9f-account-create-update-4nplb" event={"ID":"cd964d5b-2edc-41ea-8dc9-3f6e71d9da32","Type":"ContainerDied","Data":"3b8c4124bd0aef0f3c36277c16d5fcfeb94a2c582b7e38567d2feae6b95ee4b2"} Mar 20 13:47:00 crc kubenswrapper[4973]: I0320 13:47:00.536863 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-p4rbx" podStartSLOduration=9.53683998 podStartE2EDuration="9.53683998s" podCreationTimestamp="2026-03-20 13:46:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:00.511059495 +0000 UTC m=+1541.254729239" watchObservedRunningTime="2026-03-20 13:47:00.53683998 +0000 UTC m=+1541.280509724" Mar 20 13:47:00 crc kubenswrapper[4973]: I0320 13:47:00.571648 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" event={"ID":"9d40d175-2e68-4d03-acd3-0a8ba7943b57","Type":"ContainerStarted","Data":"1cd528f0ac2bfa2a82407e94d05d4f5961566ee9862931104daf4b2be20831b8"} Mar 20 13:47:00 crc kubenswrapper[4973]: I0320 13:47:00.670713 4973 generic.go:334] "Generic (PLEG): container finished" podID="73458433-b92e-4758-84bd-7aba2f23e1e0" containerID="0c1369a0c6c79678a9256215b93c6834c97a2669821cd309f83bf0f0db642d27" exitCode=0 Mar 20 13:47:00 crc kubenswrapper[4973]: I0320 13:47:00.670778 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4451-account-create-update-pddz5" event={"ID":"73458433-b92e-4758-84bd-7aba2f23e1e0","Type":"ContainerDied","Data":"0c1369a0c6c79678a9256215b93c6834c97a2669821cd309f83bf0f0db642d27"} Mar 20 13:47:01 crc kubenswrapper[4973]: I0320 13:47:01.692581 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-849bfc9c67-jsf5j" event={"ID":"28f1bcfd-788c-47fa-a462-cd5068ec34d2","Type":"ContainerStarted","Data":"6ed510b4c6ac103ff7cfd7fa1dac1234810017cc8b6ada9f8cabd87b946a4919"} Mar 20 13:47:01 crc kubenswrapper[4973]: I0320 13:47:01.693157 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:47:01 crc kubenswrapper[4973]: I0320 13:47:01.693176 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:47:01 crc kubenswrapper[4973]: I0320 13:47:01.693184 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-849bfc9c67-jsf5j" event={"ID":"28f1bcfd-788c-47fa-a462-cd5068ec34d2","Type":"ContainerStarted","Data":"e9039c1240c5545d402feaf48ed4e8b2cb6407344988f97d26e37d5fa98c84f7"} Mar 20 13:47:01 crc kubenswrapper[4973]: I0320 13:47:01.699014 4973 generic.go:334] "Generic (PLEG): container finished" podID="6bb957cd-c6f9-4602-b4a2-56834928aabb" containerID="d7d22475246e6e4e91c8c765cc857280240fcdc8df652e20dc5511e8c4bf93a1" exitCode=0 Mar 20 13:47:01 crc kubenswrapper[4973]: I0320 13:47:01.699068 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-eea5-account-create-update-jjhm6" event={"ID":"6bb957cd-c6f9-4602-b4a2-56834928aabb","Type":"ContainerDied","Data":"d7d22475246e6e4e91c8c765cc857280240fcdc8df652e20dc5511e8c4bf93a1"} Mar 20 13:47:01 crc kubenswrapper[4973]: I0320 13:47:01.703317 4973 generic.go:334] "Generic (PLEG): container finished" podID="3da0c070-14fe-41f9-9b97-f76831f43dbc" containerID="5cf9ce1eea02e37bf7bdaa07a9c80d5983c431baf6d4859bcd48ce739ee58526" exitCode=0 Mar 20 13:47:01 crc kubenswrapper[4973]: I0320 13:47:01.703376 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-p4rbx" event={"ID":"3da0c070-14fe-41f9-9b97-f76831f43dbc","Type":"ContainerDied","Data":"5cf9ce1eea02e37bf7bdaa07a9c80d5983c431baf6d4859bcd48ce739ee58526"} Mar 20 13:47:01 crc kubenswrapper[4973]: I0320 13:47:01.706531 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"273c74ce-9e0d-437a-aaf8-b16451028b6e","Type":"ContainerStarted","Data":"870ac3b9ecab5b7c3457e35026b14a354d82c14103da355e5febd7dbe1622060"} Mar 20 13:47:01 crc kubenswrapper[4973]: I0320 13:47:01.710361 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6cc7bcd8df-hpkc4" event={"ID":"ff523993-52d2-46af-8f52-f3c9ca447fcb","Type":"ContainerStarted","Data":"1c89394a0cf6ae6b38ee6b0afcc1bdc31b6ef05b59de63cbb26f65fce265d33e"} Mar 20 13:47:01 crc kubenswrapper[4973]: I0320 13:47:01.710503 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6cc7bcd8df-hpkc4" Mar 20 13:47:01 crc kubenswrapper[4973]: I0320 13:47:01.718073 4973 generic.go:334] "Generic (PLEG): container finished" podID="9d40d175-2e68-4d03-acd3-0a8ba7943b57" containerID="328372d663aefc57d3d8b09ece7693e92b2f5dee324ad8ca62bb8f4b6fb4bb45" exitCode=0 Mar 20 13:47:01 crc kubenswrapper[4973]: I0320 13:47:01.718208 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" event={"ID":"9d40d175-2e68-4d03-acd3-0a8ba7943b57","Type":"ContainerDied","Data":"328372d663aefc57d3d8b09ece7693e92b2f5dee324ad8ca62bb8f4b6fb4bb45"} Mar 20 13:47:01 crc kubenswrapper[4973]: I0320 13:47:01.728872 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71549865-d1f9-44d8-bcf3-0040dcbeda6d","Type":"ContainerStarted","Data":"db99524c410004ebceab83a58731bf85b433b846209006831521d45a9f4bd1a4"} Mar 20 13:47:01 crc kubenswrapper[4973]: I0320 13:47:01.734088 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-849bfc9c67-jsf5j" podStartSLOduration=13.734059047 podStartE2EDuration="13.734059047s" podCreationTimestamp="2026-03-20 13:46:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:01.719506469 +0000 UTC m=+1542.463176233" watchObservedRunningTime="2026-03-20 13:47:01.734059047 +0000 UTC m=+1542.477728791" Mar 20 13:47:01 crc kubenswrapper[4973]: I0320 13:47:01.744955 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6cc7bcd8df-hpkc4" podStartSLOduration=4.744939364 podStartE2EDuration="4.744939364s" podCreationTimestamp="2026-03-20 13:46:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:01.736876884 +0000 UTC m=+1542.480546628" watchObservedRunningTime="2026-03-20 13:47:01.744939364 +0000 UTC m=+1542.488609108" Mar 20 13:47:01 crc kubenswrapper[4973]: I0320 13:47:01.979634 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8448c2a8-df10-4a2f-a14f-c59318a71ceb" path="/var/lib/kubelet/pods/8448c2a8-df10-4a2f-a14f-c59318a71ceb/volumes" Mar 20 13:47:02 crc kubenswrapper[4973]: I0320 13:47:02.197849 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4451-account-create-update-pddz5" Mar 20 13:47:02 crc kubenswrapper[4973]: I0320 13:47:02.337535 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73458433-b92e-4758-84bd-7aba2f23e1e0-operator-scripts\") pod \"73458433-b92e-4758-84bd-7aba2f23e1e0\" (UID: \"73458433-b92e-4758-84bd-7aba2f23e1e0\") " Mar 20 13:47:02 crc kubenswrapper[4973]: I0320 13:47:02.338176 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zk9f7\" (UniqueName: \"kubernetes.io/projected/73458433-b92e-4758-84bd-7aba2f23e1e0-kube-api-access-zk9f7\") pod \"73458433-b92e-4758-84bd-7aba2f23e1e0\" (UID: \"73458433-b92e-4758-84bd-7aba2f23e1e0\") " Mar 20 13:47:02 crc kubenswrapper[4973]: I0320 13:47:02.338714 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73458433-b92e-4758-84bd-7aba2f23e1e0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "73458433-b92e-4758-84bd-7aba2f23e1e0" (UID: "73458433-b92e-4758-84bd-7aba2f23e1e0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:47:02 crc kubenswrapper[4973]: I0320 13:47:02.347849 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73458433-b92e-4758-84bd-7aba2f23e1e0-kube-api-access-zk9f7" (OuterVolumeSpecName: "kube-api-access-zk9f7") pod "73458433-b92e-4758-84bd-7aba2f23e1e0" (UID: "73458433-b92e-4758-84bd-7aba2f23e1e0"). InnerVolumeSpecName "kube-api-access-zk9f7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:02 crc kubenswrapper[4973]: I0320 13:47:02.441282 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zk9f7\" (UniqueName: \"kubernetes.io/projected/73458433-b92e-4758-84bd-7aba2f23e1e0-kube-api-access-zk9f7\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:02 crc kubenswrapper[4973]: I0320 13:47:02.441324 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73458433-b92e-4758-84bd-7aba2f23e1e0-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:02 crc kubenswrapper[4973]: I0320 13:47:02.441537 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-p2xzh" Mar 20 13:47:02 crc kubenswrapper[4973]: I0320 13:47:02.543485 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-44rfq\" (UniqueName: \"kubernetes.io/projected/7bf6849d-dd2b-4a3e-be4f-1b00c1826000-kube-api-access-44rfq\") pod \"7bf6849d-dd2b-4a3e-be4f-1b00c1826000\" (UID: \"7bf6849d-dd2b-4a3e-be4f-1b00c1826000\") " Mar 20 13:47:02 crc kubenswrapper[4973]: I0320 13:47:02.543958 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bf6849d-dd2b-4a3e-be4f-1b00c1826000-operator-scripts\") pod \"7bf6849d-dd2b-4a3e-be4f-1b00c1826000\" (UID: \"7bf6849d-dd2b-4a3e-be4f-1b00c1826000\") " Mar 20 13:47:02 crc kubenswrapper[4973]: I0320 13:47:02.545305 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bf6849d-dd2b-4a3e-be4f-1b00c1826000-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7bf6849d-dd2b-4a3e-be4f-1b00c1826000" (UID: "7bf6849d-dd2b-4a3e-be4f-1b00c1826000"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:47:02 crc kubenswrapper[4973]: I0320 13:47:02.550848 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bf6849d-dd2b-4a3e-be4f-1b00c1826000-kube-api-access-44rfq" (OuterVolumeSpecName: "kube-api-access-44rfq") pod "7bf6849d-dd2b-4a3e-be4f-1b00c1826000" (UID: "7bf6849d-dd2b-4a3e-be4f-1b00c1826000"). InnerVolumeSpecName "kube-api-access-44rfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:02 crc kubenswrapper[4973]: I0320 13:47:02.650949 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-44rfq\" (UniqueName: \"kubernetes.io/projected/7bf6849d-dd2b-4a3e-be4f-1b00c1826000-kube-api-access-44rfq\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:02 crc kubenswrapper[4973]: I0320 13:47:02.650981 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7bf6849d-dd2b-4a3e-be4f-1b00c1826000-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:02 crc kubenswrapper[4973]: I0320 13:47:02.797386 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4451-account-create-update-pddz5" event={"ID":"73458433-b92e-4758-84bd-7aba2f23e1e0","Type":"ContainerDied","Data":"763561df83d3e52e29b531fa9ff238f655e3dffe9c39a660d3e8958a27f309a8"} Mar 20 13:47:02 crc kubenswrapper[4973]: I0320 13:47:02.797762 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="763561df83d3e52e29b531fa9ff238f655e3dffe9c39a660d3e8958a27f309a8" Mar 20 13:47:02 crc kubenswrapper[4973]: I0320 13:47:02.797850 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4451-account-create-update-pddz5" Mar 20 13:47:02 crc kubenswrapper[4973]: I0320 13:47:02.805238 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"273c74ce-9e0d-437a-aaf8-b16451028b6e","Type":"ContainerStarted","Data":"f0046e4fb014d6d629a09a1fb4e9c35d4c18e1fd52329cec99eb381b8c61aae2"} Mar 20 13:47:02 crc kubenswrapper[4973]: I0320 13:47:02.831146 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" event={"ID":"9d40d175-2e68-4d03-acd3-0a8ba7943b57","Type":"ContainerStarted","Data":"4e2db8a75bf7a19158aaf6bf57274ad674bd6eff80cb774819b13847e9c9a1d7"} Mar 20 13:47:02 crc kubenswrapper[4973]: I0320 13:47:02.832396 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" Mar 20 13:47:02 crc kubenswrapper[4973]: I0320 13:47:02.876682 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71549865-d1f9-44d8-bcf3-0040dcbeda6d","Type":"ContainerStarted","Data":"7a4c18d5e3bed2bd1ec2cf7ccdbcbe4f2fe7b32f78b93a9524867ab2eb2c4944"} Mar 20 13:47:02 crc kubenswrapper[4973]: I0320 13:47:02.886445 4973 generic.go:334] "Generic (PLEG): container finished" podID="58e69a08-7d61-4b8a-8ff1-6a9bb59c360d" containerID="3ccab10a39d8b83dbf6322bdc5ebddcba8df48beb7400dbf5d483a59cf18bca1" exitCode=0 Mar 20 13:47:02 crc kubenswrapper[4973]: I0320 13:47:02.886561 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77cf5858cd-9xwk6" event={"ID":"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d","Type":"ContainerDied","Data":"3ccab10a39d8b83dbf6322bdc5ebddcba8df48beb7400dbf5d483a59cf18bca1"} Mar 20 13:47:02 crc kubenswrapper[4973]: I0320 13:47:02.893688 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" podStartSLOduration=5.893663848 podStartE2EDuration="5.893663848s" podCreationTimestamp="2026-03-20 13:46:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:02.86297517 +0000 UTC m=+1543.606644914" watchObservedRunningTime="2026-03-20 13:47:02.893663848 +0000 UTC m=+1543.637333592" Mar 20 13:47:02 crc kubenswrapper[4973]: I0320 13:47:02.895166 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-p2xzh" Mar 20 13:47:02 crc kubenswrapper[4973]: I0320 13:47:02.895679 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-p2xzh" event={"ID":"7bf6849d-dd2b-4a3e-be4f-1b00c1826000","Type":"ContainerDied","Data":"5a530756a889a0d24ff3eb6c40fb3c780e2fa1d6d39560472f153739c18192cd"} Mar 20 13:47:02 crc kubenswrapper[4973]: I0320 13:47:02.895723 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a530756a889a0d24ff3eb6c40fb3c780e2fa1d6d39560472f153739c18192cd" Mar 20 13:47:03 crc kubenswrapper[4973]: I0320 13:47:03.914450 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"273c74ce-9e0d-437a-aaf8-b16451028b6e","Type":"ContainerStarted","Data":"687ea77ea73a896e86a6d7f14c8d24a14611a2638c57a029aa6e514944430518"} Mar 20 13:47:03 crc kubenswrapper[4973]: I0320 13:47:03.914857 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 20 13:47:03 crc kubenswrapper[4973]: I0320 13:47:03.958532 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.958505453 podStartE2EDuration="5.958505453s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:03.940379748 +0000 UTC m=+1544.684049502" watchObservedRunningTime="2026-03-20 13:47:03.958505453 +0000 UTC m=+1544.702175197" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.332531 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77cf5858cd-9xwk6" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.333113 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p4rbx" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.340693 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-eea5-account-create-update-jjhm6" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.363383 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1a9f-account-create-update-4nplb" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.495437 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-config\") pod \"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d\" (UID: \"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d\") " Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.495503 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bkqz9\" (UniqueName: \"kubernetes.io/projected/cd964d5b-2edc-41ea-8dc9-3f6e71d9da32-kube-api-access-bkqz9\") pod \"cd964d5b-2edc-41ea-8dc9-3f6e71d9da32\" (UID: \"cd964d5b-2edc-41ea-8dc9-3f6e71d9da32\") " Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.495529 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd964d5b-2edc-41ea-8dc9-3f6e71d9da32-operator-scripts\") pod \"cd964d5b-2edc-41ea-8dc9-3f6e71d9da32\" (UID: \"cd964d5b-2edc-41ea-8dc9-3f6e71d9da32\") " Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.495565 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bb957cd-c6f9-4602-b4a2-56834928aabb-operator-scripts\") pod \"6bb957cd-c6f9-4602-b4a2-56834928aabb\" (UID: \"6bb957cd-c6f9-4602-b4a2-56834928aabb\") " Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.495609 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-ovndb-tls-certs\") pod \"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d\" (UID: \"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d\") " Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.495644 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zswnd\" (UniqueName: \"kubernetes.io/projected/6bb957cd-c6f9-4602-b4a2-56834928aabb-kube-api-access-zswnd\") pod \"6bb957cd-c6f9-4602-b4a2-56834928aabb\" (UID: \"6bb957cd-c6f9-4602-b4a2-56834928aabb\") " Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.495729 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3da0c070-14fe-41f9-9b97-f76831f43dbc-operator-scripts\") pod \"3da0c070-14fe-41f9-9b97-f76831f43dbc\" (UID: \"3da0c070-14fe-41f9-9b97-f76831f43dbc\") " Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.495754 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-httpd-config\") pod \"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d\" (UID: \"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d\") " Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.495869 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-combined-ca-bundle\") pod \"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d\" (UID: \"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d\") " Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.495976 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shrgd\" (UniqueName: \"kubernetes.io/projected/3da0c070-14fe-41f9-9b97-f76831f43dbc-kube-api-access-shrgd\") pod \"3da0c070-14fe-41f9-9b97-f76831f43dbc\" (UID: \"3da0c070-14fe-41f9-9b97-f76831f43dbc\") " Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.496014 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wrzqz\" (UniqueName: \"kubernetes.io/projected/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-kube-api-access-wrzqz\") pod \"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d\" (UID: \"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d\") " Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.508983 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cd964d5b-2edc-41ea-8dc9-3f6e71d9da32-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cd964d5b-2edc-41ea-8dc9-3f6e71d9da32" (UID: "cd964d5b-2edc-41ea-8dc9-3f6e71d9da32"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.512285 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3da0c070-14fe-41f9-9b97-f76831f43dbc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3da0c070-14fe-41f9-9b97-f76831f43dbc" (UID: "3da0c070-14fe-41f9-9b97-f76831f43dbc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.512539 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-kube-api-access-wrzqz" (OuterVolumeSpecName: "kube-api-access-wrzqz") pod "58e69a08-7d61-4b8a-8ff1-6a9bb59c360d" (UID: "58e69a08-7d61-4b8a-8ff1-6a9bb59c360d"). InnerVolumeSpecName "kube-api-access-wrzqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.513310 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bb957cd-c6f9-4602-b4a2-56834928aabb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6bb957cd-c6f9-4602-b4a2-56834928aabb" (UID: "6bb957cd-c6f9-4602-b4a2-56834928aabb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.529647 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd964d5b-2edc-41ea-8dc9-3f6e71d9da32-kube-api-access-bkqz9" (OuterVolumeSpecName: "kube-api-access-bkqz9") pod "cd964d5b-2edc-41ea-8dc9-3f6e71d9da32" (UID: "cd964d5b-2edc-41ea-8dc9-3f6e71d9da32"). InnerVolumeSpecName "kube-api-access-bkqz9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.530576 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "58e69a08-7d61-4b8a-8ff1-6a9bb59c360d" (UID: "58e69a08-7d61-4b8a-8ff1-6a9bb59c360d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.534867 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3da0c070-14fe-41f9-9b97-f76831f43dbc-kube-api-access-shrgd" (OuterVolumeSpecName: "kube-api-access-shrgd") pod "3da0c070-14fe-41f9-9b97-f76831f43dbc" (UID: "3da0c070-14fe-41f9-9b97-f76831f43dbc"). InnerVolumeSpecName "kube-api-access-shrgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.540751 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bb957cd-c6f9-4602-b4a2-56834928aabb-kube-api-access-zswnd" (OuterVolumeSpecName: "kube-api-access-zswnd") pod "6bb957cd-c6f9-4602-b4a2-56834928aabb" (UID: "6bb957cd-c6f9-4602-b4a2-56834928aabb"). InnerVolumeSpecName "kube-api-access-zswnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.599378 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bkqz9\" (UniqueName: \"kubernetes.io/projected/cd964d5b-2edc-41ea-8dc9-3f6e71d9da32-kube-api-access-bkqz9\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.599844 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cd964d5b-2edc-41ea-8dc9-3f6e71d9da32-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.599936 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bb957cd-c6f9-4602-b4a2-56834928aabb-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.600019 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zswnd\" (UniqueName: \"kubernetes.io/projected/6bb957cd-c6f9-4602-b4a2-56834928aabb-kube-api-access-zswnd\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.600146 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3da0c070-14fe-41f9-9b97-f76831f43dbc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.600224 4973 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.600301 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shrgd\" (UniqueName: \"kubernetes.io/projected/3da0c070-14fe-41f9-9b97-f76831f43dbc-kube-api-access-shrgd\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.600402 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wrzqz\" (UniqueName: \"kubernetes.io/projected/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-kube-api-access-wrzqz\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.658856 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "58e69a08-7d61-4b8a-8ff1-6a9bb59c360d" (UID: "58e69a08-7d61-4b8a-8ff1-6a9bb59c360d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.674887 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-config" (OuterVolumeSpecName: "config") pod "58e69a08-7d61-4b8a-8ff1-6a9bb59c360d" (UID: "58e69a08-7d61-4b8a-8ff1-6a9bb59c360d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.676435 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "58e69a08-7d61-4b8a-8ff1-6a9bb59c360d" (UID: "58e69a08-7d61-4b8a-8ff1-6a9bb59c360d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.702916 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.702961 4973 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.702971 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.925562 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1a9f-account-create-update-4nplb" event={"ID":"cd964d5b-2edc-41ea-8dc9-3f6e71d9da32","Type":"ContainerDied","Data":"31c4b788532e0791c16e13ebf06c1ff9ccbd6c3067e173303cb96043d7b36656"} Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.925597 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1a9f-account-create-update-4nplb" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.925604 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31c4b788532e0791c16e13ebf06c1ff9ccbd6c3067e173303cb96043d7b36656" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.934771 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-77cf5858cd-9xwk6" event={"ID":"58e69a08-7d61-4b8a-8ff1-6a9bb59c360d","Type":"ContainerDied","Data":"c04c9ea43ebaf35ec4ce831abb2aa5c50e7456dfb24830774b761f1bfce62d71"} Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.934829 4973 scope.go:117] "RemoveContainer" containerID="8185079e7c55881e7a789b73d0c2cde66bafaaa4cb18f01382bb731f9cef3f0a" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.934948 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-77cf5858cd-9xwk6" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.938852 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-eea5-account-create-update-jjhm6" event={"ID":"6bb957cd-c6f9-4602-b4a2-56834928aabb","Type":"ContainerDied","Data":"750dd998754af85127ab54d7ce15ca252cdef384ab46a0db4f48e20ebc12edb6"} Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.939173 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="750dd998754af85127ab54d7ce15ca252cdef384ab46a0db4f48e20ebc12edb6" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.939245 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-eea5-account-create-update-jjhm6" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.947367 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-p4rbx" event={"ID":"3da0c070-14fe-41f9-9b97-f76831f43dbc","Type":"ContainerDied","Data":"b0d9fd22160edb6fdf0154c168e7fb085c118acb16595c6c0d23d68bd0a66a9b"} Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.947411 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0d9fd22160edb6fdf0154c168e7fb085c118acb16595c6c0d23d68bd0a66a9b" Mar 20 13:47:04 crc kubenswrapper[4973]: I0320 13:47:04.947521 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-p4rbx" Mar 20 13:47:05 crc kubenswrapper[4973]: I0320 13:47:05.012591 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-77cf5858cd-9xwk6"] Mar 20 13:47:05 crc kubenswrapper[4973]: I0320 13:47:05.029444 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-77cf5858cd-9xwk6"] Mar 20 13:47:05 crc kubenswrapper[4973]: I0320 13:47:05.330305 4973 scope.go:117] "RemoveContainer" containerID="3ccab10a39d8b83dbf6322bdc5ebddcba8df48beb7400dbf5d483a59cf18bca1" Mar 20 13:47:05 crc kubenswrapper[4973]: I0320 13:47:05.369052 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xnwtw" Mar 20 13:47:05 crc kubenswrapper[4973]: I0320 13:47:05.532504 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh2xs\" (UniqueName: \"kubernetes.io/projected/abeee3f9-2831-465f-8c3e-9853954f7087-kube-api-access-vh2xs\") pod \"abeee3f9-2831-465f-8c3e-9853954f7087\" (UID: \"abeee3f9-2831-465f-8c3e-9853954f7087\") " Mar 20 13:47:05 crc kubenswrapper[4973]: I0320 13:47:05.532780 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abeee3f9-2831-465f-8c3e-9853954f7087-operator-scripts\") pod \"abeee3f9-2831-465f-8c3e-9853954f7087\" (UID: \"abeee3f9-2831-465f-8c3e-9853954f7087\") " Mar 20 13:47:05 crc kubenswrapper[4973]: I0320 13:47:05.547177 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/abeee3f9-2831-465f-8c3e-9853954f7087-kube-api-access-vh2xs" (OuterVolumeSpecName: "kube-api-access-vh2xs") pod "abeee3f9-2831-465f-8c3e-9853954f7087" (UID: "abeee3f9-2831-465f-8c3e-9853954f7087"). InnerVolumeSpecName "kube-api-access-vh2xs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:05 crc kubenswrapper[4973]: I0320 13:47:05.548063 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/abeee3f9-2831-465f-8c3e-9853954f7087-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "abeee3f9-2831-465f-8c3e-9853954f7087" (UID: "abeee3f9-2831-465f-8c3e-9853954f7087"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:47:05 crc kubenswrapper[4973]: I0320 13:47:05.637239 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh2xs\" (UniqueName: \"kubernetes.io/projected/abeee3f9-2831-465f-8c3e-9853954f7087-kube-api-access-vh2xs\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:05 crc kubenswrapper[4973]: I0320 13:47:05.637296 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/abeee3f9-2831-465f-8c3e-9853954f7087-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:05 crc kubenswrapper[4973]: I0320 13:47:05.970302 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="58e69a08-7d61-4b8a-8ff1-6a9bb59c360d" path="/var/lib/kubelet/pods/58e69a08-7d61-4b8a-8ff1-6a9bb59c360d/volumes" Mar 20 13:47:05 crc kubenswrapper[4973]: I0320 13:47:05.971930 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-667c56bb7d-c5695" Mar 20 13:47:05 crc kubenswrapper[4973]: I0320 13:47:05.971965 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-b6774478c-cwcjf" Mar 20 13:47:05 crc kubenswrapper[4973]: I0320 13:47:05.971978 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-667c56bb7d-c5695" event={"ID":"07fe89fe-2971-4d79-afd3-4688be48e51c","Type":"ContainerStarted","Data":"67763a411d219ed52e22b2f329b5a6bb85e80d401299e127254bf4f709190947"} Mar 20 13:47:05 crc kubenswrapper[4973]: I0320 13:47:05.971999 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-b6774478c-cwcjf" event={"ID":"7b7eb71d-d134-4d3b-84a0-f14776f4410a","Type":"ContainerStarted","Data":"a87d5793ecd115be5927094ad7abc26581635b4f6c204b6b783553664548e395"} Mar 20 13:47:05 crc kubenswrapper[4973]: I0320 13:47:05.972012 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71549865-d1f9-44d8-bcf3-0040dcbeda6d","Type":"ContainerStarted","Data":"6390404d783256c13d7d05cba687f7acfc0b5746e5ef4db063fa1c80cc1c123d"} Mar 20 13:47:05 crc kubenswrapper[4973]: I0320 13:47:05.974668 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-xnwtw" event={"ID":"abeee3f9-2831-465f-8c3e-9853954f7087","Type":"ContainerDied","Data":"aedd969c93b35e0312bc8573f2d6a62f6e79adc315f2a571ad23f6bd6e63b47f"} Mar 20 13:47:05 crc kubenswrapper[4973]: I0320 13:47:05.974710 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aedd969c93b35e0312bc8573f2d6a62f6e79adc315f2a571ad23f6bd6e63b47f" Mar 20 13:47:05 crc kubenswrapper[4973]: I0320 13:47:05.974735 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-xnwtw" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.004838 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-667c56bb7d-c5695" podStartSLOduration=3.528226445 podStartE2EDuration="9.004811986s" podCreationTimestamp="2026-03-20 13:46:57 +0000 UTC" firstStartedPulling="2026-03-20 13:46:59.902419253 +0000 UTC m=+1540.646088997" lastFinishedPulling="2026-03-20 13:47:05.379004794 +0000 UTC m=+1546.122674538" observedRunningTime="2026-03-20 13:47:05.986963868 +0000 UTC m=+1546.730633612" watchObservedRunningTime="2026-03-20 13:47:06.004811986 +0000 UTC m=+1546.748481730" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.020993 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-b6774478c-cwcjf" podStartSLOduration=3.420913645 podStartE2EDuration="9.020969107s" podCreationTimestamp="2026-03-20 13:46:57 +0000 UTC" firstStartedPulling="2026-03-20 13:46:59.812539459 +0000 UTC m=+1540.556209203" lastFinishedPulling="2026-03-20 13:47:05.412594921 +0000 UTC m=+1546.156264665" observedRunningTime="2026-03-20 13:47:06.009890655 +0000 UTC m=+1546.753560399" watchObservedRunningTime="2026-03-20 13:47:06.020969107 +0000 UTC m=+1546.764638851" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.127191 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-76f767b74-8jwpc"] Mar 20 13:47:06 crc kubenswrapper[4973]: E0320 13:47:06.128300 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bf6849d-dd2b-4a3e-be4f-1b00c1826000" containerName="mariadb-database-create" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.128322 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bf6849d-dd2b-4a3e-be4f-1b00c1826000" containerName="mariadb-database-create" Mar 20 13:47:06 crc kubenswrapper[4973]: E0320 13:47:06.128362 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73458433-b92e-4758-84bd-7aba2f23e1e0" containerName="mariadb-account-create-update" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.128373 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="73458433-b92e-4758-84bd-7aba2f23e1e0" containerName="mariadb-account-create-update" Mar 20 13:47:06 crc kubenswrapper[4973]: E0320 13:47:06.128393 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="abeee3f9-2831-465f-8c3e-9853954f7087" containerName="mariadb-database-create" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.128400 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="abeee3f9-2831-465f-8c3e-9853954f7087" containerName="mariadb-database-create" Mar 20 13:47:06 crc kubenswrapper[4973]: E0320 13:47:06.128417 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3da0c070-14fe-41f9-9b97-f76831f43dbc" containerName="mariadb-database-create" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.128424 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="3da0c070-14fe-41f9-9b97-f76831f43dbc" containerName="mariadb-database-create" Mar 20 13:47:06 crc kubenswrapper[4973]: E0320 13:47:06.128435 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bb957cd-c6f9-4602-b4a2-56834928aabb" containerName="mariadb-account-create-update" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.128442 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bb957cd-c6f9-4602-b4a2-56834928aabb" containerName="mariadb-account-create-update" Mar 20 13:47:06 crc kubenswrapper[4973]: E0320 13:47:06.128459 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd964d5b-2edc-41ea-8dc9-3f6e71d9da32" containerName="mariadb-account-create-update" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.128466 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd964d5b-2edc-41ea-8dc9-3f6e71d9da32" containerName="mariadb-account-create-update" Mar 20 13:47:06 crc kubenswrapper[4973]: E0320 13:47:06.128492 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58e69a08-7d61-4b8a-8ff1-6a9bb59c360d" containerName="neutron-api" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.128500 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="58e69a08-7d61-4b8a-8ff1-6a9bb59c360d" containerName="neutron-api" Mar 20 13:47:06 crc kubenswrapper[4973]: E0320 13:47:06.128524 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="58e69a08-7d61-4b8a-8ff1-6a9bb59c360d" containerName="neutron-httpd" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.128532 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="58e69a08-7d61-4b8a-8ff1-6a9bb59c360d" containerName="neutron-httpd" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.128785 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd964d5b-2edc-41ea-8dc9-3f6e71d9da32" containerName="mariadb-account-create-update" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.128802 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="58e69a08-7d61-4b8a-8ff1-6a9bb59c360d" containerName="neutron-api" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.128813 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="58e69a08-7d61-4b8a-8ff1-6a9bb59c360d" containerName="neutron-httpd" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.128823 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="abeee3f9-2831-465f-8c3e-9853954f7087" containerName="mariadb-database-create" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.128832 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bf6849d-dd2b-4a3e-be4f-1b00c1826000" containerName="mariadb-database-create" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.128845 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="3da0c070-14fe-41f9-9b97-f76831f43dbc" containerName="mariadb-database-create" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.128856 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bb957cd-c6f9-4602-b4a2-56834928aabb" containerName="mariadb-account-create-update" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.128872 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="73458433-b92e-4758-84bd-7aba2f23e1e0" containerName="mariadb-account-create-update" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.129896 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-76f767b74-8jwpc" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.188474 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-76f767b74-8jwpc"] Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.285391 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-77dfb8c95d-ng2gw"] Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.293820 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d53b7a0-4041-437e-8ec2-91013bed7135-config-data-custom\") pod \"heat-engine-76f767b74-8jwpc\" (UID: \"2d53b7a0-4041-437e-8ec2-91013bed7135\") " pod="openstack/heat-engine-76f767b74-8jwpc" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.294023 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d53b7a0-4041-437e-8ec2-91013bed7135-combined-ca-bundle\") pod \"heat-engine-76f767b74-8jwpc\" (UID: \"2d53b7a0-4041-437e-8ec2-91013bed7135\") " pod="openstack/heat-engine-76f767b74-8jwpc" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.294052 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d53b7a0-4041-437e-8ec2-91013bed7135-config-data\") pod \"heat-engine-76f767b74-8jwpc\" (UID: \"2d53b7a0-4041-437e-8ec2-91013bed7135\") " pod="openstack/heat-engine-76f767b74-8jwpc" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.294102 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dj7hr\" (UniqueName: \"kubernetes.io/projected/2d53b7a0-4041-437e-8ec2-91013bed7135-kube-api-access-dj7hr\") pod \"heat-engine-76f767b74-8jwpc\" (UID: \"2d53b7a0-4041-437e-8ec2-91013bed7135\") " pod="openstack/heat-engine-76f767b74-8jwpc" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.306847 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-77dfb8c95d-ng2gw" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.345378 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-76bfcb8cc7-vl56f"] Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.347438 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.370875 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-77dfb8c95d-ng2gw"] Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.377441 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-76bfcb8cc7-vl56f"] Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.416920 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1a99224-b58e-48c3-aad8-bfa2c175965e-config-data-custom\") pod \"heat-api-77dfb8c95d-ng2gw\" (UID: \"e1a99224-b58e-48c3-aad8-bfa2c175965e\") " pod="openstack/heat-api-77dfb8c95d-ng2gw" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.417002 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1a99224-b58e-48c3-aad8-bfa2c175965e-combined-ca-bundle\") pod \"heat-api-77dfb8c95d-ng2gw\" (UID: \"e1a99224-b58e-48c3-aad8-bfa2c175965e\") " pod="openstack/heat-api-77dfb8c95d-ng2gw" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.417033 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1a99224-b58e-48c3-aad8-bfa2c175965e-config-data\") pod \"heat-api-77dfb8c95d-ng2gw\" (UID: \"e1a99224-b58e-48c3-aad8-bfa2c175965e\") " pod="openstack/heat-api-77dfb8c95d-ng2gw" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.417088 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d852eeb5-beb4-4410-beaa-34cb9d4da7e0-combined-ca-bundle\") pod \"heat-cfnapi-76bfcb8cc7-vl56f\" (UID: \"d852eeb5-beb4-4410-beaa-34cb9d4da7e0\") " pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.417109 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d852eeb5-beb4-4410-beaa-34cb9d4da7e0-config-data\") pod \"heat-cfnapi-76bfcb8cc7-vl56f\" (UID: \"d852eeb5-beb4-4410-beaa-34cb9d4da7e0\") " pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.417155 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d53b7a0-4041-437e-8ec2-91013bed7135-config-data-custom\") pod \"heat-engine-76f767b74-8jwpc\" (UID: \"2d53b7a0-4041-437e-8ec2-91013bed7135\") " pod="openstack/heat-engine-76f767b74-8jwpc" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.417236 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qlkw\" (UniqueName: \"kubernetes.io/projected/e1a99224-b58e-48c3-aad8-bfa2c175965e-kube-api-access-9qlkw\") pod \"heat-api-77dfb8c95d-ng2gw\" (UID: \"e1a99224-b58e-48c3-aad8-bfa2c175965e\") " pod="openstack/heat-api-77dfb8c95d-ng2gw" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.417286 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d852eeb5-beb4-4410-beaa-34cb9d4da7e0-config-data-custom\") pod \"heat-cfnapi-76bfcb8cc7-vl56f\" (UID: \"d852eeb5-beb4-4410-beaa-34cb9d4da7e0\") " pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.417330 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d53b7a0-4041-437e-8ec2-91013bed7135-combined-ca-bundle\") pod \"heat-engine-76f767b74-8jwpc\" (UID: \"2d53b7a0-4041-437e-8ec2-91013bed7135\") " pod="openstack/heat-engine-76f767b74-8jwpc" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.417382 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d53b7a0-4041-437e-8ec2-91013bed7135-config-data\") pod \"heat-engine-76f767b74-8jwpc\" (UID: \"2d53b7a0-4041-437e-8ec2-91013bed7135\") " pod="openstack/heat-engine-76f767b74-8jwpc" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.417407 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk6hx\" (UniqueName: \"kubernetes.io/projected/d852eeb5-beb4-4410-beaa-34cb9d4da7e0-kube-api-access-lk6hx\") pod \"heat-cfnapi-76bfcb8cc7-vl56f\" (UID: \"d852eeb5-beb4-4410-beaa-34cb9d4da7e0\") " pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.417443 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dj7hr\" (UniqueName: \"kubernetes.io/projected/2d53b7a0-4041-437e-8ec2-91013bed7135-kube-api-access-dj7hr\") pod \"heat-engine-76f767b74-8jwpc\" (UID: \"2d53b7a0-4041-437e-8ec2-91013bed7135\") " pod="openstack/heat-engine-76f767b74-8jwpc" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.431229 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d53b7a0-4041-437e-8ec2-91013bed7135-config-data-custom\") pod \"heat-engine-76f767b74-8jwpc\" (UID: \"2d53b7a0-4041-437e-8ec2-91013bed7135\") " pod="openstack/heat-engine-76f767b74-8jwpc" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.436533 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d53b7a0-4041-437e-8ec2-91013bed7135-combined-ca-bundle\") pod \"heat-engine-76f767b74-8jwpc\" (UID: \"2d53b7a0-4041-437e-8ec2-91013bed7135\") " pod="openstack/heat-engine-76f767b74-8jwpc" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.448017 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dj7hr\" (UniqueName: \"kubernetes.io/projected/2d53b7a0-4041-437e-8ec2-91013bed7135-kube-api-access-dj7hr\") pod \"heat-engine-76f767b74-8jwpc\" (UID: \"2d53b7a0-4041-437e-8ec2-91013bed7135\") " pod="openstack/heat-engine-76f767b74-8jwpc" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.452647 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d53b7a0-4041-437e-8ec2-91013bed7135-config-data\") pod \"heat-engine-76f767b74-8jwpc\" (UID: \"2d53b7a0-4041-437e-8ec2-91013bed7135\") " pod="openstack/heat-engine-76f767b74-8jwpc" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.513184 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-76f767b74-8jwpc" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.522070 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1a99224-b58e-48c3-aad8-bfa2c175965e-config-data-custom\") pod \"heat-api-77dfb8c95d-ng2gw\" (UID: \"e1a99224-b58e-48c3-aad8-bfa2c175965e\") " pod="openstack/heat-api-77dfb8c95d-ng2gw" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.522109 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1a99224-b58e-48c3-aad8-bfa2c175965e-combined-ca-bundle\") pod \"heat-api-77dfb8c95d-ng2gw\" (UID: \"e1a99224-b58e-48c3-aad8-bfa2c175965e\") " pod="openstack/heat-api-77dfb8c95d-ng2gw" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.522131 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1a99224-b58e-48c3-aad8-bfa2c175965e-config-data\") pod \"heat-api-77dfb8c95d-ng2gw\" (UID: \"e1a99224-b58e-48c3-aad8-bfa2c175965e\") " pod="openstack/heat-api-77dfb8c95d-ng2gw" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.522197 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d852eeb5-beb4-4410-beaa-34cb9d4da7e0-combined-ca-bundle\") pod \"heat-cfnapi-76bfcb8cc7-vl56f\" (UID: \"d852eeb5-beb4-4410-beaa-34cb9d4da7e0\") " pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.522229 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d852eeb5-beb4-4410-beaa-34cb9d4da7e0-config-data\") pod \"heat-cfnapi-76bfcb8cc7-vl56f\" (UID: \"d852eeb5-beb4-4410-beaa-34cb9d4da7e0\") " pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.522323 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qlkw\" (UniqueName: \"kubernetes.io/projected/e1a99224-b58e-48c3-aad8-bfa2c175965e-kube-api-access-9qlkw\") pod \"heat-api-77dfb8c95d-ng2gw\" (UID: \"e1a99224-b58e-48c3-aad8-bfa2c175965e\") " pod="openstack/heat-api-77dfb8c95d-ng2gw" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.522384 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d852eeb5-beb4-4410-beaa-34cb9d4da7e0-config-data-custom\") pod \"heat-cfnapi-76bfcb8cc7-vl56f\" (UID: \"d852eeb5-beb4-4410-beaa-34cb9d4da7e0\") " pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.522421 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk6hx\" (UniqueName: \"kubernetes.io/projected/d852eeb5-beb4-4410-beaa-34cb9d4da7e0-kube-api-access-lk6hx\") pod \"heat-cfnapi-76bfcb8cc7-vl56f\" (UID: \"d852eeb5-beb4-4410-beaa-34cb9d4da7e0\") " pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.536174 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1a99224-b58e-48c3-aad8-bfa2c175965e-combined-ca-bundle\") pod \"heat-api-77dfb8c95d-ng2gw\" (UID: \"e1a99224-b58e-48c3-aad8-bfa2c175965e\") " pod="openstack/heat-api-77dfb8c95d-ng2gw" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.538466 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d852eeb5-beb4-4410-beaa-34cb9d4da7e0-config-data\") pod \"heat-cfnapi-76bfcb8cc7-vl56f\" (UID: \"d852eeb5-beb4-4410-beaa-34cb9d4da7e0\") " pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.540487 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1a99224-b58e-48c3-aad8-bfa2c175965e-config-data\") pod \"heat-api-77dfb8c95d-ng2gw\" (UID: \"e1a99224-b58e-48c3-aad8-bfa2c175965e\") " pod="openstack/heat-api-77dfb8c95d-ng2gw" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.541443 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d852eeb5-beb4-4410-beaa-34cb9d4da7e0-combined-ca-bundle\") pod \"heat-cfnapi-76bfcb8cc7-vl56f\" (UID: \"d852eeb5-beb4-4410-beaa-34cb9d4da7e0\") " pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.554305 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d852eeb5-beb4-4410-beaa-34cb9d4da7e0-config-data-custom\") pod \"heat-cfnapi-76bfcb8cc7-vl56f\" (UID: \"d852eeb5-beb4-4410-beaa-34cb9d4da7e0\") " pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.581428 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1a99224-b58e-48c3-aad8-bfa2c175965e-config-data-custom\") pod \"heat-api-77dfb8c95d-ng2gw\" (UID: \"e1a99224-b58e-48c3-aad8-bfa2c175965e\") " pod="openstack/heat-api-77dfb8c95d-ng2gw" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.590280 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qlkw\" (UniqueName: \"kubernetes.io/projected/e1a99224-b58e-48c3-aad8-bfa2c175965e-kube-api-access-9qlkw\") pod \"heat-api-77dfb8c95d-ng2gw\" (UID: \"e1a99224-b58e-48c3-aad8-bfa2c175965e\") " pod="openstack/heat-api-77dfb8c95d-ng2gw" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.605356 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk6hx\" (UniqueName: \"kubernetes.io/projected/d852eeb5-beb4-4410-beaa-34cb9d4da7e0-kube-api-access-lk6hx\") pod \"heat-cfnapi-76bfcb8cc7-vl56f\" (UID: \"d852eeb5-beb4-4410-beaa-34cb9d4da7e0\") " pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.618637 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cl4pf"] Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.625729 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cl4pf" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.626842 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.632301 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.632613 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.632796 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qgmk5" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.633652 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/066916b9-4270-42bd-bcd1-3fd26bd65a9e-scripts\") pod \"nova-cell0-conductor-db-sync-cl4pf\" (UID: \"066916b9-4270-42bd-bcd1-3fd26bd65a9e\") " pod="openstack/nova-cell0-conductor-db-sync-cl4pf" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.633728 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/066916b9-4270-42bd-bcd1-3fd26bd65a9e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cl4pf\" (UID: \"066916b9-4270-42bd-bcd1-3fd26bd65a9e\") " pod="openstack/nova-cell0-conductor-db-sync-cl4pf" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.634287 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkv9n\" (UniqueName: \"kubernetes.io/projected/066916b9-4270-42bd-bcd1-3fd26bd65a9e-kube-api-access-hkv9n\") pod \"nova-cell0-conductor-db-sync-cl4pf\" (UID: \"066916b9-4270-42bd-bcd1-3fd26bd65a9e\") " pod="openstack/nova-cell0-conductor-db-sync-cl4pf" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.634363 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/066916b9-4270-42bd-bcd1-3fd26bd65a9e-config-data\") pod \"nova-cell0-conductor-db-sync-cl4pf\" (UID: \"066916b9-4270-42bd-bcd1-3fd26bd65a9e\") " pod="openstack/nova-cell0-conductor-db-sync-cl4pf" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.688382 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cl4pf"] Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.741527 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/066916b9-4270-42bd-bcd1-3fd26bd65a9e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cl4pf\" (UID: \"066916b9-4270-42bd-bcd1-3fd26bd65a9e\") " pod="openstack/nova-cell0-conductor-db-sync-cl4pf" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.741981 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkv9n\" (UniqueName: \"kubernetes.io/projected/066916b9-4270-42bd-bcd1-3fd26bd65a9e-kube-api-access-hkv9n\") pod \"nova-cell0-conductor-db-sync-cl4pf\" (UID: \"066916b9-4270-42bd-bcd1-3fd26bd65a9e\") " pod="openstack/nova-cell0-conductor-db-sync-cl4pf" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.742064 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/066916b9-4270-42bd-bcd1-3fd26bd65a9e-config-data\") pod \"nova-cell0-conductor-db-sync-cl4pf\" (UID: \"066916b9-4270-42bd-bcd1-3fd26bd65a9e\") " pod="openstack/nova-cell0-conductor-db-sync-cl4pf" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.742298 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/066916b9-4270-42bd-bcd1-3fd26bd65a9e-scripts\") pod \"nova-cell0-conductor-db-sync-cl4pf\" (UID: \"066916b9-4270-42bd-bcd1-3fd26bd65a9e\") " pod="openstack/nova-cell0-conductor-db-sync-cl4pf" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.759615 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/066916b9-4270-42bd-bcd1-3fd26bd65a9e-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-cl4pf\" (UID: \"066916b9-4270-42bd-bcd1-3fd26bd65a9e\") " pod="openstack/nova-cell0-conductor-db-sync-cl4pf" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.760240 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/066916b9-4270-42bd-bcd1-3fd26bd65a9e-config-data\") pod \"nova-cell0-conductor-db-sync-cl4pf\" (UID: \"066916b9-4270-42bd-bcd1-3fd26bd65a9e\") " pod="openstack/nova-cell0-conductor-db-sync-cl4pf" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.764262 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/066916b9-4270-42bd-bcd1-3fd26bd65a9e-scripts\") pod \"nova-cell0-conductor-db-sync-cl4pf\" (UID: \"066916b9-4270-42bd-bcd1-3fd26bd65a9e\") " pod="openstack/nova-cell0-conductor-db-sync-cl4pf" Mar 20 13:47:06 crc kubenswrapper[4973]: I0320 13:47:06.796983 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkv9n\" (UniqueName: \"kubernetes.io/projected/066916b9-4270-42bd-bcd1-3fd26bd65a9e-kube-api-access-hkv9n\") pod \"nova-cell0-conductor-db-sync-cl4pf\" (UID: \"066916b9-4270-42bd-bcd1-3fd26bd65a9e\") " pod="openstack/nova-cell0-conductor-db-sync-cl4pf" Mar 20 13:47:08 crc kubenswrapper[4973]: I0320 13:47:06.862964 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-77dfb8c95d-ng2gw" Mar 20 13:47:08 crc kubenswrapper[4973]: I0320 13:47:07.003811 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cl4pf" Mar 20 13:47:08 crc kubenswrapper[4973]: I0320 13:47:08.045197 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71549865-d1f9-44d8-bcf3-0040dcbeda6d","Type":"ContainerStarted","Data":"2ab4f15458aa59237ac21bb96ce22d56800a13fc4be116660c5a8212e6c53074"} Mar 20 13:47:08 crc kubenswrapper[4973]: I0320 13:47:08.138497 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" Mar 20 13:47:08 crc kubenswrapper[4973]: I0320 13:47:08.240036 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-jq7fk"] Mar 20 13:47:08 crc kubenswrapper[4973]: I0320 13:47:08.240309 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" podUID="9321b726-0c2c-4c9c-a40a-73a387dfb215" containerName="dnsmasq-dns" containerID="cri-o://1ed5313e377ebf010922dc90d59e59d3a8d03018a545e32201c9e941fbefd522" gracePeriod=10 Mar 20 13:47:08 crc kubenswrapper[4973]: I0320 13:47:08.754737 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:47:08 crc kubenswrapper[4973]: I0320 13:47:08.756853 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-849bfc9c67-jsf5j" Mar 20 13:47:08 crc kubenswrapper[4973]: I0320 13:47:08.843296 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-76f767b74-8jwpc"] Mar 20 13:47:08 crc kubenswrapper[4973]: I0320 13:47:08.967671 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-77dfb8c95d-ng2gw"] Mar 20 13:47:09 crc kubenswrapper[4973]: I0320 13:47:09.094118 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-76bfcb8cc7-vl56f"] Mar 20 13:47:09 crc kubenswrapper[4973]: I0320 13:47:09.127468 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cl4pf"] Mar 20 13:47:09 crc kubenswrapper[4973]: I0320 13:47:09.133747 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-77dfb8c95d-ng2gw" event={"ID":"e1a99224-b58e-48c3-aad8-bfa2c175965e","Type":"ContainerStarted","Data":"5c92abde5ce097b40aa1daf84507586d026e34b5df8698ea522deef53916e18f"} Mar 20 13:47:09 crc kubenswrapper[4973]: I0320 13:47:09.149887 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" event={"ID":"d852eeb5-beb4-4410-beaa-34cb9d4da7e0","Type":"ContainerStarted","Data":"21fa65abd650c55d9471b7c25cc4387b10d90a37f09c82a8f09e3f216ed37ab9"} Mar 20 13:47:09 crc kubenswrapper[4973]: I0320 13:47:09.177203 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cl4pf" event={"ID":"066916b9-4270-42bd-bcd1-3fd26bd65a9e","Type":"ContainerStarted","Data":"13ee6dec8cabcd30928c6973a70b5dce9b0c8a93b142bb7c5c882b98e361d50b"} Mar 20 13:47:09 crc kubenswrapper[4973]: I0320 13:47:09.189387 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-76f767b74-8jwpc" event={"ID":"2d53b7a0-4041-437e-8ec2-91013bed7135","Type":"ContainerStarted","Data":"604bce4c7a9ef7fe4e0aaf7871ab146874e65a0659e6558fdabc7e719800e2bb"} Mar 20 13:47:09 crc kubenswrapper[4973]: I0320 13:47:09.203995 4973 generic.go:334] "Generic (PLEG): container finished" podID="9321b726-0c2c-4c9c-a40a-73a387dfb215" containerID="1ed5313e377ebf010922dc90d59e59d3a8d03018a545e32201c9e941fbefd522" exitCode=0 Mar 20 13:47:09 crc kubenswrapper[4973]: I0320 13:47:09.204058 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" event={"ID":"9321b726-0c2c-4c9c-a40a-73a387dfb215","Type":"ContainerDied","Data":"1ed5313e377ebf010922dc90d59e59d3a8d03018a545e32201c9e941fbefd522"} Mar 20 13:47:09 crc kubenswrapper[4973]: I0320 13:47:09.291604 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" Mar 20 13:47:09 crc kubenswrapper[4973]: I0320 13:47:09.345009 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-config\") pod \"9321b726-0c2c-4c9c-a40a-73a387dfb215\" (UID: \"9321b726-0c2c-4c9c-a40a-73a387dfb215\") " Mar 20 13:47:09 crc kubenswrapper[4973]: I0320 13:47:09.345513 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-ovsdbserver-nb\") pod \"9321b726-0c2c-4c9c-a40a-73a387dfb215\" (UID: \"9321b726-0c2c-4c9c-a40a-73a387dfb215\") " Mar 20 13:47:09 crc kubenswrapper[4973]: I0320 13:47:09.345566 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-dns-svc\") pod \"9321b726-0c2c-4c9c-a40a-73a387dfb215\" (UID: \"9321b726-0c2c-4c9c-a40a-73a387dfb215\") " Mar 20 13:47:09 crc kubenswrapper[4973]: I0320 13:47:09.345610 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-dns-swift-storage-0\") pod \"9321b726-0c2c-4c9c-a40a-73a387dfb215\" (UID: \"9321b726-0c2c-4c9c-a40a-73a387dfb215\") " Mar 20 13:47:09 crc kubenswrapper[4973]: I0320 13:47:09.345685 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqvgc\" (UniqueName: \"kubernetes.io/projected/9321b726-0c2c-4c9c-a40a-73a387dfb215-kube-api-access-sqvgc\") pod \"9321b726-0c2c-4c9c-a40a-73a387dfb215\" (UID: \"9321b726-0c2c-4c9c-a40a-73a387dfb215\") " Mar 20 13:47:09 crc kubenswrapper[4973]: I0320 13:47:09.345858 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-ovsdbserver-sb\") pod \"9321b726-0c2c-4c9c-a40a-73a387dfb215\" (UID: \"9321b726-0c2c-4c9c-a40a-73a387dfb215\") " Mar 20 13:47:09 crc kubenswrapper[4973]: I0320 13:47:09.386690 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9321b726-0c2c-4c9c-a40a-73a387dfb215-kube-api-access-sqvgc" (OuterVolumeSpecName: "kube-api-access-sqvgc") pod "9321b726-0c2c-4c9c-a40a-73a387dfb215" (UID: "9321b726-0c2c-4c9c-a40a-73a387dfb215"). InnerVolumeSpecName "kube-api-access-sqvgc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:09 crc kubenswrapper[4973]: I0320 13:47:09.450964 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sqvgc\" (UniqueName: \"kubernetes.io/projected/9321b726-0c2c-4c9c-a40a-73a387dfb215-kube-api-access-sqvgc\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:09 crc kubenswrapper[4973]: I0320 13:47:09.567211 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-config" (OuterVolumeSpecName: "config") pod "9321b726-0c2c-4c9c-a40a-73a387dfb215" (UID: "9321b726-0c2c-4c9c-a40a-73a387dfb215"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:47:09 crc kubenswrapper[4973]: I0320 13:47:09.657413 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:09 crc kubenswrapper[4973]: I0320 13:47:09.850087 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9321b726-0c2c-4c9c-a40a-73a387dfb215" (UID: "9321b726-0c2c-4c9c-a40a-73a387dfb215"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:47:09 crc kubenswrapper[4973]: I0320 13:47:09.862832 4973 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:09 crc kubenswrapper[4973]: I0320 13:47:09.897279 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9321b726-0c2c-4c9c-a40a-73a387dfb215" (UID: "9321b726-0c2c-4c9c-a40a-73a387dfb215"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:47:09 crc kubenswrapper[4973]: I0320 13:47:09.965104 4973 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:09 crc kubenswrapper[4973]: I0320 13:47:09.972754 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9321b726-0c2c-4c9c-a40a-73a387dfb215" (UID: "9321b726-0c2c-4c9c-a40a-73a387dfb215"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:47:10 crc kubenswrapper[4973]: I0320 13:47:10.025383 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9321b726-0c2c-4c9c-a40a-73a387dfb215" (UID: "9321b726-0c2c-4c9c-a40a-73a387dfb215"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:47:10 crc kubenswrapper[4973]: I0320 13:47:10.071316 4973 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:10 crc kubenswrapper[4973]: I0320 13:47:10.071359 4973 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9321b726-0c2c-4c9c-a40a-73a387dfb215-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:10 crc kubenswrapper[4973]: I0320 13:47:10.227745 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" event={"ID":"9321b726-0c2c-4c9c-a40a-73a387dfb215","Type":"ContainerDied","Data":"6df5d421a858e6250e1988e8e3f9ed867042f5344c40fd5e20efe26bb22f1f1d"} Mar 20 13:47:10 crc kubenswrapper[4973]: I0320 13:47:10.227855 4973 scope.go:117] "RemoveContainer" containerID="1ed5313e377ebf010922dc90d59e59d3a8d03018a545e32201c9e941fbefd522" Mar 20 13:47:10 crc kubenswrapper[4973]: I0320 13:47:10.228060 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5784cf869f-jq7fk" Mar 20 13:47:10 crc kubenswrapper[4973]: I0320 13:47:10.236378 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-77dfb8c95d-ng2gw" event={"ID":"e1a99224-b58e-48c3-aad8-bfa2c175965e","Type":"ContainerStarted","Data":"ee863932cdea29554c74f414667cf771bb7c45379f8e648e646d769cd4a1cc90"} Mar 20 13:47:10 crc kubenswrapper[4973]: I0320 13:47:10.238150 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-77dfb8c95d-ng2gw" Mar 20 13:47:10 crc kubenswrapper[4973]: I0320 13:47:10.275294 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-77dfb8c95d-ng2gw" podStartSLOduration=4.275272556 podStartE2EDuration="4.275272556s" podCreationTimestamp="2026-03-20 13:47:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:10.268372658 +0000 UTC m=+1551.012042402" watchObservedRunningTime="2026-03-20 13:47:10.275272556 +0000 UTC m=+1551.018942300" Mar 20 13:47:10 crc kubenswrapper[4973]: I0320 13:47:10.343987 4973 scope.go:117] "RemoveContainer" containerID="dbc121b549781d39bb93faa395feca0058a205d4f236dd0ec771e4838ca1e94a" Mar 20 13:47:10 crc kubenswrapper[4973]: I0320 13:47:10.371157 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-jq7fk"] Mar 20 13:47:10 crc kubenswrapper[4973]: I0320 13:47:10.386756 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5784cf869f-jq7fk"] Mar 20 13:47:11 crc kubenswrapper[4973]: I0320 13:47:11.259802 4973 generic.go:334] "Generic (PLEG): container finished" podID="d852eeb5-beb4-4410-beaa-34cb9d4da7e0" containerID="157f3de86d2659078ad01d703508c646351718d13026d29dfc70c67ebd8a96f2" exitCode=1 Mar 20 13:47:11 crc kubenswrapper[4973]: I0320 13:47:11.259925 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" event={"ID":"d852eeb5-beb4-4410-beaa-34cb9d4da7e0","Type":"ContainerDied","Data":"157f3de86d2659078ad01d703508c646351718d13026d29dfc70c67ebd8a96f2"} Mar 20 13:47:11 crc kubenswrapper[4973]: I0320 13:47:11.260652 4973 scope.go:117] "RemoveContainer" containerID="157f3de86d2659078ad01d703508c646351718d13026d29dfc70c67ebd8a96f2" Mar 20 13:47:11 crc kubenswrapper[4973]: I0320 13:47:11.268274 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-76f767b74-8jwpc" event={"ID":"2d53b7a0-4041-437e-8ec2-91013bed7135","Type":"ContainerStarted","Data":"a9f9968fdfaa6970d67f09742b94586bb90e47569f37a3792ef583b10bca3ec9"} Mar 20 13:47:11 crc kubenswrapper[4973]: I0320 13:47:11.269050 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-76f767b74-8jwpc" Mar 20 13:47:11 crc kubenswrapper[4973]: I0320 13:47:11.294284 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71549865-d1f9-44d8-bcf3-0040dcbeda6d","Type":"ContainerStarted","Data":"f770a706d70ed325607eb9c0fe5be1d7078a64625b059747da55740b786c66fb"} Mar 20 13:47:11 crc kubenswrapper[4973]: I0320 13:47:11.295569 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:47:11 crc kubenswrapper[4973]: I0320 13:47:11.301390 4973 generic.go:334] "Generic (PLEG): container finished" podID="e1a99224-b58e-48c3-aad8-bfa2c175965e" containerID="ee863932cdea29554c74f414667cf771bb7c45379f8e648e646d769cd4a1cc90" exitCode=1 Mar 20 13:47:11 crc kubenswrapper[4973]: I0320 13:47:11.301434 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-77dfb8c95d-ng2gw" event={"ID":"e1a99224-b58e-48c3-aad8-bfa2c175965e","Type":"ContainerDied","Data":"ee863932cdea29554c74f414667cf771bb7c45379f8e648e646d769cd4a1cc90"} Mar 20 13:47:11 crc kubenswrapper[4973]: I0320 13:47:11.302525 4973 scope.go:117] "RemoveContainer" containerID="ee863932cdea29554c74f414667cf771bb7c45379f8e648e646d769cd4a1cc90" Mar 20 13:47:11 crc kubenswrapper[4973]: I0320 13:47:11.335499 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-76f767b74-8jwpc" podStartSLOduration=5.335474414 podStartE2EDuration="5.335474414s" podCreationTimestamp="2026-03-20 13:47:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:11.296892482 +0000 UTC m=+1552.040562226" watchObservedRunningTime="2026-03-20 13:47:11.335474414 +0000 UTC m=+1552.079144158" Mar 20 13:47:11 crc kubenswrapper[4973]: I0320 13:47:11.338686 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.251021608 podStartE2EDuration="13.338678522s" podCreationTimestamp="2026-03-20 13:46:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:00.571794224 +0000 UTC m=+1541.315463968" lastFinishedPulling="2026-03-20 13:47:09.659451138 +0000 UTC m=+1550.403120882" observedRunningTime="2026-03-20 13:47:11.330936331 +0000 UTC m=+1552.074606075" watchObservedRunningTime="2026-03-20 13:47:11.338678522 +0000 UTC m=+1552.082348266" Mar 20 13:47:11 crc kubenswrapper[4973]: I0320 13:47:11.628185 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" Mar 20 13:47:11 crc kubenswrapper[4973]: I0320 13:47:11.628234 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" Mar 20 13:47:11 crc kubenswrapper[4973]: I0320 13:47:11.863568 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-77dfb8c95d-ng2gw" Mar 20 13:47:11 crc kubenswrapper[4973]: I0320 13:47:11.974116 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9321b726-0c2c-4c9c-a40a-73a387dfb215" path="/var/lib/kubelet/pods/9321b726-0c2c-4c9c-a40a-73a387dfb215/volumes" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.077238 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-b6774478c-cwcjf"] Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.077572 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-b6774478c-cwcjf" podUID="7b7eb71d-d134-4d3b-84a0-f14776f4410a" containerName="heat-cfnapi" containerID="cri-o://a87d5793ecd115be5927094ad7abc26581635b4f6c204b6b783553664548e395" gracePeriod=60 Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.106183 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-667c56bb7d-c5695"] Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.106505 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-667c56bb7d-c5695" podUID="07fe89fe-2971-4d79-afd3-4688be48e51c" containerName="heat-api" containerID="cri-o://67763a411d219ed52e22b2f329b5a6bb85e80d401299e127254bf4f709190947" gracePeriod=60 Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.126406 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-b78ff9797-ms9rv"] Mar 20 13:47:12 crc kubenswrapper[4973]: E0320 13:47:12.127019 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9321b726-0c2c-4c9c-a40a-73a387dfb215" containerName="dnsmasq-dns" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.127042 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="9321b726-0c2c-4c9c-a40a-73a387dfb215" containerName="dnsmasq-dns" Mar 20 13:47:12 crc kubenswrapper[4973]: E0320 13:47:12.127076 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9321b726-0c2c-4c9c-a40a-73a387dfb215" containerName="init" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.127086 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="9321b726-0c2c-4c9c-a40a-73a387dfb215" containerName="init" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.127332 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="9321b726-0c2c-4c9c-a40a-73a387dfb215" containerName="dnsmasq-dns" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.128208 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-b78ff9797-ms9rv" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.138647 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-public-svc" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.138998 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-cfnapi-internal-svc" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.140413 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-6558cfd5cd-bptc4"] Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.140987 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-public-tls-certs\") pod \"heat-cfnapi-b78ff9797-ms9rv\" (UID: \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\") " pod="openstack/heat-cfnapi-b78ff9797-ms9rv" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.141026 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-config-data-custom\") pod \"heat-cfnapi-b78ff9797-ms9rv\" (UID: \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\") " pod="openstack/heat-cfnapi-b78ff9797-ms9rv" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.141203 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9s6k\" (UniqueName: \"kubernetes.io/projected/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-kube-api-access-j9s6k\") pod \"heat-cfnapi-b78ff9797-ms9rv\" (UID: \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\") " pod="openstack/heat-cfnapi-b78ff9797-ms9rv" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.141236 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-combined-ca-bundle\") pod \"heat-cfnapi-b78ff9797-ms9rv\" (UID: \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\") " pod="openstack/heat-cfnapi-b78ff9797-ms9rv" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.141325 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-config-data\") pod \"heat-cfnapi-b78ff9797-ms9rv\" (UID: \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\") " pod="openstack/heat-cfnapi-b78ff9797-ms9rv" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.141512 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-internal-tls-certs\") pod \"heat-cfnapi-b78ff9797-ms9rv\" (UID: \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\") " pod="openstack/heat-cfnapi-b78ff9797-ms9rv" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.142070 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6558cfd5cd-bptc4" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.161621 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-b6774478c-cwcjf" podUID="7b7eb71d-d134-4d3b-84a0-f14776f4410a" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.227:8000/healthcheck\": EOF" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.163557 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-public-svc" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.163656 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-heat-api-internal-svc" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.183833 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-b78ff9797-ms9rv"] Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.251039 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-config-data-custom\") pod \"heat-api-6558cfd5cd-bptc4\" (UID: \"3fd39b11-95a1-493b-afd5-6469bd8ee321\") " pod="openstack/heat-api-6558cfd5cd-bptc4" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.251101 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-config-data\") pod \"heat-cfnapi-b78ff9797-ms9rv\" (UID: \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\") " pod="openstack/heat-cfnapi-b78ff9797-ms9rv" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.251320 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-public-tls-certs\") pod \"heat-api-6558cfd5cd-bptc4\" (UID: \"3fd39b11-95a1-493b-afd5-6469bd8ee321\") " pod="openstack/heat-api-6558cfd5cd-bptc4" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.251436 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-internal-tls-certs\") pod \"heat-cfnapi-b78ff9797-ms9rv\" (UID: \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\") " pod="openstack/heat-cfnapi-b78ff9797-ms9rv" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.251468 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-public-tls-certs\") pod \"heat-cfnapi-b78ff9797-ms9rv\" (UID: \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\") " pod="openstack/heat-cfnapi-b78ff9797-ms9rv" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.251488 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-config-data-custom\") pod \"heat-cfnapi-b78ff9797-ms9rv\" (UID: \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\") " pod="openstack/heat-cfnapi-b78ff9797-ms9rv" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.251521 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-config-data\") pod \"heat-api-6558cfd5cd-bptc4\" (UID: \"3fd39b11-95a1-493b-afd5-6469bd8ee321\") " pod="openstack/heat-api-6558cfd5cd-bptc4" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.251784 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkslb\" (UniqueName: \"kubernetes.io/projected/3fd39b11-95a1-493b-afd5-6469bd8ee321-kube-api-access-qkslb\") pod \"heat-api-6558cfd5cd-bptc4\" (UID: \"3fd39b11-95a1-493b-afd5-6469bd8ee321\") " pod="openstack/heat-api-6558cfd5cd-bptc4" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.252035 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9s6k\" (UniqueName: \"kubernetes.io/projected/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-kube-api-access-j9s6k\") pod \"heat-cfnapi-b78ff9797-ms9rv\" (UID: \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\") " pod="openstack/heat-cfnapi-b78ff9797-ms9rv" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.252144 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-combined-ca-bundle\") pod \"heat-cfnapi-b78ff9797-ms9rv\" (UID: \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\") " pod="openstack/heat-cfnapi-b78ff9797-ms9rv" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.252291 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-combined-ca-bundle\") pod \"heat-api-6558cfd5cd-bptc4\" (UID: \"3fd39b11-95a1-493b-afd5-6469bd8ee321\") " pod="openstack/heat-api-6558cfd5cd-bptc4" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.252426 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-internal-tls-certs\") pod \"heat-api-6558cfd5cd-bptc4\" (UID: \"3fd39b11-95a1-493b-afd5-6469bd8ee321\") " pod="openstack/heat-api-6558cfd5cd-bptc4" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.270677 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6558cfd5cd-bptc4"] Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.279149 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-public-tls-certs\") pod \"heat-cfnapi-b78ff9797-ms9rv\" (UID: \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\") " pod="openstack/heat-cfnapi-b78ff9797-ms9rv" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.282652 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-config-data\") pod \"heat-cfnapi-b78ff9797-ms9rv\" (UID: \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\") " pod="openstack/heat-cfnapi-b78ff9797-ms9rv" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.285216 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-internal-tls-certs\") pod \"heat-cfnapi-b78ff9797-ms9rv\" (UID: \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\") " pod="openstack/heat-cfnapi-b78ff9797-ms9rv" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.302186 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-config-data-custom\") pod \"heat-cfnapi-b78ff9797-ms9rv\" (UID: \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\") " pod="openstack/heat-cfnapi-b78ff9797-ms9rv" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.304045 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-combined-ca-bundle\") pod \"heat-cfnapi-b78ff9797-ms9rv\" (UID: \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\") " pod="openstack/heat-cfnapi-b78ff9797-ms9rv" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.304704 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9s6k\" (UniqueName: \"kubernetes.io/projected/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-kube-api-access-j9s6k\") pod \"heat-cfnapi-b78ff9797-ms9rv\" (UID: \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\") " pod="openstack/heat-cfnapi-b78ff9797-ms9rv" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.362703 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-config-data\") pod \"heat-api-6558cfd5cd-bptc4\" (UID: \"3fd39b11-95a1-493b-afd5-6469bd8ee321\") " pod="openstack/heat-api-6558cfd5cd-bptc4" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.366127 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkslb\" (UniqueName: \"kubernetes.io/projected/3fd39b11-95a1-493b-afd5-6469bd8ee321-kube-api-access-qkslb\") pod \"heat-api-6558cfd5cd-bptc4\" (UID: \"3fd39b11-95a1-493b-afd5-6469bd8ee321\") " pod="openstack/heat-api-6558cfd5cd-bptc4" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.366531 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-combined-ca-bundle\") pod \"heat-api-6558cfd5cd-bptc4\" (UID: \"3fd39b11-95a1-493b-afd5-6469bd8ee321\") " pod="openstack/heat-api-6558cfd5cd-bptc4" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.366579 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-internal-tls-certs\") pod \"heat-api-6558cfd5cd-bptc4\" (UID: \"3fd39b11-95a1-493b-afd5-6469bd8ee321\") " pod="openstack/heat-api-6558cfd5cd-bptc4" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.368871 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-config-data-custom\") pod \"heat-api-6558cfd5cd-bptc4\" (UID: \"3fd39b11-95a1-493b-afd5-6469bd8ee321\") " pod="openstack/heat-api-6558cfd5cd-bptc4" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.369191 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-public-tls-certs\") pod \"heat-api-6558cfd5cd-bptc4\" (UID: \"3fd39b11-95a1-493b-afd5-6469bd8ee321\") " pod="openstack/heat-api-6558cfd5cd-bptc4" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.379876 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-config-data-custom\") pod \"heat-api-6558cfd5cd-bptc4\" (UID: \"3fd39b11-95a1-493b-afd5-6469bd8ee321\") " pod="openstack/heat-api-6558cfd5cd-bptc4" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.380386 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-public-tls-certs\") pod \"heat-api-6558cfd5cd-bptc4\" (UID: \"3fd39b11-95a1-493b-afd5-6469bd8ee321\") " pod="openstack/heat-api-6558cfd5cd-bptc4" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.397681 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-internal-tls-certs\") pod \"heat-api-6558cfd5cd-bptc4\" (UID: \"3fd39b11-95a1-493b-afd5-6469bd8ee321\") " pod="openstack/heat-api-6558cfd5cd-bptc4" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.397657 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-combined-ca-bundle\") pod \"heat-api-6558cfd5cd-bptc4\" (UID: \"3fd39b11-95a1-493b-afd5-6469bd8ee321\") " pod="openstack/heat-api-6558cfd5cd-bptc4" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.400362 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkslb\" (UniqueName: \"kubernetes.io/projected/3fd39b11-95a1-493b-afd5-6469bd8ee321-kube-api-access-qkslb\") pod \"heat-api-6558cfd5cd-bptc4\" (UID: \"3fd39b11-95a1-493b-afd5-6469bd8ee321\") " pod="openstack/heat-api-6558cfd5cd-bptc4" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.403848 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-config-data\") pod \"heat-api-6558cfd5cd-bptc4\" (UID: \"3fd39b11-95a1-493b-afd5-6469bd8ee321\") " pod="openstack/heat-api-6558cfd5cd-bptc4" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.542012 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-b78ff9797-ms9rv" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.542137 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6558cfd5cd-bptc4" Mar 20 13:47:12 crc kubenswrapper[4973]: I0320 13:47:12.606844 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 20 13:47:13 crc kubenswrapper[4973]: I0320 13:47:13.239614 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-b78ff9797-ms9rv"] Mar 20 13:47:13 crc kubenswrapper[4973]: I0320 13:47:13.321165 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:47:13 crc kubenswrapper[4973]: I0320 13:47:13.321289 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:47:13 crc kubenswrapper[4973]: I0320 13:47:13.342058 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" event={"ID":"d852eeb5-beb4-4410-beaa-34cb9d4da7e0","Type":"ContainerStarted","Data":"d7d46a1ce93c90ca0cb917da6f1d57303662cf449344a7406994d65b1bfcac8b"} Mar 20 13:47:13 crc kubenswrapper[4973]: I0320 13:47:13.345107 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" Mar 20 13:47:13 crc kubenswrapper[4973]: I0320 13:47:13.350606 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-b78ff9797-ms9rv" event={"ID":"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c","Type":"ContainerStarted","Data":"cebe68b56c8bca1a398cdad49ccb297f8e772fa7a220528d21272387b928173d"} Mar 20 13:47:13 crc kubenswrapper[4973]: I0320 13:47:13.382570 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-77dfb8c95d-ng2gw" event={"ID":"e1a99224-b58e-48c3-aad8-bfa2c175965e","Type":"ContainerStarted","Data":"36af5976952feb7b2ad97b2689fa020238de3f6e61ca838c28cb0cba67ad7132"} Mar 20 13:47:13 crc kubenswrapper[4973]: I0320 13:47:13.383359 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-77dfb8c95d-ng2gw" Mar 20 13:47:13 crc kubenswrapper[4973]: I0320 13:47:13.383945 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" podStartSLOduration=7.383912045 podStartE2EDuration="7.383912045s" podCreationTimestamp="2026-03-20 13:47:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:13.372243967 +0000 UTC m=+1554.115913731" watchObservedRunningTime="2026-03-20 13:47:13.383912045 +0000 UTC m=+1554.127581859" Mar 20 13:47:13 crc kubenswrapper[4973]: I0320 13:47:13.485591 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-6558cfd5cd-bptc4"] Mar 20 13:47:14 crc kubenswrapper[4973]: I0320 13:47:14.418768 4973 generic.go:334] "Generic (PLEG): container finished" podID="e1a99224-b58e-48c3-aad8-bfa2c175965e" containerID="36af5976952feb7b2ad97b2689fa020238de3f6e61ca838c28cb0cba67ad7132" exitCode=1 Mar 20 13:47:14 crc kubenswrapper[4973]: I0320 13:47:14.419042 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-77dfb8c95d-ng2gw" event={"ID":"e1a99224-b58e-48c3-aad8-bfa2c175965e","Type":"ContainerDied","Data":"36af5976952feb7b2ad97b2689fa020238de3f6e61ca838c28cb0cba67ad7132"} Mar 20 13:47:14 crc kubenswrapper[4973]: I0320 13:47:14.419359 4973 scope.go:117] "RemoveContainer" containerID="ee863932cdea29554c74f414667cf771bb7c45379f8e648e646d769cd4a1cc90" Mar 20 13:47:14 crc kubenswrapper[4973]: I0320 13:47:14.420681 4973 scope.go:117] "RemoveContainer" containerID="36af5976952feb7b2ad97b2689fa020238de3f6e61ca838c28cb0cba67ad7132" Mar 20 13:47:14 crc kubenswrapper[4973]: E0320 13:47:14.421140 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-77dfb8c95d-ng2gw_openstack(e1a99224-b58e-48c3-aad8-bfa2c175965e)\"" pod="openstack/heat-api-77dfb8c95d-ng2gw" podUID="e1a99224-b58e-48c3-aad8-bfa2c175965e" Mar 20 13:47:14 crc kubenswrapper[4973]: I0320 13:47:14.422717 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6558cfd5cd-bptc4" event={"ID":"3fd39b11-95a1-493b-afd5-6469bd8ee321","Type":"ContainerStarted","Data":"5fd3a24fa902cdc7db2d0123e6cc1a28435afafe8ff66817e12cfb41d15950f8"} Mar 20 13:47:14 crc kubenswrapper[4973]: I0320 13:47:14.422751 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6558cfd5cd-bptc4" event={"ID":"3fd39b11-95a1-493b-afd5-6469bd8ee321","Type":"ContainerStarted","Data":"8acf94ebd28196f244271c82a490d19fbf92f2d30b9bf5fd1b0186b18421d224"} Mar 20 13:47:14 crc kubenswrapper[4973]: I0320 13:47:14.424203 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-6558cfd5cd-bptc4" Mar 20 13:47:14 crc kubenswrapper[4973]: I0320 13:47:14.467048 4973 generic.go:334] "Generic (PLEG): container finished" podID="d852eeb5-beb4-4410-beaa-34cb9d4da7e0" containerID="d7d46a1ce93c90ca0cb917da6f1d57303662cf449344a7406994d65b1bfcac8b" exitCode=1 Mar 20 13:47:14 crc kubenswrapper[4973]: I0320 13:47:14.467153 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" event={"ID":"d852eeb5-beb4-4410-beaa-34cb9d4da7e0","Type":"ContainerDied","Data":"d7d46a1ce93c90ca0cb917da6f1d57303662cf449344a7406994d65b1bfcac8b"} Mar 20 13:47:14 crc kubenswrapper[4973]: I0320 13:47:14.468062 4973 scope.go:117] "RemoveContainer" containerID="d7d46a1ce93c90ca0cb917da6f1d57303662cf449344a7406994d65b1bfcac8b" Mar 20 13:47:14 crc kubenswrapper[4973]: E0320 13:47:14.468420 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-76bfcb8cc7-vl56f_openstack(d852eeb5-beb4-4410-beaa-34cb9d4da7e0)\"" pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" podUID="d852eeb5-beb4-4410-beaa-34cb9d4da7e0" Mar 20 13:47:14 crc kubenswrapper[4973]: I0320 13:47:14.479316 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-b78ff9797-ms9rv" event={"ID":"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c","Type":"ContainerStarted","Data":"d3995aa4396af1f1e32565ada9c9f146370fb46167efc8a4bdbda0a96f007ede"} Mar 20 13:47:14 crc kubenswrapper[4973]: I0320 13:47:14.479560 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-b78ff9797-ms9rv" Mar 20 13:47:14 crc kubenswrapper[4973]: I0320 13:47:14.515470 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-6558cfd5cd-bptc4" podStartSLOduration=2.51544505 podStartE2EDuration="2.51544505s" podCreationTimestamp="2026-03-20 13:47:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:14.487796496 +0000 UTC m=+1555.231466240" watchObservedRunningTime="2026-03-20 13:47:14.51544505 +0000 UTC m=+1555.259114784" Mar 20 13:47:14 crc kubenswrapper[4973]: I0320 13:47:14.553173 4973 scope.go:117] "RemoveContainer" containerID="157f3de86d2659078ad01d703508c646351718d13026d29dfc70c67ebd8a96f2" Mar 20 13:47:14 crc kubenswrapper[4973]: I0320 13:47:14.573286 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-b78ff9797-ms9rv" podStartSLOduration=2.573259569 podStartE2EDuration="2.573259569s" podCreationTimestamp="2026-03-20 13:47:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:14.545945083 +0000 UTC m=+1555.289614827" watchObservedRunningTime="2026-03-20 13:47:14.573259569 +0000 UTC m=+1555.316929313" Mar 20 13:47:15 crc kubenswrapper[4973]: I0320 13:47:15.499690 4973 scope.go:117] "RemoveContainer" containerID="36af5976952feb7b2ad97b2689fa020238de3f6e61ca838c28cb0cba67ad7132" Mar 20 13:47:15 crc kubenswrapper[4973]: E0320 13:47:15.500454 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-77dfb8c95d-ng2gw_openstack(e1a99224-b58e-48c3-aad8-bfa2c175965e)\"" pod="openstack/heat-api-77dfb8c95d-ng2gw" podUID="e1a99224-b58e-48c3-aad8-bfa2c175965e" Mar 20 13:47:15 crc kubenswrapper[4973]: I0320 13:47:15.507115 4973 scope.go:117] "RemoveContainer" containerID="d7d46a1ce93c90ca0cb917da6f1d57303662cf449344a7406994d65b1bfcac8b" Mar 20 13:47:15 crc kubenswrapper[4973]: E0320 13:47:15.507399 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-76bfcb8cc7-vl56f_openstack(d852eeb5-beb4-4410-beaa-34cb9d4da7e0)\"" pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" podUID="d852eeb5-beb4-4410-beaa-34cb9d4da7e0" Mar 20 13:47:15 crc kubenswrapper[4973]: I0320 13:47:15.514253 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-667c56bb7d-c5695" Mar 20 13:47:16 crc kubenswrapper[4973]: I0320 13:47:16.627406 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" Mar 20 13:47:16 crc kubenswrapper[4973]: I0320 13:47:16.628577 4973 scope.go:117] "RemoveContainer" containerID="d7d46a1ce93c90ca0cb917da6f1d57303662cf449344a7406994d65b1bfcac8b" Mar 20 13:47:16 crc kubenswrapper[4973]: E0320 13:47:16.628815 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-cfnapi\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-cfnapi pod=heat-cfnapi-76bfcb8cc7-vl56f_openstack(d852eeb5-beb4-4410-beaa-34cb9d4da7e0)\"" pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" podUID="d852eeb5-beb4-4410-beaa-34cb9d4da7e0" Mar 20 13:47:16 crc kubenswrapper[4973]: I0320 13:47:16.863692 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/heat-api-77dfb8c95d-ng2gw" Mar 20 13:47:16 crc kubenswrapper[4973]: I0320 13:47:16.864777 4973 scope.go:117] "RemoveContainer" containerID="36af5976952feb7b2ad97b2689fa020238de3f6e61ca838c28cb0cba67ad7132" Mar 20 13:47:16 crc kubenswrapper[4973]: E0320 13:47:16.865087 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=heat-api pod=heat-api-77dfb8c95d-ng2gw_openstack(e1a99224-b58e-48c3-aad8-bfa2c175965e)\"" pod="openstack/heat-api-77dfb8c95d-ng2gw" podUID="e1a99224-b58e-48c3-aad8-bfa2c175965e" Mar 20 13:47:17 crc kubenswrapper[4973]: I0320 13:47:17.661284 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:47:17 crc kubenswrapper[4973]: I0320 13:47:17.661613 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="71549865-d1f9-44d8-bcf3-0040dcbeda6d" containerName="ceilometer-central-agent" containerID="cri-o://7a4c18d5e3bed2bd1ec2cf7ccdbcbe4f2fe7b32f78b93a9524867ab2eb2c4944" gracePeriod=30 Mar 20 13:47:17 crc kubenswrapper[4973]: I0320 13:47:17.661757 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="71549865-d1f9-44d8-bcf3-0040dcbeda6d" containerName="proxy-httpd" containerID="cri-o://f770a706d70ed325607eb9c0fe5be1d7078a64625b059747da55740b786c66fb" gracePeriod=30 Mar 20 13:47:17 crc kubenswrapper[4973]: I0320 13:47:17.661792 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="71549865-d1f9-44d8-bcf3-0040dcbeda6d" containerName="sg-core" containerID="cri-o://2ab4f15458aa59237ac21bb96ce22d56800a13fc4be116660c5a8212e6c53074" gracePeriod=30 Mar 20 13:47:17 crc kubenswrapper[4973]: I0320 13:47:17.661781 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="71549865-d1f9-44d8-bcf3-0040dcbeda6d" containerName="ceilometer-notification-agent" containerID="cri-o://6390404d783256c13d7d05cba687f7acfc0b5746e5ef4db063fa1c80cc1c123d" gracePeriod=30 Mar 20 13:47:18 crc kubenswrapper[4973]: I0320 13:47:18.126785 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6cc7bcd8df-hpkc4" Mar 20 13:47:18 crc kubenswrapper[4973]: I0320 13:47:18.583784 4973 generic.go:334] "Generic (PLEG): container finished" podID="71549865-d1f9-44d8-bcf3-0040dcbeda6d" containerID="f770a706d70ed325607eb9c0fe5be1d7078a64625b059747da55740b786c66fb" exitCode=0 Mar 20 13:47:18 crc kubenswrapper[4973]: I0320 13:47:18.583840 4973 generic.go:334] "Generic (PLEG): container finished" podID="71549865-d1f9-44d8-bcf3-0040dcbeda6d" containerID="2ab4f15458aa59237ac21bb96ce22d56800a13fc4be116660c5a8212e6c53074" exitCode=2 Mar 20 13:47:18 crc kubenswrapper[4973]: I0320 13:47:18.583852 4973 generic.go:334] "Generic (PLEG): container finished" podID="71549865-d1f9-44d8-bcf3-0040dcbeda6d" containerID="6390404d783256c13d7d05cba687f7acfc0b5746e5ef4db063fa1c80cc1c123d" exitCode=0 Mar 20 13:47:18 crc kubenswrapper[4973]: I0320 13:47:18.583889 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71549865-d1f9-44d8-bcf3-0040dcbeda6d","Type":"ContainerDied","Data":"f770a706d70ed325607eb9c0fe5be1d7078a64625b059747da55740b786c66fb"} Mar 20 13:47:18 crc kubenswrapper[4973]: I0320 13:47:18.583956 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71549865-d1f9-44d8-bcf3-0040dcbeda6d","Type":"ContainerDied","Data":"2ab4f15458aa59237ac21bb96ce22d56800a13fc4be116660c5a8212e6c53074"} Mar 20 13:47:18 crc kubenswrapper[4973]: I0320 13:47:18.583971 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71549865-d1f9-44d8-bcf3-0040dcbeda6d","Type":"ContainerDied","Data":"6390404d783256c13d7d05cba687f7acfc0b5746e5ef4db063fa1c80cc1c123d"} Mar 20 13:47:19 crc kubenswrapper[4973]: I0320 13:47:19.554195 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-667c56bb7d-c5695" podUID="07fe89fe-2971-4d79-afd3-4688be48e51c" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.228:8004/healthcheck\": read tcp 10.217.0.2:60648->10.217.0.228:8004: read: connection reset by peer" Mar 20 13:47:19 crc kubenswrapper[4973]: I0320 13:47:19.555236 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-667c56bb7d-c5695" podUID="07fe89fe-2971-4d79-afd3-4688be48e51c" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.228:8004/healthcheck\": dial tcp 10.217.0.228:8004: connect: connection refused" Mar 20 13:47:19 crc kubenswrapper[4973]: I0320 13:47:19.696410 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-b6774478c-cwcjf" podUID="7b7eb71d-d134-4d3b-84a0-f14776f4410a" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.227:8000/healthcheck\": read tcp 10.217.0.2:52732->10.217.0.227:8000: read: connection reset by peer" Mar 20 13:47:19 crc kubenswrapper[4973]: I0320 13:47:19.700173 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-b6774478c-cwcjf" podUID="7b7eb71d-d134-4d3b-84a0-f14776f4410a" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.227:8000/healthcheck\": dial tcp 10.217.0.227:8000: connect: connection refused" Mar 20 13:47:20 crc kubenswrapper[4973]: I0320 13:47:20.617069 4973 generic.go:334] "Generic (PLEG): container finished" podID="07fe89fe-2971-4d79-afd3-4688be48e51c" containerID="67763a411d219ed52e22b2f329b5a6bb85e80d401299e127254bf4f709190947" exitCode=0 Mar 20 13:47:20 crc kubenswrapper[4973]: I0320 13:47:20.617293 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-667c56bb7d-c5695" event={"ID":"07fe89fe-2971-4d79-afd3-4688be48e51c","Type":"ContainerDied","Data":"67763a411d219ed52e22b2f329b5a6bb85e80d401299e127254bf4f709190947"} Mar 20 13:47:20 crc kubenswrapper[4973]: I0320 13:47:20.622023 4973 generic.go:334] "Generic (PLEG): container finished" podID="7b7eb71d-d134-4d3b-84a0-f14776f4410a" containerID="a87d5793ecd115be5927094ad7abc26581635b4f6c204b6b783553664548e395" exitCode=0 Mar 20 13:47:20 crc kubenswrapper[4973]: I0320 13:47:20.622082 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-b6774478c-cwcjf" event={"ID":"7b7eb71d-d134-4d3b-84a0-f14776f4410a","Type":"ContainerDied","Data":"a87d5793ecd115be5927094ad7abc26581635b4f6c204b6b783553664548e395"} Mar 20 13:47:20 crc kubenswrapper[4973]: I0320 13:47:20.635073 4973 generic.go:334] "Generic (PLEG): container finished" podID="71549865-d1f9-44d8-bcf3-0040dcbeda6d" containerID="7a4c18d5e3bed2bd1ec2cf7ccdbcbe4f2fe7b32f78b93a9524867ab2eb2c4944" exitCode=0 Mar 20 13:47:20 crc kubenswrapper[4973]: I0320 13:47:20.635134 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71549865-d1f9-44d8-bcf3-0040dcbeda6d","Type":"ContainerDied","Data":"7a4c18d5e3bed2bd1ec2cf7ccdbcbe4f2fe7b32f78b93a9524867ab2eb2c4944"} Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.654978 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-b6774478c-cwcjf" Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.677390 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-667c56bb7d-c5695" Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.724613 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-667c56bb7d-c5695" event={"ID":"07fe89fe-2971-4d79-afd3-4688be48e51c","Type":"ContainerDied","Data":"882c71bc20b25f789677fc7a06fd093602e54bd95f9575661b353cbb5a4d9d45"} Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.724667 4973 scope.go:117] "RemoveContainer" containerID="67763a411d219ed52e22b2f329b5a6bb85e80d401299e127254bf4f709190947" Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.724821 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-667c56bb7d-c5695" Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.754195 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-b6774478c-cwcjf" event={"ID":"7b7eb71d-d134-4d3b-84a0-f14776f4410a","Type":"ContainerDied","Data":"c5fc82857565d1788375c28c1cc2a87f26c31a6e67530fcd1dac2f249b1086a5"} Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.754659 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-b6774478c-cwcjf" Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.758288 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b7eb71d-d134-4d3b-84a0-f14776f4410a-config-data\") pod \"7b7eb71d-d134-4d3b-84a0-f14776f4410a\" (UID: \"7b7eb71d-d134-4d3b-84a0-f14776f4410a\") " Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.758325 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07fe89fe-2971-4d79-afd3-4688be48e51c-combined-ca-bundle\") pod \"07fe89fe-2971-4d79-afd3-4688be48e51c\" (UID: \"07fe89fe-2971-4d79-afd3-4688be48e51c\") " Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.758511 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07fe89fe-2971-4d79-afd3-4688be48e51c-config-data-custom\") pod \"07fe89fe-2971-4d79-afd3-4688be48e51c\" (UID: \"07fe89fe-2971-4d79-afd3-4688be48e51c\") " Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.758646 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b7eb71d-d134-4d3b-84a0-f14776f4410a-combined-ca-bundle\") pod \"7b7eb71d-d134-4d3b-84a0-f14776f4410a\" (UID: \"7b7eb71d-d134-4d3b-84a0-f14776f4410a\") " Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.758690 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07fe89fe-2971-4d79-afd3-4688be48e51c-config-data\") pod \"07fe89fe-2971-4d79-afd3-4688be48e51c\" (UID: \"07fe89fe-2971-4d79-afd3-4688be48e51c\") " Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.758719 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdkgw\" (UniqueName: \"kubernetes.io/projected/7b7eb71d-d134-4d3b-84a0-f14776f4410a-kube-api-access-zdkgw\") pod \"7b7eb71d-d134-4d3b-84a0-f14776f4410a\" (UID: \"7b7eb71d-d134-4d3b-84a0-f14776f4410a\") " Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.758793 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snz56\" (UniqueName: \"kubernetes.io/projected/07fe89fe-2971-4d79-afd3-4688be48e51c-kube-api-access-snz56\") pod \"07fe89fe-2971-4d79-afd3-4688be48e51c\" (UID: \"07fe89fe-2971-4d79-afd3-4688be48e51c\") " Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.758904 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b7eb71d-d134-4d3b-84a0-f14776f4410a-config-data-custom\") pod \"7b7eb71d-d134-4d3b-84a0-f14776f4410a\" (UID: \"7b7eb71d-d134-4d3b-84a0-f14776f4410a\") " Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.773833 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b7eb71d-d134-4d3b-84a0-f14776f4410a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "7b7eb71d-d134-4d3b-84a0-f14776f4410a" (UID: "7b7eb71d-d134-4d3b-84a0-f14776f4410a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.777989 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07fe89fe-2971-4d79-afd3-4688be48e51c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "07fe89fe-2971-4d79-afd3-4688be48e51c" (UID: "07fe89fe-2971-4d79-afd3-4688be48e51c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.780588 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b7eb71d-d134-4d3b-84a0-f14776f4410a-kube-api-access-zdkgw" (OuterVolumeSpecName: "kube-api-access-zdkgw") pod "7b7eb71d-d134-4d3b-84a0-f14776f4410a" (UID: "7b7eb71d-d134-4d3b-84a0-f14776f4410a"). InnerVolumeSpecName "kube-api-access-zdkgw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.785677 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07fe89fe-2971-4d79-afd3-4688be48e51c-kube-api-access-snz56" (OuterVolumeSpecName: "kube-api-access-snz56") pod "07fe89fe-2971-4d79-afd3-4688be48e51c" (UID: "07fe89fe-2971-4d79-afd3-4688be48e51c"). InnerVolumeSpecName "kube-api-access-snz56". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.872662 4973 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/7b7eb71d-d134-4d3b-84a0-f14776f4410a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.872728 4973 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/07fe89fe-2971-4d79-afd3-4688be48e51c-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.872742 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdkgw\" (UniqueName: \"kubernetes.io/projected/7b7eb71d-d134-4d3b-84a0-f14776f4410a-kube-api-access-zdkgw\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.872757 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snz56\" (UniqueName: \"kubernetes.io/projected/07fe89fe-2971-4d79-afd3-4688be48e51c-kube-api-access-snz56\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.877568 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07fe89fe-2971-4d79-afd3-4688be48e51c-config-data" (OuterVolumeSpecName: "config-data") pod "07fe89fe-2971-4d79-afd3-4688be48e51c" (UID: "07fe89fe-2971-4d79-afd3-4688be48e51c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.911011 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07fe89fe-2971-4d79-afd3-4688be48e51c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "07fe89fe-2971-4d79-afd3-4688be48e51c" (UID: "07fe89fe-2971-4d79-afd3-4688be48e51c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.915951 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b7eb71d-d134-4d3b-84a0-f14776f4410a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7b7eb71d-d134-4d3b-84a0-f14776f4410a" (UID: "7b7eb71d-d134-4d3b-84a0-f14776f4410a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.957190 4973 scope.go:117] "RemoveContainer" containerID="a87d5793ecd115be5927094ad7abc26581635b4f6c204b6b783553664548e395" Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.965020 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b7eb71d-d134-4d3b-84a0-f14776f4410a-config-data" (OuterVolumeSpecName: "config-data") pod "7b7eb71d-d134-4d3b-84a0-f14776f4410a" (UID: "7b7eb71d-d134-4d3b-84a0-f14776f4410a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.975254 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7b7eb71d-d134-4d3b-84a0-f14776f4410a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.975291 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/07fe89fe-2971-4d79-afd3-4688be48e51c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.975305 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7b7eb71d-d134-4d3b-84a0-f14776f4410a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:23 crc kubenswrapper[4973]: I0320 13:47:23.975316 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/07fe89fe-2971-4d79-afd3-4688be48e51c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.106884 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.118281 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-667c56bb7d-c5695"] Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.128502 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-667c56bb7d-c5695"] Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.144310 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-b6774478c-cwcjf"] Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.165421 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-b6774478c-cwcjf"] Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.178208 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71549865-d1f9-44d8-bcf3-0040dcbeda6d-log-httpd\") pod \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.178276 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71549865-d1f9-44d8-bcf3-0040dcbeda6d-config-data\") pod \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.178413 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71549865-d1f9-44d8-bcf3-0040dcbeda6d-sg-core-conf-yaml\") pod \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.178493 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6lm\" (UniqueName: \"kubernetes.io/projected/71549865-d1f9-44d8-bcf3-0040dcbeda6d-kube-api-access-sb6lm\") pod \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.178540 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71549865-d1f9-44d8-bcf3-0040dcbeda6d-run-httpd\") pod \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.178701 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71549865-d1f9-44d8-bcf3-0040dcbeda6d-combined-ca-bundle\") pod \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.178792 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71549865-d1f9-44d8-bcf3-0040dcbeda6d-scripts\") pod \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\" (UID: \"71549865-d1f9-44d8-bcf3-0040dcbeda6d\") " Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.181299 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71549865-d1f9-44d8-bcf3-0040dcbeda6d-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "71549865-d1f9-44d8-bcf3-0040dcbeda6d" (UID: "71549865-d1f9-44d8-bcf3-0040dcbeda6d"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.184634 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/71549865-d1f9-44d8-bcf3-0040dcbeda6d-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "71549865-d1f9-44d8-bcf3-0040dcbeda6d" (UID: "71549865-d1f9-44d8-bcf3-0040dcbeda6d"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.188201 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71549865-d1f9-44d8-bcf3-0040dcbeda6d-scripts" (OuterVolumeSpecName: "scripts") pod "71549865-d1f9-44d8-bcf3-0040dcbeda6d" (UID: "71549865-d1f9-44d8-bcf3-0040dcbeda6d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.190811 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71549865-d1f9-44d8-bcf3-0040dcbeda6d-kube-api-access-sb6lm" (OuterVolumeSpecName: "kube-api-access-sb6lm") pod "71549865-d1f9-44d8-bcf3-0040dcbeda6d" (UID: "71549865-d1f9-44d8-bcf3-0040dcbeda6d"). InnerVolumeSpecName "kube-api-access-sb6lm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.235495 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71549865-d1f9-44d8-bcf3-0040dcbeda6d-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "71549865-d1f9-44d8-bcf3-0040dcbeda6d" (UID: "71549865-d1f9-44d8-bcf3-0040dcbeda6d"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.281957 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/71549865-d1f9-44d8-bcf3-0040dcbeda6d-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.281997 4973 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71549865-d1f9-44d8-bcf3-0040dcbeda6d-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.282007 4973 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/71549865-d1f9-44d8-bcf3-0040dcbeda6d-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.282019 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6lm\" (UniqueName: \"kubernetes.io/projected/71549865-d1f9-44d8-bcf3-0040dcbeda6d-kube-api-access-sb6lm\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.282029 4973 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/71549865-d1f9-44d8-bcf3-0040dcbeda6d-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.322438 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71549865-d1f9-44d8-bcf3-0040dcbeda6d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "71549865-d1f9-44d8-bcf3-0040dcbeda6d" (UID: "71549865-d1f9-44d8-bcf3-0040dcbeda6d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.353435 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71549865-d1f9-44d8-bcf3-0040dcbeda6d-config-data" (OuterVolumeSpecName: "config-data") pod "71549865-d1f9-44d8-bcf3-0040dcbeda6d" (UID: "71549865-d1f9-44d8-bcf3-0040dcbeda6d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.384011 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71549865-d1f9-44d8-bcf3-0040dcbeda6d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.384053 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/71549865-d1f9-44d8-bcf3-0040dcbeda6d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.628124 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-6558cfd5cd-bptc4" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.707405 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-77dfb8c95d-ng2gw"] Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.822943 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"71549865-d1f9-44d8-bcf3-0040dcbeda6d","Type":"ContainerDied","Data":"db99524c410004ebceab83a58731bf85b433b846209006831521d45a9f4bd1a4"} Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.823261 4973 scope.go:117] "RemoveContainer" containerID="f770a706d70ed325607eb9c0fe5be1d7078a64625b059747da55740b786c66fb" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.823713 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.841128 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cl4pf" event={"ID":"066916b9-4270-42bd-bcd1-3fd26bd65a9e","Type":"ContainerStarted","Data":"210d5c181fb4f04e97bfc605675b4d3f7ca4c477d78dfbb351f46f194c573ca4"} Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.899624 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-cl4pf" podStartSLOduration=4.158147929 podStartE2EDuration="18.899599233s" podCreationTimestamp="2026-03-20 13:47:06 +0000 UTC" firstStartedPulling="2026-03-20 13:47:08.938608122 +0000 UTC m=+1549.682277866" lastFinishedPulling="2026-03-20 13:47:23.680059426 +0000 UTC m=+1564.423729170" observedRunningTime="2026-03-20 13:47:24.877360246 +0000 UTC m=+1565.621030000" watchObservedRunningTime="2026-03-20 13:47:24.899599233 +0000 UTC m=+1565.643268977" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.900682 4973 scope.go:117] "RemoveContainer" containerID="2ab4f15458aa59237ac21bb96ce22d56800a13fc4be116660c5a8212e6c53074" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.928993 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.949617 4973 scope.go:117] "RemoveContainer" containerID="6390404d783256c13d7d05cba687f7acfc0b5746e5ef4db063fa1c80cc1c123d" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.951833 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.971542 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:47:24 crc kubenswrapper[4973]: E0320 13:47:24.973087 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71549865-d1f9-44d8-bcf3-0040dcbeda6d" containerName="ceilometer-central-agent" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.973204 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="71549865-d1f9-44d8-bcf3-0040dcbeda6d" containerName="ceilometer-central-agent" Mar 20 13:47:24 crc kubenswrapper[4973]: E0320 13:47:24.973261 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07fe89fe-2971-4d79-afd3-4688be48e51c" containerName="heat-api" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.973327 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="07fe89fe-2971-4d79-afd3-4688be48e51c" containerName="heat-api" Mar 20 13:47:24 crc kubenswrapper[4973]: E0320 13:47:24.973437 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71549865-d1f9-44d8-bcf3-0040dcbeda6d" containerName="proxy-httpd" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.973488 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="71549865-d1f9-44d8-bcf3-0040dcbeda6d" containerName="proxy-httpd" Mar 20 13:47:24 crc kubenswrapper[4973]: E0320 13:47:24.973602 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71549865-d1f9-44d8-bcf3-0040dcbeda6d" containerName="ceilometer-notification-agent" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.973674 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="71549865-d1f9-44d8-bcf3-0040dcbeda6d" containerName="ceilometer-notification-agent" Mar 20 13:47:24 crc kubenswrapper[4973]: E0320 13:47:24.973731 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b7eb71d-d134-4d3b-84a0-f14776f4410a" containerName="heat-cfnapi" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.973855 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b7eb71d-d134-4d3b-84a0-f14776f4410a" containerName="heat-cfnapi" Mar 20 13:47:24 crc kubenswrapper[4973]: E0320 13:47:24.973941 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71549865-d1f9-44d8-bcf3-0040dcbeda6d" containerName="sg-core" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.974014 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="71549865-d1f9-44d8-bcf3-0040dcbeda6d" containerName="sg-core" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.974383 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="71549865-d1f9-44d8-bcf3-0040dcbeda6d" containerName="sg-core" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.974463 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="07fe89fe-2971-4d79-afd3-4688be48e51c" containerName="heat-api" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.974526 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="71549865-d1f9-44d8-bcf3-0040dcbeda6d" containerName="ceilometer-notification-agent" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.974580 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="71549865-d1f9-44d8-bcf3-0040dcbeda6d" containerName="proxy-httpd" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.974644 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b7eb71d-d134-4d3b-84a0-f14776f4410a" containerName="heat-cfnapi" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.974693 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="71549865-d1f9-44d8-bcf3-0040dcbeda6d" containerName="ceilometer-central-agent" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.977233 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.988856 4973 scope.go:117] "RemoveContainer" containerID="7a4c18d5e3bed2bd1ec2cf7ccdbcbe4f2fe7b32f78b93a9524867ab2eb2c4944" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.989673 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:47:24 crc kubenswrapper[4973]: I0320 13:47:24.989894 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.009188 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.128769 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gggm2\" (UniqueName: \"kubernetes.io/projected/d59a7c46-99e2-41de-a1b2-f98bd416a668-kube-api-access-gggm2\") pod \"ceilometer-0\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " pod="openstack/ceilometer-0" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.128918 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d59a7c46-99e2-41de-a1b2-f98bd416a668-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " pod="openstack/ceilometer-0" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.128962 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d59a7c46-99e2-41de-a1b2-f98bd416a668-run-httpd\") pod \"ceilometer-0\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " pod="openstack/ceilometer-0" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.128986 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d59a7c46-99e2-41de-a1b2-f98bd416a668-config-data\") pod \"ceilometer-0\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " pod="openstack/ceilometer-0" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.129015 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d59a7c46-99e2-41de-a1b2-f98bd416a668-scripts\") pod \"ceilometer-0\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " pod="openstack/ceilometer-0" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.129290 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d59a7c46-99e2-41de-a1b2-f98bd416a668-log-httpd\") pod \"ceilometer-0\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " pod="openstack/ceilometer-0" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.129771 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d59a7c46-99e2-41de-a1b2-f98bd416a668-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " pod="openstack/ceilometer-0" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.234806 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d59a7c46-99e2-41de-a1b2-f98bd416a668-log-httpd\") pod \"ceilometer-0\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " pod="openstack/ceilometer-0" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.234889 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d59a7c46-99e2-41de-a1b2-f98bd416a668-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " pod="openstack/ceilometer-0" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.234954 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gggm2\" (UniqueName: \"kubernetes.io/projected/d59a7c46-99e2-41de-a1b2-f98bd416a668-kube-api-access-gggm2\") pod \"ceilometer-0\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " pod="openstack/ceilometer-0" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.235035 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d59a7c46-99e2-41de-a1b2-f98bd416a668-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " pod="openstack/ceilometer-0" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.235071 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d59a7c46-99e2-41de-a1b2-f98bd416a668-run-httpd\") pod \"ceilometer-0\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " pod="openstack/ceilometer-0" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.235088 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d59a7c46-99e2-41de-a1b2-f98bd416a668-config-data\") pod \"ceilometer-0\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " pod="openstack/ceilometer-0" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.235112 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d59a7c46-99e2-41de-a1b2-f98bd416a668-scripts\") pod \"ceilometer-0\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " pod="openstack/ceilometer-0" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.238184 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d59a7c46-99e2-41de-a1b2-f98bd416a668-run-httpd\") pod \"ceilometer-0\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " pod="openstack/ceilometer-0" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.239354 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d59a7c46-99e2-41de-a1b2-f98bd416a668-log-httpd\") pod \"ceilometer-0\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " pod="openstack/ceilometer-0" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.241467 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d59a7c46-99e2-41de-a1b2-f98bd416a668-scripts\") pod \"ceilometer-0\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " pod="openstack/ceilometer-0" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.242797 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d59a7c46-99e2-41de-a1b2-f98bd416a668-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " pod="openstack/ceilometer-0" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.243688 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d59a7c46-99e2-41de-a1b2-f98bd416a668-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " pod="openstack/ceilometer-0" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.255252 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d59a7c46-99e2-41de-a1b2-f98bd416a668-config-data\") pod \"ceilometer-0\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " pod="openstack/ceilometer-0" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.260069 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gggm2\" (UniqueName: \"kubernetes.io/projected/d59a7c46-99e2-41de-a1b2-f98bd416a668-kube-api-access-gggm2\") pod \"ceilometer-0\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " pod="openstack/ceilometer-0" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.320350 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.443750 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-77dfb8c95d-ng2gw" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.544259 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1a99224-b58e-48c3-aad8-bfa2c175965e-config-data-custom\") pod \"e1a99224-b58e-48c3-aad8-bfa2c175965e\" (UID: \"e1a99224-b58e-48c3-aad8-bfa2c175965e\") " Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.544420 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qlkw\" (UniqueName: \"kubernetes.io/projected/e1a99224-b58e-48c3-aad8-bfa2c175965e-kube-api-access-9qlkw\") pod \"e1a99224-b58e-48c3-aad8-bfa2c175965e\" (UID: \"e1a99224-b58e-48c3-aad8-bfa2c175965e\") " Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.544786 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1a99224-b58e-48c3-aad8-bfa2c175965e-config-data\") pod \"e1a99224-b58e-48c3-aad8-bfa2c175965e\" (UID: \"e1a99224-b58e-48c3-aad8-bfa2c175965e\") " Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.544865 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1a99224-b58e-48c3-aad8-bfa2c175965e-combined-ca-bundle\") pod \"e1a99224-b58e-48c3-aad8-bfa2c175965e\" (UID: \"e1a99224-b58e-48c3-aad8-bfa2c175965e\") " Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.569723 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1a99224-b58e-48c3-aad8-bfa2c175965e-kube-api-access-9qlkw" (OuterVolumeSpecName: "kube-api-access-9qlkw") pod "e1a99224-b58e-48c3-aad8-bfa2c175965e" (UID: "e1a99224-b58e-48c3-aad8-bfa2c175965e"). InnerVolumeSpecName "kube-api-access-9qlkw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.579378 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1a99224-b58e-48c3-aad8-bfa2c175965e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e1a99224-b58e-48c3-aad8-bfa2c175965e" (UID: "e1a99224-b58e-48c3-aad8-bfa2c175965e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.584326 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-b78ff9797-ms9rv" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.605490 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1a99224-b58e-48c3-aad8-bfa2c175965e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1a99224-b58e-48c3-aad8-bfa2c175965e" (UID: "e1a99224-b58e-48c3-aad8-bfa2c175965e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.647653 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1a99224-b58e-48c3-aad8-bfa2c175965e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.653410 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-76bfcb8cc7-vl56f"] Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.667104 4973 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1a99224-b58e-48c3-aad8-bfa2c175965e-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.667556 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qlkw\" (UniqueName: \"kubernetes.io/projected/e1a99224-b58e-48c3-aad8-bfa2c175965e-kube-api-access-9qlkw\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.769395 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1a99224-b58e-48c3-aad8-bfa2c175965e-config-data" (OuterVolumeSpecName: "config-data") pod "e1a99224-b58e-48c3-aad8-bfa2c175965e" (UID: "e1a99224-b58e-48c3-aad8-bfa2c175965e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.770671 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1a99224-b58e-48c3-aad8-bfa2c175965e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.867810 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-77dfb8c95d-ng2gw" event={"ID":"e1a99224-b58e-48c3-aad8-bfa2c175965e","Type":"ContainerDied","Data":"5c92abde5ce097b40aa1daf84507586d026e34b5df8698ea522deef53916e18f"} Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.867893 4973 scope.go:117] "RemoveContainer" containerID="36af5976952feb7b2ad97b2689fa020238de3f6e61ca838c28cb0cba67ad7132" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.868035 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-77dfb8c95d-ng2gw" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.972698 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07fe89fe-2971-4d79-afd3-4688be48e51c" path="/var/lib/kubelet/pods/07fe89fe-2971-4d79-afd3-4688be48e51c/volumes" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.973307 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71549865-d1f9-44d8-bcf3-0040dcbeda6d" path="/var/lib/kubelet/pods/71549865-d1f9-44d8-bcf3-0040dcbeda6d/volumes" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.979584 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b7eb71d-d134-4d3b-84a0-f14776f4410a" path="/var/lib/kubelet/pods/7b7eb71d-d134-4d3b-84a0-f14776f4410a/volumes" Mar 20 13:47:25 crc kubenswrapper[4973]: I0320 13:47:25.987682 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-77dfb8c95d-ng2gw"] Mar 20 13:47:26 crc kubenswrapper[4973]: I0320 13:47:26.030821 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-77dfb8c95d-ng2gw"] Mar 20 13:47:26 crc kubenswrapper[4973]: I0320 13:47:26.306010 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:47:26 crc kubenswrapper[4973]: I0320 13:47:26.369431 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" Mar 20 13:47:26 crc kubenswrapper[4973]: I0320 13:47:26.494700 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d852eeb5-beb4-4410-beaa-34cb9d4da7e0-config-data\") pod \"d852eeb5-beb4-4410-beaa-34cb9d4da7e0\" (UID: \"d852eeb5-beb4-4410-beaa-34cb9d4da7e0\") " Mar 20 13:47:26 crc kubenswrapper[4973]: I0320 13:47:26.494819 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d852eeb5-beb4-4410-beaa-34cb9d4da7e0-config-data-custom\") pod \"d852eeb5-beb4-4410-beaa-34cb9d4da7e0\" (UID: \"d852eeb5-beb4-4410-beaa-34cb9d4da7e0\") " Mar 20 13:47:26 crc kubenswrapper[4973]: I0320 13:47:26.494865 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk6hx\" (UniqueName: \"kubernetes.io/projected/d852eeb5-beb4-4410-beaa-34cb9d4da7e0-kube-api-access-lk6hx\") pod \"d852eeb5-beb4-4410-beaa-34cb9d4da7e0\" (UID: \"d852eeb5-beb4-4410-beaa-34cb9d4da7e0\") " Mar 20 13:47:26 crc kubenswrapper[4973]: I0320 13:47:26.494921 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d852eeb5-beb4-4410-beaa-34cb9d4da7e0-combined-ca-bundle\") pod \"d852eeb5-beb4-4410-beaa-34cb9d4da7e0\" (UID: \"d852eeb5-beb4-4410-beaa-34cb9d4da7e0\") " Mar 20 13:47:26 crc kubenswrapper[4973]: I0320 13:47:26.502088 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d852eeb5-beb4-4410-beaa-34cb9d4da7e0-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d852eeb5-beb4-4410-beaa-34cb9d4da7e0" (UID: "d852eeb5-beb4-4410-beaa-34cb9d4da7e0"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:26 crc kubenswrapper[4973]: I0320 13:47:26.502623 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d852eeb5-beb4-4410-beaa-34cb9d4da7e0-kube-api-access-lk6hx" (OuterVolumeSpecName: "kube-api-access-lk6hx") pod "d852eeb5-beb4-4410-beaa-34cb9d4da7e0" (UID: "d852eeb5-beb4-4410-beaa-34cb9d4da7e0"). InnerVolumeSpecName "kube-api-access-lk6hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:26 crc kubenswrapper[4973]: I0320 13:47:26.544678 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d852eeb5-beb4-4410-beaa-34cb9d4da7e0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d852eeb5-beb4-4410-beaa-34cb9d4da7e0" (UID: "d852eeb5-beb4-4410-beaa-34cb9d4da7e0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:26 crc kubenswrapper[4973]: I0320 13:47:26.567607 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-76f767b74-8jwpc" Mar 20 13:47:26 crc kubenswrapper[4973]: I0320 13:47:26.567593 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d852eeb5-beb4-4410-beaa-34cb9d4da7e0-config-data" (OuterVolumeSpecName: "config-data") pod "d852eeb5-beb4-4410-beaa-34cb9d4da7e0" (UID: "d852eeb5-beb4-4410-beaa-34cb9d4da7e0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:26 crc kubenswrapper[4973]: I0320 13:47:26.597947 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d852eeb5-beb4-4410-beaa-34cb9d4da7e0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:26 crc kubenswrapper[4973]: I0320 13:47:26.597985 4973 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d852eeb5-beb4-4410-beaa-34cb9d4da7e0-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:26 crc kubenswrapper[4973]: I0320 13:47:26.597996 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk6hx\" (UniqueName: \"kubernetes.io/projected/d852eeb5-beb4-4410-beaa-34cb9d4da7e0-kube-api-access-lk6hx\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:26 crc kubenswrapper[4973]: I0320 13:47:26.598005 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d852eeb5-beb4-4410-beaa-34cb9d4da7e0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:26 crc kubenswrapper[4973]: I0320 13:47:26.635324 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6cc7bcd8df-hpkc4"] Mar 20 13:47:26 crc kubenswrapper[4973]: I0320 13:47:26.653603 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-6cc7bcd8df-hpkc4" podUID="ff523993-52d2-46af-8f52-f3c9ca447fcb" containerName="heat-engine" containerID="cri-o://1c89394a0cf6ae6b38ee6b0afcc1bdc31b6ef05b59de63cbb26f65fce265d33e" gracePeriod=60 Mar 20 13:47:26 crc kubenswrapper[4973]: I0320 13:47:26.894784 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" event={"ID":"d852eeb5-beb4-4410-beaa-34cb9d4da7e0","Type":"ContainerDied","Data":"21fa65abd650c55d9471b7c25cc4387b10d90a37f09c82a8f09e3f216ed37ab9"} Mar 20 13:47:26 crc kubenswrapper[4973]: I0320 13:47:26.895981 4973 scope.go:117] "RemoveContainer" containerID="d7d46a1ce93c90ca0cb917da6f1d57303662cf449344a7406994d65b1bfcac8b" Mar 20 13:47:26 crc kubenswrapper[4973]: I0320 13:47:26.894802 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-76bfcb8cc7-vl56f" Mar 20 13:47:26 crc kubenswrapper[4973]: I0320 13:47:26.901303 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d59a7c46-99e2-41de-a1b2-f98bd416a668","Type":"ContainerStarted","Data":"84ec7a34a77b8cd76820d73a4940cd2322492f9f40fbb2924d1c4bd1a21661a6"} Mar 20 13:47:27 crc kubenswrapper[4973]: I0320 13:47:27.009631 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-76bfcb8cc7-vl56f"] Mar 20 13:47:27 crc kubenswrapper[4973]: I0320 13:47:27.032998 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-76bfcb8cc7-vl56f"] Mar 20 13:47:27 crc kubenswrapper[4973]: I0320 13:47:27.932577 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d59a7c46-99e2-41de-a1b2-f98bd416a668","Type":"ContainerStarted","Data":"cbf79b320cb43bf295304afaad87ebe612d8f281f2e5b846bf4146daf0487ba1"} Mar 20 13:47:27 crc kubenswrapper[4973]: I0320 13:47:27.932899 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d59a7c46-99e2-41de-a1b2-f98bd416a668","Type":"ContainerStarted","Data":"985a6ae4df76f010d9052bf276f825c8125e68bfe9519603fb60bc3f0a7b9f68"} Mar 20 13:47:27 crc kubenswrapper[4973]: I0320 13:47:27.975330 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d852eeb5-beb4-4410-beaa-34cb9d4da7e0" path="/var/lib/kubelet/pods/d852eeb5-beb4-4410-beaa-34cb9d4da7e0/volumes" Mar 20 13:47:27 crc kubenswrapper[4973]: I0320 13:47:27.976192 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1a99224-b58e-48c3-aad8-bfa2c175965e" path="/var/lib/kubelet/pods/e1a99224-b58e-48c3-aad8-bfa2c175965e/volumes" Mar 20 13:47:28 crc kubenswrapper[4973]: E0320 13:47:28.074687 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c89394a0cf6ae6b38ee6b0afcc1bdc31b6ef05b59de63cbb26f65fce265d33e" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 20 13:47:28 crc kubenswrapper[4973]: E0320 13:47:28.076305 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c89394a0cf6ae6b38ee6b0afcc1bdc31b6ef05b59de63cbb26f65fce265d33e" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 20 13:47:28 crc kubenswrapper[4973]: E0320 13:47:28.077476 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c89394a0cf6ae6b38ee6b0afcc1bdc31b6ef05b59de63cbb26f65fce265d33e" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 20 13:47:28 crc kubenswrapper[4973]: E0320 13:47:28.077543 4973 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-6cc7bcd8df-hpkc4" podUID="ff523993-52d2-46af-8f52-f3c9ca447fcb" containerName="heat-engine" Mar 20 13:47:29 crc kubenswrapper[4973]: I0320 13:47:29.975140 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d59a7c46-99e2-41de-a1b2-f98bd416a668","Type":"ContainerStarted","Data":"b75a03c8412029f442c76915c9ecbe2b1ba0bbf5cb6bd7cc2bedd8cbbf956e2f"} Mar 20 13:47:32 crc kubenswrapper[4973]: I0320 13:47:32.012549 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d59a7c46-99e2-41de-a1b2-f98bd416a668","Type":"ContainerStarted","Data":"00ac3b77fa912a22b3b6a7b1c61d8f189943314b0e8df286178338d03aff6f50"} Mar 20 13:47:32 crc kubenswrapper[4973]: I0320 13:47:32.013188 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:47:32 crc kubenswrapper[4973]: I0320 13:47:32.048681 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.930890076 podStartE2EDuration="8.048659203s" podCreationTimestamp="2026-03-20 13:47:24 +0000 UTC" firstStartedPulling="2026-03-20 13:47:26.310529654 +0000 UTC m=+1567.054199398" lastFinishedPulling="2026-03-20 13:47:31.428298781 +0000 UTC m=+1572.171968525" observedRunningTime="2026-03-20 13:47:32.043519403 +0000 UTC m=+1572.787189147" watchObservedRunningTime="2026-03-20 13:47:32.048659203 +0000 UTC m=+1572.792328947" Mar 20 13:47:33 crc kubenswrapper[4973]: I0320 13:47:33.083961 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:47:33 crc kubenswrapper[4973]: I0320 13:47:33.445101 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-b6774478c-cwcjf" podUID="7b7eb71d-d134-4d3b-84a0-f14776f4410a" containerName="heat-cfnapi" probeResult="failure" output="Get \"http://10.217.0.227:8000/healthcheck\": dial tcp 10.217.0.227:8000: i/o timeout (Client.Timeout exceeded while awaiting headers)" Mar 20 13:47:33 crc kubenswrapper[4973]: I0320 13:47:33.457947 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-667c56bb7d-c5695" podUID="07fe89fe-2971-4d79-afd3-4688be48e51c" containerName="heat-api" probeResult="failure" output="Get \"http://10.217.0.228:8004/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:47:34 crc kubenswrapper[4973]: I0320 13:47:34.040158 4973 generic.go:334] "Generic (PLEG): container finished" podID="ff523993-52d2-46af-8f52-f3c9ca447fcb" containerID="1c89394a0cf6ae6b38ee6b0afcc1bdc31b6ef05b59de63cbb26f65fce265d33e" exitCode=0 Mar 20 13:47:34 crc kubenswrapper[4973]: I0320 13:47:34.041236 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d59a7c46-99e2-41de-a1b2-f98bd416a668" containerName="ceilometer-central-agent" containerID="cri-o://985a6ae4df76f010d9052bf276f825c8125e68bfe9519603fb60bc3f0a7b9f68" gracePeriod=30 Mar 20 13:47:34 crc kubenswrapper[4973]: I0320 13:47:34.041462 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6cc7bcd8df-hpkc4" event={"ID":"ff523993-52d2-46af-8f52-f3c9ca447fcb","Type":"ContainerDied","Data":"1c89394a0cf6ae6b38ee6b0afcc1bdc31b6ef05b59de63cbb26f65fce265d33e"} Mar 20 13:47:34 crc kubenswrapper[4973]: I0320 13:47:34.041593 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d59a7c46-99e2-41de-a1b2-f98bd416a668" containerName="sg-core" containerID="cri-o://b75a03c8412029f442c76915c9ecbe2b1ba0bbf5cb6bd7cc2bedd8cbbf956e2f" gracePeriod=30 Mar 20 13:47:34 crc kubenswrapper[4973]: I0320 13:47:34.041723 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d59a7c46-99e2-41de-a1b2-f98bd416a668" containerName="proxy-httpd" containerID="cri-o://00ac3b77fa912a22b3b6a7b1c61d8f189943314b0e8df286178338d03aff6f50" gracePeriod=30 Mar 20 13:47:34 crc kubenswrapper[4973]: I0320 13:47:34.041773 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="d59a7c46-99e2-41de-a1b2-f98bd416a668" containerName="ceilometer-notification-agent" containerID="cri-o://cbf79b320cb43bf295304afaad87ebe612d8f281f2e5b846bf4146daf0487ba1" gracePeriod=30 Mar 20 13:47:34 crc kubenswrapper[4973]: I0320 13:47:34.321532 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6cc7bcd8df-hpkc4" Mar 20 13:47:34 crc kubenswrapper[4973]: I0320 13:47:34.505136 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff523993-52d2-46af-8f52-f3c9ca447fcb-config-data-custom\") pod \"ff523993-52d2-46af-8f52-f3c9ca447fcb\" (UID: \"ff523993-52d2-46af-8f52-f3c9ca447fcb\") " Mar 20 13:47:34 crc kubenswrapper[4973]: I0320 13:47:34.505260 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff523993-52d2-46af-8f52-f3c9ca447fcb-combined-ca-bundle\") pod \"ff523993-52d2-46af-8f52-f3c9ca447fcb\" (UID: \"ff523993-52d2-46af-8f52-f3c9ca447fcb\") " Mar 20 13:47:34 crc kubenswrapper[4973]: I0320 13:47:34.505297 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff523993-52d2-46af-8f52-f3c9ca447fcb-config-data\") pod \"ff523993-52d2-46af-8f52-f3c9ca447fcb\" (UID: \"ff523993-52d2-46af-8f52-f3c9ca447fcb\") " Mar 20 13:47:34 crc kubenswrapper[4973]: I0320 13:47:34.505318 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xjcz\" (UniqueName: \"kubernetes.io/projected/ff523993-52d2-46af-8f52-f3c9ca447fcb-kube-api-access-7xjcz\") pod \"ff523993-52d2-46af-8f52-f3c9ca447fcb\" (UID: \"ff523993-52d2-46af-8f52-f3c9ca447fcb\") " Mar 20 13:47:34 crc kubenswrapper[4973]: I0320 13:47:34.513271 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff523993-52d2-46af-8f52-f3c9ca447fcb-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "ff523993-52d2-46af-8f52-f3c9ca447fcb" (UID: "ff523993-52d2-46af-8f52-f3c9ca447fcb"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:34 crc kubenswrapper[4973]: I0320 13:47:34.513399 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff523993-52d2-46af-8f52-f3c9ca447fcb-kube-api-access-7xjcz" (OuterVolumeSpecName: "kube-api-access-7xjcz") pod "ff523993-52d2-46af-8f52-f3c9ca447fcb" (UID: "ff523993-52d2-46af-8f52-f3c9ca447fcb"). InnerVolumeSpecName "kube-api-access-7xjcz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:34 crc kubenswrapper[4973]: I0320 13:47:34.544254 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff523993-52d2-46af-8f52-f3c9ca447fcb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff523993-52d2-46af-8f52-f3c9ca447fcb" (UID: "ff523993-52d2-46af-8f52-f3c9ca447fcb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:34 crc kubenswrapper[4973]: I0320 13:47:34.576859 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff523993-52d2-46af-8f52-f3c9ca447fcb-config-data" (OuterVolumeSpecName: "config-data") pod "ff523993-52d2-46af-8f52-f3c9ca447fcb" (UID: "ff523993-52d2-46af-8f52-f3c9ca447fcb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:34 crc kubenswrapper[4973]: I0320 13:47:34.608552 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff523993-52d2-46af-8f52-f3c9ca447fcb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:34 crc kubenswrapper[4973]: I0320 13:47:34.608587 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff523993-52d2-46af-8f52-f3c9ca447fcb-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:34 crc kubenswrapper[4973]: I0320 13:47:34.608598 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xjcz\" (UniqueName: \"kubernetes.io/projected/ff523993-52d2-46af-8f52-f3c9ca447fcb-kube-api-access-7xjcz\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:34 crc kubenswrapper[4973]: I0320 13:47:34.608610 4973 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ff523993-52d2-46af-8f52-f3c9ca447fcb-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:35 crc kubenswrapper[4973]: I0320 13:47:35.054428 4973 generic.go:334] "Generic (PLEG): container finished" podID="d59a7c46-99e2-41de-a1b2-f98bd416a668" containerID="00ac3b77fa912a22b3b6a7b1c61d8f189943314b0e8df286178338d03aff6f50" exitCode=0 Mar 20 13:47:35 crc kubenswrapper[4973]: I0320 13:47:35.054753 4973 generic.go:334] "Generic (PLEG): container finished" podID="d59a7c46-99e2-41de-a1b2-f98bd416a668" containerID="b75a03c8412029f442c76915c9ecbe2b1ba0bbf5cb6bd7cc2bedd8cbbf956e2f" exitCode=2 Mar 20 13:47:35 crc kubenswrapper[4973]: I0320 13:47:35.054762 4973 generic.go:334] "Generic (PLEG): container finished" podID="d59a7c46-99e2-41de-a1b2-f98bd416a668" containerID="cbf79b320cb43bf295304afaad87ebe612d8f281f2e5b846bf4146daf0487ba1" exitCode=0 Mar 20 13:47:35 crc kubenswrapper[4973]: I0320 13:47:35.054493 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d59a7c46-99e2-41de-a1b2-f98bd416a668","Type":"ContainerDied","Data":"00ac3b77fa912a22b3b6a7b1c61d8f189943314b0e8df286178338d03aff6f50"} Mar 20 13:47:35 crc kubenswrapper[4973]: I0320 13:47:35.054826 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d59a7c46-99e2-41de-a1b2-f98bd416a668","Type":"ContainerDied","Data":"b75a03c8412029f442c76915c9ecbe2b1ba0bbf5cb6bd7cc2bedd8cbbf956e2f"} Mar 20 13:47:35 crc kubenswrapper[4973]: I0320 13:47:35.054840 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d59a7c46-99e2-41de-a1b2-f98bd416a668","Type":"ContainerDied","Data":"cbf79b320cb43bf295304afaad87ebe612d8f281f2e5b846bf4146daf0487ba1"} Mar 20 13:47:35 crc kubenswrapper[4973]: I0320 13:47:35.057294 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6cc7bcd8df-hpkc4" event={"ID":"ff523993-52d2-46af-8f52-f3c9ca447fcb","Type":"ContainerDied","Data":"19a10bee73d7d9127fa52b47b461dfd7e0c9964373537fc9d69c5ca1ed033c19"} Mar 20 13:47:35 crc kubenswrapper[4973]: I0320 13:47:35.057340 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6cc7bcd8df-hpkc4" Mar 20 13:47:35 crc kubenswrapper[4973]: I0320 13:47:35.057379 4973 scope.go:117] "RemoveContainer" containerID="1c89394a0cf6ae6b38ee6b0afcc1bdc31b6ef05b59de63cbb26f65fce265d33e" Mar 20 13:47:35 crc kubenswrapper[4973]: I0320 13:47:35.099230 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-6cc7bcd8df-hpkc4"] Mar 20 13:47:35 crc kubenswrapper[4973]: I0320 13:47:35.114743 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-6cc7bcd8df-hpkc4"] Mar 20 13:47:35 crc kubenswrapper[4973]: I0320 13:47:35.965918 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff523993-52d2-46af-8f52-f3c9ca447fcb" path="/var/lib/kubelet/pods/ff523993-52d2-46af-8f52-f3c9ca447fcb/volumes" Mar 20 13:47:43 crc kubenswrapper[4973]: I0320 13:47:43.320663 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:47:43 crc kubenswrapper[4973]: I0320 13:47:43.321602 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:47:44 crc kubenswrapper[4973]: I0320 13:47:44.912378 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.056329 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d59a7c46-99e2-41de-a1b2-f98bd416a668-combined-ca-bundle\") pod \"d59a7c46-99e2-41de-a1b2-f98bd416a668\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.056423 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d59a7c46-99e2-41de-a1b2-f98bd416a668-config-data\") pod \"d59a7c46-99e2-41de-a1b2-f98bd416a668\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.057114 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d59a7c46-99e2-41de-a1b2-f98bd416a668-log-httpd\") pod \"d59a7c46-99e2-41de-a1b2-f98bd416a668\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.057197 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d59a7c46-99e2-41de-a1b2-f98bd416a668-scripts\") pod \"d59a7c46-99e2-41de-a1b2-f98bd416a668\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.057227 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gggm2\" (UniqueName: \"kubernetes.io/projected/d59a7c46-99e2-41de-a1b2-f98bd416a668-kube-api-access-gggm2\") pod \"d59a7c46-99e2-41de-a1b2-f98bd416a668\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.057250 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d59a7c46-99e2-41de-a1b2-f98bd416a668-sg-core-conf-yaml\") pod \"d59a7c46-99e2-41de-a1b2-f98bd416a668\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.057297 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d59a7c46-99e2-41de-a1b2-f98bd416a668-run-httpd\") pod \"d59a7c46-99e2-41de-a1b2-f98bd416a668\" (UID: \"d59a7c46-99e2-41de-a1b2-f98bd416a668\") " Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.057664 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d59a7c46-99e2-41de-a1b2-f98bd416a668-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "d59a7c46-99e2-41de-a1b2-f98bd416a668" (UID: "d59a7c46-99e2-41de-a1b2-f98bd416a668"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.058527 4973 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d59a7c46-99e2-41de-a1b2-f98bd416a668-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.059274 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d59a7c46-99e2-41de-a1b2-f98bd416a668-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "d59a7c46-99e2-41de-a1b2-f98bd416a668" (UID: "d59a7c46-99e2-41de-a1b2-f98bd416a668"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.064199 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d59a7c46-99e2-41de-a1b2-f98bd416a668-kube-api-access-gggm2" (OuterVolumeSpecName: "kube-api-access-gggm2") pod "d59a7c46-99e2-41de-a1b2-f98bd416a668" (UID: "d59a7c46-99e2-41de-a1b2-f98bd416a668"). InnerVolumeSpecName "kube-api-access-gggm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.066579 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d59a7c46-99e2-41de-a1b2-f98bd416a668-scripts" (OuterVolumeSpecName: "scripts") pod "d59a7c46-99e2-41de-a1b2-f98bd416a668" (UID: "d59a7c46-99e2-41de-a1b2-f98bd416a668"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.106325 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d59a7c46-99e2-41de-a1b2-f98bd416a668-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "d59a7c46-99e2-41de-a1b2-f98bd416a668" (UID: "d59a7c46-99e2-41de-a1b2-f98bd416a668"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.161157 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d59a7c46-99e2-41de-a1b2-f98bd416a668-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.161491 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gggm2\" (UniqueName: \"kubernetes.io/projected/d59a7c46-99e2-41de-a1b2-f98bd416a668-kube-api-access-gggm2\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.161505 4973 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/d59a7c46-99e2-41de-a1b2-f98bd416a668-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.161517 4973 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/d59a7c46-99e2-41de-a1b2-f98bd416a668-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.165254 4973 generic.go:334] "Generic (PLEG): container finished" podID="d59a7c46-99e2-41de-a1b2-f98bd416a668" containerID="985a6ae4df76f010d9052bf276f825c8125e68bfe9519603fb60bc3f0a7b9f68" exitCode=0 Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.165302 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d59a7c46-99e2-41de-a1b2-f98bd416a668","Type":"ContainerDied","Data":"985a6ae4df76f010d9052bf276f825c8125e68bfe9519603fb60bc3f0a7b9f68"} Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.165331 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"d59a7c46-99e2-41de-a1b2-f98bd416a668","Type":"ContainerDied","Data":"84ec7a34a77b8cd76820d73a4940cd2322492f9f40fbb2924d1c4bd1a21661a6"} Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.165383 4973 scope.go:117] "RemoveContainer" containerID="00ac3b77fa912a22b3b6a7b1c61d8f189943314b0e8df286178338d03aff6f50" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.165529 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.177587 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d59a7c46-99e2-41de-a1b2-f98bd416a668-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d59a7c46-99e2-41de-a1b2-f98bd416a668" (UID: "d59a7c46-99e2-41de-a1b2-f98bd416a668"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.193394 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d59a7c46-99e2-41de-a1b2-f98bd416a668-config-data" (OuterVolumeSpecName: "config-data") pod "d59a7c46-99e2-41de-a1b2-f98bd416a668" (UID: "d59a7c46-99e2-41de-a1b2-f98bd416a668"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.220917 4973 scope.go:117] "RemoveContainer" containerID="b75a03c8412029f442c76915c9ecbe2b1ba0bbf5cb6bd7cc2bedd8cbbf956e2f" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.243937 4973 scope.go:117] "RemoveContainer" containerID="cbf79b320cb43bf295304afaad87ebe612d8f281f2e5b846bf4146daf0487ba1" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.264066 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d59a7c46-99e2-41de-a1b2-f98bd416a668-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.264102 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d59a7c46-99e2-41de-a1b2-f98bd416a668-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.265926 4973 scope.go:117] "RemoveContainer" containerID="985a6ae4df76f010d9052bf276f825c8125e68bfe9519603fb60bc3f0a7b9f68" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.286520 4973 scope.go:117] "RemoveContainer" containerID="00ac3b77fa912a22b3b6a7b1c61d8f189943314b0e8df286178338d03aff6f50" Mar 20 13:47:45 crc kubenswrapper[4973]: E0320 13:47:45.287145 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00ac3b77fa912a22b3b6a7b1c61d8f189943314b0e8df286178338d03aff6f50\": container with ID starting with 00ac3b77fa912a22b3b6a7b1c61d8f189943314b0e8df286178338d03aff6f50 not found: ID does not exist" containerID="00ac3b77fa912a22b3b6a7b1c61d8f189943314b0e8df286178338d03aff6f50" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.287197 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00ac3b77fa912a22b3b6a7b1c61d8f189943314b0e8df286178338d03aff6f50"} err="failed to get container status \"00ac3b77fa912a22b3b6a7b1c61d8f189943314b0e8df286178338d03aff6f50\": rpc error: code = NotFound desc = could not find container \"00ac3b77fa912a22b3b6a7b1c61d8f189943314b0e8df286178338d03aff6f50\": container with ID starting with 00ac3b77fa912a22b3b6a7b1c61d8f189943314b0e8df286178338d03aff6f50 not found: ID does not exist" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.287231 4973 scope.go:117] "RemoveContainer" containerID="b75a03c8412029f442c76915c9ecbe2b1ba0bbf5cb6bd7cc2bedd8cbbf956e2f" Mar 20 13:47:45 crc kubenswrapper[4973]: E0320 13:47:45.287693 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b75a03c8412029f442c76915c9ecbe2b1ba0bbf5cb6bd7cc2bedd8cbbf956e2f\": container with ID starting with b75a03c8412029f442c76915c9ecbe2b1ba0bbf5cb6bd7cc2bedd8cbbf956e2f not found: ID does not exist" containerID="b75a03c8412029f442c76915c9ecbe2b1ba0bbf5cb6bd7cc2bedd8cbbf956e2f" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.287719 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b75a03c8412029f442c76915c9ecbe2b1ba0bbf5cb6bd7cc2bedd8cbbf956e2f"} err="failed to get container status \"b75a03c8412029f442c76915c9ecbe2b1ba0bbf5cb6bd7cc2bedd8cbbf956e2f\": rpc error: code = NotFound desc = could not find container \"b75a03c8412029f442c76915c9ecbe2b1ba0bbf5cb6bd7cc2bedd8cbbf956e2f\": container with ID starting with b75a03c8412029f442c76915c9ecbe2b1ba0bbf5cb6bd7cc2bedd8cbbf956e2f not found: ID does not exist" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.287757 4973 scope.go:117] "RemoveContainer" containerID="cbf79b320cb43bf295304afaad87ebe612d8f281f2e5b846bf4146daf0487ba1" Mar 20 13:47:45 crc kubenswrapper[4973]: E0320 13:47:45.288113 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbf79b320cb43bf295304afaad87ebe612d8f281f2e5b846bf4146daf0487ba1\": container with ID starting with cbf79b320cb43bf295304afaad87ebe612d8f281f2e5b846bf4146daf0487ba1 not found: ID does not exist" containerID="cbf79b320cb43bf295304afaad87ebe612d8f281f2e5b846bf4146daf0487ba1" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.288133 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbf79b320cb43bf295304afaad87ebe612d8f281f2e5b846bf4146daf0487ba1"} err="failed to get container status \"cbf79b320cb43bf295304afaad87ebe612d8f281f2e5b846bf4146daf0487ba1\": rpc error: code = NotFound desc = could not find container \"cbf79b320cb43bf295304afaad87ebe612d8f281f2e5b846bf4146daf0487ba1\": container with ID starting with cbf79b320cb43bf295304afaad87ebe612d8f281f2e5b846bf4146daf0487ba1 not found: ID does not exist" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.288152 4973 scope.go:117] "RemoveContainer" containerID="985a6ae4df76f010d9052bf276f825c8125e68bfe9519603fb60bc3f0a7b9f68" Mar 20 13:47:45 crc kubenswrapper[4973]: E0320 13:47:45.288712 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"985a6ae4df76f010d9052bf276f825c8125e68bfe9519603fb60bc3f0a7b9f68\": container with ID starting with 985a6ae4df76f010d9052bf276f825c8125e68bfe9519603fb60bc3f0a7b9f68 not found: ID does not exist" containerID="985a6ae4df76f010d9052bf276f825c8125e68bfe9519603fb60bc3f0a7b9f68" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.288739 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"985a6ae4df76f010d9052bf276f825c8125e68bfe9519603fb60bc3f0a7b9f68"} err="failed to get container status \"985a6ae4df76f010d9052bf276f825c8125e68bfe9519603fb60bc3f0a7b9f68\": rpc error: code = NotFound desc = could not find container \"985a6ae4df76f010d9052bf276f825c8125e68bfe9519603fb60bc3f0a7b9f68\": container with ID starting with 985a6ae4df76f010d9052bf276f825c8125e68bfe9519603fb60bc3f0a7b9f68 not found: ID does not exist" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.515647 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.541638 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.570714 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:47:45 crc kubenswrapper[4973]: E0320 13:47:45.571414 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d59a7c46-99e2-41de-a1b2-f98bd416a668" containerName="ceilometer-central-agent" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.571438 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="d59a7c46-99e2-41de-a1b2-f98bd416a668" containerName="ceilometer-central-agent" Mar 20 13:47:45 crc kubenswrapper[4973]: E0320 13:47:45.571469 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a99224-b58e-48c3-aad8-bfa2c175965e" containerName="heat-api" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.571478 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a99224-b58e-48c3-aad8-bfa2c175965e" containerName="heat-api" Mar 20 13:47:45 crc kubenswrapper[4973]: E0320 13:47:45.571495 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d852eeb5-beb4-4410-beaa-34cb9d4da7e0" containerName="heat-cfnapi" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.571503 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="d852eeb5-beb4-4410-beaa-34cb9d4da7e0" containerName="heat-cfnapi" Mar 20 13:47:45 crc kubenswrapper[4973]: E0320 13:47:45.571514 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d59a7c46-99e2-41de-a1b2-f98bd416a668" containerName="proxy-httpd" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.571522 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="d59a7c46-99e2-41de-a1b2-f98bd416a668" containerName="proxy-httpd" Mar 20 13:47:45 crc kubenswrapper[4973]: E0320 13:47:45.571551 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff523993-52d2-46af-8f52-f3c9ca447fcb" containerName="heat-engine" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.571601 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff523993-52d2-46af-8f52-f3c9ca447fcb" containerName="heat-engine" Mar 20 13:47:45 crc kubenswrapper[4973]: E0320 13:47:45.571619 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d852eeb5-beb4-4410-beaa-34cb9d4da7e0" containerName="heat-cfnapi" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.571629 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="d852eeb5-beb4-4410-beaa-34cb9d4da7e0" containerName="heat-cfnapi" Mar 20 13:47:45 crc kubenswrapper[4973]: E0320 13:47:45.571647 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d59a7c46-99e2-41de-a1b2-f98bd416a668" containerName="ceilometer-notification-agent" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.571656 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="d59a7c46-99e2-41de-a1b2-f98bd416a668" containerName="ceilometer-notification-agent" Mar 20 13:47:45 crc kubenswrapper[4973]: E0320 13:47:45.571676 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d59a7c46-99e2-41de-a1b2-f98bd416a668" containerName="sg-core" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.571684 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="d59a7c46-99e2-41de-a1b2-f98bd416a668" containerName="sg-core" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.572236 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="d59a7c46-99e2-41de-a1b2-f98bd416a668" containerName="ceilometer-central-agent" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.572268 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="d852eeb5-beb4-4410-beaa-34cb9d4da7e0" containerName="heat-cfnapi" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.572285 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="d852eeb5-beb4-4410-beaa-34cb9d4da7e0" containerName="heat-cfnapi" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.572299 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1a99224-b58e-48c3-aad8-bfa2c175965e" containerName="heat-api" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.572320 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="d59a7c46-99e2-41de-a1b2-f98bd416a668" containerName="ceilometer-notification-agent" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.572359 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff523993-52d2-46af-8f52-f3c9ca447fcb" containerName="heat-engine" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.572378 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1a99224-b58e-48c3-aad8-bfa2c175965e" containerName="heat-api" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.572400 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="d59a7c46-99e2-41de-a1b2-f98bd416a668" containerName="sg-core" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.572413 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="d59a7c46-99e2-41de-a1b2-f98bd416a668" containerName="proxy-httpd" Mar 20 13:47:45 crc kubenswrapper[4973]: E0320 13:47:45.572690 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a99224-b58e-48c3-aad8-bfa2c175965e" containerName="heat-api" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.572815 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a99224-b58e-48c3-aad8-bfa2c175965e" containerName="heat-api" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.575208 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.577877 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.578000 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.582032 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.672539 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1e6f898-edd9-4fd4-97ad-80bb06786c23-log-httpd\") pod \"ceilometer-0\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " pod="openstack/ceilometer-0" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.672617 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e6f898-edd9-4fd4-97ad-80bb06786c23-config-data\") pod \"ceilometer-0\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " pod="openstack/ceilometer-0" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.672655 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9fwz\" (UniqueName: \"kubernetes.io/projected/a1e6f898-edd9-4fd4-97ad-80bb06786c23-kube-api-access-z9fwz\") pod \"ceilometer-0\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " pod="openstack/ceilometer-0" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.673088 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e6f898-edd9-4fd4-97ad-80bb06786c23-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " pod="openstack/ceilometer-0" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.673136 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e6f898-edd9-4fd4-97ad-80bb06786c23-scripts\") pod \"ceilometer-0\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " pod="openstack/ceilometer-0" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.673249 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1e6f898-edd9-4fd4-97ad-80bb06786c23-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " pod="openstack/ceilometer-0" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.673285 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1e6f898-edd9-4fd4-97ad-80bb06786c23-run-httpd\") pod \"ceilometer-0\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " pod="openstack/ceilometer-0" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.782385 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e6f898-edd9-4fd4-97ad-80bb06786c23-config-data\") pod \"ceilometer-0\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " pod="openstack/ceilometer-0" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.782459 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9fwz\" (UniqueName: \"kubernetes.io/projected/a1e6f898-edd9-4fd4-97ad-80bb06786c23-kube-api-access-z9fwz\") pod \"ceilometer-0\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " pod="openstack/ceilometer-0" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.782615 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e6f898-edd9-4fd4-97ad-80bb06786c23-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " pod="openstack/ceilometer-0" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.782644 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e6f898-edd9-4fd4-97ad-80bb06786c23-scripts\") pod \"ceilometer-0\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " pod="openstack/ceilometer-0" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.782682 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1e6f898-edd9-4fd4-97ad-80bb06786c23-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " pod="openstack/ceilometer-0" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.782700 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1e6f898-edd9-4fd4-97ad-80bb06786c23-run-httpd\") pod \"ceilometer-0\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " pod="openstack/ceilometer-0" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.782782 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1e6f898-edd9-4fd4-97ad-80bb06786c23-log-httpd\") pod \"ceilometer-0\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " pod="openstack/ceilometer-0" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.783727 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1e6f898-edd9-4fd4-97ad-80bb06786c23-run-httpd\") pod \"ceilometer-0\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " pod="openstack/ceilometer-0" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.783932 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1e6f898-edd9-4fd4-97ad-80bb06786c23-log-httpd\") pod \"ceilometer-0\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " pod="openstack/ceilometer-0" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.790448 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e6f898-edd9-4fd4-97ad-80bb06786c23-scripts\") pod \"ceilometer-0\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " pod="openstack/ceilometer-0" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.791033 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e6f898-edd9-4fd4-97ad-80bb06786c23-config-data\") pod \"ceilometer-0\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " pod="openstack/ceilometer-0" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.792263 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e6f898-edd9-4fd4-97ad-80bb06786c23-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " pod="openstack/ceilometer-0" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.797474 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1e6f898-edd9-4fd4-97ad-80bb06786c23-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " pod="openstack/ceilometer-0" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.802077 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9fwz\" (UniqueName: \"kubernetes.io/projected/a1e6f898-edd9-4fd4-97ad-80bb06786c23-kube-api-access-z9fwz\") pod \"ceilometer-0\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " pod="openstack/ceilometer-0" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.919190 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:47:45 crc kubenswrapper[4973]: I0320 13:47:45.971687 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d59a7c46-99e2-41de-a1b2-f98bd416a668" path="/var/lib/kubelet/pods/d59a7c46-99e2-41de-a1b2-f98bd416a668/volumes" Mar 20 13:47:46 crc kubenswrapper[4973]: I0320 13:47:46.424765 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:47:47 crc kubenswrapper[4973]: I0320 13:47:47.196559 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1e6f898-edd9-4fd4-97ad-80bb06786c23","Type":"ContainerStarted","Data":"96524c281ec7ab0aa8d518697d9c609a4172960bae40cd605b3ea7e8c07ecec5"} Mar 20 13:47:47 crc kubenswrapper[4973]: I0320 13:47:47.196611 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1e6f898-edd9-4fd4-97ad-80bb06786c23","Type":"ContainerStarted","Data":"b97fbe14eb392bc16d9df3d0eaca9984eb4400e20493398d4eadcad7fd0d897c"} Mar 20 13:47:49 crc kubenswrapper[4973]: I0320 13:47:49.224912 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1e6f898-edd9-4fd4-97ad-80bb06786c23","Type":"ContainerStarted","Data":"6f485e1193eb59eec8cac3a577afb5d4c6d8aace6dcfe887c53bb781ae504f05"} Mar 20 13:47:49 crc kubenswrapper[4973]: I0320 13:47:49.225455 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1e6f898-edd9-4fd4-97ad-80bb06786c23","Type":"ContainerStarted","Data":"0bb262cc478a46cb3b27dce3073603af02b5cda3d5ba8a2d7a6d8cb1c6c5186d"} Mar 20 13:47:50 crc kubenswrapper[4973]: I0320 13:47:50.237593 4973 generic.go:334] "Generic (PLEG): container finished" podID="066916b9-4270-42bd-bcd1-3fd26bd65a9e" containerID="210d5c181fb4f04e97bfc605675b4d3f7ca4c477d78dfbb351f46f194c573ca4" exitCode=0 Mar 20 13:47:50 crc kubenswrapper[4973]: I0320 13:47:50.237770 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cl4pf" event={"ID":"066916b9-4270-42bd-bcd1-3fd26bd65a9e","Type":"ContainerDied","Data":"210d5c181fb4f04e97bfc605675b4d3f7ca4c477d78dfbb351f46f194c573ca4"} Mar 20 13:47:51 crc kubenswrapper[4973]: I0320 13:47:51.252375 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1e6f898-edd9-4fd4-97ad-80bb06786c23","Type":"ContainerStarted","Data":"50560a38936136aea96c0c108b2294547f3dec7b9f529037a8d3fdc030739533"} Mar 20 13:47:51 crc kubenswrapper[4973]: I0320 13:47:51.252904 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:47:51 crc kubenswrapper[4973]: I0320 13:47:51.263814 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:47:51 crc kubenswrapper[4973]: I0320 13:47:51.264291 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="446ab543-b039-4065-964f-945824ddec63" containerName="glance-log" containerID="cri-o://250bc14e923b459bc4688d4c37932bb175e6fd2bd1c85de12c3664484335bf8e" gracePeriod=30 Mar 20 13:47:51 crc kubenswrapper[4973]: I0320 13:47:51.264399 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="446ab543-b039-4065-964f-945824ddec63" containerName="glance-httpd" containerID="cri-o://d089e46bbfe0c52652e56f493ae3b60a798e861828b80405b16d6c6a7ae423ae" gracePeriod=30 Mar 20 13:47:51 crc kubenswrapper[4973]: I0320 13:47:51.287659 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.353073283 podStartE2EDuration="6.287634865s" podCreationTimestamp="2026-03-20 13:47:45 +0000 UTC" firstStartedPulling="2026-03-20 13:47:46.416452818 +0000 UTC m=+1587.160122562" lastFinishedPulling="2026-03-20 13:47:50.3510144 +0000 UTC m=+1591.094684144" observedRunningTime="2026-03-20 13:47:51.277528749 +0000 UTC m=+1592.021198493" watchObservedRunningTime="2026-03-20 13:47:51.287634865 +0000 UTC m=+1592.031304609" Mar 20 13:47:51 crc kubenswrapper[4973]: I0320 13:47:51.878769 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cl4pf" Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.063762 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/066916b9-4270-42bd-bcd1-3fd26bd65a9e-config-data\") pod \"066916b9-4270-42bd-bcd1-3fd26bd65a9e\" (UID: \"066916b9-4270-42bd-bcd1-3fd26bd65a9e\") " Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.063951 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/066916b9-4270-42bd-bcd1-3fd26bd65a9e-scripts\") pod \"066916b9-4270-42bd-bcd1-3fd26bd65a9e\" (UID: \"066916b9-4270-42bd-bcd1-3fd26bd65a9e\") " Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.063974 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/066916b9-4270-42bd-bcd1-3fd26bd65a9e-combined-ca-bundle\") pod \"066916b9-4270-42bd-bcd1-3fd26bd65a9e\" (UID: \"066916b9-4270-42bd-bcd1-3fd26bd65a9e\") " Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.063993 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkv9n\" (UniqueName: \"kubernetes.io/projected/066916b9-4270-42bd-bcd1-3fd26bd65a9e-kube-api-access-hkv9n\") pod \"066916b9-4270-42bd-bcd1-3fd26bd65a9e\" (UID: \"066916b9-4270-42bd-bcd1-3fd26bd65a9e\") " Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.084700 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/066916b9-4270-42bd-bcd1-3fd26bd65a9e-scripts" (OuterVolumeSpecName: "scripts") pod "066916b9-4270-42bd-bcd1-3fd26bd65a9e" (UID: "066916b9-4270-42bd-bcd1-3fd26bd65a9e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.095635 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/066916b9-4270-42bd-bcd1-3fd26bd65a9e-kube-api-access-hkv9n" (OuterVolumeSpecName: "kube-api-access-hkv9n") pod "066916b9-4270-42bd-bcd1-3fd26bd65a9e" (UID: "066916b9-4270-42bd-bcd1-3fd26bd65a9e"). InnerVolumeSpecName "kube-api-access-hkv9n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.099413 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/066916b9-4270-42bd-bcd1-3fd26bd65a9e-config-data" (OuterVolumeSpecName: "config-data") pod "066916b9-4270-42bd-bcd1-3fd26bd65a9e" (UID: "066916b9-4270-42bd-bcd1-3fd26bd65a9e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.105648 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/066916b9-4270-42bd-bcd1-3fd26bd65a9e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "066916b9-4270-42bd-bcd1-3fd26bd65a9e" (UID: "066916b9-4270-42bd-bcd1-3fd26bd65a9e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.172537 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/066916b9-4270-42bd-bcd1-3fd26bd65a9e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.172828 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkv9n\" (UniqueName: \"kubernetes.io/projected/066916b9-4270-42bd-bcd1-3fd26bd65a9e-kube-api-access-hkv9n\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.172970 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/066916b9-4270-42bd-bcd1-3fd26bd65a9e-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.173050 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/066916b9-4270-42bd-bcd1-3fd26bd65a9e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.265473 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-cl4pf" event={"ID":"066916b9-4270-42bd-bcd1-3fd26bd65a9e","Type":"ContainerDied","Data":"13ee6dec8cabcd30928c6973a70b5dce9b0c8a93b142bb7c5c882b98e361d50b"} Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.265494 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-cl4pf" Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.265736 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13ee6dec8cabcd30928c6973a70b5dce9b0c8a93b142bb7c5c882b98e361d50b" Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.267795 4973 generic.go:334] "Generic (PLEG): container finished" podID="446ab543-b039-4065-964f-945824ddec63" containerID="250bc14e923b459bc4688d4c37932bb175e6fd2bd1c85de12c3664484335bf8e" exitCode=143 Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.267938 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"446ab543-b039-4065-964f-945824ddec63","Type":"ContainerDied","Data":"250bc14e923b459bc4688d4c37932bb175e6fd2bd1c85de12c3664484335bf8e"} Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.359899 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 13:47:52 crc kubenswrapper[4973]: E0320 13:47:52.360525 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="066916b9-4270-42bd-bcd1-3fd26bd65a9e" containerName="nova-cell0-conductor-db-sync" Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.360673 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="066916b9-4270-42bd-bcd1-3fd26bd65a9e" containerName="nova-cell0-conductor-db-sync" Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.360912 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="066916b9-4270-42bd-bcd1-3fd26bd65a9e" containerName="nova-cell0-conductor-db-sync" Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.362824 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.367043 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.367058 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qgmk5" Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.375128 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.378259 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzmh5\" (UniqueName: \"kubernetes.io/projected/c31768ff-7740-4ba0-a355-02b5ae3b75f0-kube-api-access-lzmh5\") pod \"nova-cell0-conductor-0\" (UID: \"c31768ff-7740-4ba0-a355-02b5ae3b75f0\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.378505 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c31768ff-7740-4ba0-a355-02b5ae3b75f0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c31768ff-7740-4ba0-a355-02b5ae3b75f0\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.378642 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c31768ff-7740-4ba0-a355-02b5ae3b75f0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c31768ff-7740-4ba0-a355-02b5ae3b75f0\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.481245 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzmh5\" (UniqueName: \"kubernetes.io/projected/c31768ff-7740-4ba0-a355-02b5ae3b75f0-kube-api-access-lzmh5\") pod \"nova-cell0-conductor-0\" (UID: \"c31768ff-7740-4ba0-a355-02b5ae3b75f0\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.481352 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c31768ff-7740-4ba0-a355-02b5ae3b75f0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c31768ff-7740-4ba0-a355-02b5ae3b75f0\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.481375 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c31768ff-7740-4ba0-a355-02b5ae3b75f0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c31768ff-7740-4ba0-a355-02b5ae3b75f0\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.485413 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c31768ff-7740-4ba0-a355-02b5ae3b75f0-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"c31768ff-7740-4ba0-a355-02b5ae3b75f0\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.486896 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c31768ff-7740-4ba0-a355-02b5ae3b75f0-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"c31768ff-7740-4ba0-a355-02b5ae3b75f0\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.499976 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzmh5\" (UniqueName: \"kubernetes.io/projected/c31768ff-7740-4ba0-a355-02b5ae3b75f0-kube-api-access-lzmh5\") pod \"nova-cell0-conductor-0\" (UID: \"c31768ff-7740-4ba0-a355-02b5ae3b75f0\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:47:52 crc kubenswrapper[4973]: I0320 13:47:52.689660 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 13:47:53 crc kubenswrapper[4973]: I0320 13:47:53.141152 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 13:47:53 crc kubenswrapper[4973]: I0320 13:47:53.155583 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:47:53 crc kubenswrapper[4973]: I0320 13:47:53.155939 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9befea21-7c31-4cd9-b5a2-2f86a1d32b28" containerName="glance-log" containerID="cri-o://29f1058286c4135a0040c93a2b493cec6fb2e8530a48378dd6d0e669e6e7f8c2" gracePeriod=30 Mar 20 13:47:53 crc kubenswrapper[4973]: I0320 13:47:53.156435 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="9befea21-7c31-4cd9-b5a2-2f86a1d32b28" containerName="glance-httpd" containerID="cri-o://80e73f3dbaee92c1970189eab012b3d029931f9afa64eb568818b4b37843f08c" gracePeriod=30 Mar 20 13:47:53 crc kubenswrapper[4973]: I0320 13:47:53.187695 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 13:47:53 crc kubenswrapper[4973]: W0320 13:47:53.188113 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc31768ff_7740_4ba0_a355_02b5ae3b75f0.slice/crio-8f42022327d63df885b4b9894518ca73fc71869eb86f996920617d1aa0290fa8 WatchSource:0}: Error finding container 8f42022327d63df885b4b9894518ca73fc71869eb86f996920617d1aa0290fa8: Status 404 returned error can't find the container with id 8f42022327d63df885b4b9894518ca73fc71869eb86f996920617d1aa0290fa8 Mar 20 13:47:53 crc kubenswrapper[4973]: I0320 13:47:53.285900 4973 generic.go:334] "Generic (PLEG): container finished" podID="9befea21-7c31-4cd9-b5a2-2f86a1d32b28" containerID="29f1058286c4135a0040c93a2b493cec6fb2e8530a48378dd6d0e669e6e7f8c2" exitCode=143 Mar 20 13:47:53 crc kubenswrapper[4973]: I0320 13:47:53.286174 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9befea21-7c31-4cd9-b5a2-2f86a1d32b28","Type":"ContainerDied","Data":"29f1058286c4135a0040c93a2b493cec6fb2e8530a48378dd6d0e669e6e7f8c2"} Mar 20 13:47:53 crc kubenswrapper[4973]: I0320 13:47:53.289752 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c31768ff-7740-4ba0-a355-02b5ae3b75f0","Type":"ContainerStarted","Data":"8f42022327d63df885b4b9894518ca73fc71869eb86f996920617d1aa0290fa8"} Mar 20 13:47:54 crc kubenswrapper[4973]: I0320 13:47:54.075119 4973 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod07fe89fe-2971-4d79-afd3-4688be48e51c"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod07fe89fe-2971-4d79-afd3-4688be48e51c] : Timed out while waiting for systemd to remove kubepods-besteffort-pod07fe89fe_2971_4d79_afd3_4688be48e51c.slice" Mar 20 13:47:54 crc kubenswrapper[4973]: I0320 13:47:54.303092 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c31768ff-7740-4ba0-a355-02b5ae3b75f0","Type":"ContainerStarted","Data":"cebacebc3abeb44846df404a9888189719b306489361fc6b40f9d49d2ca02dd9"} Mar 20 13:47:54 crc kubenswrapper[4973]: I0320 13:47:54.303204 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="c31768ff-7740-4ba0-a355-02b5ae3b75f0" containerName="nova-cell0-conductor-conductor" containerID="cri-o://cebacebc3abeb44846df404a9888189719b306489361fc6b40f9d49d2ca02dd9" gracePeriod=30 Mar 20 13:47:54 crc kubenswrapper[4973]: I0320 13:47:54.303612 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 20 13:47:54 crc kubenswrapper[4973]: I0320 13:47:54.325160 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.325137232 podStartE2EDuration="2.325137232s" podCreationTimestamp="2026-03-20 13:47:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:54.32139774 +0000 UTC m=+1595.065067474" watchObservedRunningTime="2026-03-20 13:47:54.325137232 +0000 UTC m=+1595.068806976" Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.140744 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.158694 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.159207 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a1e6f898-edd9-4fd4-97ad-80bb06786c23" containerName="ceilometer-central-agent" containerID="cri-o://96524c281ec7ab0aa8d518697d9c609a4172960bae40cd605b3ea7e8c07ecec5" gracePeriod=30 Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.159281 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a1e6f898-edd9-4fd4-97ad-80bb06786c23" containerName="ceilometer-notification-agent" containerID="cri-o://0bb262cc478a46cb3b27dce3073603af02b5cda3d5ba8a2d7a6d8cb1c6c5186d" gracePeriod=30 Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.159268 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a1e6f898-edd9-4fd4-97ad-80bb06786c23" containerName="sg-core" containerID="cri-o://6f485e1193eb59eec8cac3a577afb5d4c6d8aace6dcfe887c53bb781ae504f05" gracePeriod=30 Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.159309 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a1e6f898-edd9-4fd4-97ad-80bb06786c23" containerName="proxy-httpd" containerID="cri-o://50560a38936136aea96c0c108b2294547f3dec7b9f529037a8d3fdc030739533" gracePeriod=30 Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.160512 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\") pod \"446ab543-b039-4065-964f-945824ddec63\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.160675 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/446ab543-b039-4065-964f-945824ddec63-logs\") pod \"446ab543-b039-4065-964f-945824ddec63\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.160716 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssctt\" (UniqueName: \"kubernetes.io/projected/446ab543-b039-4065-964f-945824ddec63-kube-api-access-ssctt\") pod \"446ab543-b039-4065-964f-945824ddec63\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.160772 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/446ab543-b039-4065-964f-945824ddec63-httpd-run\") pod \"446ab543-b039-4065-964f-945824ddec63\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.160831 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/446ab543-b039-4065-964f-945824ddec63-scripts\") pod \"446ab543-b039-4065-964f-945824ddec63\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.160882 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/446ab543-b039-4065-964f-945824ddec63-combined-ca-bundle\") pod \"446ab543-b039-4065-964f-945824ddec63\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.162127 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/446ab543-b039-4065-964f-945824ddec63-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "446ab543-b039-4065-964f-945824ddec63" (UID: "446ab543-b039-4065-964f-945824ddec63"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.162436 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/446ab543-b039-4065-964f-945824ddec63-logs" (OuterVolumeSpecName: "logs") pod "446ab543-b039-4065-964f-945824ddec63" (UID: "446ab543-b039-4065-964f-945824ddec63"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.162846 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/446ab543-b039-4065-964f-945824ddec63-config-data\") pod \"446ab543-b039-4065-964f-945824ddec63\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.163116 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/446ab543-b039-4065-964f-945824ddec63-public-tls-certs\") pod \"446ab543-b039-4065-964f-945824ddec63\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.164038 4973 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/446ab543-b039-4065-964f-945824ddec63-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.164059 4973 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/446ab543-b039-4065-964f-945824ddec63-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.167020 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/446ab543-b039-4065-964f-945824ddec63-scripts" (OuterVolumeSpecName: "scripts") pod "446ab543-b039-4065-964f-945824ddec63" (UID: "446ab543-b039-4065-964f-945824ddec63"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.204379 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/446ab543-b039-4065-964f-945824ddec63-kube-api-access-ssctt" (OuterVolumeSpecName: "kube-api-access-ssctt") pod "446ab543-b039-4065-964f-945824ddec63" (UID: "446ab543-b039-4065-964f-945824ddec63"). InnerVolumeSpecName "kube-api-access-ssctt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:55 crc kubenswrapper[4973]: E0320 13:47:55.262569 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564 podName:446ab543-b039-4065-964f-945824ddec63 nodeName:}" failed. No retries permitted until 2026-03-20 13:47:55.762540409 +0000 UTC m=+1596.506210153 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "glance" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564") pod "446ab543-b039-4065-964f-945824ddec63" (UID: "446ab543-b039-4065-964f-945824ddec63") : kubernetes.io/csi: Unmounter.TearDownAt failed: rpc error: code = Unknown desc = check target path: could not get consistent content of /proc/mounts after 3 attempts Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.266885 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssctt\" (UniqueName: \"kubernetes.io/projected/446ab543-b039-4065-964f-945824ddec63-kube-api-access-ssctt\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.266914 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/446ab543-b039-4065-964f-945824ddec63-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.277539 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/446ab543-b039-4065-964f-945824ddec63-config-data" (OuterVolumeSpecName: "config-data") pod "446ab543-b039-4065-964f-945824ddec63" (UID: "446ab543-b039-4065-964f-945824ddec63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.286782 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/446ab543-b039-4065-964f-945824ddec63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "446ab543-b039-4065-964f-945824ddec63" (UID: "446ab543-b039-4065-964f-945824ddec63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.304182 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/446ab543-b039-4065-964f-945824ddec63-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "446ab543-b039-4065-964f-945824ddec63" (UID: "446ab543-b039-4065-964f-945824ddec63"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.319565 4973 generic.go:334] "Generic (PLEG): container finished" podID="a1e6f898-edd9-4fd4-97ad-80bb06786c23" containerID="6f485e1193eb59eec8cac3a577afb5d4c6d8aace6dcfe887c53bb781ae504f05" exitCode=2 Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.319647 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1e6f898-edd9-4fd4-97ad-80bb06786c23","Type":"ContainerDied","Data":"6f485e1193eb59eec8cac3a577afb5d4c6d8aace6dcfe887c53bb781ae504f05"} Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.323328 4973 generic.go:334] "Generic (PLEG): container finished" podID="446ab543-b039-4065-964f-945824ddec63" containerID="d089e46bbfe0c52652e56f493ae3b60a798e861828b80405b16d6c6a7ae423ae" exitCode=0 Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.323391 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"446ab543-b039-4065-964f-945824ddec63","Type":"ContainerDied","Data":"d089e46bbfe0c52652e56f493ae3b60a798e861828b80405b16d6c6a7ae423ae"} Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.323419 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"446ab543-b039-4065-964f-945824ddec63","Type":"ContainerDied","Data":"087378d9e75af5932d6ffb6f2b09ab926ea359a7002e69608ba724c3dbcc2e78"} Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.323435 4973 scope.go:117] "RemoveContainer" containerID="d089e46bbfe0c52652e56f493ae3b60a798e861828b80405b16d6c6a7ae423ae" Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.323608 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.376359 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/446ab543-b039-4065-964f-945824ddec63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.376400 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/446ab543-b039-4065-964f-945824ddec63-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.376418 4973 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/446ab543-b039-4065-964f-945824ddec63-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.404103 4973 scope.go:117] "RemoveContainer" containerID="250bc14e923b459bc4688d4c37932bb175e6fd2bd1c85de12c3664484335bf8e" Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.454689 4973 scope.go:117] "RemoveContainer" containerID="d089e46bbfe0c52652e56f493ae3b60a798e861828b80405b16d6c6a7ae423ae" Mar 20 13:47:55 crc kubenswrapper[4973]: E0320 13:47:55.456811 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d089e46bbfe0c52652e56f493ae3b60a798e861828b80405b16d6c6a7ae423ae\": container with ID starting with d089e46bbfe0c52652e56f493ae3b60a798e861828b80405b16d6c6a7ae423ae not found: ID does not exist" containerID="d089e46bbfe0c52652e56f493ae3b60a798e861828b80405b16d6c6a7ae423ae" Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.456855 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d089e46bbfe0c52652e56f493ae3b60a798e861828b80405b16d6c6a7ae423ae"} err="failed to get container status \"d089e46bbfe0c52652e56f493ae3b60a798e861828b80405b16d6c6a7ae423ae\": rpc error: code = NotFound desc = could not find container \"d089e46bbfe0c52652e56f493ae3b60a798e861828b80405b16d6c6a7ae423ae\": container with ID starting with d089e46bbfe0c52652e56f493ae3b60a798e861828b80405b16d6c6a7ae423ae not found: ID does not exist" Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.456890 4973 scope.go:117] "RemoveContainer" containerID="250bc14e923b459bc4688d4c37932bb175e6fd2bd1c85de12c3664484335bf8e" Mar 20 13:47:55 crc kubenswrapper[4973]: E0320 13:47:55.460654 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"250bc14e923b459bc4688d4c37932bb175e6fd2bd1c85de12c3664484335bf8e\": container with ID starting with 250bc14e923b459bc4688d4c37932bb175e6fd2bd1c85de12c3664484335bf8e not found: ID does not exist" containerID="250bc14e923b459bc4688d4c37932bb175e6fd2bd1c85de12c3664484335bf8e" Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.460695 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"250bc14e923b459bc4688d4c37932bb175e6fd2bd1c85de12c3664484335bf8e"} err="failed to get container status \"250bc14e923b459bc4688d4c37932bb175e6fd2bd1c85de12c3664484335bf8e\": rpc error: code = NotFound desc = could not find container \"250bc14e923b459bc4688d4c37932bb175e6fd2bd1c85de12c3664484335bf8e\": container with ID starting with 250bc14e923b459bc4688d4c37932bb175e6fd2bd1c85de12c3664484335bf8e not found: ID does not exist" Mar 20 13:47:55 crc kubenswrapper[4973]: E0320 13:47:55.742227 4973 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1e6f898_edd9_4fd4_97ad_80bb06786c23.slice/crio-conmon-50560a38936136aea96c0c108b2294547f3dec7b9f529037a8d3fdc030739533.scope\": RecentStats: unable to find data in memory cache]" Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.784391 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\") pod \"446ab543-b039-4065-964f-945824ddec63\" (UID: \"446ab543-b039-4065-964f-945824ddec63\") " Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.806936 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564" (OuterVolumeSpecName: "glance") pod "446ab543-b039-4065-964f-945824ddec63" (UID: "446ab543-b039-4065-964f-945824ddec63"). InnerVolumeSpecName "pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.891518 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.891985 4973 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\") on node \"crc\" " Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.957405 4973 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.957914 4973 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564") on node "crc" Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.984314 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.994605 4973 reconciler_common.go:293] "Volume detached for volume \"pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:55 crc kubenswrapper[4973]: I0320 13:47:55.999455 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:47:56 crc kubenswrapper[4973]: E0320 13:47:56.001068 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="446ab543-b039-4065-964f-945824ddec63" containerName="glance-log" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.001095 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="446ab543-b039-4065-964f-945824ddec63" containerName="glance-log" Mar 20 13:47:56 crc kubenswrapper[4973]: E0320 13:47:56.001168 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="446ab543-b039-4065-964f-945824ddec63" containerName="glance-httpd" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.001227 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="446ab543-b039-4065-964f-945824ddec63" containerName="glance-httpd" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.001665 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="446ab543-b039-4065-964f-945824ddec63" containerName="glance-log" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.001743 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="446ab543-b039-4065-964f-945824ddec63" containerName="glance-httpd" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.004545 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.007781 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.008104 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.021907 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.199843 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/154777eb-43ed-484d-9e6e-f2bd09fecf57-scripts\") pod \"glance-default-external-api-0\" (UID: \"154777eb-43ed-484d-9e6e-f2bd09fecf57\") " pod="openstack/glance-default-external-api-0" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.199912 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/154777eb-43ed-484d-9e6e-f2bd09fecf57-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"154777eb-43ed-484d-9e6e-f2bd09fecf57\") " pod="openstack/glance-default-external-api-0" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.199971 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154777eb-43ed-484d-9e6e-f2bd09fecf57-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"154777eb-43ed-484d-9e6e-f2bd09fecf57\") " pod="openstack/glance-default-external-api-0" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.200286 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/154777eb-43ed-484d-9e6e-f2bd09fecf57-logs\") pod \"glance-default-external-api-0\" (UID: \"154777eb-43ed-484d-9e6e-f2bd09fecf57\") " pod="openstack/glance-default-external-api-0" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.200353 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/154777eb-43ed-484d-9e6e-f2bd09fecf57-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"154777eb-43ed-484d-9e6e-f2bd09fecf57\") " pod="openstack/glance-default-external-api-0" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.200589 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7rdp\" (UniqueName: \"kubernetes.io/projected/154777eb-43ed-484d-9e6e-f2bd09fecf57-kube-api-access-f7rdp\") pod \"glance-default-external-api-0\" (UID: \"154777eb-43ed-484d-9e6e-f2bd09fecf57\") " pod="openstack/glance-default-external-api-0" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.200762 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\") pod \"glance-default-external-api-0\" (UID: \"154777eb-43ed-484d-9e6e-f2bd09fecf57\") " pod="openstack/glance-default-external-api-0" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.200856 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/154777eb-43ed-484d-9e6e-f2bd09fecf57-config-data\") pod \"glance-default-external-api-0\" (UID: \"154777eb-43ed-484d-9e6e-f2bd09fecf57\") " pod="openstack/glance-default-external-api-0" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.302369 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7rdp\" (UniqueName: \"kubernetes.io/projected/154777eb-43ed-484d-9e6e-f2bd09fecf57-kube-api-access-f7rdp\") pod \"glance-default-external-api-0\" (UID: \"154777eb-43ed-484d-9e6e-f2bd09fecf57\") " pod="openstack/glance-default-external-api-0" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.302445 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\") pod \"glance-default-external-api-0\" (UID: \"154777eb-43ed-484d-9e6e-f2bd09fecf57\") " pod="openstack/glance-default-external-api-0" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.302484 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/154777eb-43ed-484d-9e6e-f2bd09fecf57-config-data\") pod \"glance-default-external-api-0\" (UID: \"154777eb-43ed-484d-9e6e-f2bd09fecf57\") " pod="openstack/glance-default-external-api-0" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.302526 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/154777eb-43ed-484d-9e6e-f2bd09fecf57-scripts\") pod \"glance-default-external-api-0\" (UID: \"154777eb-43ed-484d-9e6e-f2bd09fecf57\") " pod="openstack/glance-default-external-api-0" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.302560 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/154777eb-43ed-484d-9e6e-f2bd09fecf57-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"154777eb-43ed-484d-9e6e-f2bd09fecf57\") " pod="openstack/glance-default-external-api-0" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.302585 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154777eb-43ed-484d-9e6e-f2bd09fecf57-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"154777eb-43ed-484d-9e6e-f2bd09fecf57\") " pod="openstack/glance-default-external-api-0" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.302652 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/154777eb-43ed-484d-9e6e-f2bd09fecf57-logs\") pod \"glance-default-external-api-0\" (UID: \"154777eb-43ed-484d-9e6e-f2bd09fecf57\") " pod="openstack/glance-default-external-api-0" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.302669 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/154777eb-43ed-484d-9e6e-f2bd09fecf57-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"154777eb-43ed-484d-9e6e-f2bd09fecf57\") " pod="openstack/glance-default-external-api-0" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.304593 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/154777eb-43ed-484d-9e6e-f2bd09fecf57-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"154777eb-43ed-484d-9e6e-f2bd09fecf57\") " pod="openstack/glance-default-external-api-0" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.305721 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/154777eb-43ed-484d-9e6e-f2bd09fecf57-logs\") pod \"glance-default-external-api-0\" (UID: \"154777eb-43ed-484d-9e6e-f2bd09fecf57\") " pod="openstack/glance-default-external-api-0" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.308865 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/154777eb-43ed-484d-9e6e-f2bd09fecf57-scripts\") pod \"glance-default-external-api-0\" (UID: \"154777eb-43ed-484d-9e6e-f2bd09fecf57\") " pod="openstack/glance-default-external-api-0" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.308998 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/154777eb-43ed-484d-9e6e-f2bd09fecf57-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"154777eb-43ed-484d-9e6e-f2bd09fecf57\") " pod="openstack/glance-default-external-api-0" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.310406 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/154777eb-43ed-484d-9e6e-f2bd09fecf57-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"154777eb-43ed-484d-9e6e-f2bd09fecf57\") " pod="openstack/glance-default-external-api-0" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.314104 4973 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.314152 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\") pod \"glance-default-external-api-0\" (UID: \"154777eb-43ed-484d-9e6e-f2bd09fecf57\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/da3d9df538268a2045279e6c35b80a9f9f98a48c1b51b6173b49a5c36304d8af/globalmount\"" pod="openstack/glance-default-external-api-0" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.317587 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/154777eb-43ed-484d-9e6e-f2bd09fecf57-config-data\") pod \"glance-default-external-api-0\" (UID: \"154777eb-43ed-484d-9e6e-f2bd09fecf57\") " pod="openstack/glance-default-external-api-0" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.333442 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7rdp\" (UniqueName: \"kubernetes.io/projected/154777eb-43ed-484d-9e6e-f2bd09fecf57-kube-api-access-f7rdp\") pod \"glance-default-external-api-0\" (UID: \"154777eb-43ed-484d-9e6e-f2bd09fecf57\") " pod="openstack/glance-default-external-api-0" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.338934 4973 generic.go:334] "Generic (PLEG): container finished" podID="a1e6f898-edd9-4fd4-97ad-80bb06786c23" containerID="50560a38936136aea96c0c108b2294547f3dec7b9f529037a8d3fdc030739533" exitCode=0 Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.339149 4973 generic.go:334] "Generic (PLEG): container finished" podID="a1e6f898-edd9-4fd4-97ad-80bb06786c23" containerID="0bb262cc478a46cb3b27dce3073603af02b5cda3d5ba8a2d7a6d8cb1c6c5186d" exitCode=0 Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.339011 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1e6f898-edd9-4fd4-97ad-80bb06786c23","Type":"ContainerDied","Data":"50560a38936136aea96c0c108b2294547f3dec7b9f529037a8d3fdc030739533"} Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.339402 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1e6f898-edd9-4fd4-97ad-80bb06786c23","Type":"ContainerDied","Data":"0bb262cc478a46cb3b27dce3073603af02b5cda3d5ba8a2d7a6d8cb1c6c5186d"} Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.368208 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-925d9a86-425f-4e0f-b70d-8ee7d0bf9564\") pod \"glance-default-external-api-0\" (UID: \"154777eb-43ed-484d-9e6e-f2bd09fecf57\") " pod="openstack/glance-default-external-api-0" Mar 20 13:47:56 crc kubenswrapper[4973]: I0320 13:47:56.628198 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.041007 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.132747 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-scripts\") pod \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.132862 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-httpd-run\") pod \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.132907 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-combined-ca-bundle\") pod \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.133561 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4da075d1-9614-421d-9161-1d84cabf00c7\") pod \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.133617 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmdlq\" (UniqueName: \"kubernetes.io/projected/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-kube-api-access-xmdlq\") pod \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.133651 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-logs\") pod \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.133725 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-config-data\") pod \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.133825 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-internal-tls-certs\") pod \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\" (UID: \"9befea21-7c31-4cd9-b5a2-2f86a1d32b28\") " Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.135676 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9befea21-7c31-4cd9-b5a2-2f86a1d32b28" (UID: "9befea21-7c31-4cd9-b5a2-2f86a1d32b28"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.139826 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-logs" (OuterVolumeSpecName: "logs") pod "9befea21-7c31-4cd9-b5a2-2f86a1d32b28" (UID: "9befea21-7c31-4cd9-b5a2-2f86a1d32b28"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.146069 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-kube-api-access-xmdlq" (OuterVolumeSpecName: "kube-api-access-xmdlq") pod "9befea21-7c31-4cd9-b5a2-2f86a1d32b28" (UID: "9befea21-7c31-4cd9-b5a2-2f86a1d32b28"). InnerVolumeSpecName "kube-api-access-xmdlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.147953 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-scripts" (OuterVolumeSpecName: "scripts") pod "9befea21-7c31-4cd9-b5a2-2f86a1d32b28" (UID: "9befea21-7c31-4cd9-b5a2-2f86a1d32b28"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.190177 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4da075d1-9614-421d-9161-1d84cabf00c7" (OuterVolumeSpecName: "glance") pod "9befea21-7c31-4cd9-b5a2-2f86a1d32b28" (UID: "9befea21-7c31-4cd9-b5a2-2f86a1d32b28"). InnerVolumeSpecName "pvc-4da075d1-9614-421d-9161-1d84cabf00c7". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.216519 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9befea21-7c31-4cd9-b5a2-2f86a1d32b28" (UID: "9befea21-7c31-4cd9-b5a2-2f86a1d32b28"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.240527 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9befea21-7c31-4cd9-b5a2-2f86a1d32b28" (UID: "9befea21-7c31-4cd9-b5a2-2f86a1d32b28"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.242093 4973 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.242122 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.242132 4973 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.242144 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.242169 4973 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-4da075d1-9614-421d-9161-1d84cabf00c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4da075d1-9614-421d-9161-1d84cabf00c7\") on node \"crc\" " Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.242183 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmdlq\" (UniqueName: \"kubernetes.io/projected/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-kube-api-access-xmdlq\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.242196 4973 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.251925 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-config-data" (OuterVolumeSpecName: "config-data") pod "9befea21-7c31-4cd9-b5a2-2f86a1d32b28" (UID: "9befea21-7c31-4cd9-b5a2-2f86a1d32b28"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.304659 4973 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.305061 4973 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-4da075d1-9614-421d-9161-1d84cabf00c7" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4da075d1-9614-421d-9161-1d84cabf00c7") on node "crc" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.344265 4973 reconciler_common.go:293] "Volume detached for volume \"pvc-4da075d1-9614-421d-9161-1d84cabf00c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4da075d1-9614-421d-9161-1d84cabf00c7\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.347993 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9befea21-7c31-4cd9-b5a2-2f86a1d32b28-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.358612 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.365712 4973 generic.go:334] "Generic (PLEG): container finished" podID="9befea21-7c31-4cd9-b5a2-2f86a1d32b28" containerID="80e73f3dbaee92c1970189eab012b3d029931f9afa64eb568818b4b37843f08c" exitCode=0 Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.365763 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9befea21-7c31-4cd9-b5a2-2f86a1d32b28","Type":"ContainerDied","Data":"80e73f3dbaee92c1970189eab012b3d029931f9afa64eb568818b4b37843f08c"} Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.365789 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"9befea21-7c31-4cd9-b5a2-2f86a1d32b28","Type":"ContainerDied","Data":"437e3854f9943662a434b350517f6830327bba98666c5a03a7d1b3db2ecd96b7"} Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.365806 4973 scope.go:117] "RemoveContainer" containerID="80e73f3dbaee92c1970189eab012b3d029931f9afa64eb568818b4b37843f08c" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.365923 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.375411 4973 generic.go:334] "Generic (PLEG): container finished" podID="a1e6f898-edd9-4fd4-97ad-80bb06786c23" containerID="96524c281ec7ab0aa8d518697d9c609a4172960bae40cd605b3ea7e8c07ecec5" exitCode=0 Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.375454 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1e6f898-edd9-4fd4-97ad-80bb06786c23","Type":"ContainerDied","Data":"96524c281ec7ab0aa8d518697d9c609a4172960bae40cd605b3ea7e8c07ecec5"} Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.411069 4973 scope.go:117] "RemoveContainer" containerID="29f1058286c4135a0040c93a2b493cec6fb2e8530a48378dd6d0e669e6e7f8c2" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.416444 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.426596 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.443745 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.449938 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e6f898-edd9-4fd4-97ad-80bb06786c23-config-data\") pod \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.449997 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e6f898-edd9-4fd4-97ad-80bb06786c23-scripts\") pod \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.450092 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1e6f898-edd9-4fd4-97ad-80bb06786c23-run-httpd\") pod \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.450256 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1e6f898-edd9-4fd4-97ad-80bb06786c23-log-httpd\") pod \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.450612 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1e6f898-edd9-4fd4-97ad-80bb06786c23-sg-core-conf-yaml\") pod \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.450667 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9fwz\" (UniqueName: \"kubernetes.io/projected/a1e6f898-edd9-4fd4-97ad-80bb06786c23-kube-api-access-z9fwz\") pod \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.450716 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e6f898-edd9-4fd4-97ad-80bb06786c23-combined-ca-bundle\") pod \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\" (UID: \"a1e6f898-edd9-4fd4-97ad-80bb06786c23\") " Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.455038 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1e6f898-edd9-4fd4-97ad-80bb06786c23-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a1e6f898-edd9-4fd4-97ad-80bb06786c23" (UID: "a1e6f898-edd9-4fd4-97ad-80bb06786c23"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.457478 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.457727 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a1e6f898-edd9-4fd4-97ad-80bb06786c23-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a1e6f898-edd9-4fd4-97ad-80bb06786c23" (UID: "a1e6f898-edd9-4fd4-97ad-80bb06786c23"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:47:57 crc kubenswrapper[4973]: E0320 13:47:57.458010 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1e6f898-edd9-4fd4-97ad-80bb06786c23" containerName="ceilometer-central-agent" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.458023 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1e6f898-edd9-4fd4-97ad-80bb06786c23" containerName="ceilometer-central-agent" Mar 20 13:47:57 crc kubenswrapper[4973]: E0320 13:47:57.458056 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1e6f898-edd9-4fd4-97ad-80bb06786c23" containerName="sg-core" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.458062 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1e6f898-edd9-4fd4-97ad-80bb06786c23" containerName="sg-core" Mar 20 13:47:57 crc kubenswrapper[4973]: E0320 13:47:57.458070 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1e6f898-edd9-4fd4-97ad-80bb06786c23" containerName="ceilometer-notification-agent" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.458077 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1e6f898-edd9-4fd4-97ad-80bb06786c23" containerName="ceilometer-notification-agent" Mar 20 13:47:57 crc kubenswrapper[4973]: E0320 13:47:57.458092 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9befea21-7c31-4cd9-b5a2-2f86a1d32b28" containerName="glance-log" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.458098 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="9befea21-7c31-4cd9-b5a2-2f86a1d32b28" containerName="glance-log" Mar 20 13:47:57 crc kubenswrapper[4973]: E0320 13:47:57.458112 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1e6f898-edd9-4fd4-97ad-80bb06786c23" containerName="proxy-httpd" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.458118 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1e6f898-edd9-4fd4-97ad-80bb06786c23" containerName="proxy-httpd" Mar 20 13:47:57 crc kubenswrapper[4973]: E0320 13:47:57.458130 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9befea21-7c31-4cd9-b5a2-2f86a1d32b28" containerName="glance-httpd" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.458135 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="9befea21-7c31-4cd9-b5a2-2f86a1d32b28" containerName="glance-httpd" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.458368 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="9befea21-7c31-4cd9-b5a2-2f86a1d32b28" containerName="glance-log" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.458387 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1e6f898-edd9-4fd4-97ad-80bb06786c23" containerName="proxy-httpd" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.458400 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="9befea21-7c31-4cd9-b5a2-2f86a1d32b28" containerName="glance-httpd" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.458406 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1e6f898-edd9-4fd4-97ad-80bb06786c23" containerName="sg-core" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.458417 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1e6f898-edd9-4fd4-97ad-80bb06786c23" containerName="ceilometer-notification-agent" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.458429 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1e6f898-edd9-4fd4-97ad-80bb06786c23" containerName="ceilometer-central-agent" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.460002 4973 scope.go:117] "RemoveContainer" containerID="80e73f3dbaee92c1970189eab012b3d029931f9afa64eb568818b4b37843f08c" Mar 20 13:47:57 crc kubenswrapper[4973]: E0320 13:47:57.465913 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80e73f3dbaee92c1970189eab012b3d029931f9afa64eb568818b4b37843f08c\": container with ID starting with 80e73f3dbaee92c1970189eab012b3d029931f9afa64eb568818b4b37843f08c not found: ID does not exist" containerID="80e73f3dbaee92c1970189eab012b3d029931f9afa64eb568818b4b37843f08c" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.465967 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80e73f3dbaee92c1970189eab012b3d029931f9afa64eb568818b4b37843f08c"} err="failed to get container status \"80e73f3dbaee92c1970189eab012b3d029931f9afa64eb568818b4b37843f08c\": rpc error: code = NotFound desc = could not find container \"80e73f3dbaee92c1970189eab012b3d029931f9afa64eb568818b4b37843f08c\": container with ID starting with 80e73f3dbaee92c1970189eab012b3d029931f9afa64eb568818b4b37843f08c not found: ID does not exist" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.465994 4973 scope.go:117] "RemoveContainer" containerID="29f1058286c4135a0040c93a2b493cec6fb2e8530a48378dd6d0e669e6e7f8c2" Mar 20 13:47:57 crc kubenswrapper[4973]: E0320 13:47:57.466273 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29f1058286c4135a0040c93a2b493cec6fb2e8530a48378dd6d0e669e6e7f8c2\": container with ID starting with 29f1058286c4135a0040c93a2b493cec6fb2e8530a48378dd6d0e669e6e7f8c2 not found: ID does not exist" containerID="29f1058286c4135a0040c93a2b493cec6fb2e8530a48378dd6d0e669e6e7f8c2" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.466298 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29f1058286c4135a0040c93a2b493cec6fb2e8530a48378dd6d0e669e6e7f8c2"} err="failed to get container status \"29f1058286c4135a0040c93a2b493cec6fb2e8530a48378dd6d0e669e6e7f8c2\": rpc error: code = NotFound desc = could not find container \"29f1058286c4135a0040c93a2b493cec6fb2e8530a48378dd6d0e669e6e7f8c2\": container with ID starting with 29f1058286c4135a0040c93a2b493cec6fb2e8530a48378dd6d0e669e6e7f8c2 not found: ID does not exist" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.469186 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.472307 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.472142 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.485506 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1e6f898-edd9-4fd4-97ad-80bb06786c23-kube-api-access-z9fwz" (OuterVolumeSpecName: "kube-api-access-z9fwz") pod "a1e6f898-edd9-4fd4-97ad-80bb06786c23" (UID: "a1e6f898-edd9-4fd4-97ad-80bb06786c23"). InnerVolumeSpecName "kube-api-access-z9fwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.485610 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e6f898-edd9-4fd4-97ad-80bb06786c23-scripts" (OuterVolumeSpecName: "scripts") pod "a1e6f898-edd9-4fd4-97ad-80bb06786c23" (UID: "a1e6f898-edd9-4fd4-97ad-80bb06786c23"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.492814 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.522711 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e6f898-edd9-4fd4-97ad-80bb06786c23-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a1e6f898-edd9-4fd4-97ad-80bb06786c23" (UID: "a1e6f898-edd9-4fd4-97ad-80bb06786c23"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.553619 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4da075d1-9614-421d-9161-1d84cabf00c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4da075d1-9614-421d-9161-1d84cabf00c7\") pod \"glance-default-internal-api-0\" (UID: \"8c476652-ce6b-4652-8f8a-9415b0be7465\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.553672 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c476652-ce6b-4652-8f8a-9415b0be7465-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8c476652-ce6b-4652-8f8a-9415b0be7465\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.553692 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c476652-ce6b-4652-8f8a-9415b0be7465-logs\") pod \"glance-default-internal-api-0\" (UID: \"8c476652-ce6b-4652-8f8a-9415b0be7465\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.553736 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c476652-ce6b-4652-8f8a-9415b0be7465-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8c476652-ce6b-4652-8f8a-9415b0be7465\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.553765 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c476652-ce6b-4652-8f8a-9415b0be7465-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8c476652-ce6b-4652-8f8a-9415b0be7465\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.553858 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdnr8\" (UniqueName: \"kubernetes.io/projected/8c476652-ce6b-4652-8f8a-9415b0be7465-kube-api-access-jdnr8\") pod \"glance-default-internal-api-0\" (UID: \"8c476652-ce6b-4652-8f8a-9415b0be7465\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.553895 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c476652-ce6b-4652-8f8a-9415b0be7465-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8c476652-ce6b-4652-8f8a-9415b0be7465\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.553939 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c476652-ce6b-4652-8f8a-9415b0be7465-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8c476652-ce6b-4652-8f8a-9415b0be7465\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.554003 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a1e6f898-edd9-4fd4-97ad-80bb06786c23-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.554015 4973 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1e6f898-edd9-4fd4-97ad-80bb06786c23-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.554024 4973 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a1e6f898-edd9-4fd4-97ad-80bb06786c23-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.554033 4973 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a1e6f898-edd9-4fd4-97ad-80bb06786c23-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.554043 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9fwz\" (UniqueName: \"kubernetes.io/projected/a1e6f898-edd9-4fd4-97ad-80bb06786c23-kube-api-access-z9fwz\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.596820 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e6f898-edd9-4fd4-97ad-80bb06786c23-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a1e6f898-edd9-4fd4-97ad-80bb06786c23" (UID: "a1e6f898-edd9-4fd4-97ad-80bb06786c23"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.628006 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a1e6f898-edd9-4fd4-97ad-80bb06786c23-config-data" (OuterVolumeSpecName: "config-data") pod "a1e6f898-edd9-4fd4-97ad-80bb06786c23" (UID: "a1e6f898-edd9-4fd4-97ad-80bb06786c23"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.655796 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c476652-ce6b-4652-8f8a-9415b0be7465-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8c476652-ce6b-4652-8f8a-9415b0be7465\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.655852 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c476652-ce6b-4652-8f8a-9415b0be7465-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8c476652-ce6b-4652-8f8a-9415b0be7465\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.655974 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdnr8\" (UniqueName: \"kubernetes.io/projected/8c476652-ce6b-4652-8f8a-9415b0be7465-kube-api-access-jdnr8\") pod \"glance-default-internal-api-0\" (UID: \"8c476652-ce6b-4652-8f8a-9415b0be7465\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.656022 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c476652-ce6b-4652-8f8a-9415b0be7465-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8c476652-ce6b-4652-8f8a-9415b0be7465\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.656085 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c476652-ce6b-4652-8f8a-9415b0be7465-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8c476652-ce6b-4652-8f8a-9415b0be7465\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.656126 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4da075d1-9614-421d-9161-1d84cabf00c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4da075d1-9614-421d-9161-1d84cabf00c7\") pod \"glance-default-internal-api-0\" (UID: \"8c476652-ce6b-4652-8f8a-9415b0be7465\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.656165 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c476652-ce6b-4652-8f8a-9415b0be7465-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8c476652-ce6b-4652-8f8a-9415b0be7465\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.656187 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c476652-ce6b-4652-8f8a-9415b0be7465-logs\") pod \"glance-default-internal-api-0\" (UID: \"8c476652-ce6b-4652-8f8a-9415b0be7465\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.656293 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1e6f898-edd9-4fd4-97ad-80bb06786c23-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.656309 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a1e6f898-edd9-4fd4-97ad-80bb06786c23-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.656673 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8c476652-ce6b-4652-8f8a-9415b0be7465-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"8c476652-ce6b-4652-8f8a-9415b0be7465\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.657288 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8c476652-ce6b-4652-8f8a-9415b0be7465-logs\") pod \"glance-default-internal-api-0\" (UID: \"8c476652-ce6b-4652-8f8a-9415b0be7465\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.661682 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c476652-ce6b-4652-8f8a-9415b0be7465-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"8c476652-ce6b-4652-8f8a-9415b0be7465\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.661997 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8c476652-ce6b-4652-8f8a-9415b0be7465-scripts\") pod \"glance-default-internal-api-0\" (UID: \"8c476652-ce6b-4652-8f8a-9415b0be7465\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.668001 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8c476652-ce6b-4652-8f8a-9415b0be7465-config-data\") pod \"glance-default-internal-api-0\" (UID: \"8c476652-ce6b-4652-8f8a-9415b0be7465\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.668587 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c476652-ce6b-4652-8f8a-9415b0be7465-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"8c476652-ce6b-4652-8f8a-9415b0be7465\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.669316 4973 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.669420 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4da075d1-9614-421d-9161-1d84cabf00c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4da075d1-9614-421d-9161-1d84cabf00c7\") pod \"glance-default-internal-api-0\" (UID: \"8c476652-ce6b-4652-8f8a-9415b0be7465\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/5c263bf6e6632ecbb6820e7ea731ec6b961228752a610e87fa7ed7c4de533cfd/globalmount\"" pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.681271 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdnr8\" (UniqueName: \"kubernetes.io/projected/8c476652-ce6b-4652-8f8a-9415b0be7465-kube-api-access-jdnr8\") pod \"glance-default-internal-api-0\" (UID: \"8c476652-ce6b-4652-8f8a-9415b0be7465\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.719438 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4da075d1-9614-421d-9161-1d84cabf00c7\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-4da075d1-9614-421d-9161-1d84cabf00c7\") pod \"glance-default-internal-api-0\" (UID: \"8c476652-ce6b-4652-8f8a-9415b0be7465\") " pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.799850 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.992096 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="446ab543-b039-4065-964f-945824ddec63" path="/var/lib/kubelet/pods/446ab543-b039-4065-964f-945824ddec63/volumes" Mar 20 13:47:57 crc kubenswrapper[4973]: I0320 13:47:57.994289 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9befea21-7c31-4cd9-b5a2-2f86a1d32b28" path="/var/lib/kubelet/pods/9befea21-7c31-4cd9-b5a2-2f86a1d32b28/volumes" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.417962 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"154777eb-43ed-484d-9e6e-f2bd09fecf57","Type":"ContainerStarted","Data":"2de67b1e65c90965f31a16fd520daae088bec543ceb7dc93d236691b4afd989a"} Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.418315 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"154777eb-43ed-484d-9e6e-f2bd09fecf57","Type":"ContainerStarted","Data":"9699decafb181af2023e5f8c42c860ff263b4766b499bf2e347a54291bcbfcb5"} Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.431131 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a1e6f898-edd9-4fd4-97ad-80bb06786c23","Type":"ContainerDied","Data":"b97fbe14eb392bc16d9df3d0eaca9984eb4400e20493398d4eadcad7fd0d897c"} Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.431186 4973 scope.go:117] "RemoveContainer" containerID="50560a38936136aea96c0c108b2294547f3dec7b9f529037a8d3fdc030739533" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.431395 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.483404 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.496229 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.527504 4973 scope.go:117] "RemoveContainer" containerID="6f485e1193eb59eec8cac3a577afb5d4c6d8aace6dcfe887c53bb781ae504f05" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.530507 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.533499 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.548795 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.548979 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.552398 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.593288 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4stw\" (UniqueName: \"kubernetes.io/projected/0239e23b-f2f5-438e-9bff-c80a19672a3c-kube-api-access-n4stw\") pod \"ceilometer-0\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " pod="openstack/ceilometer-0" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.593533 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0239e23b-f2f5-438e-9bff-c80a19672a3c-config-data\") pod \"ceilometer-0\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " pod="openstack/ceilometer-0" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.593729 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0239e23b-f2f5-438e-9bff-c80a19672a3c-run-httpd\") pod \"ceilometer-0\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " pod="openstack/ceilometer-0" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.593823 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0239e23b-f2f5-438e-9bff-c80a19672a3c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " pod="openstack/ceilometer-0" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.593905 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0239e23b-f2f5-438e-9bff-c80a19672a3c-log-httpd\") pod \"ceilometer-0\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " pod="openstack/ceilometer-0" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.593972 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0239e23b-f2f5-438e-9bff-c80a19672a3c-scripts\") pod \"ceilometer-0\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " pod="openstack/ceilometer-0" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.594119 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0239e23b-f2f5-438e-9bff-c80a19672a3c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " pod="openstack/ceilometer-0" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.615715 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.663444 4973 scope.go:117] "RemoveContainer" containerID="0bb262cc478a46cb3b27dce3073603af02b5cda3d5ba8a2d7a6d8cb1c6c5186d" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.695778 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0239e23b-f2f5-438e-9bff-c80a19672a3c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " pod="openstack/ceilometer-0" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.695841 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4stw\" (UniqueName: \"kubernetes.io/projected/0239e23b-f2f5-438e-9bff-c80a19672a3c-kube-api-access-n4stw\") pod \"ceilometer-0\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " pod="openstack/ceilometer-0" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.695872 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0239e23b-f2f5-438e-9bff-c80a19672a3c-config-data\") pod \"ceilometer-0\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " pod="openstack/ceilometer-0" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.695954 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0239e23b-f2f5-438e-9bff-c80a19672a3c-run-httpd\") pod \"ceilometer-0\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " pod="openstack/ceilometer-0" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.695989 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0239e23b-f2f5-438e-9bff-c80a19672a3c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " pod="openstack/ceilometer-0" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.696018 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0239e23b-f2f5-438e-9bff-c80a19672a3c-log-httpd\") pod \"ceilometer-0\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " pod="openstack/ceilometer-0" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.696036 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0239e23b-f2f5-438e-9bff-c80a19672a3c-scripts\") pod \"ceilometer-0\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " pod="openstack/ceilometer-0" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.696749 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0239e23b-f2f5-438e-9bff-c80a19672a3c-run-httpd\") pod \"ceilometer-0\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " pod="openstack/ceilometer-0" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.697197 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0239e23b-f2f5-438e-9bff-c80a19672a3c-log-httpd\") pod \"ceilometer-0\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " pod="openstack/ceilometer-0" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.702197 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0239e23b-f2f5-438e-9bff-c80a19672a3c-scripts\") pod \"ceilometer-0\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " pod="openstack/ceilometer-0" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.704589 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0239e23b-f2f5-438e-9bff-c80a19672a3c-config-data\") pod \"ceilometer-0\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " pod="openstack/ceilometer-0" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.715709 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4stw\" (UniqueName: \"kubernetes.io/projected/0239e23b-f2f5-438e-9bff-c80a19672a3c-kube-api-access-n4stw\") pod \"ceilometer-0\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " pod="openstack/ceilometer-0" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.716031 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0239e23b-f2f5-438e-9bff-c80a19672a3c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " pod="openstack/ceilometer-0" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.722239 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0239e23b-f2f5-438e-9bff-c80a19672a3c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " pod="openstack/ceilometer-0" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.726831 4973 scope.go:117] "RemoveContainer" containerID="96524c281ec7ab0aa8d518697d9c609a4172960bae40cd605b3ea7e8c07ecec5" Mar 20 13:47:58 crc kubenswrapper[4973]: I0320 13:47:58.862224 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:47:59 crc kubenswrapper[4973]: I0320 13:47:59.433975 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:47:59 crc kubenswrapper[4973]: W0320 13:47:59.445431 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0239e23b_f2f5_438e_9bff_c80a19672a3c.slice/crio-1357545a0f42184d0c3ae902b76924a0574fa4a8b647495d58ee62228c8711d9 WatchSource:0}: Error finding container 1357545a0f42184d0c3ae902b76924a0574fa4a8b647495d58ee62228c8711d9: Status 404 returned error can't find the container with id 1357545a0f42184d0c3ae902b76924a0574fa4a8b647495d58ee62228c8711d9 Mar 20 13:47:59 crc kubenswrapper[4973]: I0320 13:47:59.471108 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"154777eb-43ed-484d-9e6e-f2bd09fecf57","Type":"ContainerStarted","Data":"5a316bc67636414426cc08be2e3aa226f20ee6f6721e4790a20281982c8060b5"} Mar 20 13:47:59 crc kubenswrapper[4973]: I0320 13:47:59.477902 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0239e23b-f2f5-438e-9bff-c80a19672a3c","Type":"ContainerStarted","Data":"1357545a0f42184d0c3ae902b76924a0574fa4a8b647495d58ee62228c8711d9"} Mar 20 13:47:59 crc kubenswrapper[4973]: I0320 13:47:59.480250 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8c476652-ce6b-4652-8f8a-9415b0be7465","Type":"ContainerStarted","Data":"583b6d506f92a52ae1f7a1ef2b39b600b626b97bce0b8635444094ed6a79c25c"} Mar 20 13:47:59 crc kubenswrapper[4973]: I0320 13:47:59.480297 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8c476652-ce6b-4652-8f8a-9415b0be7465","Type":"ContainerStarted","Data":"536ce153744d10b8ea58bc20551905d5e034c657d34b9db2a64de2d27f965879"} Mar 20 13:47:59 crc kubenswrapper[4973]: I0320 13:47:59.498091 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.498069766 podStartE2EDuration="4.498069766s" podCreationTimestamp="2026-03-20 13:47:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:47:59.492084072 +0000 UTC m=+1600.235753816" watchObservedRunningTime="2026-03-20 13:47:59.498069766 +0000 UTC m=+1600.241739510" Mar 20 13:47:59 crc kubenswrapper[4973]: I0320 13:47:59.972050 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1e6f898-edd9-4fd4-97ad-80bb06786c23" path="/var/lib/kubelet/pods/a1e6f898-edd9-4fd4-97ad-80bb06786c23/volumes" Mar 20 13:48:00 crc kubenswrapper[4973]: I0320 13:48:00.161716 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566908-9w8fc"] Mar 20 13:48:00 crc kubenswrapper[4973]: I0320 13:48:00.163727 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566908-9w8fc" Mar 20 13:48:00 crc kubenswrapper[4973]: I0320 13:48:00.168947 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:48:00 crc kubenswrapper[4973]: I0320 13:48:00.169196 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:48:00 crc kubenswrapper[4973]: I0320 13:48:00.169319 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 13:48:00 crc kubenswrapper[4973]: I0320 13:48:00.208786 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566908-9w8fc"] Mar 20 13:48:00 crc kubenswrapper[4973]: I0320 13:48:00.313174 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67v8w\" (UniqueName: \"kubernetes.io/projected/7ed9fca6-1a1b-4489-9ca6-dab93e0615f3-kube-api-access-67v8w\") pod \"auto-csr-approver-29566908-9w8fc\" (UID: \"7ed9fca6-1a1b-4489-9ca6-dab93e0615f3\") " pod="openshift-infra/auto-csr-approver-29566908-9w8fc" Mar 20 13:48:00 crc kubenswrapper[4973]: I0320 13:48:00.415735 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67v8w\" (UniqueName: \"kubernetes.io/projected/7ed9fca6-1a1b-4489-9ca6-dab93e0615f3-kube-api-access-67v8w\") pod \"auto-csr-approver-29566908-9w8fc\" (UID: \"7ed9fca6-1a1b-4489-9ca6-dab93e0615f3\") " pod="openshift-infra/auto-csr-approver-29566908-9w8fc" Mar 20 13:48:00 crc kubenswrapper[4973]: I0320 13:48:00.424615 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:48:00 crc kubenswrapper[4973]: I0320 13:48:00.437978 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67v8w\" (UniqueName: \"kubernetes.io/projected/7ed9fca6-1a1b-4489-9ca6-dab93e0615f3-kube-api-access-67v8w\") pod \"auto-csr-approver-29566908-9w8fc\" (UID: \"7ed9fca6-1a1b-4489-9ca6-dab93e0615f3\") " pod="openshift-infra/auto-csr-approver-29566908-9w8fc" Mar 20 13:48:00 crc kubenswrapper[4973]: I0320 13:48:00.493264 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0239e23b-f2f5-438e-9bff-c80a19672a3c","Type":"ContainerStarted","Data":"2e9cc9a0c6cd5e26c62540e2498de8bc2223f2b38545008a6a1e17927bf6dd3c"} Mar 20 13:48:00 crc kubenswrapper[4973]: I0320 13:48:00.495408 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"8c476652-ce6b-4652-8f8a-9415b0be7465","Type":"ContainerStarted","Data":"991ede999068285f0654b06bd57947379fc8be8a77209571fdd00065504ff9ea"} Mar 20 13:48:00 crc kubenswrapper[4973]: I0320 13:48:00.525300 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.5252816129999998 podStartE2EDuration="3.525281613s" podCreationTimestamp="2026-03-20 13:47:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:00.5192964 +0000 UTC m=+1601.262966164" watchObservedRunningTime="2026-03-20 13:48:00.525281613 +0000 UTC m=+1601.268951357" Mar 20 13:48:00 crc kubenswrapper[4973]: I0320 13:48:00.607201 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566908-9w8fc" Mar 20 13:48:01 crc kubenswrapper[4973]: W0320 13:48:01.074649 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ed9fca6_1a1b_4489_9ca6_dab93e0615f3.slice/crio-69225fa4d05a27124ae5b72da157e08bb092b3c0b3ec47e970c51a58365b92cb WatchSource:0}: Error finding container 69225fa4d05a27124ae5b72da157e08bb092b3c0b3ec47e970c51a58365b92cb: Status 404 returned error can't find the container with id 69225fa4d05a27124ae5b72da157e08bb092b3c0b3ec47e970c51a58365b92cb Mar 20 13:48:01 crc kubenswrapper[4973]: I0320 13:48:01.075791 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566908-9w8fc"] Mar 20 13:48:01 crc kubenswrapper[4973]: I0320 13:48:01.506727 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0239e23b-f2f5-438e-9bff-c80a19672a3c","Type":"ContainerStarted","Data":"985f59da16877efa33d99869212240838eba6bb0b457e4e8e8403a0bb6f5059a"} Mar 20 13:48:01 crc kubenswrapper[4973]: I0320 13:48:01.509504 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566908-9w8fc" event={"ID":"7ed9fca6-1a1b-4489-9ca6-dab93e0615f3","Type":"ContainerStarted","Data":"69225fa4d05a27124ae5b72da157e08bb092b3c0b3ec47e970c51a58365b92cb"} Mar 20 13:48:02 crc kubenswrapper[4973]: I0320 13:48:02.525570 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0239e23b-f2f5-438e-9bff-c80a19672a3c","Type":"ContainerStarted","Data":"5f1bea883dab4eed5d5fe3072df4600bb68714d46771f0e312fbb7c13750e603"} Mar 20 13:48:02 crc kubenswrapper[4973]: E0320 13:48:02.694996 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cebacebc3abeb44846df404a9888189719b306489361fc6b40f9d49d2ca02dd9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 13:48:02 crc kubenswrapper[4973]: E0320 13:48:02.697055 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cebacebc3abeb44846df404a9888189719b306489361fc6b40f9d49d2ca02dd9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 13:48:02 crc kubenswrapper[4973]: E0320 13:48:02.699336 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cebacebc3abeb44846df404a9888189719b306489361fc6b40f9d49d2ca02dd9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 13:48:02 crc kubenswrapper[4973]: E0320 13:48:02.699434 4973 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="c31768ff-7740-4ba0-a355-02b5ae3b75f0" containerName="nova-cell0-conductor-conductor" Mar 20 13:48:03 crc kubenswrapper[4973]: I0320 13:48:03.542570 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566908-9w8fc" event={"ID":"7ed9fca6-1a1b-4489-9ca6-dab93e0615f3","Type":"ContainerStarted","Data":"815cf8addeea77972c3c230d8a330e18ff0fb4a62a7f968fe769530bf6effc27"} Mar 20 13:48:03 crc kubenswrapper[4973]: I0320 13:48:03.558485 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566908-9w8fc" podStartSLOduration=1.700312877 podStartE2EDuration="3.558465914s" podCreationTimestamp="2026-03-20 13:48:00 +0000 UTC" firstStartedPulling="2026-03-20 13:48:01.077971819 +0000 UTC m=+1601.821641563" lastFinishedPulling="2026-03-20 13:48:02.936124856 +0000 UTC m=+1603.679794600" observedRunningTime="2026-03-20 13:48:03.554171945 +0000 UTC m=+1604.297841689" watchObservedRunningTime="2026-03-20 13:48:03.558465914 +0000 UTC m=+1604.302135658" Mar 20 13:48:04 crc kubenswrapper[4973]: I0320 13:48:04.557954 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0239e23b-f2f5-438e-9bff-c80a19672a3c" containerName="ceilometer-central-agent" containerID="cri-o://2e9cc9a0c6cd5e26c62540e2498de8bc2223f2b38545008a6a1e17927bf6dd3c" gracePeriod=30 Mar 20 13:48:04 crc kubenswrapper[4973]: I0320 13:48:04.559462 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0239e23b-f2f5-438e-9bff-c80a19672a3c" containerName="ceilometer-notification-agent" containerID="cri-o://985f59da16877efa33d99869212240838eba6bb0b457e4e8e8403a0bb6f5059a" gracePeriod=30 Mar 20 13:48:04 crc kubenswrapper[4973]: I0320 13:48:04.558896 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0239e23b-f2f5-438e-9bff-c80a19672a3c","Type":"ContainerStarted","Data":"e300dd5f33c0854523a2fa05d1579f6d387707e68fa3b7eeb7899189ba5c131f"} Mar 20 13:48:04 crc kubenswrapper[4973]: I0320 13:48:04.559387 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0239e23b-f2f5-438e-9bff-c80a19672a3c" containerName="proxy-httpd" containerID="cri-o://e300dd5f33c0854523a2fa05d1579f6d387707e68fa3b7eeb7899189ba5c131f" gracePeriod=30 Mar 20 13:48:04 crc kubenswrapper[4973]: I0320 13:48:04.559607 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:48:04 crc kubenswrapper[4973]: I0320 13:48:04.559421 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0239e23b-f2f5-438e-9bff-c80a19672a3c" containerName="sg-core" containerID="cri-o://5f1bea883dab4eed5d5fe3072df4600bb68714d46771f0e312fbb7c13750e603" gracePeriod=30 Mar 20 13:48:04 crc kubenswrapper[4973]: I0320 13:48:04.586299 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.205783653 podStartE2EDuration="6.586275417s" podCreationTimestamp="2026-03-20 13:47:58 +0000 UTC" firstStartedPulling="2026-03-20 13:47:59.448572595 +0000 UTC m=+1600.192242339" lastFinishedPulling="2026-03-20 13:48:03.829064349 +0000 UTC m=+1604.572734103" observedRunningTime="2026-03-20 13:48:04.583218083 +0000 UTC m=+1605.326887847" watchObservedRunningTime="2026-03-20 13:48:04.586275417 +0000 UTC m=+1605.329945161" Mar 20 13:48:05 crc kubenswrapper[4973]: I0320 13:48:05.570533 4973 generic.go:334] "Generic (PLEG): container finished" podID="0239e23b-f2f5-438e-9bff-c80a19672a3c" containerID="e300dd5f33c0854523a2fa05d1579f6d387707e68fa3b7eeb7899189ba5c131f" exitCode=0 Mar 20 13:48:05 crc kubenswrapper[4973]: I0320 13:48:05.570823 4973 generic.go:334] "Generic (PLEG): container finished" podID="0239e23b-f2f5-438e-9bff-c80a19672a3c" containerID="5f1bea883dab4eed5d5fe3072df4600bb68714d46771f0e312fbb7c13750e603" exitCode=2 Mar 20 13:48:05 crc kubenswrapper[4973]: I0320 13:48:05.570834 4973 generic.go:334] "Generic (PLEG): container finished" podID="0239e23b-f2f5-438e-9bff-c80a19672a3c" containerID="985f59da16877efa33d99869212240838eba6bb0b457e4e8e8403a0bb6f5059a" exitCode=0 Mar 20 13:48:05 crc kubenswrapper[4973]: I0320 13:48:05.570605 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0239e23b-f2f5-438e-9bff-c80a19672a3c","Type":"ContainerDied","Data":"e300dd5f33c0854523a2fa05d1579f6d387707e68fa3b7eeb7899189ba5c131f"} Mar 20 13:48:05 crc kubenswrapper[4973]: I0320 13:48:05.570888 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0239e23b-f2f5-438e-9bff-c80a19672a3c","Type":"ContainerDied","Data":"5f1bea883dab4eed5d5fe3072df4600bb68714d46771f0e312fbb7c13750e603"} Mar 20 13:48:05 crc kubenswrapper[4973]: I0320 13:48:05.570905 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0239e23b-f2f5-438e-9bff-c80a19672a3c","Type":"ContainerDied","Data":"985f59da16877efa33d99869212240838eba6bb0b457e4e8e8403a0bb6f5059a"} Mar 20 13:48:05 crc kubenswrapper[4973]: I0320 13:48:05.573496 4973 generic.go:334] "Generic (PLEG): container finished" podID="7ed9fca6-1a1b-4489-9ca6-dab93e0615f3" containerID="815cf8addeea77972c3c230d8a330e18ff0fb4a62a7f968fe769530bf6effc27" exitCode=0 Mar 20 13:48:05 crc kubenswrapper[4973]: I0320 13:48:05.573825 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566908-9w8fc" event={"ID":"7ed9fca6-1a1b-4489-9ca6-dab93e0615f3","Type":"ContainerDied","Data":"815cf8addeea77972c3c230d8a330e18ff0fb4a62a7f968fe769530bf6effc27"} Mar 20 13:48:06 crc kubenswrapper[4973]: I0320 13:48:06.605178 4973 generic.go:334] "Generic (PLEG): container finished" podID="0239e23b-f2f5-438e-9bff-c80a19672a3c" containerID="2e9cc9a0c6cd5e26c62540e2498de8bc2223f2b38545008a6a1e17927bf6dd3c" exitCode=0 Mar 20 13:48:06 crc kubenswrapper[4973]: I0320 13:48:06.605390 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0239e23b-f2f5-438e-9bff-c80a19672a3c","Type":"ContainerDied","Data":"2e9cc9a0c6cd5e26c62540e2498de8bc2223f2b38545008a6a1e17927bf6dd3c"} Mar 20 13:48:06 crc kubenswrapper[4973]: I0320 13:48:06.629678 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 13:48:06 crc kubenswrapper[4973]: I0320 13:48:06.629727 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 20 13:48:06 crc kubenswrapper[4973]: I0320 13:48:06.667128 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 13:48:06 crc kubenswrapper[4973]: I0320 13:48:06.680997 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 20 13:48:06 crc kubenswrapper[4973]: I0320 13:48:06.856865 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.021482 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0239e23b-f2f5-438e-9bff-c80a19672a3c-combined-ca-bundle\") pod \"0239e23b-f2f5-438e-9bff-c80a19672a3c\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.021563 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0239e23b-f2f5-438e-9bff-c80a19672a3c-config-data\") pod \"0239e23b-f2f5-438e-9bff-c80a19672a3c\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.021660 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0239e23b-f2f5-438e-9bff-c80a19672a3c-run-httpd\") pod \"0239e23b-f2f5-438e-9bff-c80a19672a3c\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.021712 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0239e23b-f2f5-438e-9bff-c80a19672a3c-scripts\") pod \"0239e23b-f2f5-438e-9bff-c80a19672a3c\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.021751 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4stw\" (UniqueName: \"kubernetes.io/projected/0239e23b-f2f5-438e-9bff-c80a19672a3c-kube-api-access-n4stw\") pod \"0239e23b-f2f5-438e-9bff-c80a19672a3c\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.021802 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0239e23b-f2f5-438e-9bff-c80a19672a3c-log-httpd\") pod \"0239e23b-f2f5-438e-9bff-c80a19672a3c\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.021936 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0239e23b-f2f5-438e-9bff-c80a19672a3c-sg-core-conf-yaml\") pod \"0239e23b-f2f5-438e-9bff-c80a19672a3c\" (UID: \"0239e23b-f2f5-438e-9bff-c80a19672a3c\") " Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.023899 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0239e23b-f2f5-438e-9bff-c80a19672a3c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0239e23b-f2f5-438e-9bff-c80a19672a3c" (UID: "0239e23b-f2f5-438e-9bff-c80a19672a3c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.023919 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0239e23b-f2f5-438e-9bff-c80a19672a3c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0239e23b-f2f5-438e-9bff-c80a19672a3c" (UID: "0239e23b-f2f5-438e-9bff-c80a19672a3c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.028529 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0239e23b-f2f5-438e-9bff-c80a19672a3c-scripts" (OuterVolumeSpecName: "scripts") pod "0239e23b-f2f5-438e-9bff-c80a19672a3c" (UID: "0239e23b-f2f5-438e-9bff-c80a19672a3c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.029565 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566908-9w8fc" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.033704 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0239e23b-f2f5-438e-9bff-c80a19672a3c-kube-api-access-n4stw" (OuterVolumeSpecName: "kube-api-access-n4stw") pod "0239e23b-f2f5-438e-9bff-c80a19672a3c" (UID: "0239e23b-f2f5-438e-9bff-c80a19672a3c"). InnerVolumeSpecName "kube-api-access-n4stw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.066656 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0239e23b-f2f5-438e-9bff-c80a19672a3c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0239e23b-f2f5-438e-9bff-c80a19672a3c" (UID: "0239e23b-f2f5-438e-9bff-c80a19672a3c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.124130 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-67v8w\" (UniqueName: \"kubernetes.io/projected/7ed9fca6-1a1b-4489-9ca6-dab93e0615f3-kube-api-access-67v8w\") pod \"7ed9fca6-1a1b-4489-9ca6-dab93e0615f3\" (UID: \"7ed9fca6-1a1b-4489-9ca6-dab93e0615f3\") " Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.124731 4973 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0239e23b-f2f5-438e-9bff-c80a19672a3c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.124752 4973 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0239e23b-f2f5-438e-9bff-c80a19672a3c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.124776 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0239e23b-f2f5-438e-9bff-c80a19672a3c-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.124791 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4stw\" (UniqueName: \"kubernetes.io/projected/0239e23b-f2f5-438e-9bff-c80a19672a3c-kube-api-access-n4stw\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.124804 4973 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0239e23b-f2f5-438e-9bff-c80a19672a3c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.128900 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7ed9fca6-1a1b-4489-9ca6-dab93e0615f3-kube-api-access-67v8w" (OuterVolumeSpecName: "kube-api-access-67v8w") pod "7ed9fca6-1a1b-4489-9ca6-dab93e0615f3" (UID: "7ed9fca6-1a1b-4489-9ca6-dab93e0615f3"). InnerVolumeSpecName "kube-api-access-67v8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.153234 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0239e23b-f2f5-438e-9bff-c80a19672a3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0239e23b-f2f5-438e-9bff-c80a19672a3c" (UID: "0239e23b-f2f5-438e-9bff-c80a19672a3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.188556 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0239e23b-f2f5-438e-9bff-c80a19672a3c-config-data" (OuterVolumeSpecName: "config-data") pod "0239e23b-f2f5-438e-9bff-c80a19672a3c" (UID: "0239e23b-f2f5-438e-9bff-c80a19672a3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.226606 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0239e23b-f2f5-438e-9bff-c80a19672a3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.226891 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0239e23b-f2f5-438e-9bff-c80a19672a3c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.226989 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-67v8w\" (UniqueName: \"kubernetes.io/projected/7ed9fca6-1a1b-4489-9ca6-dab93e0615f3-kube-api-access-67v8w\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.625871 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0239e23b-f2f5-438e-9bff-c80a19672a3c","Type":"ContainerDied","Data":"1357545a0f42184d0c3ae902b76924a0574fa4a8b647495d58ee62228c8711d9"} Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.626256 4973 scope.go:117] "RemoveContainer" containerID="e300dd5f33c0854523a2fa05d1579f6d387707e68fa3b7eeb7899189ba5c131f" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.625909 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.629375 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566908-9w8fc" event={"ID":"7ed9fca6-1a1b-4489-9ca6-dab93e0615f3","Type":"ContainerDied","Data":"69225fa4d05a27124ae5b72da157e08bb092b3c0b3ec47e970c51a58365b92cb"} Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.629414 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="69225fa4d05a27124ae5b72da157e08bb092b3c0b3ec47e970c51a58365b92cb" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.629487 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566908-9w8fc" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.630487 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.630538 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.660873 4973 scope.go:117] "RemoveContainer" containerID="5f1bea883dab4eed5d5fe3072df4600bb68714d46771f0e312fbb7c13750e603" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.683021 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:48:07 crc kubenswrapper[4973]: E0320 13:48:07.692672 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cebacebc3abeb44846df404a9888189719b306489361fc6b40f9d49d2ca02dd9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.693067 4973 scope.go:117] "RemoveContainer" containerID="985f59da16877efa33d99869212240838eba6bb0b457e4e8e8403a0bb6f5059a" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.696601 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:48:07 crc kubenswrapper[4973]: E0320 13:48:07.698524 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cebacebc3abeb44846df404a9888189719b306489361fc6b40f9d49d2ca02dd9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 13:48:07 crc kubenswrapper[4973]: E0320 13:48:07.703735 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cebacebc3abeb44846df404a9888189719b306489361fc6b40f9d49d2ca02dd9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 13:48:07 crc kubenswrapper[4973]: E0320 13:48:07.703821 4973 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="c31768ff-7740-4ba0-a355-02b5ae3b75f0" containerName="nova-cell0-conductor-conductor" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.717192 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566902-mjw5k"] Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.728149 4973 scope.go:117] "RemoveContainer" containerID="2e9cc9a0c6cd5e26c62540e2498de8bc2223f2b38545008a6a1e17927bf6dd3c" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.730645 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566902-mjw5k"] Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.739731 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:48:07 crc kubenswrapper[4973]: E0320 13:48:07.740243 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0239e23b-f2f5-438e-9bff-c80a19672a3c" containerName="ceilometer-notification-agent" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.740258 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="0239e23b-f2f5-438e-9bff-c80a19672a3c" containerName="ceilometer-notification-agent" Mar 20 13:48:07 crc kubenswrapper[4973]: E0320 13:48:07.740287 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7ed9fca6-1a1b-4489-9ca6-dab93e0615f3" containerName="oc" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.740293 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="7ed9fca6-1a1b-4489-9ca6-dab93e0615f3" containerName="oc" Mar 20 13:48:07 crc kubenswrapper[4973]: E0320 13:48:07.740316 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0239e23b-f2f5-438e-9bff-c80a19672a3c" containerName="sg-core" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.740323 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="0239e23b-f2f5-438e-9bff-c80a19672a3c" containerName="sg-core" Mar 20 13:48:07 crc kubenswrapper[4973]: E0320 13:48:07.740334 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0239e23b-f2f5-438e-9bff-c80a19672a3c" containerName="proxy-httpd" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.740404 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="0239e23b-f2f5-438e-9bff-c80a19672a3c" containerName="proxy-httpd" Mar 20 13:48:07 crc kubenswrapper[4973]: E0320 13:48:07.740427 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0239e23b-f2f5-438e-9bff-c80a19672a3c" containerName="ceilometer-central-agent" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.740433 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="0239e23b-f2f5-438e-9bff-c80a19672a3c" containerName="ceilometer-central-agent" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.740631 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="0239e23b-f2f5-438e-9bff-c80a19672a3c" containerName="ceilometer-central-agent" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.740645 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="0239e23b-f2f5-438e-9bff-c80a19672a3c" containerName="sg-core" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.740653 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="0239e23b-f2f5-438e-9bff-c80a19672a3c" containerName="ceilometer-notification-agent" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.740678 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="7ed9fca6-1a1b-4489-9ca6-dab93e0615f3" containerName="oc" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.740688 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="0239e23b-f2f5-438e-9bff-c80a19672a3c" containerName="proxy-httpd" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.742793 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.745634 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.746427 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.760536 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.800527 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.800603 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.843032 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c062dc-c672-49fc-bc93-cd0942214304-config-data\") pod \"ceilometer-0\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " pod="openstack/ceilometer-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.843109 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95c062dc-c672-49fc-bc93-cd0942214304-scripts\") pod \"ceilometer-0\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " pod="openstack/ceilometer-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.843178 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95c062dc-c672-49fc-bc93-cd0942214304-run-httpd\") pod \"ceilometer-0\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " pod="openstack/ceilometer-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.843314 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c062dc-c672-49fc-bc93-cd0942214304-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " pod="openstack/ceilometer-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.843730 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95c062dc-c672-49fc-bc93-cd0942214304-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " pod="openstack/ceilometer-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.843875 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6qvr\" (UniqueName: \"kubernetes.io/projected/95c062dc-c672-49fc-bc93-cd0942214304-kube-api-access-w6qvr\") pod \"ceilometer-0\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " pod="openstack/ceilometer-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.843971 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95c062dc-c672-49fc-bc93-cd0942214304-log-httpd\") pod \"ceilometer-0\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " pod="openstack/ceilometer-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.878892 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.886329 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.946019 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95c062dc-c672-49fc-bc93-cd0942214304-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " pod="openstack/ceilometer-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.946327 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6qvr\" (UniqueName: \"kubernetes.io/projected/95c062dc-c672-49fc-bc93-cd0942214304-kube-api-access-w6qvr\") pod \"ceilometer-0\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " pod="openstack/ceilometer-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.946490 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95c062dc-c672-49fc-bc93-cd0942214304-log-httpd\") pod \"ceilometer-0\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " pod="openstack/ceilometer-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.946638 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c062dc-c672-49fc-bc93-cd0942214304-config-data\") pod \"ceilometer-0\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " pod="openstack/ceilometer-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.946710 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95c062dc-c672-49fc-bc93-cd0942214304-scripts\") pod \"ceilometer-0\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " pod="openstack/ceilometer-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.946782 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95c062dc-c672-49fc-bc93-cd0942214304-run-httpd\") pod \"ceilometer-0\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " pod="openstack/ceilometer-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.946909 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c062dc-c672-49fc-bc93-cd0942214304-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " pod="openstack/ceilometer-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.947766 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95c062dc-c672-49fc-bc93-cd0942214304-log-httpd\") pod \"ceilometer-0\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " pod="openstack/ceilometer-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.947902 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95c062dc-c672-49fc-bc93-cd0942214304-run-httpd\") pod \"ceilometer-0\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " pod="openstack/ceilometer-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.952106 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c062dc-c672-49fc-bc93-cd0942214304-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " pod="openstack/ceilometer-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.952149 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95c062dc-c672-49fc-bc93-cd0942214304-scripts\") pod \"ceilometer-0\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " pod="openstack/ceilometer-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.952544 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95c062dc-c672-49fc-bc93-cd0942214304-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " pod="openstack/ceilometer-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.955778 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c062dc-c672-49fc-bc93-cd0942214304-config-data\") pod \"ceilometer-0\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " pod="openstack/ceilometer-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.973226 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6qvr\" (UniqueName: \"kubernetes.io/projected/95c062dc-c672-49fc-bc93-cd0942214304-kube-api-access-w6qvr\") pod \"ceilometer-0\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " pod="openstack/ceilometer-0" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.984613 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0239e23b-f2f5-438e-9bff-c80a19672a3c" path="/var/lib/kubelet/pods/0239e23b-f2f5-438e-9bff-c80a19672a3c/volumes" Mar 20 13:48:07 crc kubenswrapper[4973]: I0320 13:48:07.986063 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4aecf729-16cf-45ff-80bb-6d4f4cfd5f7d" path="/var/lib/kubelet/pods/4aecf729-16cf-45ff-80bb-6d4f4cfd5f7d/volumes" Mar 20 13:48:08 crc kubenswrapper[4973]: I0320 13:48:08.067970 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:48:08 crc kubenswrapper[4973]: I0320 13:48:08.645525 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 13:48:08 crc kubenswrapper[4973]: I0320 13:48:08.645851 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 20 13:48:08 crc kubenswrapper[4973]: I0320 13:48:08.674434 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:48:09 crc kubenswrapper[4973]: I0320 13:48:09.657444 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95c062dc-c672-49fc-bc93-cd0942214304","Type":"ContainerStarted","Data":"df92844731ba9f1112fa05922b9360877c18bb28c791b94cdb9a58cbaeb6bfab"} Mar 20 13:48:09 crc kubenswrapper[4973]: I0320 13:48:09.658105 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95c062dc-c672-49fc-bc93-cd0942214304","Type":"ContainerStarted","Data":"ef19dd7e8154e50cc122bfc91c43fec3ac40c55545ea6b5db67f0c4770c25f99"} Mar 20 13:48:09 crc kubenswrapper[4973]: I0320 13:48:09.657568 4973 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:48:09 crc kubenswrapper[4973]: I0320 13:48:09.658133 4973 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:48:10 crc kubenswrapper[4973]: I0320 13:48:10.369426 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 13:48:10 crc kubenswrapper[4973]: I0320 13:48:10.503400 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 20 13:48:10 crc kubenswrapper[4973]: I0320 13:48:10.542125 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-f7kbp"] Mar 20 13:48:10 crc kubenswrapper[4973]: I0320 13:48:10.543896 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-f7kbp" Mar 20 13:48:10 crc kubenswrapper[4973]: I0320 13:48:10.555378 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-f7kbp"] Mar 20 13:48:10 crc kubenswrapper[4973]: I0320 13:48:10.613444 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0c659f5-9f0d-4133-a245-7def431f0b1a-operator-scripts\") pod \"aodh-db-create-f7kbp\" (UID: \"e0c659f5-9f0d-4133-a245-7def431f0b1a\") " pod="openstack/aodh-db-create-f7kbp" Mar 20 13:48:10 crc kubenswrapper[4973]: I0320 13:48:10.613545 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95xrq\" (UniqueName: \"kubernetes.io/projected/e0c659f5-9f0d-4133-a245-7def431f0b1a-kube-api-access-95xrq\") pod \"aodh-db-create-f7kbp\" (UID: \"e0c659f5-9f0d-4133-a245-7def431f0b1a\") " pod="openstack/aodh-db-create-f7kbp" Mar 20 13:48:10 crc kubenswrapper[4973]: I0320 13:48:10.671761 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-eeca-account-create-update-q7rsh"] Mar 20 13:48:10 crc kubenswrapper[4973]: I0320 13:48:10.675618 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-eeca-account-create-update-q7rsh" Mar 20 13:48:10 crc kubenswrapper[4973]: I0320 13:48:10.683291 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Mar 20 13:48:10 crc kubenswrapper[4973]: I0320 13:48:10.722522 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-eeca-account-create-update-q7rsh"] Mar 20 13:48:10 crc kubenswrapper[4973]: I0320 13:48:10.733791 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0c659f5-9f0d-4133-a245-7def431f0b1a-operator-scripts\") pod \"aodh-db-create-f7kbp\" (UID: \"e0c659f5-9f0d-4133-a245-7def431f0b1a\") " pod="openstack/aodh-db-create-f7kbp" Mar 20 13:48:10 crc kubenswrapper[4973]: I0320 13:48:10.733926 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cc72d03-7a34-42ab-ad05-54dea5adfa04-operator-scripts\") pod \"aodh-eeca-account-create-update-q7rsh\" (UID: \"4cc72d03-7a34-42ab-ad05-54dea5adfa04\") " pod="openstack/aodh-eeca-account-create-update-q7rsh" Mar 20 13:48:10 crc kubenswrapper[4973]: I0320 13:48:10.733986 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95xrq\" (UniqueName: \"kubernetes.io/projected/e0c659f5-9f0d-4133-a245-7def431f0b1a-kube-api-access-95xrq\") pod \"aodh-db-create-f7kbp\" (UID: \"e0c659f5-9f0d-4133-a245-7def431f0b1a\") " pod="openstack/aodh-db-create-f7kbp" Mar 20 13:48:10 crc kubenswrapper[4973]: I0320 13:48:10.734444 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btl8c\" (UniqueName: \"kubernetes.io/projected/4cc72d03-7a34-42ab-ad05-54dea5adfa04-kube-api-access-btl8c\") pod \"aodh-eeca-account-create-update-q7rsh\" (UID: \"4cc72d03-7a34-42ab-ad05-54dea5adfa04\") " pod="openstack/aodh-eeca-account-create-update-q7rsh" Mar 20 13:48:10 crc kubenswrapper[4973]: I0320 13:48:10.735559 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0c659f5-9f0d-4133-a245-7def431f0b1a-operator-scripts\") pod \"aodh-db-create-f7kbp\" (UID: \"e0c659f5-9f0d-4133-a245-7def431f0b1a\") " pod="openstack/aodh-db-create-f7kbp" Mar 20 13:48:10 crc kubenswrapper[4973]: I0320 13:48:10.748212 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95c062dc-c672-49fc-bc93-cd0942214304","Type":"ContainerStarted","Data":"d170fdd0314fb19f8fa94a9d14ea44796906a9d4d3c39d1d7c945d9edc4e9612"} Mar 20 13:48:10 crc kubenswrapper[4973]: I0320 13:48:10.759742 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95xrq\" (UniqueName: \"kubernetes.io/projected/e0c659f5-9f0d-4133-a245-7def431f0b1a-kube-api-access-95xrq\") pod \"aodh-db-create-f7kbp\" (UID: \"e0c659f5-9f0d-4133-a245-7def431f0b1a\") " pod="openstack/aodh-db-create-f7kbp" Mar 20 13:48:10 crc kubenswrapper[4973]: I0320 13:48:10.836869 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btl8c\" (UniqueName: \"kubernetes.io/projected/4cc72d03-7a34-42ab-ad05-54dea5adfa04-kube-api-access-btl8c\") pod \"aodh-eeca-account-create-update-q7rsh\" (UID: \"4cc72d03-7a34-42ab-ad05-54dea5adfa04\") " pod="openstack/aodh-eeca-account-create-update-q7rsh" Mar 20 13:48:10 crc kubenswrapper[4973]: I0320 13:48:10.837046 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cc72d03-7a34-42ab-ad05-54dea5adfa04-operator-scripts\") pod \"aodh-eeca-account-create-update-q7rsh\" (UID: \"4cc72d03-7a34-42ab-ad05-54dea5adfa04\") " pod="openstack/aodh-eeca-account-create-update-q7rsh" Mar 20 13:48:10 crc kubenswrapper[4973]: I0320 13:48:10.845950 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cc72d03-7a34-42ab-ad05-54dea5adfa04-operator-scripts\") pod \"aodh-eeca-account-create-update-q7rsh\" (UID: \"4cc72d03-7a34-42ab-ad05-54dea5adfa04\") " pod="openstack/aodh-eeca-account-create-update-q7rsh" Mar 20 13:48:10 crc kubenswrapper[4973]: I0320 13:48:10.854830 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btl8c\" (UniqueName: \"kubernetes.io/projected/4cc72d03-7a34-42ab-ad05-54dea5adfa04-kube-api-access-btl8c\") pod \"aodh-eeca-account-create-update-q7rsh\" (UID: \"4cc72d03-7a34-42ab-ad05-54dea5adfa04\") " pod="openstack/aodh-eeca-account-create-update-q7rsh" Mar 20 13:48:10 crc kubenswrapper[4973]: I0320 13:48:10.929251 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-f7kbp" Mar 20 13:48:11 crc kubenswrapper[4973]: I0320 13:48:11.008908 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-eeca-account-create-update-q7rsh" Mar 20 13:48:11 crc kubenswrapper[4973]: I0320 13:48:11.663147 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 13:48:11 crc kubenswrapper[4973]: I0320 13:48:11.663629 4973 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 13:48:11 crc kubenswrapper[4973]: I0320 13:48:11.786689 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95c062dc-c672-49fc-bc93-cd0942214304","Type":"ContainerStarted","Data":"65c8adf96888515f2e042fb5de02ac85f4b0ae7fa1b2ad2318e1d1c906d64177"} Mar 20 13:48:11 crc kubenswrapper[4973]: I0320 13:48:11.790474 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-f7kbp"] Mar 20 13:48:11 crc kubenswrapper[4973]: I0320 13:48:11.907794 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-eeca-account-create-update-q7rsh"] Mar 20 13:48:11 crc kubenswrapper[4973]: I0320 13:48:11.982769 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 20 13:48:12 crc kubenswrapper[4973]: I0320 13:48:12.404478 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:48:12 crc kubenswrapper[4973]: E0320 13:48:12.693881 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cebacebc3abeb44846df404a9888189719b306489361fc6b40f9d49d2ca02dd9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 13:48:12 crc kubenswrapper[4973]: E0320 13:48:12.698633 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cebacebc3abeb44846df404a9888189719b306489361fc6b40f9d49d2ca02dd9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 13:48:12 crc kubenswrapper[4973]: E0320 13:48:12.700680 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cebacebc3abeb44846df404a9888189719b306489361fc6b40f9d49d2ca02dd9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 13:48:12 crc kubenswrapper[4973]: E0320 13:48:12.700727 4973 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="c31768ff-7740-4ba0-a355-02b5ae3b75f0" containerName="nova-cell0-conductor-conductor" Mar 20 13:48:12 crc kubenswrapper[4973]: I0320 13:48:12.802224 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-eeca-account-create-update-q7rsh" event={"ID":"4cc72d03-7a34-42ab-ad05-54dea5adfa04","Type":"ContainerStarted","Data":"4ac0de1008deec27af56feed7d6e25375cde44a11d80e888ce389f842527df4c"} Mar 20 13:48:12 crc kubenswrapper[4973]: I0320 13:48:12.802306 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-eeca-account-create-update-q7rsh" event={"ID":"4cc72d03-7a34-42ab-ad05-54dea5adfa04","Type":"ContainerStarted","Data":"0cb89a4d3cea06ceb903d689fefad4bddf7f3a8242ddaba5b381b6cf807d88e9"} Mar 20 13:48:12 crc kubenswrapper[4973]: I0320 13:48:12.804558 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-f7kbp" event={"ID":"e0c659f5-9f0d-4133-a245-7def431f0b1a","Type":"ContainerStarted","Data":"7280559e6e57805be6a95dcd864b8a1d929a3706a84959bbc75bf834c7986217"} Mar 20 13:48:12 crc kubenswrapper[4973]: I0320 13:48:12.804637 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-f7kbp" event={"ID":"e0c659f5-9f0d-4133-a245-7def431f0b1a","Type":"ContainerStarted","Data":"40c8f41ac796ad779762b1488460c48a0f412e8a5a2e09d5f272da5f48d67802"} Mar 20 13:48:12 crc kubenswrapper[4973]: I0320 13:48:12.822297 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-eeca-account-create-update-q7rsh" podStartSLOduration=2.8222667059999997 podStartE2EDuration="2.822266706s" podCreationTimestamp="2026-03-20 13:48:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:12.820873698 +0000 UTC m=+1613.564543462" watchObservedRunningTime="2026-03-20 13:48:12.822266706 +0000 UTC m=+1613.565936450" Mar 20 13:48:12 crc kubenswrapper[4973]: I0320 13:48:12.846648 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-create-f7kbp" podStartSLOduration=2.84662045 podStartE2EDuration="2.84662045s" podCreationTimestamp="2026-03-20 13:48:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:12.839560358 +0000 UTC m=+1613.583230102" watchObservedRunningTime="2026-03-20 13:48:12.84662045 +0000 UTC m=+1613.590290184" Mar 20 13:48:13 crc kubenswrapper[4973]: I0320 13:48:13.320590 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:48:13 crc kubenswrapper[4973]: I0320 13:48:13.320957 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:48:13 crc kubenswrapper[4973]: I0320 13:48:13.321009 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 13:48:13 crc kubenswrapper[4973]: I0320 13:48:13.321977 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e"} pod="openshift-machine-config-operator/machine-config-daemon-qlztx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:48:13 crc kubenswrapper[4973]: I0320 13:48:13.322094 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" containerID="cri-o://6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" gracePeriod=600 Mar 20 13:48:13 crc kubenswrapper[4973]: E0320 13:48:13.447785 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:48:13 crc kubenswrapper[4973]: I0320 13:48:13.818406 4973 generic.go:334] "Generic (PLEG): container finished" podID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerID="6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" exitCode=0 Mar 20 13:48:13 crc kubenswrapper[4973]: I0320 13:48:13.818568 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerDied","Data":"6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e"} Mar 20 13:48:13 crc kubenswrapper[4973]: I0320 13:48:13.818946 4973 scope.go:117] "RemoveContainer" containerID="96fa5ee864c868aadb2da3b33886e9ad0c244086b57b687cf9a31b416fea8562" Mar 20 13:48:13 crc kubenswrapper[4973]: I0320 13:48:13.820071 4973 scope.go:117] "RemoveContainer" containerID="6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" Mar 20 13:48:13 crc kubenswrapper[4973]: E0320 13:48:13.820513 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:48:13 crc kubenswrapper[4973]: I0320 13:48:13.822201 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95c062dc-c672-49fc-bc93-cd0942214304","Type":"ContainerStarted","Data":"3db21c4dbefb0fd8948dbae617f72729b1b10958db1d143e52b2a88c796e5e52"} Mar 20 13:48:13 crc kubenswrapper[4973]: I0320 13:48:13.822363 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:48:13 crc kubenswrapper[4973]: I0320 13:48:13.822362 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95c062dc-c672-49fc-bc93-cd0942214304" containerName="ceilometer-central-agent" containerID="cri-o://df92844731ba9f1112fa05922b9360877c18bb28c791b94cdb9a58cbaeb6bfab" gracePeriod=30 Mar 20 13:48:13 crc kubenswrapper[4973]: I0320 13:48:13.822463 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95c062dc-c672-49fc-bc93-cd0942214304" containerName="sg-core" containerID="cri-o://65c8adf96888515f2e042fb5de02ac85f4b0ae7fa1b2ad2318e1d1c906d64177" gracePeriod=30 Mar 20 13:48:13 crc kubenswrapper[4973]: I0320 13:48:13.822515 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95c062dc-c672-49fc-bc93-cd0942214304" containerName="proxy-httpd" containerID="cri-o://3db21c4dbefb0fd8948dbae617f72729b1b10958db1d143e52b2a88c796e5e52" gracePeriod=30 Mar 20 13:48:13 crc kubenswrapper[4973]: I0320 13:48:13.822562 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="95c062dc-c672-49fc-bc93-cd0942214304" containerName="ceilometer-notification-agent" containerID="cri-o://d170fdd0314fb19f8fa94a9d14ea44796906a9d4d3c39d1d7c945d9edc4e9612" gracePeriod=30 Mar 20 13:48:13 crc kubenswrapper[4973]: I0320 13:48:13.829767 4973 generic.go:334] "Generic (PLEG): container finished" podID="e0c659f5-9f0d-4133-a245-7def431f0b1a" containerID="7280559e6e57805be6a95dcd864b8a1d929a3706a84959bbc75bf834c7986217" exitCode=0 Mar 20 13:48:13 crc kubenswrapper[4973]: I0320 13:48:13.829841 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-f7kbp" event={"ID":"e0c659f5-9f0d-4133-a245-7def431f0b1a","Type":"ContainerDied","Data":"7280559e6e57805be6a95dcd864b8a1d929a3706a84959bbc75bf834c7986217"} Mar 20 13:48:13 crc kubenswrapper[4973]: I0320 13:48:13.835091 4973 generic.go:334] "Generic (PLEG): container finished" podID="4cc72d03-7a34-42ab-ad05-54dea5adfa04" containerID="4ac0de1008deec27af56feed7d6e25375cde44a11d80e888ce389f842527df4c" exitCode=0 Mar 20 13:48:13 crc kubenswrapper[4973]: I0320 13:48:13.835141 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-eeca-account-create-update-q7rsh" event={"ID":"4cc72d03-7a34-42ab-ad05-54dea5adfa04","Type":"ContainerDied","Data":"4ac0de1008deec27af56feed7d6e25375cde44a11d80e888ce389f842527df4c"} Mar 20 13:48:13 crc kubenswrapper[4973]: I0320 13:48:13.906542 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.235188488 podStartE2EDuration="6.90652404s" podCreationTimestamp="2026-03-20 13:48:07 +0000 UTC" firstStartedPulling="2026-03-20 13:48:08.711518234 +0000 UTC m=+1609.455187978" lastFinishedPulling="2026-03-20 13:48:13.382853796 +0000 UTC m=+1614.126523530" observedRunningTime="2026-03-20 13:48:13.901008939 +0000 UTC m=+1614.644678703" watchObservedRunningTime="2026-03-20 13:48:13.90652404 +0000 UTC m=+1614.650193784" Mar 20 13:48:14 crc kubenswrapper[4973]: I0320 13:48:14.850435 4973 generic.go:334] "Generic (PLEG): container finished" podID="95c062dc-c672-49fc-bc93-cd0942214304" containerID="3db21c4dbefb0fd8948dbae617f72729b1b10958db1d143e52b2a88c796e5e52" exitCode=0 Mar 20 13:48:14 crc kubenswrapper[4973]: I0320 13:48:14.850783 4973 generic.go:334] "Generic (PLEG): container finished" podID="95c062dc-c672-49fc-bc93-cd0942214304" containerID="65c8adf96888515f2e042fb5de02ac85f4b0ae7fa1b2ad2318e1d1c906d64177" exitCode=2 Mar 20 13:48:14 crc kubenswrapper[4973]: I0320 13:48:14.850795 4973 generic.go:334] "Generic (PLEG): container finished" podID="95c062dc-c672-49fc-bc93-cd0942214304" containerID="d170fdd0314fb19f8fa94a9d14ea44796906a9d4d3c39d1d7c945d9edc4e9612" exitCode=0 Mar 20 13:48:14 crc kubenswrapper[4973]: I0320 13:48:14.850498 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95c062dc-c672-49fc-bc93-cd0942214304","Type":"ContainerDied","Data":"3db21c4dbefb0fd8948dbae617f72729b1b10958db1d143e52b2a88c796e5e52"} Mar 20 13:48:14 crc kubenswrapper[4973]: I0320 13:48:14.850869 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95c062dc-c672-49fc-bc93-cd0942214304","Type":"ContainerDied","Data":"65c8adf96888515f2e042fb5de02ac85f4b0ae7fa1b2ad2318e1d1c906d64177"} Mar 20 13:48:14 crc kubenswrapper[4973]: I0320 13:48:14.850884 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95c062dc-c672-49fc-bc93-cd0942214304","Type":"ContainerDied","Data":"d170fdd0314fb19f8fa94a9d14ea44796906a9d4d3c39d1d7c945d9edc4e9612"} Mar 20 13:48:15 crc kubenswrapper[4973]: I0320 13:48:15.550584 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-eeca-account-create-update-q7rsh" Mar 20 13:48:15 crc kubenswrapper[4973]: I0320 13:48:15.567638 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-f7kbp" Mar 20 13:48:15 crc kubenswrapper[4973]: I0320 13:48:15.690283 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95xrq\" (UniqueName: \"kubernetes.io/projected/e0c659f5-9f0d-4133-a245-7def431f0b1a-kube-api-access-95xrq\") pod \"e0c659f5-9f0d-4133-a245-7def431f0b1a\" (UID: \"e0c659f5-9f0d-4133-a245-7def431f0b1a\") " Mar 20 13:48:15 crc kubenswrapper[4973]: I0320 13:48:15.690712 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0c659f5-9f0d-4133-a245-7def431f0b1a-operator-scripts\") pod \"e0c659f5-9f0d-4133-a245-7def431f0b1a\" (UID: \"e0c659f5-9f0d-4133-a245-7def431f0b1a\") " Mar 20 13:48:15 crc kubenswrapper[4973]: I0320 13:48:15.690858 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btl8c\" (UniqueName: \"kubernetes.io/projected/4cc72d03-7a34-42ab-ad05-54dea5adfa04-kube-api-access-btl8c\") pod \"4cc72d03-7a34-42ab-ad05-54dea5adfa04\" (UID: \"4cc72d03-7a34-42ab-ad05-54dea5adfa04\") " Mar 20 13:48:15 crc kubenswrapper[4973]: I0320 13:48:15.691060 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cc72d03-7a34-42ab-ad05-54dea5adfa04-operator-scripts\") pod \"4cc72d03-7a34-42ab-ad05-54dea5adfa04\" (UID: \"4cc72d03-7a34-42ab-ad05-54dea5adfa04\") " Mar 20 13:48:15 crc kubenswrapper[4973]: I0320 13:48:15.691251 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0c659f5-9f0d-4133-a245-7def431f0b1a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e0c659f5-9f0d-4133-a245-7def431f0b1a" (UID: "e0c659f5-9f0d-4133-a245-7def431f0b1a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:15 crc kubenswrapper[4973]: I0320 13:48:15.691551 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cc72d03-7a34-42ab-ad05-54dea5adfa04-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4cc72d03-7a34-42ab-ad05-54dea5adfa04" (UID: "4cc72d03-7a34-42ab-ad05-54dea5adfa04"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:15 crc kubenswrapper[4973]: I0320 13:48:15.692208 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0c659f5-9f0d-4133-a245-7def431f0b1a-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:15 crc kubenswrapper[4973]: I0320 13:48:15.692331 4973 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4cc72d03-7a34-42ab-ad05-54dea5adfa04-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:15 crc kubenswrapper[4973]: I0320 13:48:15.696879 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0c659f5-9f0d-4133-a245-7def431f0b1a-kube-api-access-95xrq" (OuterVolumeSpecName: "kube-api-access-95xrq") pod "e0c659f5-9f0d-4133-a245-7def431f0b1a" (UID: "e0c659f5-9f0d-4133-a245-7def431f0b1a"). InnerVolumeSpecName "kube-api-access-95xrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:15 crc kubenswrapper[4973]: I0320 13:48:15.697241 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cc72d03-7a34-42ab-ad05-54dea5adfa04-kube-api-access-btl8c" (OuterVolumeSpecName: "kube-api-access-btl8c") pod "4cc72d03-7a34-42ab-ad05-54dea5adfa04" (UID: "4cc72d03-7a34-42ab-ad05-54dea5adfa04"). InnerVolumeSpecName "kube-api-access-btl8c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:15 crc kubenswrapper[4973]: I0320 13:48:15.794631 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95xrq\" (UniqueName: \"kubernetes.io/projected/e0c659f5-9f0d-4133-a245-7def431f0b1a-kube-api-access-95xrq\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:15 crc kubenswrapper[4973]: I0320 13:48:15.794686 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btl8c\" (UniqueName: \"kubernetes.io/projected/4cc72d03-7a34-42ab-ad05-54dea5adfa04-kube-api-access-btl8c\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:15 crc kubenswrapper[4973]: I0320 13:48:15.870487 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-eeca-account-create-update-q7rsh" Mar 20 13:48:15 crc kubenswrapper[4973]: I0320 13:48:15.870505 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-eeca-account-create-update-q7rsh" event={"ID":"4cc72d03-7a34-42ab-ad05-54dea5adfa04","Type":"ContainerDied","Data":"0cb89a4d3cea06ceb903d689fefad4bddf7f3a8242ddaba5b381b6cf807d88e9"} Mar 20 13:48:15 crc kubenswrapper[4973]: I0320 13:48:15.870556 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cb89a4d3cea06ceb903d689fefad4bddf7f3a8242ddaba5b381b6cf807d88e9" Mar 20 13:48:15 crc kubenswrapper[4973]: I0320 13:48:15.872256 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-f7kbp" event={"ID":"e0c659f5-9f0d-4133-a245-7def431f0b1a","Type":"ContainerDied","Data":"40c8f41ac796ad779762b1488460c48a0f412e8a5a2e09d5f272da5f48d67802"} Mar 20 13:48:15 crc kubenswrapper[4973]: I0320 13:48:15.872590 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="40c8f41ac796ad779762b1488460c48a0f412e8a5a2e09d5f272da5f48d67802" Mar 20 13:48:15 crc kubenswrapper[4973]: I0320 13:48:15.872309 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-f7kbp" Mar 20 13:48:17 crc kubenswrapper[4973]: E0320 13:48:17.693929 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cebacebc3abeb44846df404a9888189719b306489361fc6b40f9d49d2ca02dd9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 13:48:17 crc kubenswrapper[4973]: E0320 13:48:17.695565 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cebacebc3abeb44846df404a9888189719b306489361fc6b40f9d49d2ca02dd9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 13:48:17 crc kubenswrapper[4973]: E0320 13:48:17.696713 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cebacebc3abeb44846df404a9888189719b306489361fc6b40f9d49d2ca02dd9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 13:48:17 crc kubenswrapper[4973]: E0320 13:48:17.696757 4973 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="c31768ff-7740-4ba0-a355-02b5ae3b75f0" containerName="nova-cell0-conductor-conductor" Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.624441 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.663314 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95c062dc-c672-49fc-bc93-cd0942214304-sg-core-conf-yaml\") pod \"95c062dc-c672-49fc-bc93-cd0942214304\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.663499 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c062dc-c672-49fc-bc93-cd0942214304-combined-ca-bundle\") pod \"95c062dc-c672-49fc-bc93-cd0942214304\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.663652 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c062dc-c672-49fc-bc93-cd0942214304-config-data\") pod \"95c062dc-c672-49fc-bc93-cd0942214304\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.663703 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95c062dc-c672-49fc-bc93-cd0942214304-log-httpd\") pod \"95c062dc-c672-49fc-bc93-cd0942214304\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.663732 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95c062dc-c672-49fc-bc93-cd0942214304-scripts\") pod \"95c062dc-c672-49fc-bc93-cd0942214304\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.663785 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95c062dc-c672-49fc-bc93-cd0942214304-run-httpd\") pod \"95c062dc-c672-49fc-bc93-cd0942214304\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.663818 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6qvr\" (UniqueName: \"kubernetes.io/projected/95c062dc-c672-49fc-bc93-cd0942214304-kube-api-access-w6qvr\") pod \"95c062dc-c672-49fc-bc93-cd0942214304\" (UID: \"95c062dc-c672-49fc-bc93-cd0942214304\") " Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.668661 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95c062dc-c672-49fc-bc93-cd0942214304-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "95c062dc-c672-49fc-bc93-cd0942214304" (UID: "95c062dc-c672-49fc-bc93-cd0942214304"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.668980 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/95c062dc-c672-49fc-bc93-cd0942214304-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "95c062dc-c672-49fc-bc93-cd0942214304" (UID: "95c062dc-c672-49fc-bc93-cd0942214304"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.680579 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95c062dc-c672-49fc-bc93-cd0942214304-kube-api-access-w6qvr" (OuterVolumeSpecName: "kube-api-access-w6qvr") pod "95c062dc-c672-49fc-bc93-cd0942214304" (UID: "95c062dc-c672-49fc-bc93-cd0942214304"). InnerVolumeSpecName "kube-api-access-w6qvr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.680583 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c062dc-c672-49fc-bc93-cd0942214304-scripts" (OuterVolumeSpecName: "scripts") pod "95c062dc-c672-49fc-bc93-cd0942214304" (UID: "95c062dc-c672-49fc-bc93-cd0942214304"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.733604 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c062dc-c672-49fc-bc93-cd0942214304-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "95c062dc-c672-49fc-bc93-cd0942214304" (UID: "95c062dc-c672-49fc-bc93-cd0942214304"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.766165 4973 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95c062dc-c672-49fc-bc93-cd0942214304-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.766195 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/95c062dc-c672-49fc-bc93-cd0942214304-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.766203 4973 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/95c062dc-c672-49fc-bc93-cd0942214304-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.766212 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6qvr\" (UniqueName: \"kubernetes.io/projected/95c062dc-c672-49fc-bc93-cd0942214304-kube-api-access-w6qvr\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.766221 4973 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/95c062dc-c672-49fc-bc93-cd0942214304-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.787284 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c062dc-c672-49fc-bc93-cd0942214304-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "95c062dc-c672-49fc-bc93-cd0942214304" (UID: "95c062dc-c672-49fc-bc93-cd0942214304"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.812120 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/95c062dc-c672-49fc-bc93-cd0942214304-config-data" (OuterVolumeSpecName: "config-data") pod "95c062dc-c672-49fc-bc93-cd0942214304" (UID: "95c062dc-c672-49fc-bc93-cd0942214304"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.868107 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/95c062dc-c672-49fc-bc93-cd0942214304-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.868661 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95c062dc-c672-49fc-bc93-cd0942214304-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.906783 4973 generic.go:334] "Generic (PLEG): container finished" podID="95c062dc-c672-49fc-bc93-cd0942214304" containerID="df92844731ba9f1112fa05922b9360877c18bb28c791b94cdb9a58cbaeb6bfab" exitCode=0 Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.906837 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95c062dc-c672-49fc-bc93-cd0942214304","Type":"ContainerDied","Data":"df92844731ba9f1112fa05922b9360877c18bb28c791b94cdb9a58cbaeb6bfab"} Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.906873 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"95c062dc-c672-49fc-bc93-cd0942214304","Type":"ContainerDied","Data":"ef19dd7e8154e50cc122bfc91c43fec3ac40c55545ea6b5db67f0c4770c25f99"} Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.906893 4973 scope.go:117] "RemoveContainer" containerID="3db21c4dbefb0fd8948dbae617f72729b1b10958db1d143e52b2a88c796e5e52" Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.906836 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.949202 4973 scope.go:117] "RemoveContainer" containerID="65c8adf96888515f2e042fb5de02ac85f4b0ae7fa1b2ad2318e1d1c906d64177" Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.963418 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.985698 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:48:18 crc kubenswrapper[4973]: I0320 13:48:18.997879 4973 scope.go:117] "RemoveContainer" containerID="d170fdd0314fb19f8fa94a9d14ea44796906a9d4d3c39d1d7c945d9edc4e9612" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.000776 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:48:19 crc kubenswrapper[4973]: E0320 13:48:19.001817 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c062dc-c672-49fc-bc93-cd0942214304" containerName="proxy-httpd" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.001978 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c062dc-c672-49fc-bc93-cd0942214304" containerName="proxy-httpd" Mar 20 13:48:19 crc kubenswrapper[4973]: E0320 13:48:19.002082 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c062dc-c672-49fc-bc93-cd0942214304" containerName="sg-core" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.002152 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c062dc-c672-49fc-bc93-cd0942214304" containerName="sg-core" Mar 20 13:48:19 crc kubenswrapper[4973]: E0320 13:48:19.002282 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cc72d03-7a34-42ab-ad05-54dea5adfa04" containerName="mariadb-account-create-update" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.002368 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cc72d03-7a34-42ab-ad05-54dea5adfa04" containerName="mariadb-account-create-update" Mar 20 13:48:19 crc kubenswrapper[4973]: E0320 13:48:19.002449 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c062dc-c672-49fc-bc93-cd0942214304" containerName="ceilometer-notification-agent" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.002516 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c062dc-c672-49fc-bc93-cd0942214304" containerName="ceilometer-notification-agent" Mar 20 13:48:19 crc kubenswrapper[4973]: E0320 13:48:19.002687 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0c659f5-9f0d-4133-a245-7def431f0b1a" containerName="mariadb-database-create" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.002760 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0c659f5-9f0d-4133-a245-7def431f0b1a" containerName="mariadb-database-create" Mar 20 13:48:19 crc kubenswrapper[4973]: E0320 13:48:19.002847 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95c062dc-c672-49fc-bc93-cd0942214304" containerName="ceilometer-central-agent" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.002915 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="95c062dc-c672-49fc-bc93-cd0942214304" containerName="ceilometer-central-agent" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.015435 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="95c062dc-c672-49fc-bc93-cd0942214304" containerName="proxy-httpd" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.015729 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="95c062dc-c672-49fc-bc93-cd0942214304" containerName="ceilometer-notification-agent" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.015835 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cc72d03-7a34-42ab-ad05-54dea5adfa04" containerName="mariadb-account-create-update" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.015920 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="95c062dc-c672-49fc-bc93-cd0942214304" containerName="ceilometer-central-agent" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.016033 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="95c062dc-c672-49fc-bc93-cd0942214304" containerName="sg-core" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.016107 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0c659f5-9f0d-4133-a245-7def431f0b1a" containerName="mariadb-database-create" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.025354 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.025461 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.027881 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.035196 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.037188 4973 scope.go:117] "RemoveContainer" containerID="df92844731ba9f1112fa05922b9360877c18bb28c791b94cdb9a58cbaeb6bfab" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.071717 4973 scope.go:117] "RemoveContainer" containerID="3db21c4dbefb0fd8948dbae617f72729b1b10958db1d143e52b2a88c796e5e52" Mar 20 13:48:19 crc kubenswrapper[4973]: E0320 13:48:19.072116 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3db21c4dbefb0fd8948dbae617f72729b1b10958db1d143e52b2a88c796e5e52\": container with ID starting with 3db21c4dbefb0fd8948dbae617f72729b1b10958db1d143e52b2a88c796e5e52 not found: ID does not exist" containerID="3db21c4dbefb0fd8948dbae617f72729b1b10958db1d143e52b2a88c796e5e52" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.072160 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3db21c4dbefb0fd8948dbae617f72729b1b10958db1d143e52b2a88c796e5e52"} err="failed to get container status \"3db21c4dbefb0fd8948dbae617f72729b1b10958db1d143e52b2a88c796e5e52\": rpc error: code = NotFound desc = could not find container \"3db21c4dbefb0fd8948dbae617f72729b1b10958db1d143e52b2a88c796e5e52\": container with ID starting with 3db21c4dbefb0fd8948dbae617f72729b1b10958db1d143e52b2a88c796e5e52 not found: ID does not exist" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.072197 4973 scope.go:117] "RemoveContainer" containerID="65c8adf96888515f2e042fb5de02ac85f4b0ae7fa1b2ad2318e1d1c906d64177" Mar 20 13:48:19 crc kubenswrapper[4973]: E0320 13:48:19.072735 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65c8adf96888515f2e042fb5de02ac85f4b0ae7fa1b2ad2318e1d1c906d64177\": container with ID starting with 65c8adf96888515f2e042fb5de02ac85f4b0ae7fa1b2ad2318e1d1c906d64177 not found: ID does not exist" containerID="65c8adf96888515f2e042fb5de02ac85f4b0ae7fa1b2ad2318e1d1c906d64177" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.072758 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65c8adf96888515f2e042fb5de02ac85f4b0ae7fa1b2ad2318e1d1c906d64177"} err="failed to get container status \"65c8adf96888515f2e042fb5de02ac85f4b0ae7fa1b2ad2318e1d1c906d64177\": rpc error: code = NotFound desc = could not find container \"65c8adf96888515f2e042fb5de02ac85f4b0ae7fa1b2ad2318e1d1c906d64177\": container with ID starting with 65c8adf96888515f2e042fb5de02ac85f4b0ae7fa1b2ad2318e1d1c906d64177 not found: ID does not exist" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.072776 4973 scope.go:117] "RemoveContainer" containerID="d170fdd0314fb19f8fa94a9d14ea44796906a9d4d3c39d1d7c945d9edc4e9612" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.072921 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlqj7\" (UniqueName: \"kubernetes.io/projected/708f94e3-7737-454c-845c-02ed42251525-kube-api-access-qlqj7\") pod \"ceilometer-0\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " pod="openstack/ceilometer-0" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.073019 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/708f94e3-7737-454c-845c-02ed42251525-log-httpd\") pod \"ceilometer-0\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " pod="openstack/ceilometer-0" Mar 20 13:48:19 crc kubenswrapper[4973]: E0320 13:48:19.073025 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d170fdd0314fb19f8fa94a9d14ea44796906a9d4d3c39d1d7c945d9edc4e9612\": container with ID starting with d170fdd0314fb19f8fa94a9d14ea44796906a9d4d3c39d1d7c945d9edc4e9612 not found: ID does not exist" containerID="d170fdd0314fb19f8fa94a9d14ea44796906a9d4d3c39d1d7c945d9edc4e9612" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.073083 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d170fdd0314fb19f8fa94a9d14ea44796906a9d4d3c39d1d7c945d9edc4e9612"} err="failed to get container status \"d170fdd0314fb19f8fa94a9d14ea44796906a9d4d3c39d1d7c945d9edc4e9612\": rpc error: code = NotFound desc = could not find container \"d170fdd0314fb19f8fa94a9d14ea44796906a9d4d3c39d1d7c945d9edc4e9612\": container with ID starting with d170fdd0314fb19f8fa94a9d14ea44796906a9d4d3c39d1d7c945d9edc4e9612 not found: ID does not exist" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.073105 4973 scope.go:117] "RemoveContainer" containerID="df92844731ba9f1112fa05922b9360877c18bb28c791b94cdb9a58cbaeb6bfab" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.073227 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/708f94e3-7737-454c-845c-02ed42251525-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " pod="openstack/ceilometer-0" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.073261 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/708f94e3-7737-454c-845c-02ed42251525-scripts\") pod \"ceilometer-0\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " pod="openstack/ceilometer-0" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.073316 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/708f94e3-7737-454c-845c-02ed42251525-config-data\") pod \"ceilometer-0\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " pod="openstack/ceilometer-0" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.073565 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/708f94e3-7737-454c-845c-02ed42251525-run-httpd\") pod \"ceilometer-0\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " pod="openstack/ceilometer-0" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.073692 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/708f94e3-7737-454c-845c-02ed42251525-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " pod="openstack/ceilometer-0" Mar 20 13:48:19 crc kubenswrapper[4973]: E0320 13:48:19.074134 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df92844731ba9f1112fa05922b9360877c18bb28c791b94cdb9a58cbaeb6bfab\": container with ID starting with df92844731ba9f1112fa05922b9360877c18bb28c791b94cdb9a58cbaeb6bfab not found: ID does not exist" containerID="df92844731ba9f1112fa05922b9360877c18bb28c791b94cdb9a58cbaeb6bfab" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.074161 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df92844731ba9f1112fa05922b9360877c18bb28c791b94cdb9a58cbaeb6bfab"} err="failed to get container status \"df92844731ba9f1112fa05922b9360877c18bb28c791b94cdb9a58cbaeb6bfab\": rpc error: code = NotFound desc = could not find container \"df92844731ba9f1112fa05922b9360877c18bb28c791b94cdb9a58cbaeb6bfab\": container with ID starting with df92844731ba9f1112fa05922b9360877c18bb28c791b94cdb9a58cbaeb6bfab not found: ID does not exist" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.176195 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/708f94e3-7737-454c-845c-02ed42251525-run-httpd\") pod \"ceilometer-0\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " pod="openstack/ceilometer-0" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.176371 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/708f94e3-7737-454c-845c-02ed42251525-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " pod="openstack/ceilometer-0" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.176968 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/708f94e3-7737-454c-845c-02ed42251525-run-httpd\") pod \"ceilometer-0\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " pod="openstack/ceilometer-0" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.177237 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlqj7\" (UniqueName: \"kubernetes.io/projected/708f94e3-7737-454c-845c-02ed42251525-kube-api-access-qlqj7\") pod \"ceilometer-0\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " pod="openstack/ceilometer-0" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.177307 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/708f94e3-7737-454c-845c-02ed42251525-log-httpd\") pod \"ceilometer-0\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " pod="openstack/ceilometer-0" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.177502 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/708f94e3-7737-454c-845c-02ed42251525-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " pod="openstack/ceilometer-0" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.177534 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/708f94e3-7737-454c-845c-02ed42251525-scripts\") pod \"ceilometer-0\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " pod="openstack/ceilometer-0" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.177592 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/708f94e3-7737-454c-845c-02ed42251525-config-data\") pod \"ceilometer-0\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " pod="openstack/ceilometer-0" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.178045 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/708f94e3-7737-454c-845c-02ed42251525-log-httpd\") pod \"ceilometer-0\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " pod="openstack/ceilometer-0" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.182370 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/708f94e3-7737-454c-845c-02ed42251525-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " pod="openstack/ceilometer-0" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.184296 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/708f94e3-7737-454c-845c-02ed42251525-config-data\") pod \"ceilometer-0\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " pod="openstack/ceilometer-0" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.186714 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/708f94e3-7737-454c-845c-02ed42251525-scripts\") pod \"ceilometer-0\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " pod="openstack/ceilometer-0" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.195672 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/708f94e3-7737-454c-845c-02ed42251525-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " pod="openstack/ceilometer-0" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.199121 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlqj7\" (UniqueName: \"kubernetes.io/projected/708f94e3-7737-454c-845c-02ed42251525-kube-api-access-qlqj7\") pod \"ceilometer-0\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " pod="openstack/ceilometer-0" Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.368637 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:48:19 crc kubenswrapper[4973]: W0320 13:48:19.891448 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod708f94e3_7737_454c_845c_02ed42251525.slice/crio-e4d606fa01ede0292dea3e41878d9572cc45e141509a17000e3342283aef296b WatchSource:0}: Error finding container e4d606fa01ede0292dea3e41878d9572cc45e141509a17000e3342283aef296b: Status 404 returned error can't find the container with id e4d606fa01ede0292dea3e41878d9572cc45e141509a17000e3342283aef296b Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.892011 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.923576 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"708f94e3-7737-454c-845c-02ed42251525","Type":"ContainerStarted","Data":"e4d606fa01ede0292dea3e41878d9572cc45e141509a17000e3342283aef296b"} Mar 20 13:48:19 crc kubenswrapper[4973]: I0320 13:48:19.985841 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95c062dc-c672-49fc-bc93-cd0942214304" path="/var/lib/kubelet/pods/95c062dc-c672-49fc-bc93-cd0942214304/volumes" Mar 20 13:48:20 crc kubenswrapper[4973]: I0320 13:48:20.934058 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"708f94e3-7737-454c-845c-02ed42251525","Type":"ContainerStarted","Data":"4d5149ed2679f8e455a19d14a3198c2dac8f0a9b826e28719d86313f43dfb172"} Mar 20 13:48:20 crc kubenswrapper[4973]: I0320 13:48:20.971026 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-8sqvk"] Mar 20 13:48:20 crc kubenswrapper[4973]: I0320 13:48:20.973018 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-8sqvk" Mar 20 13:48:20 crc kubenswrapper[4973]: I0320 13:48:20.975307 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 20 13:48:20 crc kubenswrapper[4973]: I0320 13:48:20.975699 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 13:48:20 crc kubenswrapper[4973]: I0320 13:48:20.975861 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 20 13:48:20 crc kubenswrapper[4973]: I0320 13:48:20.979405 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-x7tj8" Mar 20 13:48:20 crc kubenswrapper[4973]: I0320 13:48:20.988470 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-8sqvk"] Mar 20 13:48:21 crc kubenswrapper[4973]: I0320 13:48:21.027966 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9pqn\" (UniqueName: \"kubernetes.io/projected/7607725a-3dff-46d1-ac38-f1a7a393ff80-kube-api-access-z9pqn\") pod \"aodh-db-sync-8sqvk\" (UID: \"7607725a-3dff-46d1-ac38-f1a7a393ff80\") " pod="openstack/aodh-db-sync-8sqvk" Mar 20 13:48:21 crc kubenswrapper[4973]: I0320 13:48:21.028050 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7607725a-3dff-46d1-ac38-f1a7a393ff80-config-data\") pod \"aodh-db-sync-8sqvk\" (UID: \"7607725a-3dff-46d1-ac38-f1a7a393ff80\") " pod="openstack/aodh-db-sync-8sqvk" Mar 20 13:48:21 crc kubenswrapper[4973]: I0320 13:48:21.028627 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7607725a-3dff-46d1-ac38-f1a7a393ff80-combined-ca-bundle\") pod \"aodh-db-sync-8sqvk\" (UID: \"7607725a-3dff-46d1-ac38-f1a7a393ff80\") " pod="openstack/aodh-db-sync-8sqvk" Mar 20 13:48:21 crc kubenswrapper[4973]: I0320 13:48:21.028936 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7607725a-3dff-46d1-ac38-f1a7a393ff80-scripts\") pod \"aodh-db-sync-8sqvk\" (UID: \"7607725a-3dff-46d1-ac38-f1a7a393ff80\") " pod="openstack/aodh-db-sync-8sqvk" Mar 20 13:48:21 crc kubenswrapper[4973]: I0320 13:48:21.130945 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9pqn\" (UniqueName: \"kubernetes.io/projected/7607725a-3dff-46d1-ac38-f1a7a393ff80-kube-api-access-z9pqn\") pod \"aodh-db-sync-8sqvk\" (UID: \"7607725a-3dff-46d1-ac38-f1a7a393ff80\") " pod="openstack/aodh-db-sync-8sqvk" Mar 20 13:48:21 crc kubenswrapper[4973]: I0320 13:48:21.131032 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7607725a-3dff-46d1-ac38-f1a7a393ff80-config-data\") pod \"aodh-db-sync-8sqvk\" (UID: \"7607725a-3dff-46d1-ac38-f1a7a393ff80\") " pod="openstack/aodh-db-sync-8sqvk" Mar 20 13:48:21 crc kubenswrapper[4973]: I0320 13:48:21.131122 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7607725a-3dff-46d1-ac38-f1a7a393ff80-combined-ca-bundle\") pod \"aodh-db-sync-8sqvk\" (UID: \"7607725a-3dff-46d1-ac38-f1a7a393ff80\") " pod="openstack/aodh-db-sync-8sqvk" Mar 20 13:48:21 crc kubenswrapper[4973]: I0320 13:48:21.131144 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7607725a-3dff-46d1-ac38-f1a7a393ff80-scripts\") pod \"aodh-db-sync-8sqvk\" (UID: \"7607725a-3dff-46d1-ac38-f1a7a393ff80\") " pod="openstack/aodh-db-sync-8sqvk" Mar 20 13:48:21 crc kubenswrapper[4973]: I0320 13:48:21.137741 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7607725a-3dff-46d1-ac38-f1a7a393ff80-config-data\") pod \"aodh-db-sync-8sqvk\" (UID: \"7607725a-3dff-46d1-ac38-f1a7a393ff80\") " pod="openstack/aodh-db-sync-8sqvk" Mar 20 13:48:21 crc kubenswrapper[4973]: I0320 13:48:21.139481 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7607725a-3dff-46d1-ac38-f1a7a393ff80-combined-ca-bundle\") pod \"aodh-db-sync-8sqvk\" (UID: \"7607725a-3dff-46d1-ac38-f1a7a393ff80\") " pod="openstack/aodh-db-sync-8sqvk" Mar 20 13:48:21 crc kubenswrapper[4973]: I0320 13:48:21.140573 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7607725a-3dff-46d1-ac38-f1a7a393ff80-scripts\") pod \"aodh-db-sync-8sqvk\" (UID: \"7607725a-3dff-46d1-ac38-f1a7a393ff80\") " pod="openstack/aodh-db-sync-8sqvk" Mar 20 13:48:21 crc kubenswrapper[4973]: I0320 13:48:21.150384 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9pqn\" (UniqueName: \"kubernetes.io/projected/7607725a-3dff-46d1-ac38-f1a7a393ff80-kube-api-access-z9pqn\") pod \"aodh-db-sync-8sqvk\" (UID: \"7607725a-3dff-46d1-ac38-f1a7a393ff80\") " pod="openstack/aodh-db-sync-8sqvk" Mar 20 13:48:21 crc kubenswrapper[4973]: I0320 13:48:21.301868 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-8sqvk" Mar 20 13:48:21 crc kubenswrapper[4973]: I0320 13:48:21.871751 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-8sqvk"] Mar 20 13:48:21 crc kubenswrapper[4973]: I0320 13:48:21.973014 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-8sqvk" event={"ID":"7607725a-3dff-46d1-ac38-f1a7a393ff80","Type":"ContainerStarted","Data":"0e5203f28f3f5a9e332389c4ec4ee704a1dfc3a2066e3fc0e72baf9649b1e8f9"} Mar 20 13:48:21 crc kubenswrapper[4973]: I0320 13:48:21.976106 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"708f94e3-7737-454c-845c-02ed42251525","Type":"ContainerStarted","Data":"ca1fb82cf3cd4409ba21b65b629fc3bd47925be0d5a2d279d022c6b8dd8b8421"} Mar 20 13:48:22 crc kubenswrapper[4973]: E0320 13:48:22.694931 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cebacebc3abeb44846df404a9888189719b306489361fc6b40f9d49d2ca02dd9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 13:48:22 crc kubenswrapper[4973]: E0320 13:48:22.696911 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cebacebc3abeb44846df404a9888189719b306489361fc6b40f9d49d2ca02dd9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 13:48:22 crc kubenswrapper[4973]: E0320 13:48:22.702067 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="cebacebc3abeb44846df404a9888189719b306489361fc6b40f9d49d2ca02dd9" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Mar 20 13:48:22 crc kubenswrapper[4973]: E0320 13:48:22.702117 4973 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="c31768ff-7740-4ba0-a355-02b5ae3b75f0" containerName="nova-cell0-conductor-conductor" Mar 20 13:48:22 crc kubenswrapper[4973]: I0320 13:48:22.992743 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"708f94e3-7737-454c-845c-02ed42251525","Type":"ContainerStarted","Data":"efd32cc0610733774e53115cb30af510709979d519ff1929472aa92bbe0ffb3d"} Mar 20 13:48:25 crc kubenswrapper[4973]: I0320 13:48:25.017904 4973 generic.go:334] "Generic (PLEG): container finished" podID="c31768ff-7740-4ba0-a355-02b5ae3b75f0" containerID="cebacebc3abeb44846df404a9888189719b306489361fc6b40f9d49d2ca02dd9" exitCode=137 Mar 20 13:48:25 crc kubenswrapper[4973]: I0320 13:48:25.017984 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c31768ff-7740-4ba0-a355-02b5ae3b75f0","Type":"ContainerDied","Data":"cebacebc3abeb44846df404a9888189719b306489361fc6b40f9d49d2ca02dd9"} Mar 20 13:48:26 crc kubenswrapper[4973]: I0320 13:48:26.740444 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 13:48:26 crc kubenswrapper[4973]: I0320 13:48:26.780477 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c31768ff-7740-4ba0-a355-02b5ae3b75f0-config-data\") pod \"c31768ff-7740-4ba0-a355-02b5ae3b75f0\" (UID: \"c31768ff-7740-4ba0-a355-02b5ae3b75f0\") " Mar 20 13:48:26 crc kubenswrapper[4973]: I0320 13:48:26.780694 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzmh5\" (UniqueName: \"kubernetes.io/projected/c31768ff-7740-4ba0-a355-02b5ae3b75f0-kube-api-access-lzmh5\") pod \"c31768ff-7740-4ba0-a355-02b5ae3b75f0\" (UID: \"c31768ff-7740-4ba0-a355-02b5ae3b75f0\") " Mar 20 13:48:26 crc kubenswrapper[4973]: I0320 13:48:26.780727 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c31768ff-7740-4ba0-a355-02b5ae3b75f0-combined-ca-bundle\") pod \"c31768ff-7740-4ba0-a355-02b5ae3b75f0\" (UID: \"c31768ff-7740-4ba0-a355-02b5ae3b75f0\") " Mar 20 13:48:26 crc kubenswrapper[4973]: I0320 13:48:26.797724 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c31768ff-7740-4ba0-a355-02b5ae3b75f0-kube-api-access-lzmh5" (OuterVolumeSpecName: "kube-api-access-lzmh5") pod "c31768ff-7740-4ba0-a355-02b5ae3b75f0" (UID: "c31768ff-7740-4ba0-a355-02b5ae3b75f0"). InnerVolumeSpecName "kube-api-access-lzmh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:26 crc kubenswrapper[4973]: I0320 13:48:26.813001 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c31768ff-7740-4ba0-a355-02b5ae3b75f0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c31768ff-7740-4ba0-a355-02b5ae3b75f0" (UID: "c31768ff-7740-4ba0-a355-02b5ae3b75f0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:26 crc kubenswrapper[4973]: I0320 13:48:26.821333 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c31768ff-7740-4ba0-a355-02b5ae3b75f0-config-data" (OuterVolumeSpecName: "config-data") pod "c31768ff-7740-4ba0-a355-02b5ae3b75f0" (UID: "c31768ff-7740-4ba0-a355-02b5ae3b75f0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:26 crc kubenswrapper[4973]: I0320 13:48:26.883821 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzmh5\" (UniqueName: \"kubernetes.io/projected/c31768ff-7740-4ba0-a355-02b5ae3b75f0-kube-api-access-lzmh5\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:26 crc kubenswrapper[4973]: I0320 13:48:26.883860 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c31768ff-7740-4ba0-a355-02b5ae3b75f0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:26 crc kubenswrapper[4973]: I0320 13:48:26.883870 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c31768ff-7740-4ba0-a355-02b5ae3b75f0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.047054 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"708f94e3-7737-454c-845c-02ed42251525","Type":"ContainerStarted","Data":"e271186527aecc254c0001e58f8960ac5a957176ab7d2af16516e252d235f756"} Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.047197 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.049158 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.049156 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"c31768ff-7740-4ba0-a355-02b5ae3b75f0","Type":"ContainerDied","Data":"8f42022327d63df885b4b9894518ca73fc71869eb86f996920617d1aa0290fa8"} Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.049304 4973 scope.go:117] "RemoveContainer" containerID="cebacebc3abeb44846df404a9888189719b306489361fc6b40f9d49d2ca02dd9" Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.052542 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-8sqvk" event={"ID":"7607725a-3dff-46d1-ac38-f1a7a393ff80","Type":"ContainerStarted","Data":"676970f69fb018b5b2472ce7bc06064b94c37b9a6784143cf9910a3bc985da11"} Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.074506 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.713291079 podStartE2EDuration="9.0744843s" podCreationTimestamp="2026-03-20 13:48:18 +0000 UTC" firstStartedPulling="2026-03-20 13:48:19.898523236 +0000 UTC m=+1620.642192970" lastFinishedPulling="2026-03-20 13:48:26.259716447 +0000 UTC m=+1627.003386191" observedRunningTime="2026-03-20 13:48:27.069618728 +0000 UTC m=+1627.813288482" watchObservedRunningTime="2026-03-20 13:48:27.0744843 +0000 UTC m=+1627.818154044" Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.105478 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-8sqvk" podStartSLOduration=2.718411904 podStartE2EDuration="7.103903984s" podCreationTimestamp="2026-03-20 13:48:20 +0000 UTC" firstStartedPulling="2026-03-20 13:48:21.874368981 +0000 UTC m=+1622.618038725" lastFinishedPulling="2026-03-20 13:48:26.259861061 +0000 UTC m=+1627.003530805" observedRunningTime="2026-03-20 13:48:27.098866906 +0000 UTC m=+1627.842536660" watchObservedRunningTime="2026-03-20 13:48:27.103903984 +0000 UTC m=+1627.847573738" Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.133314 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.150385 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.170581 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 13:48:27 crc kubenswrapper[4973]: E0320 13:48:27.171227 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c31768ff-7740-4ba0-a355-02b5ae3b75f0" containerName="nova-cell0-conductor-conductor" Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.171248 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="c31768ff-7740-4ba0-a355-02b5ae3b75f0" containerName="nova-cell0-conductor-conductor" Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.171486 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="c31768ff-7740-4ba0-a355-02b5ae3b75f0" containerName="nova-cell0-conductor-conductor" Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.172315 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.175453 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-qgmk5" Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.175704 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.185820 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.295939 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a938dfd5-e277-4303-adfc-1d4ad07f2240-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a938dfd5-e277-4303-adfc-1d4ad07f2240\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.296132 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a938dfd5-e277-4303-adfc-1d4ad07f2240-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a938dfd5-e277-4303-adfc-1d4ad07f2240\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.296288 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wz8z\" (UniqueName: \"kubernetes.io/projected/a938dfd5-e277-4303-adfc-1d4ad07f2240-kube-api-access-4wz8z\") pod \"nova-cell0-conductor-0\" (UID: \"a938dfd5-e277-4303-adfc-1d4ad07f2240\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.399583 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a938dfd5-e277-4303-adfc-1d4ad07f2240-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a938dfd5-e277-4303-adfc-1d4ad07f2240\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.399725 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wz8z\" (UniqueName: \"kubernetes.io/projected/a938dfd5-e277-4303-adfc-1d4ad07f2240-kube-api-access-4wz8z\") pod \"nova-cell0-conductor-0\" (UID: \"a938dfd5-e277-4303-adfc-1d4ad07f2240\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.399880 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a938dfd5-e277-4303-adfc-1d4ad07f2240-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a938dfd5-e277-4303-adfc-1d4ad07f2240\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.404473 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a938dfd5-e277-4303-adfc-1d4ad07f2240-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a938dfd5-e277-4303-adfc-1d4ad07f2240\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.405539 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a938dfd5-e277-4303-adfc-1d4ad07f2240-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a938dfd5-e277-4303-adfc-1d4ad07f2240\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.417926 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wz8z\" (UniqueName: \"kubernetes.io/projected/a938dfd5-e277-4303-adfc-1d4ad07f2240-kube-api-access-4wz8z\") pod \"nova-cell0-conductor-0\" (UID: \"a938dfd5-e277-4303-adfc-1d4ad07f2240\") " pod="openstack/nova-cell0-conductor-0" Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.562925 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 20 13:48:27 crc kubenswrapper[4973]: I0320 13:48:27.966060 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c31768ff-7740-4ba0-a355-02b5ae3b75f0" path="/var/lib/kubelet/pods/c31768ff-7740-4ba0-a355-02b5ae3b75f0/volumes" Mar 20 13:48:28 crc kubenswrapper[4973]: W0320 13:48:28.058654 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda938dfd5_e277_4303_adfc_1d4ad07f2240.slice/crio-a231ddef737149ad693ec43d0fd553ea6fcb74cfe97ddd70333d74a6901a0a72 WatchSource:0}: Error finding container a231ddef737149ad693ec43d0fd553ea6fcb74cfe97ddd70333d74a6901a0a72: Status 404 returned error can't find the container with id a231ddef737149ad693ec43d0fd553ea6fcb74cfe97ddd70333d74a6901a0a72 Mar 20 13:48:28 crc kubenswrapper[4973]: I0320 13:48:28.059121 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 20 13:48:28 crc kubenswrapper[4973]: I0320 13:48:28.950854 4973 scope.go:117] "RemoveContainer" containerID="6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" Mar 20 13:48:28 crc kubenswrapper[4973]: E0320 13:48:28.951656 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:48:29 crc kubenswrapper[4973]: I0320 13:48:29.101618 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a938dfd5-e277-4303-adfc-1d4ad07f2240","Type":"ContainerStarted","Data":"46705e502931c3a929f7455ef142c41dcfcd6534093b1b40830c52252ba37f7e"} Mar 20 13:48:29 crc kubenswrapper[4973]: I0320 13:48:29.101726 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a938dfd5-e277-4303-adfc-1d4ad07f2240","Type":"ContainerStarted","Data":"a231ddef737149ad693ec43d0fd553ea6fcb74cfe97ddd70333d74a6901a0a72"} Mar 20 13:48:29 crc kubenswrapper[4973]: I0320 13:48:29.101845 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 20 13:48:29 crc kubenswrapper[4973]: I0320 13:48:29.122727 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.122708173 podStartE2EDuration="2.122708173s" podCreationTimestamp="2026-03-20 13:48:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:29.118680313 +0000 UTC m=+1629.862350077" watchObservedRunningTime="2026-03-20 13:48:29.122708173 +0000 UTC m=+1629.866377917" Mar 20 13:48:29 crc kubenswrapper[4973]: I0320 13:48:29.639709 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vrbph"] Mar 20 13:48:29 crc kubenswrapper[4973]: I0320 13:48:29.641929 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vrbph" Mar 20 13:48:29 crc kubenswrapper[4973]: I0320 13:48:29.656729 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vrbph"] Mar 20 13:48:29 crc kubenswrapper[4973]: I0320 13:48:29.761107 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40aa92f8-240c-4de8-a782-1943d2dbae21-utilities\") pod \"redhat-marketplace-vrbph\" (UID: \"40aa92f8-240c-4de8-a782-1943d2dbae21\") " pod="openshift-marketplace/redhat-marketplace-vrbph" Mar 20 13:48:29 crc kubenswrapper[4973]: I0320 13:48:29.761183 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40aa92f8-240c-4de8-a782-1943d2dbae21-catalog-content\") pod \"redhat-marketplace-vrbph\" (UID: \"40aa92f8-240c-4de8-a782-1943d2dbae21\") " pod="openshift-marketplace/redhat-marketplace-vrbph" Mar 20 13:48:29 crc kubenswrapper[4973]: I0320 13:48:29.761617 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5tcm\" (UniqueName: \"kubernetes.io/projected/40aa92f8-240c-4de8-a782-1943d2dbae21-kube-api-access-n5tcm\") pod \"redhat-marketplace-vrbph\" (UID: \"40aa92f8-240c-4de8-a782-1943d2dbae21\") " pod="openshift-marketplace/redhat-marketplace-vrbph" Mar 20 13:48:29 crc kubenswrapper[4973]: I0320 13:48:29.864046 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40aa92f8-240c-4de8-a782-1943d2dbae21-catalog-content\") pod \"redhat-marketplace-vrbph\" (UID: \"40aa92f8-240c-4de8-a782-1943d2dbae21\") " pod="openshift-marketplace/redhat-marketplace-vrbph" Mar 20 13:48:29 crc kubenswrapper[4973]: I0320 13:48:29.864598 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5tcm\" (UniqueName: \"kubernetes.io/projected/40aa92f8-240c-4de8-a782-1943d2dbae21-kube-api-access-n5tcm\") pod \"redhat-marketplace-vrbph\" (UID: \"40aa92f8-240c-4de8-a782-1943d2dbae21\") " pod="openshift-marketplace/redhat-marketplace-vrbph" Mar 20 13:48:29 crc kubenswrapper[4973]: I0320 13:48:29.864614 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40aa92f8-240c-4de8-a782-1943d2dbae21-catalog-content\") pod \"redhat-marketplace-vrbph\" (UID: \"40aa92f8-240c-4de8-a782-1943d2dbae21\") " pod="openshift-marketplace/redhat-marketplace-vrbph" Mar 20 13:48:29 crc kubenswrapper[4973]: I0320 13:48:29.864888 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40aa92f8-240c-4de8-a782-1943d2dbae21-utilities\") pod \"redhat-marketplace-vrbph\" (UID: \"40aa92f8-240c-4de8-a782-1943d2dbae21\") " pod="openshift-marketplace/redhat-marketplace-vrbph" Mar 20 13:48:29 crc kubenswrapper[4973]: I0320 13:48:29.865173 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40aa92f8-240c-4de8-a782-1943d2dbae21-utilities\") pod \"redhat-marketplace-vrbph\" (UID: \"40aa92f8-240c-4de8-a782-1943d2dbae21\") " pod="openshift-marketplace/redhat-marketplace-vrbph" Mar 20 13:48:29 crc kubenswrapper[4973]: I0320 13:48:29.894484 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5tcm\" (UniqueName: \"kubernetes.io/projected/40aa92f8-240c-4de8-a782-1943d2dbae21-kube-api-access-n5tcm\") pod \"redhat-marketplace-vrbph\" (UID: \"40aa92f8-240c-4de8-a782-1943d2dbae21\") " pod="openshift-marketplace/redhat-marketplace-vrbph" Mar 20 13:48:29 crc kubenswrapper[4973]: I0320 13:48:29.959566 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vrbph" Mar 20 13:48:30 crc kubenswrapper[4973]: I0320 13:48:30.134936 4973 generic.go:334] "Generic (PLEG): container finished" podID="7607725a-3dff-46d1-ac38-f1a7a393ff80" containerID="676970f69fb018b5b2472ce7bc06064b94c37b9a6784143cf9910a3bc985da11" exitCode=0 Mar 20 13:48:30 crc kubenswrapper[4973]: I0320 13:48:30.135012 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-8sqvk" event={"ID":"7607725a-3dff-46d1-ac38-f1a7a393ff80","Type":"ContainerDied","Data":"676970f69fb018b5b2472ce7bc06064b94c37b9a6784143cf9910a3bc985da11"} Mar 20 13:48:30 crc kubenswrapper[4973]: I0320 13:48:30.526023 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vrbph"] Mar 20 13:48:31 crc kubenswrapper[4973]: I0320 13:48:31.148128 4973 generic.go:334] "Generic (PLEG): container finished" podID="40aa92f8-240c-4de8-a782-1943d2dbae21" containerID="2fb15389346a9b5b39b40354d75461baa244d7e557de6b447dada4df9686ded5" exitCode=0 Mar 20 13:48:31 crc kubenswrapper[4973]: I0320 13:48:31.150744 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrbph" event={"ID":"40aa92f8-240c-4de8-a782-1943d2dbae21","Type":"ContainerDied","Data":"2fb15389346a9b5b39b40354d75461baa244d7e557de6b447dada4df9686ded5"} Mar 20 13:48:31 crc kubenswrapper[4973]: I0320 13:48:31.151020 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrbph" event={"ID":"40aa92f8-240c-4de8-a782-1943d2dbae21","Type":"ContainerStarted","Data":"9502c6a35d24fdc877f13f633c3dc64e58eda3000894c616fd85683d67e1a973"} Mar 20 13:48:31 crc kubenswrapper[4973]: I0320 13:48:31.632139 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-8sqvk" Mar 20 13:48:31 crc kubenswrapper[4973]: I0320 13:48:31.707893 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7607725a-3dff-46d1-ac38-f1a7a393ff80-scripts\") pod \"7607725a-3dff-46d1-ac38-f1a7a393ff80\" (UID: \"7607725a-3dff-46d1-ac38-f1a7a393ff80\") " Mar 20 13:48:31 crc kubenswrapper[4973]: I0320 13:48:31.707994 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9pqn\" (UniqueName: \"kubernetes.io/projected/7607725a-3dff-46d1-ac38-f1a7a393ff80-kube-api-access-z9pqn\") pod \"7607725a-3dff-46d1-ac38-f1a7a393ff80\" (UID: \"7607725a-3dff-46d1-ac38-f1a7a393ff80\") " Mar 20 13:48:31 crc kubenswrapper[4973]: I0320 13:48:31.708082 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7607725a-3dff-46d1-ac38-f1a7a393ff80-config-data\") pod \"7607725a-3dff-46d1-ac38-f1a7a393ff80\" (UID: \"7607725a-3dff-46d1-ac38-f1a7a393ff80\") " Mar 20 13:48:31 crc kubenswrapper[4973]: I0320 13:48:31.708288 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7607725a-3dff-46d1-ac38-f1a7a393ff80-combined-ca-bundle\") pod \"7607725a-3dff-46d1-ac38-f1a7a393ff80\" (UID: \"7607725a-3dff-46d1-ac38-f1a7a393ff80\") " Mar 20 13:48:31 crc kubenswrapper[4973]: I0320 13:48:31.714215 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7607725a-3dff-46d1-ac38-f1a7a393ff80-scripts" (OuterVolumeSpecName: "scripts") pod "7607725a-3dff-46d1-ac38-f1a7a393ff80" (UID: "7607725a-3dff-46d1-ac38-f1a7a393ff80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:31 crc kubenswrapper[4973]: I0320 13:48:31.720762 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7607725a-3dff-46d1-ac38-f1a7a393ff80-kube-api-access-z9pqn" (OuterVolumeSpecName: "kube-api-access-z9pqn") pod "7607725a-3dff-46d1-ac38-f1a7a393ff80" (UID: "7607725a-3dff-46d1-ac38-f1a7a393ff80"). InnerVolumeSpecName "kube-api-access-z9pqn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:31 crc kubenswrapper[4973]: I0320 13:48:31.741129 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7607725a-3dff-46d1-ac38-f1a7a393ff80-config-data" (OuterVolumeSpecName: "config-data") pod "7607725a-3dff-46d1-ac38-f1a7a393ff80" (UID: "7607725a-3dff-46d1-ac38-f1a7a393ff80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:31 crc kubenswrapper[4973]: I0320 13:48:31.749498 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7607725a-3dff-46d1-ac38-f1a7a393ff80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7607725a-3dff-46d1-ac38-f1a7a393ff80" (UID: "7607725a-3dff-46d1-ac38-f1a7a393ff80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:31 crc kubenswrapper[4973]: I0320 13:48:31.811312 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7607725a-3dff-46d1-ac38-f1a7a393ff80-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:31 crc kubenswrapper[4973]: I0320 13:48:31.811365 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9pqn\" (UniqueName: \"kubernetes.io/projected/7607725a-3dff-46d1-ac38-f1a7a393ff80-kube-api-access-z9pqn\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:31 crc kubenswrapper[4973]: I0320 13:48:31.811378 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7607725a-3dff-46d1-ac38-f1a7a393ff80-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:31 crc kubenswrapper[4973]: I0320 13:48:31.811390 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7607725a-3dff-46d1-ac38-f1a7a393ff80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:32 crc kubenswrapper[4973]: I0320 13:48:32.184503 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-8sqvk" event={"ID":"7607725a-3dff-46d1-ac38-f1a7a393ff80","Type":"ContainerDied","Data":"0e5203f28f3f5a9e332389c4ec4ee704a1dfc3a2066e3fc0e72baf9649b1e8f9"} Mar 20 13:48:32 crc kubenswrapper[4973]: I0320 13:48:32.185593 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0e5203f28f3f5a9e332389c4ec4ee704a1dfc3a2066e3fc0e72baf9649b1e8f9" Mar 20 13:48:32 crc kubenswrapper[4973]: I0320 13:48:32.184710 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-8sqvk" Mar 20 13:48:33 crc kubenswrapper[4973]: I0320 13:48:33.197528 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrbph" event={"ID":"40aa92f8-240c-4de8-a782-1943d2dbae21","Type":"ContainerStarted","Data":"3d215c24cde528c98861101f51a10d99eb7015ff1b68ea459cacc4df4af87d99"} Mar 20 13:48:34 crc kubenswrapper[4973]: I0320 13:48:34.213754 4973 generic.go:334] "Generic (PLEG): container finished" podID="40aa92f8-240c-4de8-a782-1943d2dbae21" containerID="3d215c24cde528c98861101f51a10d99eb7015ff1b68ea459cacc4df4af87d99" exitCode=0 Mar 20 13:48:34 crc kubenswrapper[4973]: I0320 13:48:34.213980 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrbph" event={"ID":"40aa92f8-240c-4de8-a782-1943d2dbae21","Type":"ContainerDied","Data":"3d215c24cde528c98861101f51a10d99eb7015ff1b68ea459cacc4df4af87d99"} Mar 20 13:48:35 crc kubenswrapper[4973]: I0320 13:48:35.260769 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrbph" event={"ID":"40aa92f8-240c-4de8-a782-1943d2dbae21","Type":"ContainerStarted","Data":"bc8c270323f137631d8c40ae910a63bdd79716db75f2ef02f758a7b7ebeb1ae5"} Mar 20 13:48:35 crc kubenswrapper[4973]: I0320 13:48:35.286627 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vrbph" podStartSLOduration=2.61559467 podStartE2EDuration="6.286607255s" podCreationTimestamp="2026-03-20 13:48:29 +0000 UTC" firstStartedPulling="2026-03-20 13:48:31.151614558 +0000 UTC m=+1631.895284302" lastFinishedPulling="2026-03-20 13:48:34.822627143 +0000 UTC m=+1635.566296887" observedRunningTime="2026-03-20 13:48:35.278586226 +0000 UTC m=+1636.022255980" watchObservedRunningTime="2026-03-20 13:48:35.286607255 +0000 UTC m=+1636.030276999" Mar 20 13:48:36 crc kubenswrapper[4973]: I0320 13:48:36.153461 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 20 13:48:36 crc kubenswrapper[4973]: E0320 13:48:36.154285 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7607725a-3dff-46d1-ac38-f1a7a393ff80" containerName="aodh-db-sync" Mar 20 13:48:36 crc kubenswrapper[4973]: I0320 13:48:36.154302 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="7607725a-3dff-46d1-ac38-f1a7a393ff80" containerName="aodh-db-sync" Mar 20 13:48:36 crc kubenswrapper[4973]: I0320 13:48:36.154534 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="7607725a-3dff-46d1-ac38-f1a7a393ff80" containerName="aodh-db-sync" Mar 20 13:48:36 crc kubenswrapper[4973]: I0320 13:48:36.158962 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 20 13:48:36 crc kubenswrapper[4973]: I0320 13:48:36.162882 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 20 13:48:36 crc kubenswrapper[4973]: I0320 13:48:36.163738 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 20 13:48:36 crc kubenswrapper[4973]: I0320 13:48:36.164919 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-x7tj8" Mar 20 13:48:36 crc kubenswrapper[4973]: I0320 13:48:36.179705 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 20 13:48:36 crc kubenswrapper[4973]: I0320 13:48:36.211111 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46\") " pod="openstack/aodh-0" Mar 20 13:48:36 crc kubenswrapper[4973]: I0320 13:48:36.211366 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9wlg\" (UniqueName: \"kubernetes.io/projected/6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46-kube-api-access-m9wlg\") pod \"aodh-0\" (UID: \"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46\") " pod="openstack/aodh-0" Mar 20 13:48:36 crc kubenswrapper[4973]: I0320 13:48:36.211426 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46-config-data\") pod \"aodh-0\" (UID: \"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46\") " pod="openstack/aodh-0" Mar 20 13:48:36 crc kubenswrapper[4973]: I0320 13:48:36.211509 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46-scripts\") pod \"aodh-0\" (UID: \"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46\") " pod="openstack/aodh-0" Mar 20 13:48:36 crc kubenswrapper[4973]: I0320 13:48:36.313875 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9wlg\" (UniqueName: \"kubernetes.io/projected/6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46-kube-api-access-m9wlg\") pod \"aodh-0\" (UID: \"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46\") " pod="openstack/aodh-0" Mar 20 13:48:36 crc kubenswrapper[4973]: I0320 13:48:36.314800 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46-config-data\") pod \"aodh-0\" (UID: \"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46\") " pod="openstack/aodh-0" Mar 20 13:48:36 crc kubenswrapper[4973]: I0320 13:48:36.315019 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46-scripts\") pod \"aodh-0\" (UID: \"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46\") " pod="openstack/aodh-0" Mar 20 13:48:36 crc kubenswrapper[4973]: I0320 13:48:36.315429 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46\") " pod="openstack/aodh-0" Mar 20 13:48:36 crc kubenswrapper[4973]: I0320 13:48:36.332072 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46-scripts\") pod \"aodh-0\" (UID: \"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46\") " pod="openstack/aodh-0" Mar 20 13:48:36 crc kubenswrapper[4973]: I0320 13:48:36.337310 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46-config-data\") pod \"aodh-0\" (UID: \"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46\") " pod="openstack/aodh-0" Mar 20 13:48:36 crc kubenswrapper[4973]: I0320 13:48:36.338637 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46-combined-ca-bundle\") pod \"aodh-0\" (UID: \"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46\") " pod="openstack/aodh-0" Mar 20 13:48:36 crc kubenswrapper[4973]: I0320 13:48:36.339162 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9wlg\" (UniqueName: \"kubernetes.io/projected/6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46-kube-api-access-m9wlg\") pod \"aodh-0\" (UID: \"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46\") " pod="openstack/aodh-0" Mar 20 13:48:36 crc kubenswrapper[4973]: I0320 13:48:36.491961 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 20 13:48:37 crc kubenswrapper[4973]: I0320 13:48:37.133074 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 20 13:48:37 crc kubenswrapper[4973]: W0320 13:48:37.135964 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cf232ba_9dbc_4c57_bbb0_5e3a5d8a0d46.slice/crio-d8c51111f45110bd9bb6e173933059ee874bebf25ee9434448e432cf552f91a9 WatchSource:0}: Error finding container d8c51111f45110bd9bb6e173933059ee874bebf25ee9434448e432cf552f91a9: Status 404 returned error can't find the container with id d8c51111f45110bd9bb6e173933059ee874bebf25ee9434448e432cf552f91a9 Mar 20 13:48:37 crc kubenswrapper[4973]: I0320 13:48:37.287465 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46","Type":"ContainerStarted","Data":"d8c51111f45110bd9bb6e173933059ee874bebf25ee9434448e432cf552f91a9"} Mar 20 13:48:37 crc kubenswrapper[4973]: I0320 13:48:37.625947 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 20 13:48:38 crc kubenswrapper[4973]: I0320 13:48:38.311657 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46","Type":"ContainerStarted","Data":"b98f68d8ba06a959ce32737a7dec8b2c4bf1c9941812fc35ceae360bae8902e7"} Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.136264 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-xm9xj"] Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.138313 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xm9xj" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.141656 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.141713 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.149436 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xm9xj"] Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.288000 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b9f729e-dda0-4ad0-a8fc-3f0365b27947-scripts\") pod \"nova-cell0-cell-mapping-xm9xj\" (UID: \"6b9f729e-dda0-4ad0-a8fc-3f0365b27947\") " pod="openstack/nova-cell0-cell-mapping-xm9xj" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.288380 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b9f729e-dda0-4ad0-a8fc-3f0365b27947-config-data\") pod \"nova-cell0-cell-mapping-xm9xj\" (UID: \"6b9f729e-dda0-4ad0-a8fc-3f0365b27947\") " pod="openstack/nova-cell0-cell-mapping-xm9xj" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.288435 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drb4t\" (UniqueName: \"kubernetes.io/projected/6b9f729e-dda0-4ad0-a8fc-3f0365b27947-kube-api-access-drb4t\") pod \"nova-cell0-cell-mapping-xm9xj\" (UID: \"6b9f729e-dda0-4ad0-a8fc-3f0365b27947\") " pod="openstack/nova-cell0-cell-mapping-xm9xj" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.288461 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b9f729e-dda0-4ad0-a8fc-3f0365b27947-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xm9xj\" (UID: \"6b9f729e-dda0-4ad0-a8fc-3f0365b27947\") " pod="openstack/nova-cell0-cell-mapping-xm9xj" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.346227 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.348213 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.353740 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.375838 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.400976 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b9f729e-dda0-4ad0-a8fc-3f0365b27947-scripts\") pod \"nova-cell0-cell-mapping-xm9xj\" (UID: \"6b9f729e-dda0-4ad0-a8fc-3f0365b27947\") " pod="openstack/nova-cell0-cell-mapping-xm9xj" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.401053 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b9f729e-dda0-4ad0-a8fc-3f0365b27947-config-data\") pod \"nova-cell0-cell-mapping-xm9xj\" (UID: \"6b9f729e-dda0-4ad0-a8fc-3f0365b27947\") " pod="openstack/nova-cell0-cell-mapping-xm9xj" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.401112 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drb4t\" (UniqueName: \"kubernetes.io/projected/6b9f729e-dda0-4ad0-a8fc-3f0365b27947-kube-api-access-drb4t\") pod \"nova-cell0-cell-mapping-xm9xj\" (UID: \"6b9f729e-dda0-4ad0-a8fc-3f0365b27947\") " pod="openstack/nova-cell0-cell-mapping-xm9xj" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.401140 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b9f729e-dda0-4ad0-a8fc-3f0365b27947-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xm9xj\" (UID: \"6b9f729e-dda0-4ad0-a8fc-3f0365b27947\") " pod="openstack/nova-cell0-cell-mapping-xm9xj" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.421294 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b9f729e-dda0-4ad0-a8fc-3f0365b27947-config-data\") pod \"nova-cell0-cell-mapping-xm9xj\" (UID: \"6b9f729e-dda0-4ad0-a8fc-3f0365b27947\") " pod="openstack/nova-cell0-cell-mapping-xm9xj" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.434636 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b9f729e-dda0-4ad0-a8fc-3f0365b27947-scripts\") pod \"nova-cell0-cell-mapping-xm9xj\" (UID: \"6b9f729e-dda0-4ad0-a8fc-3f0365b27947\") " pod="openstack/nova-cell0-cell-mapping-xm9xj" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.435317 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b9f729e-dda0-4ad0-a8fc-3f0365b27947-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-xm9xj\" (UID: \"6b9f729e-dda0-4ad0-a8fc-3f0365b27947\") " pod="openstack/nova-cell0-cell-mapping-xm9xj" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.468581 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.470463 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.480636 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.503985 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ed744e-9125-4240-aff5-2e2c0dd1769f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a6ed744e-9125-4240-aff5-2e2c0dd1769f\") " pod="openstack/nova-metadata-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.504270 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68gb7\" (UniqueName: \"kubernetes.io/projected/a6ed744e-9125-4240-aff5-2e2c0dd1769f-kube-api-access-68gb7\") pod \"nova-metadata-0\" (UID: \"a6ed744e-9125-4240-aff5-2e2c0dd1769f\") " pod="openstack/nova-metadata-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.504560 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6ed744e-9125-4240-aff5-2e2c0dd1769f-config-data\") pod \"nova-metadata-0\" (UID: \"a6ed744e-9125-4240-aff5-2e2c0dd1769f\") " pod="openstack/nova-metadata-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.504643 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6ed744e-9125-4240-aff5-2e2c0dd1769f-logs\") pod \"nova-metadata-0\" (UID: \"a6ed744e-9125-4240-aff5-2e2c0dd1769f\") " pod="openstack/nova-metadata-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.510013 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drb4t\" (UniqueName: \"kubernetes.io/projected/6b9f729e-dda0-4ad0-a8fc-3f0365b27947-kube-api-access-drb4t\") pod \"nova-cell0-cell-mapping-xm9xj\" (UID: \"6b9f729e-dda0-4ad0-a8fc-3f0365b27947\") " pod="openstack/nova-cell0-cell-mapping-xm9xj" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.526453 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.557415 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.559300 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.562177 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.589408 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.610349 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67aae42c-b00a-4295-873e-f9b56094726a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"67aae42c-b00a-4295-873e-f9b56094726a\") " pod="openstack/nova-api-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.610468 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67aae42c-b00a-4295-873e-f9b56094726a-logs\") pod \"nova-api-0\" (UID: \"67aae42c-b00a-4295-873e-f9b56094726a\") " pod="openstack/nova-api-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.610494 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67aae42c-b00a-4295-873e-f9b56094726a-config-data\") pod \"nova-api-0\" (UID: \"67aae42c-b00a-4295-873e-f9b56094726a\") " pod="openstack/nova-api-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.610580 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6ed744e-9125-4240-aff5-2e2c0dd1769f-config-data\") pod \"nova-metadata-0\" (UID: \"a6ed744e-9125-4240-aff5-2e2c0dd1769f\") " pod="openstack/nova-metadata-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.610608 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6ed744e-9125-4240-aff5-2e2c0dd1769f-logs\") pod \"nova-metadata-0\" (UID: \"a6ed744e-9125-4240-aff5-2e2c0dd1769f\") " pod="openstack/nova-metadata-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.610709 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wpgz\" (UniqueName: \"kubernetes.io/projected/67aae42c-b00a-4295-873e-f9b56094726a-kube-api-access-8wpgz\") pod \"nova-api-0\" (UID: \"67aae42c-b00a-4295-873e-f9b56094726a\") " pod="openstack/nova-api-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.610732 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ed744e-9125-4240-aff5-2e2c0dd1769f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a6ed744e-9125-4240-aff5-2e2c0dd1769f\") " pod="openstack/nova-metadata-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.610755 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68gb7\" (UniqueName: \"kubernetes.io/projected/a6ed744e-9125-4240-aff5-2e2c0dd1769f-kube-api-access-68gb7\") pod \"nova-metadata-0\" (UID: \"a6ed744e-9125-4240-aff5-2e2c0dd1769f\") " pod="openstack/nova-metadata-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.611391 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6ed744e-9125-4240-aff5-2e2c0dd1769f-logs\") pod \"nova-metadata-0\" (UID: \"a6ed744e-9125-4240-aff5-2e2c0dd1769f\") " pod="openstack/nova-metadata-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.647859 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ed744e-9125-4240-aff5-2e2c0dd1769f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a6ed744e-9125-4240-aff5-2e2c0dd1769f\") " pod="openstack/nova-metadata-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.651297 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6ed744e-9125-4240-aff5-2e2c0dd1769f-config-data\") pod \"nova-metadata-0\" (UID: \"a6ed744e-9125-4240-aff5-2e2c0dd1769f\") " pod="openstack/nova-metadata-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.660490 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68gb7\" (UniqueName: \"kubernetes.io/projected/a6ed744e-9125-4240-aff5-2e2c0dd1769f-kube-api-access-68gb7\") pod \"nova-metadata-0\" (UID: \"a6ed744e-9125-4240-aff5-2e2c0dd1769f\") " pod="openstack/nova-metadata-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.710959 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.712470 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx6lf\" (UniqueName: \"kubernetes.io/projected/69df30f8-12b3-40e8-b880-8344d2c737b3-kube-api-access-dx6lf\") pod \"nova-scheduler-0\" (UID: \"69df30f8-12b3-40e8-b880-8344d2c737b3\") " pod="openstack/nova-scheduler-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.712525 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69df30f8-12b3-40e8-b880-8344d2c737b3-config-data\") pod \"nova-scheduler-0\" (UID: \"69df30f8-12b3-40e8-b880-8344d2c737b3\") " pod="openstack/nova-scheduler-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.712597 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wpgz\" (UniqueName: \"kubernetes.io/projected/67aae42c-b00a-4295-873e-f9b56094726a-kube-api-access-8wpgz\") pod \"nova-api-0\" (UID: \"67aae42c-b00a-4295-873e-f9b56094726a\") " pod="openstack/nova-api-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.712624 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67aae42c-b00a-4295-873e-f9b56094726a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"67aae42c-b00a-4295-873e-f9b56094726a\") " pod="openstack/nova-api-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.712682 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69df30f8-12b3-40e8-b880-8344d2c737b3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"69df30f8-12b3-40e8-b880-8344d2c737b3\") " pod="openstack/nova-scheduler-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.712757 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67aae42c-b00a-4295-873e-f9b56094726a-logs\") pod \"nova-api-0\" (UID: \"67aae42c-b00a-4295-873e-f9b56094726a\") " pod="openstack/nova-api-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.712785 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67aae42c-b00a-4295-873e-f9b56094726a-config-data\") pod \"nova-api-0\" (UID: \"67aae42c-b00a-4295-873e-f9b56094726a\") " pod="openstack/nova-api-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.716789 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67aae42c-b00a-4295-873e-f9b56094726a-logs\") pod \"nova-api-0\" (UID: \"67aae42c-b00a-4295-873e-f9b56094726a\") " pod="openstack/nova-api-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.734060 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67aae42c-b00a-4295-873e-f9b56094726a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"67aae42c-b00a-4295-873e-f9b56094726a\") " pod="openstack/nova-api-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.734131 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.736928 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67aae42c-b00a-4295-873e-f9b56094726a-config-data\") pod \"nova-api-0\" (UID: \"67aae42c-b00a-4295-873e-f9b56094726a\") " pod="openstack/nova-api-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.737849 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.742597 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.753700 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wpgz\" (UniqueName: \"kubernetes.io/projected/67aae42c-b00a-4295-873e-f9b56094726a-kube-api-access-8wpgz\") pod \"nova-api-0\" (UID: \"67aae42c-b00a-4295-873e-f9b56094726a\") " pod="openstack/nova-api-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.768109 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.770705 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xm9xj" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.806020 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.820251 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx6lf\" (UniqueName: \"kubernetes.io/projected/69df30f8-12b3-40e8-b880-8344d2c737b3-kube-api-access-dx6lf\") pod \"nova-scheduler-0\" (UID: \"69df30f8-12b3-40e8-b880-8344d2c737b3\") " pod="openstack/nova-scheduler-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.820515 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69df30f8-12b3-40e8-b880-8344d2c737b3-config-data\") pod \"nova-scheduler-0\" (UID: \"69df30f8-12b3-40e8-b880-8344d2c737b3\") " pod="openstack/nova-scheduler-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.820622 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126a68e5-2c8b-4341-bf31-7d760b77cf8b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"126a68e5-2c8b-4341-bf31-7d760b77cf8b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.820768 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h57jm\" (UniqueName: \"kubernetes.io/projected/126a68e5-2c8b-4341-bf31-7d760b77cf8b-kube-api-access-h57jm\") pod \"nova-cell1-novncproxy-0\" (UID: \"126a68e5-2c8b-4341-bf31-7d760b77cf8b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.821097 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126a68e5-2c8b-4341-bf31-7d760b77cf8b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"126a68e5-2c8b-4341-bf31-7d760b77cf8b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.821167 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69df30f8-12b3-40e8-b880-8344d2c737b3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"69df30f8-12b3-40e8-b880-8344d2c737b3\") " pod="openstack/nova-scheduler-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.835490 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69df30f8-12b3-40e8-b880-8344d2c737b3-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"69df30f8-12b3-40e8-b880-8344d2c737b3\") " pod="openstack/nova-scheduler-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.839255 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69df30f8-12b3-40e8-b880-8344d2c737b3-config-data\") pod \"nova-scheduler-0\" (UID: \"69df30f8-12b3-40e8-b880-8344d2c737b3\") " pod="openstack/nova-scheduler-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.860441 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx6lf\" (UniqueName: \"kubernetes.io/projected/69df30f8-12b3-40e8-b880-8344d2c737b3-kube-api-access-dx6lf\") pod \"nova-scheduler-0\" (UID: \"69df30f8-12b3-40e8-b880-8344d2c737b3\") " pod="openstack/nova-scheduler-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.869415 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-jh6ck"] Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.871949 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.911621 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-jh6ck"] Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.924810 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126a68e5-2c8b-4341-bf31-7d760b77cf8b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"126a68e5-2c8b-4341-bf31-7d760b77cf8b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.925054 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126a68e5-2c8b-4341-bf31-7d760b77cf8b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"126a68e5-2c8b-4341-bf31-7d760b77cf8b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.925126 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h57jm\" (UniqueName: \"kubernetes.io/projected/126a68e5-2c8b-4341-bf31-7d760b77cf8b-kube-api-access-h57jm\") pod \"nova-cell1-novncproxy-0\" (UID: \"126a68e5-2c8b-4341-bf31-7d760b77cf8b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.944257 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126a68e5-2c8b-4341-bf31-7d760b77cf8b-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"126a68e5-2c8b-4341-bf31-7d760b77cf8b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.949967 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126a68e5-2c8b-4341-bf31-7d760b77cf8b-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"126a68e5-2c8b-4341-bf31-7d760b77cf8b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:39 crc kubenswrapper[4973]: I0320 13:48:39.960294 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h57jm\" (UniqueName: \"kubernetes.io/projected/126a68e5-2c8b-4341-bf31-7d760b77cf8b-kube-api-access-h57jm\") pod \"nova-cell1-novncproxy-0\" (UID: \"126a68e5-2c8b-4341-bf31-7d760b77cf8b\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:40 crc kubenswrapper[4973]: I0320 13:48:40.011301 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vrbph" Mar 20 13:48:40 crc kubenswrapper[4973]: I0320 13:48:40.012186 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vrbph" Mar 20 13:48:40 crc kubenswrapper[4973]: I0320 13:48:40.029336 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-ovsdbserver-nb\") pod \"dnsmasq-dns-5fbc4d444f-jh6ck\" (UID: \"7369f281-2d8f-4609-b027-d5efa15e5567\") " pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" Mar 20 13:48:40 crc kubenswrapper[4973]: I0320 13:48:40.029466 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wl4bz\" (UniqueName: \"kubernetes.io/projected/7369f281-2d8f-4609-b027-d5efa15e5567-kube-api-access-wl4bz\") pod \"dnsmasq-dns-5fbc4d444f-jh6ck\" (UID: \"7369f281-2d8f-4609-b027-d5efa15e5567\") " pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" Mar 20 13:48:40 crc kubenswrapper[4973]: I0320 13:48:40.029581 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-dns-swift-storage-0\") pod \"dnsmasq-dns-5fbc4d444f-jh6ck\" (UID: \"7369f281-2d8f-4609-b027-d5efa15e5567\") " pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" Mar 20 13:48:40 crc kubenswrapper[4973]: I0320 13:48:40.029629 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-ovsdbserver-sb\") pod \"dnsmasq-dns-5fbc4d444f-jh6ck\" (UID: \"7369f281-2d8f-4609-b027-d5efa15e5567\") " pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" Mar 20 13:48:40 crc kubenswrapper[4973]: I0320 13:48:40.029649 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-dns-svc\") pod \"dnsmasq-dns-5fbc4d444f-jh6ck\" (UID: \"7369f281-2d8f-4609-b027-d5efa15e5567\") " pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" Mar 20 13:48:40 crc kubenswrapper[4973]: I0320 13:48:40.029729 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-config\") pod \"dnsmasq-dns-5fbc4d444f-jh6ck\" (UID: \"7369f281-2d8f-4609-b027-d5efa15e5567\") " pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" Mar 20 13:48:40 crc kubenswrapper[4973]: I0320 13:48:40.112393 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:48:40 crc kubenswrapper[4973]: I0320 13:48:40.131447 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-dns-swift-storage-0\") pod \"dnsmasq-dns-5fbc4d444f-jh6ck\" (UID: \"7369f281-2d8f-4609-b027-d5efa15e5567\") " pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" Mar 20 13:48:40 crc kubenswrapper[4973]: I0320 13:48:40.131521 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-ovsdbserver-sb\") pod \"dnsmasq-dns-5fbc4d444f-jh6ck\" (UID: \"7369f281-2d8f-4609-b027-d5efa15e5567\") " pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" Mar 20 13:48:40 crc kubenswrapper[4973]: I0320 13:48:40.131548 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-dns-svc\") pod \"dnsmasq-dns-5fbc4d444f-jh6ck\" (UID: \"7369f281-2d8f-4609-b027-d5efa15e5567\") " pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" Mar 20 13:48:40 crc kubenswrapper[4973]: I0320 13:48:40.131624 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-config\") pod \"dnsmasq-dns-5fbc4d444f-jh6ck\" (UID: \"7369f281-2d8f-4609-b027-d5efa15e5567\") " pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" Mar 20 13:48:40 crc kubenswrapper[4973]: I0320 13:48:40.131700 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-ovsdbserver-nb\") pod \"dnsmasq-dns-5fbc4d444f-jh6ck\" (UID: \"7369f281-2d8f-4609-b027-d5efa15e5567\") " pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" Mar 20 13:48:40 crc kubenswrapper[4973]: I0320 13:48:40.131774 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wl4bz\" (UniqueName: \"kubernetes.io/projected/7369f281-2d8f-4609-b027-d5efa15e5567-kube-api-access-wl4bz\") pod \"dnsmasq-dns-5fbc4d444f-jh6ck\" (UID: \"7369f281-2d8f-4609-b027-d5efa15e5567\") " pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" Mar 20 13:48:40 crc kubenswrapper[4973]: I0320 13:48:40.133023 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-dns-swift-storage-0\") pod \"dnsmasq-dns-5fbc4d444f-jh6ck\" (UID: \"7369f281-2d8f-4609-b027-d5efa15e5567\") " pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" Mar 20 13:48:40 crc kubenswrapper[4973]: I0320 13:48:40.133363 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-dns-svc\") pod \"dnsmasq-dns-5fbc4d444f-jh6ck\" (UID: \"7369f281-2d8f-4609-b027-d5efa15e5567\") " pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" Mar 20 13:48:40 crc kubenswrapper[4973]: I0320 13:48:40.133560 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-config\") pod \"dnsmasq-dns-5fbc4d444f-jh6ck\" (UID: \"7369f281-2d8f-4609-b027-d5efa15e5567\") " pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" Mar 20 13:48:40 crc kubenswrapper[4973]: I0320 13:48:40.133944 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-ovsdbserver-sb\") pod \"dnsmasq-dns-5fbc4d444f-jh6ck\" (UID: \"7369f281-2d8f-4609-b027-d5efa15e5567\") " pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" Mar 20 13:48:40 crc kubenswrapper[4973]: I0320 13:48:40.134069 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-ovsdbserver-nb\") pod \"dnsmasq-dns-5fbc4d444f-jh6ck\" (UID: \"7369f281-2d8f-4609-b027-d5efa15e5567\") " pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" Mar 20 13:48:40 crc kubenswrapper[4973]: I0320 13:48:40.155680 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wl4bz\" (UniqueName: \"kubernetes.io/projected/7369f281-2d8f-4609-b027-d5efa15e5567-kube-api-access-wl4bz\") pod \"dnsmasq-dns-5fbc4d444f-jh6ck\" (UID: \"7369f281-2d8f-4609-b027-d5efa15e5567\") " pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" Mar 20 13:48:40 crc kubenswrapper[4973]: I0320 13:48:40.166703 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:40 crc kubenswrapper[4973]: I0320 13:48:40.226433 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" Mar 20 13:48:40 crc kubenswrapper[4973]: I0320 13:48:40.239979 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vrbph" Mar 20 13:48:40 crc kubenswrapper[4973]: I0320 13:48:40.520945 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:48:40 crc kubenswrapper[4973]: I0320 13:48:40.721442 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vrbph" Mar 20 13:48:41 crc kubenswrapper[4973]: I0320 13:48:41.093721 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vrbph"] Mar 20 13:48:41 crc kubenswrapper[4973]: I0320 13:48:41.278562 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-xm9xj"] Mar 20 13:48:41 crc kubenswrapper[4973]: I0320 13:48:41.363918 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:48:41 crc kubenswrapper[4973]: I0320 13:48:41.452973 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a6ed744e-9125-4240-aff5-2e2c0dd1769f","Type":"ContainerStarted","Data":"2994ce24d9f73110933fb68540a2be8c45b0e31c5df6489adaff88239551b539"} Mar 20 13:48:41 crc kubenswrapper[4973]: I0320 13:48:41.662874 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:48:41 crc kubenswrapper[4973]: I0320 13:48:41.706505 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-jh6ck"] Mar 20 13:48:41 crc kubenswrapper[4973]: I0320 13:48:41.802653 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.004164 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wwf69"] Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.012306 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wwf69" Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.019281 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wwf69"] Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.019799 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.020557 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.123637 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wwf69\" (UID: \"90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65\") " pod="openstack/nova-cell1-conductor-db-sync-wwf69" Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.123847 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65-config-data\") pod \"nova-cell1-conductor-db-sync-wwf69\" (UID: \"90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65\") " pod="openstack/nova-cell1-conductor-db-sync-wwf69" Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.123962 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65-scripts\") pod \"nova-cell1-conductor-db-sync-wwf69\" (UID: \"90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65\") " pod="openstack/nova-cell1-conductor-db-sync-wwf69" Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.124070 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmcxj\" (UniqueName: \"kubernetes.io/projected/90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65-kube-api-access-lmcxj\") pod \"nova-cell1-conductor-db-sync-wwf69\" (UID: \"90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65\") " pod="openstack/nova-cell1-conductor-db-sync-wwf69" Mar 20 13:48:42 crc kubenswrapper[4973]: W0320 13:48:42.172913 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67aae42c_b00a_4295_873e_f9b56094726a.slice/crio-87a9145af23f9562f852ebd49145e65346ec658d5f711eaf1511aadd26da7e9a WatchSource:0}: Error finding container 87a9145af23f9562f852ebd49145e65346ec658d5f711eaf1511aadd26da7e9a: Status 404 returned error can't find the container with id 87a9145af23f9562f852ebd49145e65346ec658d5f711eaf1511aadd26da7e9a Mar 20 13:48:42 crc kubenswrapper[4973]: W0320 13:48:42.173482 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod126a68e5_2c8b_4341_bf31_7d760b77cf8b.slice/crio-bd3102ee204df8c1c0ccab746ba3f987ea262bbd2e3bebdeb233062593886112 WatchSource:0}: Error finding container bd3102ee204df8c1c0ccab746ba3f987ea262bbd2e3bebdeb233062593886112: Status 404 returned error can't find the container with id bd3102ee204df8c1c0ccab746ba3f987ea262bbd2e3bebdeb233062593886112 Mar 20 13:48:42 crc kubenswrapper[4973]: W0320 13:48:42.181712 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69df30f8_12b3_40e8_b880_8344d2c737b3.slice/crio-67351248a00400395ffebb790a41e0f74fe515d810d876a131658b9ab7de374a WatchSource:0}: Error finding container 67351248a00400395ffebb790a41e0f74fe515d810d876a131658b9ab7de374a: Status 404 returned error can't find the container with id 67351248a00400395ffebb790a41e0f74fe515d810d876a131658b9ab7de374a Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.245270 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmcxj\" (UniqueName: \"kubernetes.io/projected/90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65-kube-api-access-lmcxj\") pod \"nova-cell1-conductor-db-sync-wwf69\" (UID: \"90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65\") " pod="openstack/nova-cell1-conductor-db-sync-wwf69" Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.245373 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wwf69\" (UID: \"90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65\") " pod="openstack/nova-cell1-conductor-db-sync-wwf69" Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.245592 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65-config-data\") pod \"nova-cell1-conductor-db-sync-wwf69\" (UID: \"90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65\") " pod="openstack/nova-cell1-conductor-db-sync-wwf69" Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.245778 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65-scripts\") pod \"nova-cell1-conductor-db-sync-wwf69\" (UID: \"90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65\") " pod="openstack/nova-cell1-conductor-db-sync-wwf69" Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.256633 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65-scripts\") pod \"nova-cell1-conductor-db-sync-wwf69\" (UID: \"90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65\") " pod="openstack/nova-cell1-conductor-db-sync-wwf69" Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.256738 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65-config-data\") pod \"nova-cell1-conductor-db-sync-wwf69\" (UID: \"90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65\") " pod="openstack/nova-cell1-conductor-db-sync-wwf69" Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.259694 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-wwf69\" (UID: \"90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65\") " pod="openstack/nova-cell1-conductor-db-sync-wwf69" Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.265413 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmcxj\" (UniqueName: \"kubernetes.io/projected/90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65-kube-api-access-lmcxj\") pod \"nova-cell1-conductor-db-sync-wwf69\" (UID: \"90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65\") " pod="openstack/nova-cell1-conductor-db-sync-wwf69" Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.359488 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wwf69" Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.468383 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.468752 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="708f94e3-7737-454c-845c-02ed42251525" containerName="ceilometer-central-agent" containerID="cri-o://4d5149ed2679f8e455a19d14a3198c2dac8f0a9b826e28719d86313f43dfb172" gracePeriod=30 Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.469198 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="708f94e3-7737-454c-845c-02ed42251525" containerName="proxy-httpd" containerID="cri-o://e271186527aecc254c0001e58f8960ac5a957176ab7d2af16516e252d235f756" gracePeriod=30 Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.469263 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="708f94e3-7737-454c-845c-02ed42251525" containerName="sg-core" containerID="cri-o://efd32cc0610733774e53115cb30af510709979d519ff1929472aa92bbe0ffb3d" gracePeriod=30 Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.469297 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="708f94e3-7737-454c-845c-02ed42251525" containerName="ceilometer-notification-agent" containerID="cri-o://ca1fb82cf3cd4409ba21b65b629fc3bd47925be0d5a2d279d022c6b8dd8b8421" gracePeriod=30 Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.478263 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.482708 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"69df30f8-12b3-40e8-b880-8344d2c737b3","Type":"ContainerStarted","Data":"67351248a00400395ffebb790a41e0f74fe515d810d876a131658b9ab7de374a"} Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.492821 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xm9xj" event={"ID":"6b9f729e-dda0-4ad0-a8fc-3f0365b27947","Type":"ContainerStarted","Data":"ed424703e334f40597296f94c6e68f1ed675a6909d4a88c40b91ae268ede90ec"} Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.512460 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" event={"ID":"7369f281-2d8f-4609-b027-d5efa15e5567","Type":"ContainerStarted","Data":"e8cf3b3679f01de833657f63a0d5f04df9c77bf17775cc353454d8d4b18f3cfa"} Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.517775 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"126a68e5-2c8b-4341-bf31-7d760b77cf8b","Type":"ContainerStarted","Data":"bd3102ee204df8c1c0ccab746ba3f987ea262bbd2e3bebdeb233062593886112"} Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.531262 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67aae42c-b00a-4295-873e-f9b56094726a","Type":"ContainerStarted","Data":"87a9145af23f9562f852ebd49145e65346ec658d5f711eaf1511aadd26da7e9a"} Mar 20 13:48:42 crc kubenswrapper[4973]: I0320 13:48:42.531399 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vrbph" podUID="40aa92f8-240c-4de8-a782-1943d2dbae21" containerName="registry-server" containerID="cri-o://bc8c270323f137631d8c40ae910a63bdd79716db75f2ef02f758a7b7ebeb1ae5" gracePeriod=2 Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.077759 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wwf69"] Mar 20 13:48:43 crc kubenswrapper[4973]: W0320 13:48:43.104610 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod90d5b5b7_ec1d_42bd_bd12_3cbf50d09e65.slice/crio-c6f7605e5910a27b303b78d055f9093e76e43dc6fcdcd388dbc163ec83c8ae3d WatchSource:0}: Error finding container c6f7605e5910a27b303b78d055f9093e76e43dc6fcdcd388dbc163ec83c8ae3d: Status 404 returned error can't find the container with id c6f7605e5910a27b303b78d055f9093e76e43dc6fcdcd388dbc163ec83c8ae3d Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.552521 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wwf69" event={"ID":"90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65","Type":"ContainerStarted","Data":"c6f7605e5910a27b303b78d055f9093e76e43dc6fcdcd388dbc163ec83c8ae3d"} Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.605596 4973 generic.go:334] "Generic (PLEG): container finished" podID="708f94e3-7737-454c-845c-02ed42251525" containerID="e271186527aecc254c0001e58f8960ac5a957176ab7d2af16516e252d235f756" exitCode=0 Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.605908 4973 generic.go:334] "Generic (PLEG): container finished" podID="708f94e3-7737-454c-845c-02ed42251525" containerID="efd32cc0610733774e53115cb30af510709979d519ff1929472aa92bbe0ffb3d" exitCode=2 Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.605918 4973 generic.go:334] "Generic (PLEG): container finished" podID="708f94e3-7737-454c-845c-02ed42251525" containerID="4d5149ed2679f8e455a19d14a3198c2dac8f0a9b826e28719d86313f43dfb172" exitCode=0 Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.605981 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"708f94e3-7737-454c-845c-02ed42251525","Type":"ContainerDied","Data":"e271186527aecc254c0001e58f8960ac5a957176ab7d2af16516e252d235f756"} Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.606009 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"708f94e3-7737-454c-845c-02ed42251525","Type":"ContainerDied","Data":"efd32cc0610733774e53115cb30af510709979d519ff1929472aa92bbe0ffb3d"} Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.606021 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"708f94e3-7737-454c-845c-02ed42251525","Type":"ContainerDied","Data":"4d5149ed2679f8e455a19d14a3198c2dac8f0a9b826e28719d86313f43dfb172"} Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.621739 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vrbph" Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.662042 4973 generic.go:334] "Generic (PLEG): container finished" podID="7369f281-2d8f-4609-b027-d5efa15e5567" containerID="493fb2840749d3c1b41b27aeb579cd78973c04df87c08d5d149a143a7b082093" exitCode=0 Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.662199 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" event={"ID":"7369f281-2d8f-4609-b027-d5efa15e5567","Type":"ContainerDied","Data":"493fb2840749d3c1b41b27aeb579cd78973c04df87c08d5d149a143a7b082093"} Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.711538 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5tcm\" (UniqueName: \"kubernetes.io/projected/40aa92f8-240c-4de8-a782-1943d2dbae21-kube-api-access-n5tcm\") pod \"40aa92f8-240c-4de8-a782-1943d2dbae21\" (UID: \"40aa92f8-240c-4de8-a782-1943d2dbae21\") " Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.711726 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40aa92f8-240c-4de8-a782-1943d2dbae21-catalog-content\") pod \"40aa92f8-240c-4de8-a782-1943d2dbae21\" (UID: \"40aa92f8-240c-4de8-a782-1943d2dbae21\") " Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.711882 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40aa92f8-240c-4de8-a782-1943d2dbae21-utilities\") pod \"40aa92f8-240c-4de8-a782-1943d2dbae21\" (UID: \"40aa92f8-240c-4de8-a782-1943d2dbae21\") " Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.717465 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40aa92f8-240c-4de8-a782-1943d2dbae21-utilities" (OuterVolumeSpecName: "utilities") pod "40aa92f8-240c-4de8-a782-1943d2dbae21" (UID: "40aa92f8-240c-4de8-a782-1943d2dbae21"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.746539 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/40aa92f8-240c-4de8-a782-1943d2dbae21-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.756322 4973 generic.go:334] "Generic (PLEG): container finished" podID="40aa92f8-240c-4de8-a782-1943d2dbae21" containerID="bc8c270323f137631d8c40ae910a63bdd79716db75f2ef02f758a7b7ebeb1ae5" exitCode=0 Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.756451 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrbph" event={"ID":"40aa92f8-240c-4de8-a782-1943d2dbae21","Type":"ContainerDied","Data":"bc8c270323f137631d8c40ae910a63bdd79716db75f2ef02f758a7b7ebeb1ae5"} Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.756482 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vrbph" event={"ID":"40aa92f8-240c-4de8-a782-1943d2dbae21","Type":"ContainerDied","Data":"9502c6a35d24fdc877f13f633c3dc64e58eda3000894c616fd85683d67e1a973"} Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.756499 4973 scope.go:117] "RemoveContainer" containerID="bc8c270323f137631d8c40ae910a63bdd79716db75f2ef02f758a7b7ebeb1ae5" Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.756711 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vrbph" Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.765799 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40aa92f8-240c-4de8-a782-1943d2dbae21-kube-api-access-n5tcm" (OuterVolumeSpecName: "kube-api-access-n5tcm") pod "40aa92f8-240c-4de8-a782-1943d2dbae21" (UID: "40aa92f8-240c-4de8-a782-1943d2dbae21"). InnerVolumeSpecName "kube-api-access-n5tcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.818547 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40aa92f8-240c-4de8-a782-1943d2dbae21-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "40aa92f8-240c-4de8-a782-1943d2dbae21" (UID: "40aa92f8-240c-4de8-a782-1943d2dbae21"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.819139 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xm9xj" event={"ID":"6b9f729e-dda0-4ad0-a8fc-3f0365b27947","Type":"ContainerStarted","Data":"8d7d87ff2805313825bce4cad26271bdd381c5ca0531450eebf153c6a88d38d9"} Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.850562 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5tcm\" (UniqueName: \"kubernetes.io/projected/40aa92f8-240c-4de8-a782-1943d2dbae21-kube-api-access-n5tcm\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.850594 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/40aa92f8-240c-4de8-a782-1943d2dbae21-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.900879 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-xm9xj" podStartSLOduration=4.900851665 podStartE2EDuration="4.900851665s" podCreationTimestamp="2026-03-20 13:48:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:43.863144975 +0000 UTC m=+1644.606814729" watchObservedRunningTime="2026-03-20 13:48:43.900851665 +0000 UTC m=+1644.644521409" Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.920066 4973 scope.go:117] "RemoveContainer" containerID="3d215c24cde528c98861101f51a10d99eb7015ff1b68ea459cacc4df4af87d99" Mar 20 13:48:43 crc kubenswrapper[4973]: I0320 13:48:43.955146 4973 scope.go:117] "RemoveContainer" containerID="6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" Mar 20 13:48:43 crc kubenswrapper[4973]: E0320 13:48:43.956915 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:48:44 crc kubenswrapper[4973]: I0320 13:48:44.044362 4973 scope.go:117] "RemoveContainer" containerID="2fb15389346a9b5b39b40354d75461baa244d7e557de6b447dada4df9686ded5" Mar 20 13:48:44 crc kubenswrapper[4973]: I0320 13:48:44.098821 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vrbph"] Mar 20 13:48:44 crc kubenswrapper[4973]: I0320 13:48:44.099037 4973 scope.go:117] "RemoveContainer" containerID="bc8c270323f137631d8c40ae910a63bdd79716db75f2ef02f758a7b7ebeb1ae5" Mar 20 13:48:44 crc kubenswrapper[4973]: E0320 13:48:44.099894 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc8c270323f137631d8c40ae910a63bdd79716db75f2ef02f758a7b7ebeb1ae5\": container with ID starting with bc8c270323f137631d8c40ae910a63bdd79716db75f2ef02f758a7b7ebeb1ae5 not found: ID does not exist" containerID="bc8c270323f137631d8c40ae910a63bdd79716db75f2ef02f758a7b7ebeb1ae5" Mar 20 13:48:44 crc kubenswrapper[4973]: I0320 13:48:44.099931 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc8c270323f137631d8c40ae910a63bdd79716db75f2ef02f758a7b7ebeb1ae5"} err="failed to get container status \"bc8c270323f137631d8c40ae910a63bdd79716db75f2ef02f758a7b7ebeb1ae5\": rpc error: code = NotFound desc = could not find container \"bc8c270323f137631d8c40ae910a63bdd79716db75f2ef02f758a7b7ebeb1ae5\": container with ID starting with bc8c270323f137631d8c40ae910a63bdd79716db75f2ef02f758a7b7ebeb1ae5 not found: ID does not exist" Mar 20 13:48:44 crc kubenswrapper[4973]: I0320 13:48:44.099959 4973 scope.go:117] "RemoveContainer" containerID="3d215c24cde528c98861101f51a10d99eb7015ff1b68ea459cacc4df4af87d99" Mar 20 13:48:44 crc kubenswrapper[4973]: E0320 13:48:44.100546 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d215c24cde528c98861101f51a10d99eb7015ff1b68ea459cacc4df4af87d99\": container with ID starting with 3d215c24cde528c98861101f51a10d99eb7015ff1b68ea459cacc4df4af87d99 not found: ID does not exist" containerID="3d215c24cde528c98861101f51a10d99eb7015ff1b68ea459cacc4df4af87d99" Mar 20 13:48:44 crc kubenswrapper[4973]: I0320 13:48:44.100567 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d215c24cde528c98861101f51a10d99eb7015ff1b68ea459cacc4df4af87d99"} err="failed to get container status \"3d215c24cde528c98861101f51a10d99eb7015ff1b68ea459cacc4df4af87d99\": rpc error: code = NotFound desc = could not find container \"3d215c24cde528c98861101f51a10d99eb7015ff1b68ea459cacc4df4af87d99\": container with ID starting with 3d215c24cde528c98861101f51a10d99eb7015ff1b68ea459cacc4df4af87d99 not found: ID does not exist" Mar 20 13:48:44 crc kubenswrapper[4973]: I0320 13:48:44.100590 4973 scope.go:117] "RemoveContainer" containerID="2fb15389346a9b5b39b40354d75461baa244d7e557de6b447dada4df9686ded5" Mar 20 13:48:44 crc kubenswrapper[4973]: E0320 13:48:44.101211 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fb15389346a9b5b39b40354d75461baa244d7e557de6b447dada4df9686ded5\": container with ID starting with 2fb15389346a9b5b39b40354d75461baa244d7e557de6b447dada4df9686ded5 not found: ID does not exist" containerID="2fb15389346a9b5b39b40354d75461baa244d7e557de6b447dada4df9686ded5" Mar 20 13:48:44 crc kubenswrapper[4973]: I0320 13:48:44.101249 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fb15389346a9b5b39b40354d75461baa244d7e557de6b447dada4df9686ded5"} err="failed to get container status \"2fb15389346a9b5b39b40354d75461baa244d7e557de6b447dada4df9686ded5\": rpc error: code = NotFound desc = could not find container \"2fb15389346a9b5b39b40354d75461baa244d7e557de6b447dada4df9686ded5\": container with ID starting with 2fb15389346a9b5b39b40354d75461baa244d7e557de6b447dada4df9686ded5 not found: ID does not exist" Mar 20 13:48:44 crc kubenswrapper[4973]: I0320 13:48:44.122571 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vrbph"] Mar 20 13:48:44 crc kubenswrapper[4973]: I0320 13:48:44.563176 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:48:44 crc kubenswrapper[4973]: I0320 13:48:44.605336 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:48:44 crc kubenswrapper[4973]: I0320 13:48:44.853515 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" event={"ID":"7369f281-2d8f-4609-b027-d5efa15e5567","Type":"ContainerStarted","Data":"4c68a8e2b0d801372785724d8bbcb0d7b739bfe828ffdce539d96659261e782b"} Mar 20 13:48:44 crc kubenswrapper[4973]: I0320 13:48:44.854078 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" Mar 20 13:48:44 crc kubenswrapper[4973]: I0320 13:48:44.866701 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46","Type":"ContainerStarted","Data":"db5af94fafe96b7e1e64b783aa04c6c31da69cb7d2ba7f14e326d73f7848dbf0"} Mar 20 13:48:44 crc kubenswrapper[4973]: I0320 13:48:44.875806 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wwf69" event={"ID":"90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65","Type":"ContainerStarted","Data":"9421d4a8f747a4ff43be42f3ae0f3390f9170da9651133851f6efb1a308c24e5"} Mar 20 13:48:44 crc kubenswrapper[4973]: I0320 13:48:44.886448 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" podStartSLOduration=5.886428093 podStartE2EDuration="5.886428093s" podCreationTimestamp="2026-03-20 13:48:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:44.885965321 +0000 UTC m=+1645.629635085" watchObservedRunningTime="2026-03-20 13:48:44.886428093 +0000 UTC m=+1645.630097847" Mar 20 13:48:44 crc kubenswrapper[4973]: I0320 13:48:44.903192 4973 generic.go:334] "Generic (PLEG): container finished" podID="708f94e3-7737-454c-845c-02ed42251525" containerID="ca1fb82cf3cd4409ba21b65b629fc3bd47925be0d5a2d279d022c6b8dd8b8421" exitCode=0 Mar 20 13:48:44 crc kubenswrapper[4973]: I0320 13:48:44.904455 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"708f94e3-7737-454c-845c-02ed42251525","Type":"ContainerDied","Data":"ca1fb82cf3cd4409ba21b65b629fc3bd47925be0d5a2d279d022c6b8dd8b8421"} Mar 20 13:48:44 crc kubenswrapper[4973]: I0320 13:48:44.904499 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"708f94e3-7737-454c-845c-02ed42251525","Type":"ContainerDied","Data":"e4d606fa01ede0292dea3e41878d9572cc45e141509a17000e3342283aef296b"} Mar 20 13:48:44 crc kubenswrapper[4973]: I0320 13:48:44.904514 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4d606fa01ede0292dea3e41878d9572cc45e141509a17000e3342283aef296b" Mar 20 13:48:44 crc kubenswrapper[4973]: I0320 13:48:44.914835 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:48:44 crc kubenswrapper[4973]: I0320 13:48:44.919004 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-wwf69" podStartSLOduration=3.918982903 podStartE2EDuration="3.918982903s" podCreationTimestamp="2026-03-20 13:48:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:44.9079051 +0000 UTC m=+1645.651574864" watchObservedRunningTime="2026-03-20 13:48:44.918982903 +0000 UTC m=+1645.662652657" Mar 20 13:48:44 crc kubenswrapper[4973]: I0320 13:48:44.989317 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 20 13:48:45 crc kubenswrapper[4973]: I0320 13:48:45.105365 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/708f94e3-7737-454c-845c-02ed42251525-config-data\") pod \"708f94e3-7737-454c-845c-02ed42251525\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " Mar 20 13:48:45 crc kubenswrapper[4973]: I0320 13:48:45.105724 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/708f94e3-7737-454c-845c-02ed42251525-log-httpd\") pod \"708f94e3-7737-454c-845c-02ed42251525\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " Mar 20 13:48:45 crc kubenswrapper[4973]: I0320 13:48:45.105786 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlqj7\" (UniqueName: \"kubernetes.io/projected/708f94e3-7737-454c-845c-02ed42251525-kube-api-access-qlqj7\") pod \"708f94e3-7737-454c-845c-02ed42251525\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " Mar 20 13:48:45 crc kubenswrapper[4973]: I0320 13:48:45.105803 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/708f94e3-7737-454c-845c-02ed42251525-sg-core-conf-yaml\") pod \"708f94e3-7737-454c-845c-02ed42251525\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " Mar 20 13:48:45 crc kubenswrapper[4973]: I0320 13:48:45.105830 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/708f94e3-7737-454c-845c-02ed42251525-combined-ca-bundle\") pod \"708f94e3-7737-454c-845c-02ed42251525\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " Mar 20 13:48:45 crc kubenswrapper[4973]: I0320 13:48:45.105901 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/708f94e3-7737-454c-845c-02ed42251525-scripts\") pod \"708f94e3-7737-454c-845c-02ed42251525\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " Mar 20 13:48:45 crc kubenswrapper[4973]: I0320 13:48:45.105972 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/708f94e3-7737-454c-845c-02ed42251525-run-httpd\") pod \"708f94e3-7737-454c-845c-02ed42251525\" (UID: \"708f94e3-7737-454c-845c-02ed42251525\") " Mar 20 13:48:45 crc kubenswrapper[4973]: I0320 13:48:45.107041 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/708f94e3-7737-454c-845c-02ed42251525-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "708f94e3-7737-454c-845c-02ed42251525" (UID: "708f94e3-7737-454c-845c-02ed42251525"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:48:45 crc kubenswrapper[4973]: I0320 13:48:45.113795 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/708f94e3-7737-454c-845c-02ed42251525-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "708f94e3-7737-454c-845c-02ed42251525" (UID: "708f94e3-7737-454c-845c-02ed42251525"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:48:45 crc kubenswrapper[4973]: I0320 13:48:45.157530 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/708f94e3-7737-454c-845c-02ed42251525-kube-api-access-qlqj7" (OuterVolumeSpecName: "kube-api-access-qlqj7") pod "708f94e3-7737-454c-845c-02ed42251525" (UID: "708f94e3-7737-454c-845c-02ed42251525"). InnerVolumeSpecName "kube-api-access-qlqj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:45 crc kubenswrapper[4973]: I0320 13:48:45.158104 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/708f94e3-7737-454c-845c-02ed42251525-scripts" (OuterVolumeSpecName: "scripts") pod "708f94e3-7737-454c-845c-02ed42251525" (UID: "708f94e3-7737-454c-845c-02ed42251525"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:45 crc kubenswrapper[4973]: I0320 13:48:45.163481 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/708f94e3-7737-454c-845c-02ed42251525-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "708f94e3-7737-454c-845c-02ed42251525" (UID: "708f94e3-7737-454c-845c-02ed42251525"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:45 crc kubenswrapper[4973]: I0320 13:48:45.209278 4973 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/708f94e3-7737-454c-845c-02ed42251525-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:45 crc kubenswrapper[4973]: I0320 13:48:45.209326 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlqj7\" (UniqueName: \"kubernetes.io/projected/708f94e3-7737-454c-845c-02ed42251525-kube-api-access-qlqj7\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:45 crc kubenswrapper[4973]: I0320 13:48:45.209341 4973 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/708f94e3-7737-454c-845c-02ed42251525-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:45 crc kubenswrapper[4973]: I0320 13:48:45.209364 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/708f94e3-7737-454c-845c-02ed42251525-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:45 crc kubenswrapper[4973]: I0320 13:48:45.209375 4973 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/708f94e3-7737-454c-845c-02ed42251525-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:45 crc kubenswrapper[4973]: I0320 13:48:45.263492 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/708f94e3-7737-454c-845c-02ed42251525-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "708f94e3-7737-454c-845c-02ed42251525" (UID: "708f94e3-7737-454c-845c-02ed42251525"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:45 crc kubenswrapper[4973]: I0320 13:48:45.311932 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/708f94e3-7737-454c-845c-02ed42251525-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:45 crc kubenswrapper[4973]: I0320 13:48:45.352531 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/708f94e3-7737-454c-845c-02ed42251525-config-data" (OuterVolumeSpecName: "config-data") pod "708f94e3-7737-454c-845c-02ed42251525" (UID: "708f94e3-7737-454c-845c-02ed42251525"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:45 crc kubenswrapper[4973]: I0320 13:48:45.413870 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/708f94e3-7737-454c-845c-02ed42251525-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:45 crc kubenswrapper[4973]: I0320 13:48:45.917790 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:48:45 crc kubenswrapper[4973]: I0320 13:48:45.966729 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40aa92f8-240c-4de8-a782-1943d2dbae21" path="/var/lib/kubelet/pods/40aa92f8-240c-4de8-a782-1943d2dbae21/volumes" Mar 20 13:48:45 crc kubenswrapper[4973]: I0320 13:48:45.979053 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:48:45 crc kubenswrapper[4973]: I0320 13:48:45.990753 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.027801 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:48:46 crc kubenswrapper[4973]: E0320 13:48:46.029252 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="708f94e3-7737-454c-845c-02ed42251525" containerName="sg-core" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.029283 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="708f94e3-7737-454c-845c-02ed42251525" containerName="sg-core" Mar 20 13:48:46 crc kubenswrapper[4973]: E0320 13:48:46.029322 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40aa92f8-240c-4de8-a782-1943d2dbae21" containerName="extract-content" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.029331 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="40aa92f8-240c-4de8-a782-1943d2dbae21" containerName="extract-content" Mar 20 13:48:46 crc kubenswrapper[4973]: E0320 13:48:46.029380 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40aa92f8-240c-4de8-a782-1943d2dbae21" containerName="registry-server" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.029391 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="40aa92f8-240c-4de8-a782-1943d2dbae21" containerName="registry-server" Mar 20 13:48:46 crc kubenswrapper[4973]: E0320 13:48:46.029416 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="708f94e3-7737-454c-845c-02ed42251525" containerName="proxy-httpd" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.029424 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="708f94e3-7737-454c-845c-02ed42251525" containerName="proxy-httpd" Mar 20 13:48:46 crc kubenswrapper[4973]: E0320 13:48:46.029452 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="708f94e3-7737-454c-845c-02ed42251525" containerName="ceilometer-notification-agent" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.029460 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="708f94e3-7737-454c-845c-02ed42251525" containerName="ceilometer-notification-agent" Mar 20 13:48:46 crc kubenswrapper[4973]: E0320 13:48:46.029545 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="708f94e3-7737-454c-845c-02ed42251525" containerName="ceilometer-central-agent" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.029556 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="708f94e3-7737-454c-845c-02ed42251525" containerName="ceilometer-central-agent" Mar 20 13:48:46 crc kubenswrapper[4973]: E0320 13:48:46.029591 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="40aa92f8-240c-4de8-a782-1943d2dbae21" containerName="extract-utilities" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.029604 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="40aa92f8-240c-4de8-a782-1943d2dbae21" containerName="extract-utilities" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.030835 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="40aa92f8-240c-4de8-a782-1943d2dbae21" containerName="registry-server" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.030873 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="708f94e3-7737-454c-845c-02ed42251525" containerName="sg-core" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.030894 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="708f94e3-7737-454c-845c-02ed42251525" containerName="ceilometer-central-agent" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.030923 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="708f94e3-7737-454c-845c-02ed42251525" containerName="ceilometer-notification-agent" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.030973 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="708f94e3-7737-454c-845c-02ed42251525" containerName="proxy-httpd" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.037195 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.041903 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.045839 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.090058 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.150964 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ef21c1e-3352-4660-9cf4-3023ee75c18b-log-httpd\") pod \"ceilometer-0\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " pod="openstack/ceilometer-0" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.152101 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ef21c1e-3352-4660-9cf4-3023ee75c18b-scripts\") pod \"ceilometer-0\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " pod="openstack/ceilometer-0" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.152299 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ef21c1e-3352-4660-9cf4-3023ee75c18b-config-data\") pod \"ceilometer-0\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " pod="openstack/ceilometer-0" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.152329 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ef21c1e-3352-4660-9cf4-3023ee75c18b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " pod="openstack/ceilometer-0" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.152473 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ef21c1e-3352-4660-9cf4-3023ee75c18b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " pod="openstack/ceilometer-0" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.152536 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsms7\" (UniqueName: \"kubernetes.io/projected/4ef21c1e-3352-4660-9cf4-3023ee75c18b-kube-api-access-fsms7\") pod \"ceilometer-0\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " pod="openstack/ceilometer-0" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.152575 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ef21c1e-3352-4660-9cf4-3023ee75c18b-run-httpd\") pod \"ceilometer-0\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " pod="openstack/ceilometer-0" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.257830 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ef21c1e-3352-4660-9cf4-3023ee75c18b-config-data\") pod \"ceilometer-0\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " pod="openstack/ceilometer-0" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.257872 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ef21c1e-3352-4660-9cf4-3023ee75c18b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " pod="openstack/ceilometer-0" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.257908 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ef21c1e-3352-4660-9cf4-3023ee75c18b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " pod="openstack/ceilometer-0" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.257945 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsms7\" (UniqueName: \"kubernetes.io/projected/4ef21c1e-3352-4660-9cf4-3023ee75c18b-kube-api-access-fsms7\") pod \"ceilometer-0\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " pod="openstack/ceilometer-0" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.257981 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ef21c1e-3352-4660-9cf4-3023ee75c18b-run-httpd\") pod \"ceilometer-0\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " pod="openstack/ceilometer-0" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.258064 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ef21c1e-3352-4660-9cf4-3023ee75c18b-log-httpd\") pod \"ceilometer-0\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " pod="openstack/ceilometer-0" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.258094 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ef21c1e-3352-4660-9cf4-3023ee75c18b-scripts\") pod \"ceilometer-0\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " pod="openstack/ceilometer-0" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.259548 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ef21c1e-3352-4660-9cf4-3023ee75c18b-run-httpd\") pod \"ceilometer-0\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " pod="openstack/ceilometer-0" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.260663 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ef21c1e-3352-4660-9cf4-3023ee75c18b-log-httpd\") pod \"ceilometer-0\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " pod="openstack/ceilometer-0" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.264177 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ef21c1e-3352-4660-9cf4-3023ee75c18b-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " pod="openstack/ceilometer-0" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.269266 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ef21c1e-3352-4660-9cf4-3023ee75c18b-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " pod="openstack/ceilometer-0" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.270396 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ef21c1e-3352-4660-9cf4-3023ee75c18b-config-data\") pod \"ceilometer-0\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " pod="openstack/ceilometer-0" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.278040 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ef21c1e-3352-4660-9cf4-3023ee75c18b-scripts\") pod \"ceilometer-0\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " pod="openstack/ceilometer-0" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.309259 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsms7\" (UniqueName: \"kubernetes.io/projected/4ef21c1e-3352-4660-9cf4-3023ee75c18b-kube-api-access-fsms7\") pod \"ceilometer-0\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " pod="openstack/ceilometer-0" Mar 20 13:48:46 crc kubenswrapper[4973]: I0320 13:48:46.397135 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:48:47 crc kubenswrapper[4973]: I0320 13:48:47.901508 4973 scope.go:117] "RemoveContainer" containerID="413934feb0e3a7b1e8afa46eb5e13abbcfd0f3044499ba98c2be17557158c20d" Mar 20 13:48:47 crc kubenswrapper[4973]: I0320 13:48:47.968036 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="708f94e3-7737-454c-845c-02ed42251525" path="/var/lib/kubelet/pods/708f94e3-7737-454c-845c-02ed42251525/volumes" Mar 20 13:48:49 crc kubenswrapper[4973]: I0320 13:48:49.074658 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:48:49 crc kubenswrapper[4973]: W0320 13:48:49.213055 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ef21c1e_3352_4660_9cf4_3023ee75c18b.slice/crio-b90f7a65fbdfcb70d8ff5e3492cee1858a0b94fbb6ed2e8b3bfa6eb958d58a00 WatchSource:0}: Error finding container b90f7a65fbdfcb70d8ff5e3492cee1858a0b94fbb6ed2e8b3bfa6eb958d58a00: Status 404 returned error can't find the container with id b90f7a65fbdfcb70d8ff5e3492cee1858a0b94fbb6ed2e8b3bfa6eb958d58a00 Mar 20 13:48:49 crc kubenswrapper[4973]: I0320 13:48:49.213628 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:48:49 crc kubenswrapper[4973]: I0320 13:48:49.998764 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46","Type":"ContainerStarted","Data":"e18d7336844c3f073b75c20896cdc0d5a8453949f430ae1546a7eb68910ebe8f"} Mar 20 13:48:50 crc kubenswrapper[4973]: I0320 13:48:50.014693 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67aae42c-b00a-4295-873e-f9b56094726a","Type":"ContainerStarted","Data":"50969a1aef77ee9053f5b3e4319958c446e8d86d04165333aab817d4a0260dc5"} Mar 20 13:48:50 crc kubenswrapper[4973]: I0320 13:48:50.014831 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67aae42c-b00a-4295-873e-f9b56094726a","Type":"ContainerStarted","Data":"512fe11d9d468098f06d950031447e5ea75540f2b060e7fc602868ffcfa2f046"} Mar 20 13:48:50 crc kubenswrapper[4973]: I0320 13:48:50.037510 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ef21c1e-3352-4660-9cf4-3023ee75c18b","Type":"ContainerStarted","Data":"b90f7a65fbdfcb70d8ff5e3492cee1858a0b94fbb6ed2e8b3bfa6eb958d58a00"} Mar 20 13:48:50 crc kubenswrapper[4973]: I0320 13:48:50.086884 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"69df30f8-12b3-40e8-b880-8344d2c737b3","Type":"ContainerStarted","Data":"9c298c8308f05f7ac0b5adb31e4a36d978d1166a503a0d6c82e10688439fac93"} Mar 20 13:48:50 crc kubenswrapper[4973]: I0320 13:48:50.095097 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a6ed744e-9125-4240-aff5-2e2c0dd1769f","Type":"ContainerStarted","Data":"6b796a20f6472dd350a1a16a6738986b104a8dea26b45f8e0d36ab8cb5de9bc6"} Mar 20 13:48:50 crc kubenswrapper[4973]: I0320 13:48:50.095522 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a6ed744e-9125-4240-aff5-2e2c0dd1769f","Type":"ContainerStarted","Data":"adff2c9954126a5c98bfd2932478e0e16d5e53ed2e51141abc24a153e09c771b"} Mar 20 13:48:50 crc kubenswrapper[4973]: I0320 13:48:50.095703 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a6ed744e-9125-4240-aff5-2e2c0dd1769f" containerName="nova-metadata-log" containerID="cri-o://adff2c9954126a5c98bfd2932478e0e16d5e53ed2e51141abc24a153e09c771b" gracePeriod=30 Mar 20 13:48:50 crc kubenswrapper[4973]: I0320 13:48:50.095961 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a6ed744e-9125-4240-aff5-2e2c0dd1769f" containerName="nova-metadata-metadata" containerID="cri-o://6b796a20f6472dd350a1a16a6738986b104a8dea26b45f8e0d36ab8cb5de9bc6" gracePeriod=30 Mar 20 13:48:50 crc kubenswrapper[4973]: I0320 13:48:50.111960 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"126a68e5-2c8b-4341-bf31-7d760b77cf8b","Type":"ContainerStarted","Data":"5e3ba363e3fc4d5912da83cd5aaf3c80923804667f2ddc04a7ff054d4972c2bb"} Mar 20 13:48:50 crc kubenswrapper[4973]: I0320 13:48:50.112120 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="126a68e5-2c8b-4341-bf31-7d760b77cf8b" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://5e3ba363e3fc4d5912da83cd5aaf3c80923804667f2ddc04a7ff054d4972c2bb" gracePeriod=30 Mar 20 13:48:50 crc kubenswrapper[4973]: I0320 13:48:50.113843 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 13:48:50 crc kubenswrapper[4973]: I0320 13:48:50.113882 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 13:48:50 crc kubenswrapper[4973]: I0320 13:48:50.160695 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.623525369 podStartE2EDuration="11.160674587s" podCreationTimestamp="2026-03-20 13:48:39 +0000 UTC" firstStartedPulling="2026-03-20 13:48:42.177889395 +0000 UTC m=+1642.921559139" lastFinishedPulling="2026-03-20 13:48:48.715038623 +0000 UTC m=+1649.458708357" observedRunningTime="2026-03-20 13:48:50.151660221 +0000 UTC m=+1650.895329975" watchObservedRunningTime="2026-03-20 13:48:50.160674587 +0000 UTC m=+1650.904344331" Mar 20 13:48:50 crc kubenswrapper[4973]: I0320 13:48:50.168920 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:48:50 crc kubenswrapper[4973]: I0320 13:48:50.182536 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=4.641996013 podStartE2EDuration="11.182514814s" podCreationTimestamp="2026-03-20 13:48:39 +0000 UTC" firstStartedPulling="2026-03-20 13:48:42.178261445 +0000 UTC m=+1642.921931189" lastFinishedPulling="2026-03-20 13:48:48.718780246 +0000 UTC m=+1649.462449990" observedRunningTime="2026-03-20 13:48:50.168733778 +0000 UTC m=+1650.912403522" watchObservedRunningTime="2026-03-20 13:48:50.182514814 +0000 UTC m=+1650.926184558" Mar 20 13:48:50 crc kubenswrapper[4973]: I0320 13:48:50.195083 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 13:48:50 crc kubenswrapper[4973]: I0320 13:48:50.205768 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=4.6722515 podStartE2EDuration="11.205747939s" podCreationTimestamp="2026-03-20 13:48:39 +0000 UTC" firstStartedPulling="2026-03-20 13:48:42.18541691 +0000 UTC m=+1642.929086654" lastFinishedPulling="2026-03-20 13:48:48.718913349 +0000 UTC m=+1649.462583093" observedRunningTime="2026-03-20 13:48:50.188274761 +0000 UTC m=+1650.931944505" watchObservedRunningTime="2026-03-20 13:48:50.205747939 +0000 UTC m=+1650.949417683" Mar 20 13:48:50 crc kubenswrapper[4973]: I0320 13:48:50.222137 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.226222995 podStartE2EDuration="11.222120826s" podCreationTimestamp="2026-03-20 13:48:39 +0000 UTC" firstStartedPulling="2026-03-20 13:48:40.721955979 +0000 UTC m=+1641.465625723" lastFinishedPulling="2026-03-20 13:48:48.71785381 +0000 UTC m=+1649.461523554" observedRunningTime="2026-03-20 13:48:50.211391133 +0000 UTC m=+1650.955060887" watchObservedRunningTime="2026-03-20 13:48:50.222120826 +0000 UTC m=+1650.965790570" Mar 20 13:48:50 crc kubenswrapper[4973]: I0320 13:48:50.229475 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" Mar 20 13:48:50 crc kubenswrapper[4973]: I0320 13:48:50.296483 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-9v29d"] Mar 20 13:48:50 crc kubenswrapper[4973]: I0320 13:48:50.296793 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" podUID="9d40d175-2e68-4d03-acd3-0a8ba7943b57" containerName="dnsmasq-dns" containerID="cri-o://4e2db8a75bf7a19158aaf6bf57274ad674bd6eff80cb774819b13847e9c9a1d7" gracePeriod=10 Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.152554 4973 generic.go:334] "Generic (PLEG): container finished" podID="9d40d175-2e68-4d03-acd3-0a8ba7943b57" containerID="4e2db8a75bf7a19158aaf6bf57274ad674bd6eff80cb774819b13847e9c9a1d7" exitCode=0 Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.152717 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" event={"ID":"9d40d175-2e68-4d03-acd3-0a8ba7943b57","Type":"ContainerDied","Data":"4e2db8a75bf7a19158aaf6bf57274ad674bd6eff80cb774819b13847e9c9a1d7"} Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.167715 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ef21c1e-3352-4660-9cf4-3023ee75c18b","Type":"ContainerStarted","Data":"0d8d75a0e885f56bc57989fd308cedbc022268441dffe4fca78698790241b081"} Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.175062 4973 generic.go:334] "Generic (PLEG): container finished" podID="a6ed744e-9125-4240-aff5-2e2c0dd1769f" containerID="6b796a20f6472dd350a1a16a6738986b104a8dea26b45f8e0d36ab8cb5de9bc6" exitCode=0 Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.175090 4973 generic.go:334] "Generic (PLEG): container finished" podID="a6ed744e-9125-4240-aff5-2e2c0dd1769f" containerID="adff2c9954126a5c98bfd2932478e0e16d5e53ed2e51141abc24a153e09c771b" exitCode=143 Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.176123 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a6ed744e-9125-4240-aff5-2e2c0dd1769f","Type":"ContainerDied","Data":"6b796a20f6472dd350a1a16a6738986b104a8dea26b45f8e0d36ab8cb5de9bc6"} Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.176153 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a6ed744e-9125-4240-aff5-2e2c0dd1769f","Type":"ContainerDied","Data":"adff2c9954126a5c98bfd2932478e0e16d5e53ed2e51141abc24a153e09c771b"} Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.230626 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.246613 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.313589 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.344607 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-ovsdbserver-nb\") pod \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\" (UID: \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\") " Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.344863 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbzwh\" (UniqueName: \"kubernetes.io/projected/9d40d175-2e68-4d03-acd3-0a8ba7943b57-kube-api-access-tbzwh\") pod \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\" (UID: \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\") " Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.344899 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ed744e-9125-4240-aff5-2e2c0dd1769f-combined-ca-bundle\") pod \"a6ed744e-9125-4240-aff5-2e2c0dd1769f\" (UID: \"a6ed744e-9125-4240-aff5-2e2c0dd1769f\") " Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.344996 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-config\") pod \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\" (UID: \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\") " Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.345024 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-ovsdbserver-sb\") pod \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\" (UID: \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\") " Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.345078 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6ed744e-9125-4240-aff5-2e2c0dd1769f-logs\") pod \"a6ed744e-9125-4240-aff5-2e2c0dd1769f\" (UID: \"a6ed744e-9125-4240-aff5-2e2c0dd1769f\") " Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.345118 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-dns-swift-storage-0\") pod \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\" (UID: \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\") " Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.345202 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68gb7\" (UniqueName: \"kubernetes.io/projected/a6ed744e-9125-4240-aff5-2e2c0dd1769f-kube-api-access-68gb7\") pod \"a6ed744e-9125-4240-aff5-2e2c0dd1769f\" (UID: \"a6ed744e-9125-4240-aff5-2e2c0dd1769f\") " Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.345267 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6ed744e-9125-4240-aff5-2e2c0dd1769f-config-data\") pod \"a6ed744e-9125-4240-aff5-2e2c0dd1769f\" (UID: \"a6ed744e-9125-4240-aff5-2e2c0dd1769f\") " Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.345854 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6ed744e-9125-4240-aff5-2e2c0dd1769f-logs" (OuterVolumeSpecName: "logs") pod "a6ed744e-9125-4240-aff5-2e2c0dd1769f" (UID: "a6ed744e-9125-4240-aff5-2e2c0dd1769f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.351964 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-dns-svc\") pod \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\" (UID: \"9d40d175-2e68-4d03-acd3-0a8ba7943b57\") " Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.356010 4973 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a6ed744e-9125-4240-aff5-2e2c0dd1769f-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.373068 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6ed744e-9125-4240-aff5-2e2c0dd1769f-kube-api-access-68gb7" (OuterVolumeSpecName: "kube-api-access-68gb7") pod "a6ed744e-9125-4240-aff5-2e2c0dd1769f" (UID: "a6ed744e-9125-4240-aff5-2e2c0dd1769f"). InnerVolumeSpecName "kube-api-access-68gb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.384034 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d40d175-2e68-4d03-acd3-0a8ba7943b57-kube-api-access-tbzwh" (OuterVolumeSpecName: "kube-api-access-tbzwh") pod "9d40d175-2e68-4d03-acd3-0a8ba7943b57" (UID: "9d40d175-2e68-4d03-acd3-0a8ba7943b57"). InnerVolumeSpecName "kube-api-access-tbzwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.450434 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ed744e-9125-4240-aff5-2e2c0dd1769f-config-data" (OuterVolumeSpecName: "config-data") pod "a6ed744e-9125-4240-aff5-2e2c0dd1769f" (UID: "a6ed744e-9125-4240-aff5-2e2c0dd1769f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.460014 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6ed744e-9125-4240-aff5-2e2c0dd1769f-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.460058 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbzwh\" (UniqueName: \"kubernetes.io/projected/9d40d175-2e68-4d03-acd3-0a8ba7943b57-kube-api-access-tbzwh\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.460074 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68gb7\" (UniqueName: \"kubernetes.io/projected/a6ed744e-9125-4240-aff5-2e2c0dd1769f-kube-api-access-68gb7\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.471145 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6ed744e-9125-4240-aff5-2e2c0dd1769f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6ed744e-9125-4240-aff5-2e2c0dd1769f" (UID: "a6ed744e-9125-4240-aff5-2e2c0dd1769f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.510437 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9d40d175-2e68-4d03-acd3-0a8ba7943b57" (UID: "9d40d175-2e68-4d03-acd3-0a8ba7943b57"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.516881 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9d40d175-2e68-4d03-acd3-0a8ba7943b57" (UID: "9d40d175-2e68-4d03-acd3-0a8ba7943b57"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.531878 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9d40d175-2e68-4d03-acd3-0a8ba7943b57" (UID: "9d40d175-2e68-4d03-acd3-0a8ba7943b57"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.545814 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-config" (OuterVolumeSpecName: "config") pod "9d40d175-2e68-4d03-acd3-0a8ba7943b57" (UID: "9d40d175-2e68-4d03-acd3-0a8ba7943b57"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.547917 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9d40d175-2e68-4d03-acd3-0a8ba7943b57" (UID: "9d40d175-2e68-4d03-acd3-0a8ba7943b57"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.562887 4973 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.562921 4973 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.562931 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6ed744e-9125-4240-aff5-2e2c0dd1769f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.562941 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.562951 4973 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:51 crc kubenswrapper[4973]: I0320 13:48:51.562963 4973 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9d40d175-2e68-4d03-acd3-0a8ba7943b57-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.188523 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ef21c1e-3352-4660-9cf4-3023ee75c18b","Type":"ContainerStarted","Data":"ea868f986eb64cd4bc4d61b0c3ca34f86972021dcaac45f414bfd372de5c1c01"} Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.190644 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.190634 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a6ed744e-9125-4240-aff5-2e2c0dd1769f","Type":"ContainerDied","Data":"2994ce24d9f73110933fb68540a2be8c45b0e31c5df6489adaff88239551b539"} Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.190767 4973 scope.go:117] "RemoveContainer" containerID="6b796a20f6472dd350a1a16a6738986b104a8dea26b45f8e0d36ab8cb5de9bc6" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.193661 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.194099 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-f6bc4c6c9-9v29d" event={"ID":"9d40d175-2e68-4d03-acd3-0a8ba7943b57","Type":"ContainerDied","Data":"1cd528f0ac2bfa2a82407e94d05d4f5961566ee9862931104daf4b2be20831b8"} Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.219766 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.236142 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.252150 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:48:52 crc kubenswrapper[4973]: E0320 13:48:52.252667 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d40d175-2e68-4d03-acd3-0a8ba7943b57" containerName="dnsmasq-dns" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.252685 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d40d175-2e68-4d03-acd3-0a8ba7943b57" containerName="dnsmasq-dns" Mar 20 13:48:52 crc kubenswrapper[4973]: E0320 13:48:52.252700 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ed744e-9125-4240-aff5-2e2c0dd1769f" containerName="nova-metadata-metadata" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.252707 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ed744e-9125-4240-aff5-2e2c0dd1769f" containerName="nova-metadata-metadata" Mar 20 13:48:52 crc kubenswrapper[4973]: E0320 13:48:52.252718 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d40d175-2e68-4d03-acd3-0a8ba7943b57" containerName="init" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.252724 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d40d175-2e68-4d03-acd3-0a8ba7943b57" containerName="init" Mar 20 13:48:52 crc kubenswrapper[4973]: E0320 13:48:52.252757 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6ed744e-9125-4240-aff5-2e2c0dd1769f" containerName="nova-metadata-log" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.252763 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6ed744e-9125-4240-aff5-2e2c0dd1769f" containerName="nova-metadata-log" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.252973 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ed744e-9125-4240-aff5-2e2c0dd1769f" containerName="nova-metadata-metadata" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.252998 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d40d175-2e68-4d03-acd3-0a8ba7943b57" containerName="dnsmasq-dns" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.253005 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6ed744e-9125-4240-aff5-2e2c0dd1769f" containerName="nova-metadata-log" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.254764 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.259704 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.259950 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.263976 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-9v29d"] Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.278582 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-f6bc4c6c9-9v29d"] Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.295397 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.384841 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcc274a4-3873-4ddb-93d6-ba446c832ac7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bcc274a4-3873-4ddb-93d6-ba446c832ac7\") " pod="openstack/nova-metadata-0" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.384974 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqlk9\" (UniqueName: \"kubernetes.io/projected/bcc274a4-3873-4ddb-93d6-ba446c832ac7-kube-api-access-wqlk9\") pod \"nova-metadata-0\" (UID: \"bcc274a4-3873-4ddb-93d6-ba446c832ac7\") " pod="openstack/nova-metadata-0" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.385019 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc274a4-3873-4ddb-93d6-ba446c832ac7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bcc274a4-3873-4ddb-93d6-ba446c832ac7\") " pod="openstack/nova-metadata-0" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.385047 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcc274a4-3873-4ddb-93d6-ba446c832ac7-logs\") pod \"nova-metadata-0\" (UID: \"bcc274a4-3873-4ddb-93d6-ba446c832ac7\") " pod="openstack/nova-metadata-0" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.385194 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc274a4-3873-4ddb-93d6-ba446c832ac7-config-data\") pod \"nova-metadata-0\" (UID: \"bcc274a4-3873-4ddb-93d6-ba446c832ac7\") " pod="openstack/nova-metadata-0" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.488082 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcc274a4-3873-4ddb-93d6-ba446c832ac7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bcc274a4-3873-4ddb-93d6-ba446c832ac7\") " pod="openstack/nova-metadata-0" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.488246 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqlk9\" (UniqueName: \"kubernetes.io/projected/bcc274a4-3873-4ddb-93d6-ba446c832ac7-kube-api-access-wqlk9\") pod \"nova-metadata-0\" (UID: \"bcc274a4-3873-4ddb-93d6-ba446c832ac7\") " pod="openstack/nova-metadata-0" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.488293 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc274a4-3873-4ddb-93d6-ba446c832ac7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bcc274a4-3873-4ddb-93d6-ba446c832ac7\") " pod="openstack/nova-metadata-0" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.488330 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcc274a4-3873-4ddb-93d6-ba446c832ac7-logs\") pod \"nova-metadata-0\" (UID: \"bcc274a4-3873-4ddb-93d6-ba446c832ac7\") " pod="openstack/nova-metadata-0" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.488494 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc274a4-3873-4ddb-93d6-ba446c832ac7-config-data\") pod \"nova-metadata-0\" (UID: \"bcc274a4-3873-4ddb-93d6-ba446c832ac7\") " pod="openstack/nova-metadata-0" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.488892 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcc274a4-3873-4ddb-93d6-ba446c832ac7-logs\") pod \"nova-metadata-0\" (UID: \"bcc274a4-3873-4ddb-93d6-ba446c832ac7\") " pod="openstack/nova-metadata-0" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.493501 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc274a4-3873-4ddb-93d6-ba446c832ac7-config-data\") pod \"nova-metadata-0\" (UID: \"bcc274a4-3873-4ddb-93d6-ba446c832ac7\") " pod="openstack/nova-metadata-0" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.496927 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcc274a4-3873-4ddb-93d6-ba446c832ac7-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"bcc274a4-3873-4ddb-93d6-ba446c832ac7\") " pod="openstack/nova-metadata-0" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.497979 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc274a4-3873-4ddb-93d6-ba446c832ac7-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"bcc274a4-3873-4ddb-93d6-ba446c832ac7\") " pod="openstack/nova-metadata-0" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.545537 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqlk9\" (UniqueName: \"kubernetes.io/projected/bcc274a4-3873-4ddb-93d6-ba446c832ac7-kube-api-access-wqlk9\") pod \"nova-metadata-0\" (UID: \"bcc274a4-3873-4ddb-93d6-ba446c832ac7\") " pod="openstack/nova-metadata-0" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.578059 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:48:52 crc kubenswrapper[4973]: I0320 13:48:52.893481 4973 scope.go:117] "RemoveContainer" containerID="adff2c9954126a5c98bfd2932478e0e16d5e53ed2e51141abc24a153e09c771b" Mar 20 13:48:53 crc kubenswrapper[4973]: I0320 13:48:53.108988 4973 scope.go:117] "RemoveContainer" containerID="4e2db8a75bf7a19158aaf6bf57274ad674bd6eff80cb774819b13847e9c9a1d7" Mar 20 13:48:53 crc kubenswrapper[4973]: I0320 13:48:53.265678 4973 scope.go:117] "RemoveContainer" containerID="328372d663aefc57d3d8b09ece7693e92b2f5dee324ad8ca62bb8f4b6fb4bb45" Mar 20 13:48:53 crc kubenswrapper[4973]: I0320 13:48:53.815724 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:48:53 crc kubenswrapper[4973]: I0320 13:48:53.966826 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d40d175-2e68-4d03-acd3-0a8ba7943b57" path="/var/lib/kubelet/pods/9d40d175-2e68-4d03-acd3-0a8ba7943b57/volumes" Mar 20 13:48:53 crc kubenswrapper[4973]: I0320 13:48:53.967954 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6ed744e-9125-4240-aff5-2e2c0dd1769f" path="/var/lib/kubelet/pods/a6ed744e-9125-4240-aff5-2e2c0dd1769f/volumes" Mar 20 13:48:54 crc kubenswrapper[4973]: I0320 13:48:54.220215 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ef21c1e-3352-4660-9cf4-3023ee75c18b","Type":"ContainerStarted","Data":"bd2139f149fc4d242626a4c708a17e3fbaf6978f8d7015f419012e3037743f2e"} Mar 20 13:48:54 crc kubenswrapper[4973]: I0320 13:48:54.223637 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bcc274a4-3873-4ddb-93d6-ba446c832ac7","Type":"ContainerStarted","Data":"bea2c65ddd9ba880384f5afc98ef0a308cae57b4aa117955d1f6460b09f1d09c"} Mar 20 13:48:54 crc kubenswrapper[4973]: I0320 13:48:54.223664 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bcc274a4-3873-4ddb-93d6-ba446c832ac7","Type":"ContainerStarted","Data":"0942c8520f775c78177573f3e9e4ca13f1d5fa989d780ab62385dbf4089e4beb"} Mar 20 13:48:54 crc kubenswrapper[4973]: I0320 13:48:54.225994 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46","Type":"ContainerStarted","Data":"db5a6de6757dc77775f70481978bf1c57bd29b38c754670dbd9fdd66b2dff901"} Mar 20 13:48:54 crc kubenswrapper[4973]: I0320 13:48:54.226181 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46" containerName="aodh-api" containerID="cri-o://b98f68d8ba06a959ce32737a7dec8b2c4bf1c9941812fc35ceae360bae8902e7" gracePeriod=30 Mar 20 13:48:54 crc kubenswrapper[4973]: I0320 13:48:54.226910 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46" containerName="aodh-listener" containerID="cri-o://db5a6de6757dc77775f70481978bf1c57bd29b38c754670dbd9fdd66b2dff901" gracePeriod=30 Mar 20 13:48:54 crc kubenswrapper[4973]: I0320 13:48:54.226979 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46" containerName="aodh-notifier" containerID="cri-o://e18d7336844c3f073b75c20896cdc0d5a8453949f430ae1546a7eb68910ebe8f" gracePeriod=30 Mar 20 13:48:54 crc kubenswrapper[4973]: I0320 13:48:54.227027 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46" containerName="aodh-evaluator" containerID="cri-o://db5af94fafe96b7e1e64b783aa04c6c31da69cb7d2ba7f14e326d73f7848dbf0" gracePeriod=30 Mar 20 13:48:54 crc kubenswrapper[4973]: I0320 13:48:54.300919 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.179726934 podStartE2EDuration="18.300890768s" podCreationTimestamp="2026-03-20 13:48:36 +0000 UTC" firstStartedPulling="2026-03-20 13:48:37.144605172 +0000 UTC m=+1637.888274916" lastFinishedPulling="2026-03-20 13:48:53.265769006 +0000 UTC m=+1654.009438750" observedRunningTime="2026-03-20 13:48:54.244137508 +0000 UTC m=+1654.987807262" watchObservedRunningTime="2026-03-20 13:48:54.300890768 +0000 UTC m=+1655.044560512" Mar 20 13:48:55 crc kubenswrapper[4973]: I0320 13:48:55.248377 4973 generic.go:334] "Generic (PLEG): container finished" podID="6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46" containerID="db5af94fafe96b7e1e64b783aa04c6c31da69cb7d2ba7f14e326d73f7848dbf0" exitCode=0 Mar 20 13:48:55 crc kubenswrapper[4973]: I0320 13:48:55.248831 4973 generic.go:334] "Generic (PLEG): container finished" podID="6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46" containerID="b98f68d8ba06a959ce32737a7dec8b2c4bf1c9941812fc35ceae360bae8902e7" exitCode=0 Mar 20 13:48:55 crc kubenswrapper[4973]: I0320 13:48:55.248893 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46","Type":"ContainerDied","Data":"db5af94fafe96b7e1e64b783aa04c6c31da69cb7d2ba7f14e326d73f7848dbf0"} Mar 20 13:48:55 crc kubenswrapper[4973]: I0320 13:48:55.248926 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46","Type":"ContainerDied","Data":"b98f68d8ba06a959ce32737a7dec8b2c4bf1c9941812fc35ceae360bae8902e7"} Mar 20 13:48:55 crc kubenswrapper[4973]: I0320 13:48:55.251216 4973 generic.go:334] "Generic (PLEG): container finished" podID="6b9f729e-dda0-4ad0-a8fc-3f0365b27947" containerID="8d7d87ff2805313825bce4cad26271bdd381c5ca0531450eebf153c6a88d38d9" exitCode=0 Mar 20 13:48:55 crc kubenswrapper[4973]: I0320 13:48:55.251278 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xm9xj" event={"ID":"6b9f729e-dda0-4ad0-a8fc-3f0365b27947","Type":"ContainerDied","Data":"8d7d87ff2805313825bce4cad26271bdd381c5ca0531450eebf153c6a88d38d9"} Mar 20 13:48:55 crc kubenswrapper[4973]: I0320 13:48:55.254044 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bcc274a4-3873-4ddb-93d6-ba446c832ac7","Type":"ContainerStarted","Data":"7fcd47a9027e86edd8bd467497a27a381d84184ff126cee648929e13d0fd3779"} Mar 20 13:48:55 crc kubenswrapper[4973]: I0320 13:48:55.300301 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.300281624 podStartE2EDuration="3.300281624s" podCreationTimestamp="2026-03-20 13:48:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:48:55.293718064 +0000 UTC m=+1656.037387818" watchObservedRunningTime="2026-03-20 13:48:55.300281624 +0000 UTC m=+1656.043951368" Mar 20 13:48:55 crc kubenswrapper[4973]: E0320 13:48:55.454001 4973 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6cf232ba_9dbc_4c57_bbb0_5e3a5d8a0d46.slice/crio-e18d7336844c3f073b75c20896cdc0d5a8453949f430ae1546a7eb68910ebe8f.scope\": RecentStats: unable to find data in memory cache]" Mar 20 13:48:56 crc kubenswrapper[4973]: I0320 13:48:56.266728 4973 generic.go:334] "Generic (PLEG): container finished" podID="6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46" containerID="e18d7336844c3f073b75c20896cdc0d5a8453949f430ae1546a7eb68910ebe8f" exitCode=0 Mar 20 13:48:56 crc kubenswrapper[4973]: I0320 13:48:56.266820 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46","Type":"ContainerDied","Data":"e18d7336844c3f073b75c20896cdc0d5a8453949f430ae1546a7eb68910ebe8f"} Mar 20 13:48:56 crc kubenswrapper[4973]: I0320 13:48:56.270191 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ef21c1e-3352-4660-9cf4-3023ee75c18b","Type":"ContainerStarted","Data":"79076d797fbb559fdbe88cf879a6e3eddb7f5447e7ca3906227127a8a18c5195"} Mar 20 13:48:56 crc kubenswrapper[4973]: I0320 13:48:56.270324 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4ef21c1e-3352-4660-9cf4-3023ee75c18b" containerName="ceilometer-central-agent" containerID="cri-o://0d8d75a0e885f56bc57989fd308cedbc022268441dffe4fca78698790241b081" gracePeriod=30 Mar 20 13:48:56 crc kubenswrapper[4973]: I0320 13:48:56.270363 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:48:56 crc kubenswrapper[4973]: I0320 13:48:56.270409 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4ef21c1e-3352-4660-9cf4-3023ee75c18b" containerName="proxy-httpd" containerID="cri-o://79076d797fbb559fdbe88cf879a6e3eddb7f5447e7ca3906227127a8a18c5195" gracePeriod=30 Mar 20 13:48:56 crc kubenswrapper[4973]: I0320 13:48:56.270424 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4ef21c1e-3352-4660-9cf4-3023ee75c18b" containerName="sg-core" containerID="cri-o://bd2139f149fc4d242626a4c708a17e3fbaf6978f8d7015f419012e3037743f2e" gracePeriod=30 Mar 20 13:48:56 crc kubenswrapper[4973]: I0320 13:48:56.270460 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4ef21c1e-3352-4660-9cf4-3023ee75c18b" containerName="ceilometer-notification-agent" containerID="cri-o://ea868f986eb64cd4bc4d61b0c3ca34f86972021dcaac45f414bfd372de5c1c01" gracePeriod=30 Mar 20 13:48:56 crc kubenswrapper[4973]: I0320 13:48:56.272620 4973 generic.go:334] "Generic (PLEG): container finished" podID="90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65" containerID="9421d4a8f747a4ff43be42f3ae0f3390f9170da9651133851f6efb1a308c24e5" exitCode=0 Mar 20 13:48:56 crc kubenswrapper[4973]: I0320 13:48:56.272781 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wwf69" event={"ID":"90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65","Type":"ContainerDied","Data":"9421d4a8f747a4ff43be42f3ae0f3390f9170da9651133851f6efb1a308c24e5"} Mar 20 13:48:56 crc kubenswrapper[4973]: I0320 13:48:56.306159 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=4.973391862 podStartE2EDuration="11.306137227s" podCreationTimestamp="2026-03-20 13:48:45 +0000 UTC" firstStartedPulling="2026-03-20 13:48:49.21543078 +0000 UTC m=+1649.959100524" lastFinishedPulling="2026-03-20 13:48:55.548176145 +0000 UTC m=+1656.291845889" observedRunningTime="2026-03-20 13:48:56.291516327 +0000 UTC m=+1657.035186081" watchObservedRunningTime="2026-03-20 13:48:56.306137227 +0000 UTC m=+1657.049806971" Mar 20 13:48:56 crc kubenswrapper[4973]: I0320 13:48:56.873017 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xm9xj" Mar 20 13:48:56 crc kubenswrapper[4973]: I0320 13:48:56.935345 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b9f729e-dda0-4ad0-a8fc-3f0365b27947-config-data\") pod \"6b9f729e-dda0-4ad0-a8fc-3f0365b27947\" (UID: \"6b9f729e-dda0-4ad0-a8fc-3f0365b27947\") " Mar 20 13:48:56 crc kubenswrapper[4973]: I0320 13:48:56.935511 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drb4t\" (UniqueName: \"kubernetes.io/projected/6b9f729e-dda0-4ad0-a8fc-3f0365b27947-kube-api-access-drb4t\") pod \"6b9f729e-dda0-4ad0-a8fc-3f0365b27947\" (UID: \"6b9f729e-dda0-4ad0-a8fc-3f0365b27947\") " Mar 20 13:48:56 crc kubenswrapper[4973]: I0320 13:48:56.935563 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b9f729e-dda0-4ad0-a8fc-3f0365b27947-combined-ca-bundle\") pod \"6b9f729e-dda0-4ad0-a8fc-3f0365b27947\" (UID: \"6b9f729e-dda0-4ad0-a8fc-3f0365b27947\") " Mar 20 13:48:56 crc kubenswrapper[4973]: I0320 13:48:56.935600 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b9f729e-dda0-4ad0-a8fc-3f0365b27947-scripts\") pod \"6b9f729e-dda0-4ad0-a8fc-3f0365b27947\" (UID: \"6b9f729e-dda0-4ad0-a8fc-3f0365b27947\") " Mar 20 13:48:56 crc kubenswrapper[4973]: I0320 13:48:56.964646 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b9f729e-dda0-4ad0-a8fc-3f0365b27947-kube-api-access-drb4t" (OuterVolumeSpecName: "kube-api-access-drb4t") pod "6b9f729e-dda0-4ad0-a8fc-3f0365b27947" (UID: "6b9f729e-dda0-4ad0-a8fc-3f0365b27947"). InnerVolumeSpecName "kube-api-access-drb4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:56 crc kubenswrapper[4973]: I0320 13:48:56.967198 4973 scope.go:117] "RemoveContainer" containerID="6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" Mar 20 13:48:56 crc kubenswrapper[4973]: E0320 13:48:56.968246 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:48:56 crc kubenswrapper[4973]: I0320 13:48:56.982620 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b9f729e-dda0-4ad0-a8fc-3f0365b27947-scripts" (OuterVolumeSpecName: "scripts") pod "6b9f729e-dda0-4ad0-a8fc-3f0365b27947" (UID: "6b9f729e-dda0-4ad0-a8fc-3f0365b27947"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:56 crc kubenswrapper[4973]: I0320 13:48:56.992659 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b9f729e-dda0-4ad0-a8fc-3f0365b27947-config-data" (OuterVolumeSpecName: "config-data") pod "6b9f729e-dda0-4ad0-a8fc-3f0365b27947" (UID: "6b9f729e-dda0-4ad0-a8fc-3f0365b27947"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:57 crc kubenswrapper[4973]: I0320 13:48:57.020714 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b9f729e-dda0-4ad0-a8fc-3f0365b27947-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b9f729e-dda0-4ad0-a8fc-3f0365b27947" (UID: "6b9f729e-dda0-4ad0-a8fc-3f0365b27947"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:57 crc kubenswrapper[4973]: I0320 13:48:57.039472 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b9f729e-dda0-4ad0-a8fc-3f0365b27947-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:57 crc kubenswrapper[4973]: I0320 13:48:57.039663 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drb4t\" (UniqueName: \"kubernetes.io/projected/6b9f729e-dda0-4ad0-a8fc-3f0365b27947-kube-api-access-drb4t\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:57 crc kubenswrapper[4973]: I0320 13:48:57.039746 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b9f729e-dda0-4ad0-a8fc-3f0365b27947-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:57 crc kubenswrapper[4973]: I0320 13:48:57.039820 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b9f729e-dda0-4ad0-a8fc-3f0365b27947-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:57 crc kubenswrapper[4973]: I0320 13:48:57.289306 4973 generic.go:334] "Generic (PLEG): container finished" podID="4ef21c1e-3352-4660-9cf4-3023ee75c18b" containerID="79076d797fbb559fdbe88cf879a6e3eddb7f5447e7ca3906227127a8a18c5195" exitCode=0 Mar 20 13:48:57 crc kubenswrapper[4973]: I0320 13:48:57.289362 4973 generic.go:334] "Generic (PLEG): container finished" podID="4ef21c1e-3352-4660-9cf4-3023ee75c18b" containerID="bd2139f149fc4d242626a4c708a17e3fbaf6978f8d7015f419012e3037743f2e" exitCode=2 Mar 20 13:48:57 crc kubenswrapper[4973]: I0320 13:48:57.289372 4973 generic.go:334] "Generic (PLEG): container finished" podID="4ef21c1e-3352-4660-9cf4-3023ee75c18b" containerID="ea868f986eb64cd4bc4d61b0c3ca34f86972021dcaac45f414bfd372de5c1c01" exitCode=0 Mar 20 13:48:57 crc kubenswrapper[4973]: I0320 13:48:57.289476 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ef21c1e-3352-4660-9cf4-3023ee75c18b","Type":"ContainerDied","Data":"79076d797fbb559fdbe88cf879a6e3eddb7f5447e7ca3906227127a8a18c5195"} Mar 20 13:48:57 crc kubenswrapper[4973]: I0320 13:48:57.289510 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ef21c1e-3352-4660-9cf4-3023ee75c18b","Type":"ContainerDied","Data":"bd2139f149fc4d242626a4c708a17e3fbaf6978f8d7015f419012e3037743f2e"} Mar 20 13:48:57 crc kubenswrapper[4973]: I0320 13:48:57.289523 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ef21c1e-3352-4660-9cf4-3023ee75c18b","Type":"ContainerDied","Data":"ea868f986eb64cd4bc4d61b0c3ca34f86972021dcaac45f414bfd372de5c1c01"} Mar 20 13:48:57 crc kubenswrapper[4973]: I0320 13:48:57.292901 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-xm9xj" event={"ID":"6b9f729e-dda0-4ad0-a8fc-3f0365b27947","Type":"ContainerDied","Data":"ed424703e334f40597296f94c6e68f1ed675a6909d4a88c40b91ae268ede90ec"} Mar 20 13:48:57 crc kubenswrapper[4973]: I0320 13:48:57.293083 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed424703e334f40597296f94c6e68f1ed675a6909d4a88c40b91ae268ede90ec" Mar 20 13:48:57 crc kubenswrapper[4973]: I0320 13:48:57.292948 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-xm9xj" Mar 20 13:48:57 crc kubenswrapper[4973]: I0320 13:48:57.460453 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:48:57 crc kubenswrapper[4973]: I0320 13:48:57.462232 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="67aae42c-b00a-4295-873e-f9b56094726a" containerName="nova-api-log" containerID="cri-o://512fe11d9d468098f06d950031447e5ea75540f2b060e7fc602868ffcfa2f046" gracePeriod=30 Mar 20 13:48:57 crc kubenswrapper[4973]: I0320 13:48:57.462290 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="67aae42c-b00a-4295-873e-f9b56094726a" containerName="nova-api-api" containerID="cri-o://50969a1aef77ee9053f5b3e4319958c446e8d86d04165333aab817d4a0260dc5" gracePeriod=30 Mar 20 13:48:57 crc kubenswrapper[4973]: I0320 13:48:57.486677 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:48:57 crc kubenswrapper[4973]: I0320 13:48:57.486893 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="69df30f8-12b3-40e8-b880-8344d2c737b3" containerName="nova-scheduler-scheduler" containerID="cri-o://9c298c8308f05f7ac0b5adb31e4a36d978d1166a503a0d6c82e10688439fac93" gracePeriod=30 Mar 20 13:48:57 crc kubenswrapper[4973]: I0320 13:48:57.558066 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:48:57 crc kubenswrapper[4973]: I0320 13:48:57.558462 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bcc274a4-3873-4ddb-93d6-ba446c832ac7" containerName="nova-metadata-log" containerID="cri-o://bea2c65ddd9ba880384f5afc98ef0a308cae57b4aa117955d1f6460b09f1d09c" gracePeriod=30 Mar 20 13:48:57 crc kubenswrapper[4973]: I0320 13:48:57.558940 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="bcc274a4-3873-4ddb-93d6-ba446c832ac7" containerName="nova-metadata-metadata" containerID="cri-o://7fcd47a9027e86edd8bd467497a27a381d84184ff126cee648929e13d0fd3779" gracePeriod=30 Mar 20 13:48:57 crc kubenswrapper[4973]: I0320 13:48:57.768838 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:48:57 crc kubenswrapper[4973]: I0320 13:48:57.769175 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:48:57 crc kubenswrapper[4973]: I0320 13:48:57.956876 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wwf69" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.065852 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65-config-data\") pod \"90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65\" (UID: \"90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65\") " Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.066023 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmcxj\" (UniqueName: \"kubernetes.io/projected/90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65-kube-api-access-lmcxj\") pod \"90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65\" (UID: \"90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65\") " Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.066127 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65-scripts\") pod \"90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65\" (UID: \"90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65\") " Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.066198 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65-combined-ca-bundle\") pod \"90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65\" (UID: \"90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65\") " Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.088672 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65-scripts" (OuterVolumeSpecName: "scripts") pod "90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65" (UID: "90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.089083 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65-kube-api-access-lmcxj" (OuterVolumeSpecName: "kube-api-access-lmcxj") pod "90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65" (UID: "90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65"). InnerVolumeSpecName "kube-api-access-lmcxj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.113590 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65-config-data" (OuterVolumeSpecName: "config-data") pod "90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65" (UID: "90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.122588 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65" (UID: "90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.169176 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmcxj\" (UniqueName: \"kubernetes.io/projected/90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65-kube-api-access-lmcxj\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.169215 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.169228 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.169239 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.306670 4973 generic.go:334] "Generic (PLEG): container finished" podID="67aae42c-b00a-4295-873e-f9b56094726a" containerID="50969a1aef77ee9053f5b3e4319958c446e8d86d04165333aab817d4a0260dc5" exitCode=0 Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.306702 4973 generic.go:334] "Generic (PLEG): container finished" podID="67aae42c-b00a-4295-873e-f9b56094726a" containerID="512fe11d9d468098f06d950031447e5ea75540f2b060e7fc602868ffcfa2f046" exitCode=143 Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.306754 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67aae42c-b00a-4295-873e-f9b56094726a","Type":"ContainerDied","Data":"50969a1aef77ee9053f5b3e4319958c446e8d86d04165333aab817d4a0260dc5"} Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.306781 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67aae42c-b00a-4295-873e-f9b56094726a","Type":"ContainerDied","Data":"512fe11d9d468098f06d950031447e5ea75540f2b060e7fc602868ffcfa2f046"} Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.308361 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-wwf69" event={"ID":"90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65","Type":"ContainerDied","Data":"c6f7605e5910a27b303b78d055f9093e76e43dc6fcdcd388dbc163ec83c8ae3d"} Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.308397 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6f7605e5910a27b303b78d055f9093e76e43dc6fcdcd388dbc163ec83c8ae3d" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.308978 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-wwf69" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.315530 4973 generic.go:334] "Generic (PLEG): container finished" podID="bcc274a4-3873-4ddb-93d6-ba446c832ac7" containerID="7fcd47a9027e86edd8bd467497a27a381d84184ff126cee648929e13d0fd3779" exitCode=0 Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.315571 4973 generic.go:334] "Generic (PLEG): container finished" podID="bcc274a4-3873-4ddb-93d6-ba446c832ac7" containerID="bea2c65ddd9ba880384f5afc98ef0a308cae57b4aa117955d1f6460b09f1d09c" exitCode=143 Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.315599 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bcc274a4-3873-4ddb-93d6-ba446c832ac7","Type":"ContainerDied","Data":"7fcd47a9027e86edd8bd467497a27a381d84184ff126cee648929e13d0fd3779"} Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.315632 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bcc274a4-3873-4ddb-93d6-ba446c832ac7","Type":"ContainerDied","Data":"bea2c65ddd9ba880384f5afc98ef0a308cae57b4aa117955d1f6460b09f1d09c"} Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.410203 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.412100 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 13:48:58 crc kubenswrapper[4973]: E0320 13:48:58.412702 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65" containerName="nova-cell1-conductor-db-sync" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.412722 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65" containerName="nova-cell1-conductor-db-sync" Mar 20 13:48:58 crc kubenswrapper[4973]: E0320 13:48:58.412760 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc274a4-3873-4ddb-93d6-ba446c832ac7" containerName="nova-metadata-metadata" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.412769 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc274a4-3873-4ddb-93d6-ba446c832ac7" containerName="nova-metadata-metadata" Mar 20 13:48:58 crc kubenswrapper[4973]: E0320 13:48:58.412787 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc274a4-3873-4ddb-93d6-ba446c832ac7" containerName="nova-metadata-log" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.412792 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc274a4-3873-4ddb-93d6-ba446c832ac7" containerName="nova-metadata-log" Mar 20 13:48:58 crc kubenswrapper[4973]: E0320 13:48:58.412817 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9f729e-dda0-4ad0-a8fc-3f0365b27947" containerName="nova-manage" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.412824 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9f729e-dda0-4ad0-a8fc-3f0365b27947" containerName="nova-manage" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.414912 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b9f729e-dda0-4ad0-a8fc-3f0365b27947" containerName="nova-manage" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.414952 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65" containerName="nova-cell1-conductor-db-sync" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.414969 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcc274a4-3873-4ddb-93d6-ba446c832ac7" containerName="nova-metadata-metadata" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.414981 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcc274a4-3873-4ddb-93d6-ba446c832ac7" containerName="nova-metadata-log" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.416432 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.432740 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.432845 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.438008 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.477468 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67aae42c-b00a-4295-873e-f9b56094726a-logs\") pod \"67aae42c-b00a-4295-873e-f9b56094726a\" (UID: \"67aae42c-b00a-4295-873e-f9b56094726a\") " Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.477519 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcc274a4-3873-4ddb-93d6-ba446c832ac7-logs\") pod \"bcc274a4-3873-4ddb-93d6-ba446c832ac7\" (UID: \"bcc274a4-3873-4ddb-93d6-ba446c832ac7\") " Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.477602 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcc274a4-3873-4ddb-93d6-ba446c832ac7-nova-metadata-tls-certs\") pod \"bcc274a4-3873-4ddb-93d6-ba446c832ac7\" (UID: \"bcc274a4-3873-4ddb-93d6-ba446c832ac7\") " Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.477685 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wpgz\" (UniqueName: \"kubernetes.io/projected/67aae42c-b00a-4295-873e-f9b56094726a-kube-api-access-8wpgz\") pod \"67aae42c-b00a-4295-873e-f9b56094726a\" (UID: \"67aae42c-b00a-4295-873e-f9b56094726a\") " Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.477716 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc274a4-3873-4ddb-93d6-ba446c832ac7-config-data\") pod \"bcc274a4-3873-4ddb-93d6-ba446c832ac7\" (UID: \"bcc274a4-3873-4ddb-93d6-ba446c832ac7\") " Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.477761 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67aae42c-b00a-4295-873e-f9b56094726a-config-data\") pod \"67aae42c-b00a-4295-873e-f9b56094726a\" (UID: \"67aae42c-b00a-4295-873e-f9b56094726a\") " Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.477810 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67aae42c-b00a-4295-873e-f9b56094726a-combined-ca-bundle\") pod \"67aae42c-b00a-4295-873e-f9b56094726a\" (UID: \"67aae42c-b00a-4295-873e-f9b56094726a\") " Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.477864 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/67aae42c-b00a-4295-873e-f9b56094726a-logs" (OuterVolumeSpecName: "logs") pod "67aae42c-b00a-4295-873e-f9b56094726a" (UID: "67aae42c-b00a-4295-873e-f9b56094726a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.477925 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc274a4-3873-4ddb-93d6-ba446c832ac7-combined-ca-bundle\") pod \"bcc274a4-3873-4ddb-93d6-ba446c832ac7\" (UID: \"bcc274a4-3873-4ddb-93d6-ba446c832ac7\") " Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.477957 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqlk9\" (UniqueName: \"kubernetes.io/projected/bcc274a4-3873-4ddb-93d6-ba446c832ac7-kube-api-access-wqlk9\") pod \"bcc274a4-3873-4ddb-93d6-ba446c832ac7\" (UID: \"bcc274a4-3873-4ddb-93d6-ba446c832ac7\") " Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.478393 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcc274a4-3873-4ddb-93d6-ba446c832ac7-logs" (OuterVolumeSpecName: "logs") pod "bcc274a4-3873-4ddb-93d6-ba446c832ac7" (UID: "bcc274a4-3873-4ddb-93d6-ba446c832ac7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.478964 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a61963-2c9b-403b-8fed-f4072b979eb8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b0a61963-2c9b-403b-8fed-f4072b979eb8\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.479121 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a61963-2c9b-403b-8fed-f4072b979eb8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b0a61963-2c9b-403b-8fed-f4072b979eb8\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.479441 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm87q\" (UniqueName: \"kubernetes.io/projected/b0a61963-2c9b-403b-8fed-f4072b979eb8-kube-api-access-zm87q\") pod \"nova-cell1-conductor-0\" (UID: \"b0a61963-2c9b-403b-8fed-f4072b979eb8\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.479565 4973 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/67aae42c-b00a-4295-873e-f9b56094726a-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.479576 4973 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcc274a4-3873-4ddb-93d6-ba446c832ac7-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.483550 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67aae42c-b00a-4295-873e-f9b56094726a-kube-api-access-8wpgz" (OuterVolumeSpecName: "kube-api-access-8wpgz") pod "67aae42c-b00a-4295-873e-f9b56094726a" (UID: "67aae42c-b00a-4295-873e-f9b56094726a"). InnerVolumeSpecName "kube-api-access-8wpgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.490695 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcc274a4-3873-4ddb-93d6-ba446c832ac7-kube-api-access-wqlk9" (OuterVolumeSpecName: "kube-api-access-wqlk9") pod "bcc274a4-3873-4ddb-93d6-ba446c832ac7" (UID: "bcc274a4-3873-4ddb-93d6-ba446c832ac7"). InnerVolumeSpecName "kube-api-access-wqlk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.518812 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67aae42c-b00a-4295-873e-f9b56094726a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "67aae42c-b00a-4295-873e-f9b56094726a" (UID: "67aae42c-b00a-4295-873e-f9b56094726a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.520510 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc274a4-3873-4ddb-93d6-ba446c832ac7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bcc274a4-3873-4ddb-93d6-ba446c832ac7" (UID: "bcc274a4-3873-4ddb-93d6-ba446c832ac7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.538785 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/67aae42c-b00a-4295-873e-f9b56094726a-config-data" (OuterVolumeSpecName: "config-data") pod "67aae42c-b00a-4295-873e-f9b56094726a" (UID: "67aae42c-b00a-4295-873e-f9b56094726a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.548491 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc274a4-3873-4ddb-93d6-ba446c832ac7-config-data" (OuterVolumeSpecName: "config-data") pod "bcc274a4-3873-4ddb-93d6-ba446c832ac7" (UID: "bcc274a4-3873-4ddb-93d6-ba446c832ac7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.567047 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc274a4-3873-4ddb-93d6-ba446c832ac7-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "bcc274a4-3873-4ddb-93d6-ba446c832ac7" (UID: "bcc274a4-3873-4ddb-93d6-ba446c832ac7"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.581917 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm87q\" (UniqueName: \"kubernetes.io/projected/b0a61963-2c9b-403b-8fed-f4072b979eb8-kube-api-access-zm87q\") pod \"nova-cell1-conductor-0\" (UID: \"b0a61963-2c9b-403b-8fed-f4072b979eb8\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.582122 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a61963-2c9b-403b-8fed-f4072b979eb8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b0a61963-2c9b-403b-8fed-f4072b979eb8\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.582284 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a61963-2c9b-403b-8fed-f4072b979eb8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b0a61963-2c9b-403b-8fed-f4072b979eb8\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.582465 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/67aae42c-b00a-4295-873e-f9b56094726a-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.582483 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67aae42c-b00a-4295-873e-f9b56094726a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.582516 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcc274a4-3873-4ddb-93d6-ba446c832ac7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.582528 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqlk9\" (UniqueName: \"kubernetes.io/projected/bcc274a4-3873-4ddb-93d6-ba446c832ac7-kube-api-access-wqlk9\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.582536 4973 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/bcc274a4-3873-4ddb-93d6-ba446c832ac7-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.582544 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wpgz\" (UniqueName: \"kubernetes.io/projected/67aae42c-b00a-4295-873e-f9b56094726a-kube-api-access-8wpgz\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.582555 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcc274a4-3873-4ddb-93d6-ba446c832ac7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.588500 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0a61963-2c9b-403b-8fed-f4072b979eb8-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"b0a61963-2c9b-403b-8fed-f4072b979eb8\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.588537 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0a61963-2c9b-403b-8fed-f4072b979eb8-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"b0a61963-2c9b-403b-8fed-f4072b979eb8\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.606825 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm87q\" (UniqueName: \"kubernetes.io/projected/b0a61963-2c9b-403b-8fed-f4072b979eb8-kube-api-access-zm87q\") pod \"nova-cell1-conductor-0\" (UID: \"b0a61963-2c9b-403b-8fed-f4072b979eb8\") " pod="openstack/nova-cell1-conductor-0" Mar 20 13:48:58 crc kubenswrapper[4973]: I0320 13:48:58.757217 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.337292 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.386890 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"bcc274a4-3873-4ddb-93d6-ba446c832ac7","Type":"ContainerDied","Data":"0942c8520f775c78177573f3e9e4ca13f1d5fa989d780ab62385dbf4089e4beb"} Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.386948 4973 scope.go:117] "RemoveContainer" containerID="7fcd47a9027e86edd8bd467497a27a381d84184ff126cee648929e13d0fd3779" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.387108 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.406785 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"67aae42c-b00a-4295-873e-f9b56094726a","Type":"ContainerDied","Data":"87a9145af23f9562f852ebd49145e65346ec658d5f711eaf1511aadd26da7e9a"} Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.406923 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.409803 4973 generic.go:334] "Generic (PLEG): container finished" podID="69df30f8-12b3-40e8-b880-8344d2c737b3" containerID="9c298c8308f05f7ac0b5adb31e4a36d978d1166a503a0d6c82e10688439fac93" exitCode=0 Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.409878 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"69df30f8-12b3-40e8-b880-8344d2c737b3","Type":"ContainerDied","Data":"9c298c8308f05f7ac0b5adb31e4a36d978d1166a503a0d6c82e10688439fac93"} Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.415415 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b0a61963-2c9b-403b-8fed-f4072b979eb8","Type":"ContainerStarted","Data":"1496bbf25586df4f5e537498a5b2992b27aba42c95447d1dff10947ab450d6ab"} Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.462182 4973 scope.go:117] "RemoveContainer" containerID="bea2c65ddd9ba880384f5afc98ef0a308cae57b4aa117955d1f6460b09f1d09c" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.494050 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.507619 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.523329 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.553948 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.571672 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.596314 4973 scope.go:117] "RemoveContainer" containerID="50969a1aef77ee9053f5b3e4319958c446e8d86d04165333aab817d4a0260dc5" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.619369 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:48:59 crc kubenswrapper[4973]: E0320 13:48:59.619877 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67aae42c-b00a-4295-873e-f9b56094726a" containerName="nova-api-api" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.619897 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="67aae42c-b00a-4295-873e-f9b56094726a" containerName="nova-api-api" Mar 20 13:48:59 crc kubenswrapper[4973]: E0320 13:48:59.619908 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67aae42c-b00a-4295-873e-f9b56094726a" containerName="nova-api-log" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.619915 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="67aae42c-b00a-4295-873e-f9b56094726a" containerName="nova-api-log" Mar 20 13:48:59 crc kubenswrapper[4973]: E0320 13:48:59.619953 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69df30f8-12b3-40e8-b880-8344d2c737b3" containerName="nova-scheduler-scheduler" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.619961 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="69df30f8-12b3-40e8-b880-8344d2c737b3" containerName="nova-scheduler-scheduler" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.620178 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="69df30f8-12b3-40e8-b880-8344d2c737b3" containerName="nova-scheduler-scheduler" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.620214 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="67aae42c-b00a-4295-873e-f9b56094726a" containerName="nova-api-api" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.620225 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="67aae42c-b00a-4295-873e-f9b56094726a" containerName="nova-api-log" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.621866 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.625204 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.625266 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.635832 4973 scope.go:117] "RemoveContainer" containerID="512fe11d9d468098f06d950031447e5ea75540f2b060e7fc602868ffcfa2f046" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.658494 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.666568 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.669147 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.686401 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.696658 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.718649 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx6lf\" (UniqueName: \"kubernetes.io/projected/69df30f8-12b3-40e8-b880-8344d2c737b3-kube-api-access-dx6lf\") pod \"69df30f8-12b3-40e8-b880-8344d2c737b3\" (UID: \"69df30f8-12b3-40e8-b880-8344d2c737b3\") " Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.718819 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69df30f8-12b3-40e8-b880-8344d2c737b3-combined-ca-bundle\") pod \"69df30f8-12b3-40e8-b880-8344d2c737b3\" (UID: \"69df30f8-12b3-40e8-b880-8344d2c737b3\") " Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.719043 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69df30f8-12b3-40e8-b880-8344d2c737b3-config-data\") pod \"69df30f8-12b3-40e8-b880-8344d2c737b3\" (UID: \"69df30f8-12b3-40e8-b880-8344d2c737b3\") " Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.719407 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-config-data\") pod \"nova-metadata-0\" (UID: \"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63\") " pod="openstack/nova-metadata-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.719437 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63\") " pod="openstack/nova-metadata-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.719500 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-logs\") pod \"nova-metadata-0\" (UID: \"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63\") " pod="openstack/nova-metadata-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.719650 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj7hc\" (UniqueName: \"kubernetes.io/projected/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-kube-api-access-rj7hc\") pod \"nova-metadata-0\" (UID: \"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63\") " pod="openstack/nova-metadata-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.719848 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63\") " pod="openstack/nova-metadata-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.722860 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69df30f8-12b3-40e8-b880-8344d2c737b3-kube-api-access-dx6lf" (OuterVolumeSpecName: "kube-api-access-dx6lf") pod "69df30f8-12b3-40e8-b880-8344d2c737b3" (UID: "69df30f8-12b3-40e8-b880-8344d2c737b3"). InnerVolumeSpecName "kube-api-access-dx6lf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.761416 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69df30f8-12b3-40e8-b880-8344d2c737b3-config-data" (OuterVolumeSpecName: "config-data") pod "69df30f8-12b3-40e8-b880-8344d2c737b3" (UID: "69df30f8-12b3-40e8-b880-8344d2c737b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.765265 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69df30f8-12b3-40e8-b880-8344d2c737b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "69df30f8-12b3-40e8-b880-8344d2c737b3" (UID: "69df30f8-12b3-40e8-b880-8344d2c737b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.822140 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05-logs\") pod \"nova-api-0\" (UID: \"ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05\") " pod="openstack/nova-api-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.822259 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj7hc\" (UniqueName: \"kubernetes.io/projected/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-kube-api-access-rj7hc\") pod \"nova-metadata-0\" (UID: \"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63\") " pod="openstack/nova-metadata-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.822326 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05\") " pod="openstack/nova-api-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.822442 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63\") " pod="openstack/nova-metadata-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.822520 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05-config-data\") pod \"nova-api-0\" (UID: \"ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05\") " pod="openstack/nova-api-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.822621 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-config-data\") pod \"nova-metadata-0\" (UID: \"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63\") " pod="openstack/nova-metadata-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.822646 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63\") " pod="openstack/nova-metadata-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.822710 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-logs\") pod \"nova-metadata-0\" (UID: \"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63\") " pod="openstack/nova-metadata-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.822806 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9nbl\" (UniqueName: \"kubernetes.io/projected/ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05-kube-api-access-d9nbl\") pod \"nova-api-0\" (UID: \"ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05\") " pod="openstack/nova-api-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.823413 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-logs\") pod \"nova-metadata-0\" (UID: \"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63\") " pod="openstack/nova-metadata-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.823607 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/69df30f8-12b3-40e8-b880-8344d2c737b3-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.823701 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx6lf\" (UniqueName: \"kubernetes.io/projected/69df30f8-12b3-40e8-b880-8344d2c737b3-kube-api-access-dx6lf\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.823789 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/69df30f8-12b3-40e8-b880-8344d2c737b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.826363 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63\") " pod="openstack/nova-metadata-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.826511 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-config-data\") pod \"nova-metadata-0\" (UID: \"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63\") " pod="openstack/nova-metadata-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.827187 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63\") " pod="openstack/nova-metadata-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.839295 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj7hc\" (UniqueName: \"kubernetes.io/projected/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-kube-api-access-rj7hc\") pod \"nova-metadata-0\" (UID: \"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63\") " pod="openstack/nova-metadata-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.925855 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05-config-data\") pod \"nova-api-0\" (UID: \"ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05\") " pod="openstack/nova-api-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.925982 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9nbl\" (UniqueName: \"kubernetes.io/projected/ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05-kube-api-access-d9nbl\") pod \"nova-api-0\" (UID: \"ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05\") " pod="openstack/nova-api-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.926021 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05-logs\") pod \"nova-api-0\" (UID: \"ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05\") " pod="openstack/nova-api-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.926087 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05\") " pod="openstack/nova-api-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.926932 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05-logs\") pod \"nova-api-0\" (UID: \"ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05\") " pod="openstack/nova-api-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.930218 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05\") " pod="openstack/nova-api-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.930896 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05-config-data\") pod \"nova-api-0\" (UID: \"ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05\") " pod="openstack/nova-api-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.942767 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.943472 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9nbl\" (UniqueName: \"kubernetes.io/projected/ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05-kube-api-access-d9nbl\") pod \"nova-api-0\" (UID: \"ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05\") " pod="openstack/nova-api-0" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.971616 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67aae42c-b00a-4295-873e-f9b56094726a" path="/var/lib/kubelet/pods/67aae42c-b00a-4295-873e-f9b56094726a/volumes" Mar 20 13:48:59 crc kubenswrapper[4973]: I0320 13:48:59.972466 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcc274a4-3873-4ddb-93d6-ba446c832ac7" path="/var/lib/kubelet/pods/bcc274a4-3873-4ddb-93d6-ba446c832ac7/volumes" Mar 20 13:49:00 crc kubenswrapper[4973]: I0320 13:49:00.000975 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:49:00 crc kubenswrapper[4973]: I0320 13:49:00.428426 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"69df30f8-12b3-40e8-b880-8344d2c737b3","Type":"ContainerDied","Data":"67351248a00400395ffebb790a41e0f74fe515d810d876a131658b9ab7de374a"} Mar 20 13:49:00 crc kubenswrapper[4973]: I0320 13:49:00.429158 4973 scope.go:117] "RemoveContainer" containerID="9c298c8308f05f7ac0b5adb31e4a36d978d1166a503a0d6c82e10688439fac93" Mar 20 13:49:00 crc kubenswrapper[4973]: I0320 13:49:00.428727 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:49:00 crc kubenswrapper[4973]: I0320 13:49:00.440752 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"b0a61963-2c9b-403b-8fed-f4072b979eb8","Type":"ContainerStarted","Data":"eafa1502ecb6fd9964d5bce2515dc64b30cf6ff7cd07df68cef7fe6f3c913e03"} Mar 20 13:49:00 crc kubenswrapper[4973]: I0320 13:49:00.444373 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 20 13:49:00 crc kubenswrapper[4973]: I0320 13:49:00.464606 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.464585466 podStartE2EDuration="2.464585466s" podCreationTimestamp="2026-03-20 13:48:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:00.458436367 +0000 UTC m=+1661.202106131" watchObservedRunningTime="2026-03-20 13:49:00.464585466 +0000 UTC m=+1661.208255220" Mar 20 13:49:00 crc kubenswrapper[4973]: I0320 13:49:00.541840 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:49:00 crc kubenswrapper[4973]: I0320 13:49:00.563683 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:49:00 crc kubenswrapper[4973]: I0320 13:49:00.579596 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:49:00 crc kubenswrapper[4973]: I0320 13:49:00.582873 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:49:00 crc kubenswrapper[4973]: I0320 13:49:00.585693 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 13:49:00 crc kubenswrapper[4973]: I0320 13:49:00.618437 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:49:00 crc kubenswrapper[4973]: I0320 13:49:00.635075 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:49:00 crc kubenswrapper[4973]: I0320 13:49:00.651030 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed74ec77-093e-49ba-97d4-4a84588a85d7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ed74ec77-093e-49ba-97d4-4a84588a85d7\") " pod="openstack/nova-scheduler-0" Mar 20 13:49:00 crc kubenswrapper[4973]: I0320 13:49:00.651300 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed74ec77-093e-49ba-97d4-4a84588a85d7-config-data\") pod \"nova-scheduler-0\" (UID: \"ed74ec77-093e-49ba-97d4-4a84588a85d7\") " pod="openstack/nova-scheduler-0" Mar 20 13:49:00 crc kubenswrapper[4973]: I0320 13:49:00.651447 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2pbw\" (UniqueName: \"kubernetes.io/projected/ed74ec77-093e-49ba-97d4-4a84588a85d7-kube-api-access-d2pbw\") pod \"nova-scheduler-0\" (UID: \"ed74ec77-093e-49ba-97d4-4a84588a85d7\") " pod="openstack/nova-scheduler-0" Mar 20 13:49:00 crc kubenswrapper[4973]: I0320 13:49:00.704217 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:49:00 crc kubenswrapper[4973]: I0320 13:49:00.754023 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed74ec77-093e-49ba-97d4-4a84588a85d7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ed74ec77-093e-49ba-97d4-4a84588a85d7\") " pod="openstack/nova-scheduler-0" Mar 20 13:49:00 crc kubenswrapper[4973]: I0320 13:49:00.754098 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed74ec77-093e-49ba-97d4-4a84588a85d7-config-data\") pod \"nova-scheduler-0\" (UID: \"ed74ec77-093e-49ba-97d4-4a84588a85d7\") " pod="openstack/nova-scheduler-0" Mar 20 13:49:00 crc kubenswrapper[4973]: I0320 13:49:00.754124 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2pbw\" (UniqueName: \"kubernetes.io/projected/ed74ec77-093e-49ba-97d4-4a84588a85d7-kube-api-access-d2pbw\") pod \"nova-scheduler-0\" (UID: \"ed74ec77-093e-49ba-97d4-4a84588a85d7\") " pod="openstack/nova-scheduler-0" Mar 20 13:49:00 crc kubenswrapper[4973]: I0320 13:49:00.762142 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed74ec77-093e-49ba-97d4-4a84588a85d7-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ed74ec77-093e-49ba-97d4-4a84588a85d7\") " pod="openstack/nova-scheduler-0" Mar 20 13:49:00 crc kubenswrapper[4973]: I0320 13:49:00.765304 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed74ec77-093e-49ba-97d4-4a84588a85d7-config-data\") pod \"nova-scheduler-0\" (UID: \"ed74ec77-093e-49ba-97d4-4a84588a85d7\") " pod="openstack/nova-scheduler-0" Mar 20 13:49:00 crc kubenswrapper[4973]: I0320 13:49:00.770196 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2pbw\" (UniqueName: \"kubernetes.io/projected/ed74ec77-093e-49ba-97d4-4a84588a85d7-kube-api-access-d2pbw\") pod \"nova-scheduler-0\" (UID: \"ed74ec77-093e-49ba-97d4-4a84588a85d7\") " pod="openstack/nova-scheduler-0" Mar 20 13:49:00 crc kubenswrapper[4973]: I0320 13:49:00.910503 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.386307 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.469492 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63","Type":"ContainerStarted","Data":"28407e65fe1ec51655ffbd273f2bc22852ce76b206055770b09aaa9005715f0c"} Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.469554 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63","Type":"ContainerStarted","Data":"e4c641f2198e8c988cc0ae5db424af35dc12916631b4160f4d62820ecbf30bbb"} Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.469570 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63","Type":"ContainerStarted","Data":"c168e121ad495c90a32d5a31986a32aa1fdc13730ab6cc9c456a51d0da88fc66"} Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.476185 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ef21c1e-3352-4660-9cf4-3023ee75c18b-sg-core-conf-yaml\") pod \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.476247 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ef21c1e-3352-4660-9cf4-3023ee75c18b-combined-ca-bundle\") pod \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.476308 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ef21c1e-3352-4660-9cf4-3023ee75c18b-config-data\") pod \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.476413 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ef21c1e-3352-4660-9cf4-3023ee75c18b-scripts\") pod \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.476463 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ef21c1e-3352-4660-9cf4-3023ee75c18b-log-httpd\") pod \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.476558 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsms7\" (UniqueName: \"kubernetes.io/projected/4ef21c1e-3352-4660-9cf4-3023ee75c18b-kube-api-access-fsms7\") pod \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.476628 4973 generic.go:334] "Generic (PLEG): container finished" podID="4ef21c1e-3352-4660-9cf4-3023ee75c18b" containerID="0d8d75a0e885f56bc57989fd308cedbc022268441dffe4fca78698790241b081" exitCode=0 Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.476707 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ef21c1e-3352-4660-9cf4-3023ee75c18b-run-httpd\") pod \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\" (UID: \"4ef21c1e-3352-4660-9cf4-3023ee75c18b\") " Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.476738 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ef21c1e-3352-4660-9cf4-3023ee75c18b","Type":"ContainerDied","Data":"0d8d75a0e885f56bc57989fd308cedbc022268441dffe4fca78698790241b081"} Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.476769 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4ef21c1e-3352-4660-9cf4-3023ee75c18b","Type":"ContainerDied","Data":"b90f7a65fbdfcb70d8ff5e3492cee1858a0b94fbb6ed2e8b3bfa6eb958d58a00"} Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.476785 4973 scope.go:117] "RemoveContainer" containerID="79076d797fbb559fdbe88cf879a6e3eddb7f5447e7ca3906227127a8a18c5195" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.476930 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.477746 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ef21c1e-3352-4660-9cf4-3023ee75c18b-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4ef21c1e-3352-4660-9cf4-3023ee75c18b" (UID: "4ef21c1e-3352-4660-9cf4-3023ee75c18b"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.478537 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ef21c1e-3352-4660-9cf4-3023ee75c18b-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4ef21c1e-3352-4660-9cf4-3023ee75c18b" (UID: "4ef21c1e-3352-4660-9cf4-3023ee75c18b"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.483158 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ef21c1e-3352-4660-9cf4-3023ee75c18b-kube-api-access-fsms7" (OuterVolumeSpecName: "kube-api-access-fsms7") pod "4ef21c1e-3352-4660-9cf4-3023ee75c18b" (UID: "4ef21c1e-3352-4660-9cf4-3023ee75c18b"). InnerVolumeSpecName "kube-api-access-fsms7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.484443 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ef21c1e-3352-4660-9cf4-3023ee75c18b-scripts" (OuterVolumeSpecName: "scripts") pod "4ef21c1e-3352-4660-9cf4-3023ee75c18b" (UID: "4ef21c1e-3352-4660-9cf4-3023ee75c18b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.498392 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05","Type":"ContainerStarted","Data":"5ad9bbce7c8f1f0ef5b02aaf29c6093ae73f35ddbeccacd26894369c485098ec"} Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.498433 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05","Type":"ContainerStarted","Data":"32f5be99d9e354f6657f58f16da93c14cf4af07b269b3b6cf720ab9a4ceb4343"} Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.521643 4973 scope.go:117] "RemoveContainer" containerID="bd2139f149fc4d242626a4c708a17e3fbaf6978f8d7015f419012e3037743f2e" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.550153 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.550130224 podStartE2EDuration="2.550130224s" podCreationTimestamp="2026-03-20 13:48:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:01.499163272 +0000 UTC m=+1662.242833016" watchObservedRunningTime="2026-03-20 13:49:01.550130224 +0000 UTC m=+1662.293799968" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.569101 4973 scope.go:117] "RemoveContainer" containerID="ea868f986eb64cd4bc4d61b0c3ca34f86972021dcaac45f414bfd372de5c1c01" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.577867 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ef21c1e-3352-4660-9cf4-3023ee75c18b-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4ef21c1e-3352-4660-9cf4-3023ee75c18b" (UID: "4ef21c1e-3352-4660-9cf4-3023ee75c18b"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.582208 4973 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ef21c1e-3352-4660-9cf4-3023ee75c18b-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.582306 4973 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4ef21c1e-3352-4660-9cf4-3023ee75c18b-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.582325 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4ef21c1e-3352-4660-9cf4-3023ee75c18b-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.582351 4973 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4ef21c1e-3352-4660-9cf4-3023ee75c18b-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.582366 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsms7\" (UniqueName: \"kubernetes.io/projected/4ef21c1e-3352-4660-9cf4-3023ee75c18b-kube-api-access-fsms7\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.587182 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.607669 4973 scope.go:117] "RemoveContainer" containerID="0d8d75a0e885f56bc57989fd308cedbc022268441dffe4fca78698790241b081" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.653406 4973 scope.go:117] "RemoveContainer" containerID="79076d797fbb559fdbe88cf879a6e3eddb7f5447e7ca3906227127a8a18c5195" Mar 20 13:49:01 crc kubenswrapper[4973]: E0320 13:49:01.654196 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79076d797fbb559fdbe88cf879a6e3eddb7f5447e7ca3906227127a8a18c5195\": container with ID starting with 79076d797fbb559fdbe88cf879a6e3eddb7f5447e7ca3906227127a8a18c5195 not found: ID does not exist" containerID="79076d797fbb559fdbe88cf879a6e3eddb7f5447e7ca3906227127a8a18c5195" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.654229 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79076d797fbb559fdbe88cf879a6e3eddb7f5447e7ca3906227127a8a18c5195"} err="failed to get container status \"79076d797fbb559fdbe88cf879a6e3eddb7f5447e7ca3906227127a8a18c5195\": rpc error: code = NotFound desc = could not find container \"79076d797fbb559fdbe88cf879a6e3eddb7f5447e7ca3906227127a8a18c5195\": container with ID starting with 79076d797fbb559fdbe88cf879a6e3eddb7f5447e7ca3906227127a8a18c5195 not found: ID does not exist" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.654250 4973 scope.go:117] "RemoveContainer" containerID="bd2139f149fc4d242626a4c708a17e3fbaf6978f8d7015f419012e3037743f2e" Mar 20 13:49:01 crc kubenswrapper[4973]: E0320 13:49:01.654917 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd2139f149fc4d242626a4c708a17e3fbaf6978f8d7015f419012e3037743f2e\": container with ID starting with bd2139f149fc4d242626a4c708a17e3fbaf6978f8d7015f419012e3037743f2e not found: ID does not exist" containerID="bd2139f149fc4d242626a4c708a17e3fbaf6978f8d7015f419012e3037743f2e" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.654943 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd2139f149fc4d242626a4c708a17e3fbaf6978f8d7015f419012e3037743f2e"} err="failed to get container status \"bd2139f149fc4d242626a4c708a17e3fbaf6978f8d7015f419012e3037743f2e\": rpc error: code = NotFound desc = could not find container \"bd2139f149fc4d242626a4c708a17e3fbaf6978f8d7015f419012e3037743f2e\": container with ID starting with bd2139f149fc4d242626a4c708a17e3fbaf6978f8d7015f419012e3037743f2e not found: ID does not exist" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.654965 4973 scope.go:117] "RemoveContainer" containerID="ea868f986eb64cd4bc4d61b0c3ca34f86972021dcaac45f414bfd372de5c1c01" Mar 20 13:49:01 crc kubenswrapper[4973]: E0320 13:49:01.655302 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ea868f986eb64cd4bc4d61b0c3ca34f86972021dcaac45f414bfd372de5c1c01\": container with ID starting with ea868f986eb64cd4bc4d61b0c3ca34f86972021dcaac45f414bfd372de5c1c01 not found: ID does not exist" containerID="ea868f986eb64cd4bc4d61b0c3ca34f86972021dcaac45f414bfd372de5c1c01" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.655488 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ea868f986eb64cd4bc4d61b0c3ca34f86972021dcaac45f414bfd372de5c1c01"} err="failed to get container status \"ea868f986eb64cd4bc4d61b0c3ca34f86972021dcaac45f414bfd372de5c1c01\": rpc error: code = NotFound desc = could not find container \"ea868f986eb64cd4bc4d61b0c3ca34f86972021dcaac45f414bfd372de5c1c01\": container with ID starting with ea868f986eb64cd4bc4d61b0c3ca34f86972021dcaac45f414bfd372de5c1c01 not found: ID does not exist" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.655526 4973 scope.go:117] "RemoveContainer" containerID="0d8d75a0e885f56bc57989fd308cedbc022268441dffe4fca78698790241b081" Mar 20 13:49:01 crc kubenswrapper[4973]: E0320 13:49:01.655846 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d8d75a0e885f56bc57989fd308cedbc022268441dffe4fca78698790241b081\": container with ID starting with 0d8d75a0e885f56bc57989fd308cedbc022268441dffe4fca78698790241b081 not found: ID does not exist" containerID="0d8d75a0e885f56bc57989fd308cedbc022268441dffe4fca78698790241b081" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.655882 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d8d75a0e885f56bc57989fd308cedbc022268441dffe4fca78698790241b081"} err="failed to get container status \"0d8d75a0e885f56bc57989fd308cedbc022268441dffe4fca78698790241b081\": rpc error: code = NotFound desc = could not find container \"0d8d75a0e885f56bc57989fd308cedbc022268441dffe4fca78698790241b081\": container with ID starting with 0d8d75a0e885f56bc57989fd308cedbc022268441dffe4fca78698790241b081 not found: ID does not exist" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.658611 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ef21c1e-3352-4660-9cf4-3023ee75c18b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4ef21c1e-3352-4660-9cf4-3023ee75c18b" (UID: "4ef21c1e-3352-4660-9cf4-3023ee75c18b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.685054 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4ef21c1e-3352-4660-9cf4-3023ee75c18b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.685418 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ef21c1e-3352-4660-9cf4-3023ee75c18b-config-data" (OuterVolumeSpecName: "config-data") pod "4ef21c1e-3352-4660-9cf4-3023ee75c18b" (UID: "4ef21c1e-3352-4660-9cf4-3023ee75c18b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.787228 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4ef21c1e-3352-4660-9cf4-3023ee75c18b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.902006 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.912794 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.933341 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:49:01 crc kubenswrapper[4973]: E0320 13:49:01.933898 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ef21c1e-3352-4660-9cf4-3023ee75c18b" containerName="ceilometer-central-agent" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.933915 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef21c1e-3352-4660-9cf4-3023ee75c18b" containerName="ceilometer-central-agent" Mar 20 13:49:01 crc kubenswrapper[4973]: E0320 13:49:01.933942 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ef21c1e-3352-4660-9cf4-3023ee75c18b" containerName="sg-core" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.933952 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef21c1e-3352-4660-9cf4-3023ee75c18b" containerName="sg-core" Mar 20 13:49:01 crc kubenswrapper[4973]: E0320 13:49:01.933970 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ef21c1e-3352-4660-9cf4-3023ee75c18b" containerName="proxy-httpd" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.933977 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef21c1e-3352-4660-9cf4-3023ee75c18b" containerName="proxy-httpd" Mar 20 13:49:01 crc kubenswrapper[4973]: E0320 13:49:01.933990 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ef21c1e-3352-4660-9cf4-3023ee75c18b" containerName="ceilometer-notification-agent" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.933997 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ef21c1e-3352-4660-9cf4-3023ee75c18b" containerName="ceilometer-notification-agent" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.934235 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ef21c1e-3352-4660-9cf4-3023ee75c18b" containerName="proxy-httpd" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.934265 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ef21c1e-3352-4660-9cf4-3023ee75c18b" containerName="ceilometer-notification-agent" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.934275 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ef21c1e-3352-4660-9cf4-3023ee75c18b" containerName="ceilometer-central-agent" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.934295 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ef21c1e-3352-4660-9cf4-3023ee75c18b" containerName="sg-core" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.936799 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.944917 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.945162 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.975285 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ef21c1e-3352-4660-9cf4-3023ee75c18b" path="/var/lib/kubelet/pods/4ef21c1e-3352-4660-9cf4-3023ee75c18b/volumes" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.976192 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69df30f8-12b3-40e8-b880-8344d2c737b3" path="/var/lib/kubelet/pods/69df30f8-12b3-40e8-b880-8344d2c737b3/volumes" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.977217 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.992318 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-log-httpd\") pod \"ceilometer-0\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " pod="openstack/ceilometer-0" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.992390 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-scripts\") pod \"ceilometer-0\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " pod="openstack/ceilometer-0" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.992414 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-config-data\") pod \"ceilometer-0\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " pod="openstack/ceilometer-0" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.992471 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgbr2\" (UniqueName: \"kubernetes.io/projected/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-kube-api-access-bgbr2\") pod \"ceilometer-0\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " pod="openstack/ceilometer-0" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.992526 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " pod="openstack/ceilometer-0" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.992575 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-run-httpd\") pod \"ceilometer-0\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " pod="openstack/ceilometer-0" Mar 20 13:49:01 crc kubenswrapper[4973]: I0320 13:49:01.992612 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " pod="openstack/ceilometer-0" Mar 20 13:49:02 crc kubenswrapper[4973]: I0320 13:49:02.095091 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " pod="openstack/ceilometer-0" Mar 20 13:49:02 crc kubenswrapper[4973]: I0320 13:49:02.095257 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-log-httpd\") pod \"ceilometer-0\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " pod="openstack/ceilometer-0" Mar 20 13:49:02 crc kubenswrapper[4973]: I0320 13:49:02.095289 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-scripts\") pod \"ceilometer-0\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " pod="openstack/ceilometer-0" Mar 20 13:49:02 crc kubenswrapper[4973]: I0320 13:49:02.095307 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-config-data\") pod \"ceilometer-0\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " pod="openstack/ceilometer-0" Mar 20 13:49:02 crc kubenswrapper[4973]: I0320 13:49:02.095365 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgbr2\" (UniqueName: \"kubernetes.io/projected/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-kube-api-access-bgbr2\") pod \"ceilometer-0\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " pod="openstack/ceilometer-0" Mar 20 13:49:02 crc kubenswrapper[4973]: I0320 13:49:02.095438 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " pod="openstack/ceilometer-0" Mar 20 13:49:02 crc kubenswrapper[4973]: I0320 13:49:02.095493 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-run-httpd\") pod \"ceilometer-0\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " pod="openstack/ceilometer-0" Mar 20 13:49:02 crc kubenswrapper[4973]: I0320 13:49:02.095953 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-run-httpd\") pod \"ceilometer-0\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " pod="openstack/ceilometer-0" Mar 20 13:49:02 crc kubenswrapper[4973]: I0320 13:49:02.095956 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-log-httpd\") pod \"ceilometer-0\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " pod="openstack/ceilometer-0" Mar 20 13:49:02 crc kubenswrapper[4973]: I0320 13:49:02.101046 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " pod="openstack/ceilometer-0" Mar 20 13:49:02 crc kubenswrapper[4973]: I0320 13:49:02.127173 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-scripts\") pod \"ceilometer-0\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " pod="openstack/ceilometer-0" Mar 20 13:49:02 crc kubenswrapper[4973]: I0320 13:49:02.128882 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " pod="openstack/ceilometer-0" Mar 20 13:49:02 crc kubenswrapper[4973]: I0320 13:49:02.132276 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-config-data\") pod \"ceilometer-0\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " pod="openstack/ceilometer-0" Mar 20 13:49:02 crc kubenswrapper[4973]: I0320 13:49:02.135582 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgbr2\" (UniqueName: \"kubernetes.io/projected/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-kube-api-access-bgbr2\") pod \"ceilometer-0\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " pod="openstack/ceilometer-0" Mar 20 13:49:02 crc kubenswrapper[4973]: I0320 13:49:02.261027 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:49:02 crc kubenswrapper[4973]: I0320 13:49:02.528922 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ed74ec77-093e-49ba-97d4-4a84588a85d7","Type":"ContainerStarted","Data":"2b4e390e5ee6861f903f24dfe75cfffa82edce8f2efeecc029efe7ba0d788466"} Mar 20 13:49:02 crc kubenswrapper[4973]: I0320 13:49:02.528969 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ed74ec77-093e-49ba-97d4-4a84588a85d7","Type":"ContainerStarted","Data":"6ba6a9d8bf722dc219fba47a80601d10f716954b98c0d712e678b0f5fa46c343"} Mar 20 13:49:02 crc kubenswrapper[4973]: I0320 13:49:02.536263 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05","Type":"ContainerStarted","Data":"6c8812e52715412db13ba8f6a0ba63eb6bfe9c306b51a4cbedb775c86856bff0"} Mar 20 13:49:02 crc kubenswrapper[4973]: I0320 13:49:02.553835 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.5538151080000002 podStartE2EDuration="2.553815108s" podCreationTimestamp="2026-03-20 13:49:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:02.547639679 +0000 UTC m=+1663.291309413" watchObservedRunningTime="2026-03-20 13:49:02.553815108 +0000 UTC m=+1663.297484852" Mar 20 13:49:02 crc kubenswrapper[4973]: I0320 13:49:02.576573 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.576550199 podStartE2EDuration="3.576550199s" podCreationTimestamp="2026-03-20 13:48:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:02.569584758 +0000 UTC m=+1663.313254512" watchObservedRunningTime="2026-03-20 13:49:02.576550199 +0000 UTC m=+1663.320219933" Mar 20 13:49:02 crc kubenswrapper[4973]: W0320 13:49:02.779996 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod31eee5f2_b5c0_4ba8_8668_4f7f89b7e1b3.slice/crio-f980e8bb814b7a0931b65a8cb165aa6ddfb6a2ff4c836df5f9223609025aa90f WatchSource:0}: Error finding container f980e8bb814b7a0931b65a8cb165aa6ddfb6a2ff4c836df5f9223609025aa90f: Status 404 returned error can't find the container with id f980e8bb814b7a0931b65a8cb165aa6ddfb6a2ff4c836df5f9223609025aa90f Mar 20 13:49:02 crc kubenswrapper[4973]: I0320 13:49:02.807558 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:49:03 crc kubenswrapper[4973]: I0320 13:49:03.551855 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3","Type":"ContainerStarted","Data":"b1823c54226a43a64ad87a2d7a4f4b1da16b5abed254b036cf6677a156280707"} Mar 20 13:49:03 crc kubenswrapper[4973]: I0320 13:49:03.552239 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3","Type":"ContainerStarted","Data":"f980e8bb814b7a0931b65a8cb165aa6ddfb6a2ff4c836df5f9223609025aa90f"} Mar 20 13:49:04 crc kubenswrapper[4973]: I0320 13:49:04.565215 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3","Type":"ContainerStarted","Data":"e595e040c808c6ff1272f2e1711d8738290168c21ec237122d957555b4c6ee32"} Mar 20 13:49:05 crc kubenswrapper[4973]: I0320 13:49:05.595015 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3","Type":"ContainerStarted","Data":"595892f958d5780238fb7cb0bc9b80f2125230df7da9404cb715895c23054e47"} Mar 20 13:49:05 crc kubenswrapper[4973]: I0320 13:49:05.910802 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 13:49:07 crc kubenswrapper[4973]: I0320 13:49:07.627063 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3","Type":"ContainerStarted","Data":"e18c22861fc939cf08d257bb298f915be70b256288ef86a5562790313a7de3f4"} Mar 20 13:49:07 crc kubenswrapper[4973]: I0320 13:49:07.627666 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:49:07 crc kubenswrapper[4973]: I0320 13:49:07.653559 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.865306619 podStartE2EDuration="6.653540415s" podCreationTimestamp="2026-03-20 13:49:01 +0000 UTC" firstStartedPulling="2026-03-20 13:49:02.783550803 +0000 UTC m=+1663.527220547" lastFinishedPulling="2026-03-20 13:49:06.571784609 +0000 UTC m=+1667.315454343" observedRunningTime="2026-03-20 13:49:07.645276159 +0000 UTC m=+1668.388945903" watchObservedRunningTime="2026-03-20 13:49:07.653540415 +0000 UTC m=+1668.397210159" Mar 20 13:49:08 crc kubenswrapper[4973]: I0320 13:49:08.787632 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 20 13:49:08 crc kubenswrapper[4973]: I0320 13:49:08.953264 4973 scope.go:117] "RemoveContainer" containerID="6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" Mar 20 13:49:08 crc kubenswrapper[4973]: E0320 13:49:08.953545 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:49:09 crc kubenswrapper[4973]: I0320 13:49:09.944610 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 13:49:09 crc kubenswrapper[4973]: I0320 13:49:09.944673 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 13:49:10 crc kubenswrapper[4973]: I0320 13:49:10.001689 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:49:10 crc kubenswrapper[4973]: I0320 13:49:10.001747 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:49:10 crc kubenswrapper[4973]: I0320 13:49:10.910652 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 13:49:10 crc kubenswrapper[4973]: I0320 13:49:10.941406 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 13:49:10 crc kubenswrapper[4973]: I0320 13:49:10.957565 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5c1d1009-6e9d-4ad7-88d4-b00ecba35a63" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.6:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:49:10 crc kubenswrapper[4973]: I0320 13:49:10.957608 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5c1d1009-6e9d-4ad7-88d4-b00ecba35a63" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.6:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:49:11 crc kubenswrapper[4973]: I0320 13:49:11.083596 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.7:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:49:11 crc kubenswrapper[4973]: I0320 13:49:11.083598 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.7:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 13:49:11 crc kubenswrapper[4973]: I0320 13:49:11.718052 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 13:49:17 crc kubenswrapper[4973]: I0320 13:49:17.943876 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 13:49:17 crc kubenswrapper[4973]: I0320 13:49:17.945456 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 13:49:18 crc kubenswrapper[4973]: I0320 13:49:18.002123 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:49:18 crc kubenswrapper[4973]: I0320 13:49:18.002292 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:49:18 crc kubenswrapper[4973]: I0320 13:49:18.217891 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-q7czk"] Mar 20 13:49:18 crc kubenswrapper[4973]: I0320 13:49:18.221145 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7czk" Mar 20 13:49:18 crc kubenswrapper[4973]: I0320 13:49:18.231629 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q7czk"] Mar 20 13:49:18 crc kubenswrapper[4973]: I0320 13:49:18.336096 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24wr2\" (UniqueName: \"kubernetes.io/projected/07cb9303-0be6-4603-b71d-718a79aa18c3-kube-api-access-24wr2\") pod \"certified-operators-q7czk\" (UID: \"07cb9303-0be6-4603-b71d-718a79aa18c3\") " pod="openshift-marketplace/certified-operators-q7czk" Mar 20 13:49:18 crc kubenswrapper[4973]: I0320 13:49:18.336205 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07cb9303-0be6-4603-b71d-718a79aa18c3-utilities\") pod \"certified-operators-q7czk\" (UID: \"07cb9303-0be6-4603-b71d-718a79aa18c3\") " pod="openshift-marketplace/certified-operators-q7czk" Mar 20 13:49:18 crc kubenswrapper[4973]: I0320 13:49:18.336230 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07cb9303-0be6-4603-b71d-718a79aa18c3-catalog-content\") pod \"certified-operators-q7czk\" (UID: \"07cb9303-0be6-4603-b71d-718a79aa18c3\") " pod="openshift-marketplace/certified-operators-q7czk" Mar 20 13:49:18 crc kubenswrapper[4973]: I0320 13:49:18.440366 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24wr2\" (UniqueName: \"kubernetes.io/projected/07cb9303-0be6-4603-b71d-718a79aa18c3-kube-api-access-24wr2\") pod \"certified-operators-q7czk\" (UID: \"07cb9303-0be6-4603-b71d-718a79aa18c3\") " pod="openshift-marketplace/certified-operators-q7czk" Mar 20 13:49:18 crc kubenswrapper[4973]: I0320 13:49:18.440809 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07cb9303-0be6-4603-b71d-718a79aa18c3-utilities\") pod \"certified-operators-q7czk\" (UID: \"07cb9303-0be6-4603-b71d-718a79aa18c3\") " pod="openshift-marketplace/certified-operators-q7czk" Mar 20 13:49:18 crc kubenswrapper[4973]: I0320 13:49:18.440881 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07cb9303-0be6-4603-b71d-718a79aa18c3-catalog-content\") pod \"certified-operators-q7czk\" (UID: \"07cb9303-0be6-4603-b71d-718a79aa18c3\") " pod="openshift-marketplace/certified-operators-q7czk" Mar 20 13:49:18 crc kubenswrapper[4973]: I0320 13:49:18.441451 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07cb9303-0be6-4603-b71d-718a79aa18c3-utilities\") pod \"certified-operators-q7czk\" (UID: \"07cb9303-0be6-4603-b71d-718a79aa18c3\") " pod="openshift-marketplace/certified-operators-q7czk" Mar 20 13:49:18 crc kubenswrapper[4973]: I0320 13:49:18.441821 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07cb9303-0be6-4603-b71d-718a79aa18c3-catalog-content\") pod \"certified-operators-q7czk\" (UID: \"07cb9303-0be6-4603-b71d-718a79aa18c3\") " pod="openshift-marketplace/certified-operators-q7czk" Mar 20 13:49:18 crc kubenswrapper[4973]: I0320 13:49:18.460505 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24wr2\" (UniqueName: \"kubernetes.io/projected/07cb9303-0be6-4603-b71d-718a79aa18c3-kube-api-access-24wr2\") pod \"certified-operators-q7czk\" (UID: \"07cb9303-0be6-4603-b71d-718a79aa18c3\") " pod="openshift-marketplace/certified-operators-q7czk" Mar 20 13:49:18 crc kubenswrapper[4973]: I0320 13:49:18.542593 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7czk" Mar 20 13:49:19 crc kubenswrapper[4973]: I0320 13:49:19.144199 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-q7czk"] Mar 20 13:49:19 crc kubenswrapper[4973]: I0320 13:49:19.763173 4973 generic.go:334] "Generic (PLEG): container finished" podID="07cb9303-0be6-4603-b71d-718a79aa18c3" containerID="58d3f836ad8a882cc9d11e7d74573b65f6106122957a53908fa6f943141b4d23" exitCode=0 Mar 20 13:49:19 crc kubenswrapper[4973]: I0320 13:49:19.763259 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7czk" event={"ID":"07cb9303-0be6-4603-b71d-718a79aa18c3","Type":"ContainerDied","Data":"58d3f836ad8a882cc9d11e7d74573b65f6106122957a53908fa6f943141b4d23"} Mar 20 13:49:19 crc kubenswrapper[4973]: I0320 13:49:19.763462 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7czk" event={"ID":"07cb9303-0be6-4603-b71d-718a79aa18c3","Type":"ContainerStarted","Data":"bd6b44b1a08ef64d92200a76ba746a1a822f3f1cae4e34447d83d224bc9cb71e"} Mar 20 13:49:19 crc kubenswrapper[4973]: I0320 13:49:19.975898 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 13:49:19 crc kubenswrapper[4973]: I0320 13:49:19.976500 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 13:49:19 crc kubenswrapper[4973]: I0320 13:49:19.984230 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 13:49:19 crc kubenswrapper[4973]: I0320 13:49:19.992529 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 13:49:20 crc kubenswrapper[4973]: I0320 13:49:20.008654 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 13:49:20 crc kubenswrapper[4973]: I0320 13:49:20.009119 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 13:49:20 crc kubenswrapper[4973]: I0320 13:49:20.011689 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 13:49:20 crc kubenswrapper[4973]: I0320 13:49:20.725966 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:49:20 crc kubenswrapper[4973]: I0320 13:49:20.784102 4973 generic.go:334] "Generic (PLEG): container finished" podID="126a68e5-2c8b-4341-bf31-7d760b77cf8b" containerID="5e3ba363e3fc4d5912da83cd5aaf3c80923804667f2ddc04a7ff054d4972c2bb" exitCode=137 Mar 20 13:49:20 crc kubenswrapper[4973]: I0320 13:49:20.784185 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:49:20 crc kubenswrapper[4973]: I0320 13:49:20.784196 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"126a68e5-2c8b-4341-bf31-7d760b77cf8b","Type":"ContainerDied","Data":"5e3ba363e3fc4d5912da83cd5aaf3c80923804667f2ddc04a7ff054d4972c2bb"} Mar 20 13:49:20 crc kubenswrapper[4973]: I0320 13:49:20.784234 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"126a68e5-2c8b-4341-bf31-7d760b77cf8b","Type":"ContainerDied","Data":"bd3102ee204df8c1c0ccab746ba3f987ea262bbd2e3bebdeb233062593886112"} Mar 20 13:49:20 crc kubenswrapper[4973]: I0320 13:49:20.784254 4973 scope.go:117] "RemoveContainer" containerID="5e3ba363e3fc4d5912da83cd5aaf3c80923804667f2ddc04a7ff054d4972c2bb" Mar 20 13:49:20 crc kubenswrapper[4973]: I0320 13:49:20.791088 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 13:49:20 crc kubenswrapper[4973]: I0320 13:49:20.806489 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h57jm\" (UniqueName: \"kubernetes.io/projected/126a68e5-2c8b-4341-bf31-7d760b77cf8b-kube-api-access-h57jm\") pod \"126a68e5-2c8b-4341-bf31-7d760b77cf8b\" (UID: \"126a68e5-2c8b-4341-bf31-7d760b77cf8b\") " Mar 20 13:49:20 crc kubenswrapper[4973]: I0320 13:49:20.806668 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126a68e5-2c8b-4341-bf31-7d760b77cf8b-combined-ca-bundle\") pod \"126a68e5-2c8b-4341-bf31-7d760b77cf8b\" (UID: \"126a68e5-2c8b-4341-bf31-7d760b77cf8b\") " Mar 20 13:49:20 crc kubenswrapper[4973]: I0320 13:49:20.806726 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126a68e5-2c8b-4341-bf31-7d760b77cf8b-config-data\") pod \"126a68e5-2c8b-4341-bf31-7d760b77cf8b\" (UID: \"126a68e5-2c8b-4341-bf31-7d760b77cf8b\") " Mar 20 13:49:20 crc kubenswrapper[4973]: I0320 13:49:20.832263 4973 scope.go:117] "RemoveContainer" containerID="5e3ba363e3fc4d5912da83cd5aaf3c80923804667f2ddc04a7ff054d4972c2bb" Mar 20 13:49:20 crc kubenswrapper[4973]: I0320 13:49:20.840679 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/126a68e5-2c8b-4341-bf31-7d760b77cf8b-kube-api-access-h57jm" (OuterVolumeSpecName: "kube-api-access-h57jm") pod "126a68e5-2c8b-4341-bf31-7d760b77cf8b" (UID: "126a68e5-2c8b-4341-bf31-7d760b77cf8b"). InnerVolumeSpecName "kube-api-access-h57jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4973]: E0320 13:49:20.842874 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e3ba363e3fc4d5912da83cd5aaf3c80923804667f2ddc04a7ff054d4972c2bb\": container with ID starting with 5e3ba363e3fc4d5912da83cd5aaf3c80923804667f2ddc04a7ff054d4972c2bb not found: ID does not exist" containerID="5e3ba363e3fc4d5912da83cd5aaf3c80923804667f2ddc04a7ff054d4972c2bb" Mar 20 13:49:20 crc kubenswrapper[4973]: I0320 13:49:20.842934 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e3ba363e3fc4d5912da83cd5aaf3c80923804667f2ddc04a7ff054d4972c2bb"} err="failed to get container status \"5e3ba363e3fc4d5912da83cd5aaf3c80923804667f2ddc04a7ff054d4972c2bb\": rpc error: code = NotFound desc = could not find container \"5e3ba363e3fc4d5912da83cd5aaf3c80923804667f2ddc04a7ff054d4972c2bb\": container with ID starting with 5e3ba363e3fc4d5912da83cd5aaf3c80923804667f2ddc04a7ff054d4972c2bb not found: ID does not exist" Mar 20 13:49:20 crc kubenswrapper[4973]: I0320 13:49:20.855242 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/126a68e5-2c8b-4341-bf31-7d760b77cf8b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "126a68e5-2c8b-4341-bf31-7d760b77cf8b" (UID: "126a68e5-2c8b-4341-bf31-7d760b77cf8b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4973]: I0320 13:49:20.855499 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/126a68e5-2c8b-4341-bf31-7d760b77cf8b-config-data" (OuterVolumeSpecName: "config-data") pod "126a68e5-2c8b-4341-bf31-7d760b77cf8b" (UID: "126a68e5-2c8b-4341-bf31-7d760b77cf8b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:20 crc kubenswrapper[4973]: I0320 13:49:20.923213 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h57jm\" (UniqueName: \"kubernetes.io/projected/126a68e5-2c8b-4341-bf31-7d760b77cf8b-kube-api-access-h57jm\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4973]: I0320 13:49:20.923483 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/126a68e5-2c8b-4341-bf31-7d760b77cf8b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:20 crc kubenswrapper[4973]: I0320 13:49:20.923494 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/126a68e5-2c8b-4341-bf31-7d760b77cf8b-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.031906 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-nt6ht"] Mar 20 13:49:21 crc kubenswrapper[4973]: E0320 13:49:21.032652 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="126a68e5-2c8b-4341-bf31-7d760b77cf8b" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.032679 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="126a68e5-2c8b-4341-bf31-7d760b77cf8b" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.033404 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="126a68e5-2c8b-4341-bf31-7d760b77cf8b" containerName="nova-cell1-novncproxy-novncproxy" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.035762 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.055901 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-nt6ht"] Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.135011 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-ovsdbserver-nb\") pod \"dnsmasq-dns-79b5d74c8c-nt6ht\" (UID: \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\") " pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.135064 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t88q\" (UniqueName: \"kubernetes.io/projected/56685fe5-5182-46f0-84f8-c8a40d42a3d2-kube-api-access-9t88q\") pod \"dnsmasq-dns-79b5d74c8c-nt6ht\" (UID: \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\") " pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.135110 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-dns-swift-storage-0\") pod \"dnsmasq-dns-79b5d74c8c-nt6ht\" (UID: \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\") " pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.135278 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-dns-svc\") pod \"dnsmasq-dns-79b5d74c8c-nt6ht\" (UID: \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\") " pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.135463 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-ovsdbserver-sb\") pod \"dnsmasq-dns-79b5d74c8c-nt6ht\" (UID: \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\") " pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.135517 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-config\") pod \"dnsmasq-dns-79b5d74c8c-nt6ht\" (UID: \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\") " pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.165396 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.191955 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.207954 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.209683 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.217705 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.217928 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.218143 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.227973 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.238355 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-ovsdbserver-nb\") pod \"dnsmasq-dns-79b5d74c8c-nt6ht\" (UID: \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\") " pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.238868 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t88q\" (UniqueName: \"kubernetes.io/projected/56685fe5-5182-46f0-84f8-c8a40d42a3d2-kube-api-access-9t88q\") pod \"dnsmasq-dns-79b5d74c8c-nt6ht\" (UID: \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\") " pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.238896 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-dns-swift-storage-0\") pod \"dnsmasq-dns-79b5d74c8c-nt6ht\" (UID: \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\") " pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.238947 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-dns-svc\") pod \"dnsmasq-dns-79b5d74c8c-nt6ht\" (UID: \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\") " pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.239851 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-ovsdbserver-sb\") pod \"dnsmasq-dns-79b5d74c8c-nt6ht\" (UID: \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\") " pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.239882 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-ovsdbserver-nb\") pod \"dnsmasq-dns-79b5d74c8c-nt6ht\" (UID: \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\") " pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.239936 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-config\") pod \"dnsmasq-dns-79b5d74c8c-nt6ht\" (UID: \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\") " pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.240206 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-dns-svc\") pod \"dnsmasq-dns-79b5d74c8c-nt6ht\" (UID: \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\") " pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.240632 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-ovsdbserver-sb\") pod \"dnsmasq-dns-79b5d74c8c-nt6ht\" (UID: \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\") " pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.240905 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-config\") pod \"dnsmasq-dns-79b5d74c8c-nt6ht\" (UID: \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\") " pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.241308 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-dns-swift-storage-0\") pod \"dnsmasq-dns-79b5d74c8c-nt6ht\" (UID: \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\") " pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.261829 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t88q\" (UniqueName: \"kubernetes.io/projected/56685fe5-5182-46f0-84f8-c8a40d42a3d2-kube-api-access-9t88q\") pod \"dnsmasq-dns-79b5d74c8c-nt6ht\" (UID: \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\") " pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.342830 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2f8x\" (UniqueName: \"kubernetes.io/projected/48646158-e77c-4710-b05b-030e7ff1dfbe-kube-api-access-q2f8x\") pod \"nova-cell1-novncproxy-0\" (UID: \"48646158-e77c-4710-b05b-030e7ff1dfbe\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.342914 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48646158-e77c-4710-b05b-030e7ff1dfbe-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"48646158-e77c-4710-b05b-030e7ff1dfbe\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.342967 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/48646158-e77c-4710-b05b-030e7ff1dfbe-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"48646158-e77c-4710-b05b-030e7ff1dfbe\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.343143 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/48646158-e77c-4710-b05b-030e7ff1dfbe-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"48646158-e77c-4710-b05b-030e7ff1dfbe\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.343329 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48646158-e77c-4710-b05b-030e7ff1dfbe-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"48646158-e77c-4710-b05b-030e7ff1dfbe\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.382489 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.445391 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48646158-e77c-4710-b05b-030e7ff1dfbe-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"48646158-e77c-4710-b05b-030e7ff1dfbe\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.445901 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2f8x\" (UniqueName: \"kubernetes.io/projected/48646158-e77c-4710-b05b-030e7ff1dfbe-kube-api-access-q2f8x\") pod \"nova-cell1-novncproxy-0\" (UID: \"48646158-e77c-4710-b05b-030e7ff1dfbe\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.445968 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48646158-e77c-4710-b05b-030e7ff1dfbe-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"48646158-e77c-4710-b05b-030e7ff1dfbe\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.446015 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/48646158-e77c-4710-b05b-030e7ff1dfbe-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"48646158-e77c-4710-b05b-030e7ff1dfbe\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.446059 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/48646158-e77c-4710-b05b-030e7ff1dfbe-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"48646158-e77c-4710-b05b-030e7ff1dfbe\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.453403 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/48646158-e77c-4710-b05b-030e7ff1dfbe-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"48646158-e77c-4710-b05b-030e7ff1dfbe\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.454862 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/48646158-e77c-4710-b05b-030e7ff1dfbe-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"48646158-e77c-4710-b05b-030e7ff1dfbe\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.454914 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/48646158-e77c-4710-b05b-030e7ff1dfbe-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"48646158-e77c-4710-b05b-030e7ff1dfbe\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.460491 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/48646158-e77c-4710-b05b-030e7ff1dfbe-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"48646158-e77c-4710-b05b-030e7ff1dfbe\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.464793 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2f8x\" (UniqueName: \"kubernetes.io/projected/48646158-e77c-4710-b05b-030e7ff1dfbe-kube-api-access-q2f8x\") pod \"nova-cell1-novncproxy-0\" (UID: \"48646158-e77c-4710-b05b-030e7ff1dfbe\") " pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.638240 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:49:21 crc kubenswrapper[4973]: I0320 13:49:21.833981 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7czk" event={"ID":"07cb9303-0be6-4603-b71d-718a79aa18c3","Type":"ContainerStarted","Data":"0ac33e587c2ea99ce0197f2c3bdbde8140f2b76bacd035f7a0cac81f18eb9493"} Mar 20 13:49:22 crc kubenswrapper[4973]: I0320 13:49:22.005511 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="126a68e5-2c8b-4341-bf31-7d760b77cf8b" path="/var/lib/kubelet/pods/126a68e5-2c8b-4341-bf31-7d760b77cf8b/volumes" Mar 20 13:49:22 crc kubenswrapper[4973]: I0320 13:49:22.011205 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-nt6ht"] Mar 20 13:49:22 crc kubenswrapper[4973]: I0320 13:49:22.282562 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 20 13:49:22 crc kubenswrapper[4973]: I0320 13:49:22.872244 4973 generic.go:334] "Generic (PLEG): container finished" podID="56685fe5-5182-46f0-84f8-c8a40d42a3d2" containerID="043ff4cea4f53ab126f52fec6d086b0adc436b76c84625aa64e5e1faa839a350" exitCode=0 Mar 20 13:49:22 crc kubenswrapper[4973]: I0320 13:49:22.872637 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" event={"ID":"56685fe5-5182-46f0-84f8-c8a40d42a3d2","Type":"ContainerDied","Data":"043ff4cea4f53ab126f52fec6d086b0adc436b76c84625aa64e5e1faa839a350"} Mar 20 13:49:22 crc kubenswrapper[4973]: I0320 13:49:22.872683 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" event={"ID":"56685fe5-5182-46f0-84f8-c8a40d42a3d2","Type":"ContainerStarted","Data":"c3df0d88aa31b4738d70ef5fd86b4e819ca4b66a528f195e75d1d27720c7862d"} Mar 20 13:49:22 crc kubenswrapper[4973]: I0320 13:49:22.885237 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"48646158-e77c-4710-b05b-030e7ff1dfbe","Type":"ContainerStarted","Data":"f09f9f6fee89c1412c4de7c439de90fa0948585888a74abd39634a3224f9be82"} Mar 20 13:49:22 crc kubenswrapper[4973]: I0320 13:49:22.885327 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"48646158-e77c-4710-b05b-030e7ff1dfbe","Type":"ContainerStarted","Data":"e813e7674fbfc1cc3de0dd9b763d96a40842314de7f0ceecd0e2185bf501eec2"} Mar 20 13:49:22 crc kubenswrapper[4973]: I0320 13:49:22.948886 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=1.948860932 podStartE2EDuration="1.948860932s" podCreationTimestamp="2026-03-20 13:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:22.931646562 +0000 UTC m=+1683.675316306" watchObservedRunningTime="2026-03-20 13:49:22.948860932 +0000 UTC m=+1683.692530686" Mar 20 13:49:23 crc kubenswrapper[4973]: I0320 13:49:23.660251 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:49:23 crc kubenswrapper[4973]: I0320 13:49:23.895191 4973 generic.go:334] "Generic (PLEG): container finished" podID="07cb9303-0be6-4603-b71d-718a79aa18c3" containerID="0ac33e587c2ea99ce0197f2c3bdbde8140f2b76bacd035f7a0cac81f18eb9493" exitCode=0 Mar 20 13:49:23 crc kubenswrapper[4973]: I0320 13:49:23.895320 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7czk" event={"ID":"07cb9303-0be6-4603-b71d-718a79aa18c3","Type":"ContainerDied","Data":"0ac33e587c2ea99ce0197f2c3bdbde8140f2b76bacd035f7a0cac81f18eb9493"} Mar 20 13:49:23 crc kubenswrapper[4973]: I0320 13:49:23.900381 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" event={"ID":"56685fe5-5182-46f0-84f8-c8a40d42a3d2","Type":"ContainerStarted","Data":"c2866e2a87b3d4355519c2a6554318792c9a3df4fb6b3a0e2a24fb7c651776b1"} Mar 20 13:49:23 crc kubenswrapper[4973]: I0320 13:49:23.900551 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" Mar 20 13:49:23 crc kubenswrapper[4973]: I0320 13:49:23.900696 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05" containerName="nova-api-log" containerID="cri-o://5ad9bbce7c8f1f0ef5b02aaf29c6093ae73f35ddbeccacd26894369c485098ec" gracePeriod=30 Mar 20 13:49:23 crc kubenswrapper[4973]: I0320 13:49:23.900753 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05" containerName="nova-api-api" containerID="cri-o://6c8812e52715412db13ba8f6a0ba63eb6bfe9c306b51a4cbedb775c86856bff0" gracePeriod=30 Mar 20 13:49:23 crc kubenswrapper[4973]: I0320 13:49:23.950833 4973 scope.go:117] "RemoveContainer" containerID="6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" Mar 20 13:49:23 crc kubenswrapper[4973]: E0320 13:49:23.951140 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:49:23 crc kubenswrapper[4973]: I0320 13:49:23.952758 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" podStartSLOduration=3.952737561 podStartE2EDuration="3.952737561s" podCreationTimestamp="2026-03-20 13:49:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:23.946384278 +0000 UTC m=+1684.690054022" watchObservedRunningTime="2026-03-20 13:49:23.952737561 +0000 UTC m=+1684.696407305" Mar 20 13:49:24 crc kubenswrapper[4973]: I0320 13:49:24.440581 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:49:24 crc kubenswrapper[4973]: I0320 13:49:24.440947 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" containerName="ceilometer-central-agent" containerID="cri-o://b1823c54226a43a64ad87a2d7a4f4b1da16b5abed254b036cf6677a156280707" gracePeriod=30 Mar 20 13:49:24 crc kubenswrapper[4973]: I0320 13:49:24.441026 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" containerName="proxy-httpd" containerID="cri-o://e18c22861fc939cf08d257bb298f915be70b256288ef86a5562790313a7de3f4" gracePeriod=30 Mar 20 13:49:24 crc kubenswrapper[4973]: I0320 13:49:24.441070 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" containerName="ceilometer-notification-agent" containerID="cri-o://e595e040c808c6ff1272f2e1711d8738290168c21ec237122d957555b4c6ee32" gracePeriod=30 Mar 20 13:49:24 crc kubenswrapper[4973]: I0320 13:49:24.441029 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" containerName="sg-core" containerID="cri-o://595892f958d5780238fb7cb0bc9b80f2125230df7da9404cb715895c23054e47" gracePeriod=30 Mar 20 13:49:24 crc kubenswrapper[4973]: I0320 13:49:24.487916 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.1.9:3000/\": EOF" Mar 20 13:49:24 crc kubenswrapper[4973]: I0320 13:49:24.925315 4973 generic.go:334] "Generic (PLEG): container finished" podID="ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05" containerID="5ad9bbce7c8f1f0ef5b02aaf29c6093ae73f35ddbeccacd26894369c485098ec" exitCode=143 Mar 20 13:49:24 crc kubenswrapper[4973]: I0320 13:49:24.937560 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05","Type":"ContainerDied","Data":"5ad9bbce7c8f1f0ef5b02aaf29c6093ae73f35ddbeccacd26894369c485098ec"} Mar 20 13:49:24 crc kubenswrapper[4973]: I0320 13:49:24.946565 4973 generic.go:334] "Generic (PLEG): container finished" podID="6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46" containerID="db5a6de6757dc77775f70481978bf1c57bd29b38c754670dbd9fdd66b2dff901" exitCode=137 Mar 20 13:49:24 crc kubenswrapper[4973]: I0320 13:49:24.946656 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46","Type":"ContainerDied","Data":"db5a6de6757dc77775f70481978bf1c57bd29b38c754670dbd9fdd66b2dff901"} Mar 20 13:49:24 crc kubenswrapper[4973]: I0320 13:49:24.959374 4973 generic.go:334] "Generic (PLEG): container finished" podID="31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" containerID="595892f958d5780238fb7cb0bc9b80f2125230df7da9404cb715895c23054e47" exitCode=2 Mar 20 13:49:24 crc kubenswrapper[4973]: I0320 13:49:24.962734 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3","Type":"ContainerDied","Data":"595892f958d5780238fb7cb0bc9b80f2125230df7da9404cb715895c23054e47"} Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.253110 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.390474 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46-config-data\") pod \"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46\" (UID: \"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46\") " Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.390653 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46-combined-ca-bundle\") pod \"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46\" (UID: \"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46\") " Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.391637 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46-scripts\") pod \"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46\" (UID: \"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46\") " Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.391929 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m9wlg\" (UniqueName: \"kubernetes.io/projected/6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46-kube-api-access-m9wlg\") pod \"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46\" (UID: \"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46\") " Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.396033 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46-scripts" (OuterVolumeSpecName: "scripts") pod "6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46" (UID: "6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.399670 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46-kube-api-access-m9wlg" (OuterVolumeSpecName: "kube-api-access-m9wlg") pod "6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46" (UID: "6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46"). InnerVolumeSpecName "kube-api-access-m9wlg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.496517 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.496562 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m9wlg\" (UniqueName: \"kubernetes.io/projected/6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46-kube-api-access-m9wlg\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.641656 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46" (UID: "6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.682303 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.702831 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.760388 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46-config-data" (OuterVolumeSpecName: "config-data") pod "6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46" (UID: "6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.803771 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-scripts\") pod \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.803908 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-config-data\") pod \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.803942 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-log-httpd\") pod \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.804017 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-sg-core-conf-yaml\") pod \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.804129 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-combined-ca-bundle\") pod \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.804157 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-run-httpd\") pod \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.804206 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgbr2\" (UniqueName: \"kubernetes.io/projected/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-kube-api-access-bgbr2\") pod \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\" (UID: \"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3\") " Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.804886 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.806744 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" (UID: "31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.807328 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" (UID: "31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.813324 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-kube-api-access-bgbr2" (OuterVolumeSpecName: "kube-api-access-bgbr2") pod "31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" (UID: "31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3"). InnerVolumeSpecName "kube-api-access-bgbr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.820880 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-scripts" (OuterVolumeSpecName: "scripts") pod "31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" (UID: "31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.857532 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" (UID: "31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.907190 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.907220 4973 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.907229 4973 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.907238 4973 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.907246 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgbr2\" (UniqueName: \"kubernetes.io/projected/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-kube-api-access-bgbr2\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.948926 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" (UID: "31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.973152 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7czk" event={"ID":"07cb9303-0be6-4603-b71d-718a79aa18c3","Type":"ContainerStarted","Data":"1b433332a960e33766a761f3b2c26caee6571903a2ad69a4dcd3e42e05f7db2c"} Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.980059 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46","Type":"ContainerDied","Data":"d8c51111f45110bd9bb6e173933059ee874bebf25ee9434448e432cf552f91a9"} Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.980107 4973 scope.go:117] "RemoveContainer" containerID="db5a6de6757dc77775f70481978bf1c57bd29b38c754670dbd9fdd66b2dff901" Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.980258 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.994434 4973 generic.go:334] "Generic (PLEG): container finished" podID="31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" containerID="e18c22861fc939cf08d257bb298f915be70b256288ef86a5562790313a7de3f4" exitCode=0 Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.994500 4973 generic.go:334] "Generic (PLEG): container finished" podID="31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" containerID="e595e040c808c6ff1272f2e1711d8738290168c21ec237122d957555b4c6ee32" exitCode=0 Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.994513 4973 generic.go:334] "Generic (PLEG): container finished" podID="31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" containerID="b1823c54226a43a64ad87a2d7a4f4b1da16b5abed254b036cf6677a156280707" exitCode=0 Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.994518 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3","Type":"ContainerDied","Data":"e18c22861fc939cf08d257bb298f915be70b256288ef86a5562790313a7de3f4"} Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.994571 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3","Type":"ContainerDied","Data":"e595e040c808c6ff1272f2e1711d8738290168c21ec237122d957555b4c6ee32"} Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.994584 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3","Type":"ContainerDied","Data":"b1823c54226a43a64ad87a2d7a4f4b1da16b5abed254b036cf6677a156280707"} Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.994593 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3","Type":"ContainerDied","Data":"f980e8bb814b7a0931b65a8cb165aa6ddfb6a2ff4c836df5f9223609025aa90f"} Mar 20 13:49:25 crc kubenswrapper[4973]: I0320 13:49:25.994511 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.010142 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.010455 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-q7czk" podStartSLOduration=2.468345743 podStartE2EDuration="8.010417152s" podCreationTimestamp="2026-03-20 13:49:18 +0000 UTC" firstStartedPulling="2026-03-20 13:49:19.766958756 +0000 UTC m=+1680.510628500" lastFinishedPulling="2026-03-20 13:49:25.309030165 +0000 UTC m=+1686.052699909" observedRunningTime="2026-03-20 13:49:25.994143917 +0000 UTC m=+1686.737813671" watchObservedRunningTime="2026-03-20 13:49:26.010417152 +0000 UTC m=+1686.754086896" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.028764 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-config-data" (OuterVolumeSpecName: "config-data") pod "31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" (UID: "31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.058024 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.063058 4973 scope.go:117] "RemoveContainer" containerID="e18d7336844c3f073b75c20896cdc0d5a8453949f430ae1546a7eb68910ebe8f" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.077496 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.113034 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.114118 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 20 13:49:26 crc kubenswrapper[4973]: E0320 13:49:26.114706 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46" containerName="aodh-listener" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.114724 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46" containerName="aodh-listener" Mar 20 13:49:26 crc kubenswrapper[4973]: E0320 13:49:26.114754 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" containerName="ceilometer-notification-agent" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.114761 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" containerName="ceilometer-notification-agent" Mar 20 13:49:26 crc kubenswrapper[4973]: E0320 13:49:26.114772 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46" containerName="aodh-notifier" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.114778 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46" containerName="aodh-notifier" Mar 20 13:49:26 crc kubenswrapper[4973]: E0320 13:49:26.114799 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" containerName="sg-core" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.114807 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" containerName="sg-core" Mar 20 13:49:26 crc kubenswrapper[4973]: E0320 13:49:26.114814 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" containerName="proxy-httpd" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.114822 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" containerName="proxy-httpd" Mar 20 13:49:26 crc kubenswrapper[4973]: E0320 13:49:26.114842 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46" containerName="aodh-api" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.114847 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46" containerName="aodh-api" Mar 20 13:49:26 crc kubenswrapper[4973]: E0320 13:49:26.114855 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" containerName="ceilometer-central-agent" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.114860 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" containerName="ceilometer-central-agent" Mar 20 13:49:26 crc kubenswrapper[4973]: E0320 13:49:26.114875 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46" containerName="aodh-evaluator" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.114913 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46" containerName="aodh-evaluator" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.115122 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" containerName="ceilometer-central-agent" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.115135 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46" containerName="aodh-api" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.115151 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" containerName="sg-core" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.115161 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46" containerName="aodh-listener" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.115172 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" containerName="ceilometer-notification-agent" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.115183 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" containerName="proxy-httpd" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.115194 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46" containerName="aodh-notifier" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.115205 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46" containerName="aodh-evaluator" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.117271 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.120261 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-x7tj8" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.120607 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.120822 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.121022 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.125793 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.133950 4973 scope.go:117] "RemoveContainer" containerID="db5af94fafe96b7e1e64b783aa04c6c31da69cb7d2ba7f14e326d73f7848dbf0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.155166 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.172604 4973 scope.go:117] "RemoveContainer" containerID="b98f68d8ba06a959ce32737a7dec8b2c4bf1c9941812fc35ceae360bae8902e7" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.197752 4973 scope.go:117] "RemoveContainer" containerID="e18c22861fc939cf08d257bb298f915be70b256288ef86a5562790313a7de3f4" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.215827 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dppmk\" (UniqueName: \"kubernetes.io/projected/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-kube-api-access-dppmk\") pod \"aodh-0\" (UID: \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\") " pod="openstack/aodh-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.215890 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-scripts\") pod \"aodh-0\" (UID: \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\") " pod="openstack/aodh-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.216045 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-config-data\") pod \"aodh-0\" (UID: \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\") " pod="openstack/aodh-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.216260 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-combined-ca-bundle\") pod \"aodh-0\" (UID: \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\") " pod="openstack/aodh-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.216386 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-public-tls-certs\") pod \"aodh-0\" (UID: \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\") " pod="openstack/aodh-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.216479 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-internal-tls-certs\") pod \"aodh-0\" (UID: \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\") " pod="openstack/aodh-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.229232 4973 scope.go:117] "RemoveContainer" containerID="595892f958d5780238fb7cb0bc9b80f2125230df7da9404cb715895c23054e47" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.253207 4973 scope.go:117] "RemoveContainer" containerID="e595e040c808c6ff1272f2e1711d8738290168c21ec237122d957555b4c6ee32" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.274091 4973 scope.go:117] "RemoveContainer" containerID="b1823c54226a43a64ad87a2d7a4f4b1da16b5abed254b036cf6677a156280707" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.297678 4973 scope.go:117] "RemoveContainer" containerID="e18c22861fc939cf08d257bb298f915be70b256288ef86a5562790313a7de3f4" Mar 20 13:49:26 crc kubenswrapper[4973]: E0320 13:49:26.299641 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e18c22861fc939cf08d257bb298f915be70b256288ef86a5562790313a7de3f4\": container with ID starting with e18c22861fc939cf08d257bb298f915be70b256288ef86a5562790313a7de3f4 not found: ID does not exist" containerID="e18c22861fc939cf08d257bb298f915be70b256288ef86a5562790313a7de3f4" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.299684 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e18c22861fc939cf08d257bb298f915be70b256288ef86a5562790313a7de3f4"} err="failed to get container status \"e18c22861fc939cf08d257bb298f915be70b256288ef86a5562790313a7de3f4\": rpc error: code = NotFound desc = could not find container \"e18c22861fc939cf08d257bb298f915be70b256288ef86a5562790313a7de3f4\": container with ID starting with e18c22861fc939cf08d257bb298f915be70b256288ef86a5562790313a7de3f4 not found: ID does not exist" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.299709 4973 scope.go:117] "RemoveContainer" containerID="595892f958d5780238fb7cb0bc9b80f2125230df7da9404cb715895c23054e47" Mar 20 13:49:26 crc kubenswrapper[4973]: E0320 13:49:26.300196 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"595892f958d5780238fb7cb0bc9b80f2125230df7da9404cb715895c23054e47\": container with ID starting with 595892f958d5780238fb7cb0bc9b80f2125230df7da9404cb715895c23054e47 not found: ID does not exist" containerID="595892f958d5780238fb7cb0bc9b80f2125230df7da9404cb715895c23054e47" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.300234 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"595892f958d5780238fb7cb0bc9b80f2125230df7da9404cb715895c23054e47"} err="failed to get container status \"595892f958d5780238fb7cb0bc9b80f2125230df7da9404cb715895c23054e47\": rpc error: code = NotFound desc = could not find container \"595892f958d5780238fb7cb0bc9b80f2125230df7da9404cb715895c23054e47\": container with ID starting with 595892f958d5780238fb7cb0bc9b80f2125230df7da9404cb715895c23054e47 not found: ID does not exist" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.300292 4973 scope.go:117] "RemoveContainer" containerID="e595e040c808c6ff1272f2e1711d8738290168c21ec237122d957555b4c6ee32" Mar 20 13:49:26 crc kubenswrapper[4973]: E0320 13:49:26.300552 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e595e040c808c6ff1272f2e1711d8738290168c21ec237122d957555b4c6ee32\": container with ID starting with e595e040c808c6ff1272f2e1711d8738290168c21ec237122d957555b4c6ee32 not found: ID does not exist" containerID="e595e040c808c6ff1272f2e1711d8738290168c21ec237122d957555b4c6ee32" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.300571 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e595e040c808c6ff1272f2e1711d8738290168c21ec237122d957555b4c6ee32"} err="failed to get container status \"e595e040c808c6ff1272f2e1711d8738290168c21ec237122d957555b4c6ee32\": rpc error: code = NotFound desc = could not find container \"e595e040c808c6ff1272f2e1711d8738290168c21ec237122d957555b4c6ee32\": container with ID starting with e595e040c808c6ff1272f2e1711d8738290168c21ec237122d957555b4c6ee32 not found: ID does not exist" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.300587 4973 scope.go:117] "RemoveContainer" containerID="b1823c54226a43a64ad87a2d7a4f4b1da16b5abed254b036cf6677a156280707" Mar 20 13:49:26 crc kubenswrapper[4973]: E0320 13:49:26.300778 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1823c54226a43a64ad87a2d7a4f4b1da16b5abed254b036cf6677a156280707\": container with ID starting with b1823c54226a43a64ad87a2d7a4f4b1da16b5abed254b036cf6677a156280707 not found: ID does not exist" containerID="b1823c54226a43a64ad87a2d7a4f4b1da16b5abed254b036cf6677a156280707" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.300794 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1823c54226a43a64ad87a2d7a4f4b1da16b5abed254b036cf6677a156280707"} err="failed to get container status \"b1823c54226a43a64ad87a2d7a4f4b1da16b5abed254b036cf6677a156280707\": rpc error: code = NotFound desc = could not find container \"b1823c54226a43a64ad87a2d7a4f4b1da16b5abed254b036cf6677a156280707\": container with ID starting with b1823c54226a43a64ad87a2d7a4f4b1da16b5abed254b036cf6677a156280707 not found: ID does not exist" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.300807 4973 scope.go:117] "RemoveContainer" containerID="e18c22861fc939cf08d257bb298f915be70b256288ef86a5562790313a7de3f4" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.301041 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e18c22861fc939cf08d257bb298f915be70b256288ef86a5562790313a7de3f4"} err="failed to get container status \"e18c22861fc939cf08d257bb298f915be70b256288ef86a5562790313a7de3f4\": rpc error: code = NotFound desc = could not find container \"e18c22861fc939cf08d257bb298f915be70b256288ef86a5562790313a7de3f4\": container with ID starting with e18c22861fc939cf08d257bb298f915be70b256288ef86a5562790313a7de3f4 not found: ID does not exist" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.301065 4973 scope.go:117] "RemoveContainer" containerID="595892f958d5780238fb7cb0bc9b80f2125230df7da9404cb715895c23054e47" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.301383 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"595892f958d5780238fb7cb0bc9b80f2125230df7da9404cb715895c23054e47"} err="failed to get container status \"595892f958d5780238fb7cb0bc9b80f2125230df7da9404cb715895c23054e47\": rpc error: code = NotFound desc = could not find container \"595892f958d5780238fb7cb0bc9b80f2125230df7da9404cb715895c23054e47\": container with ID starting with 595892f958d5780238fb7cb0bc9b80f2125230df7da9404cb715895c23054e47 not found: ID does not exist" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.301442 4973 scope.go:117] "RemoveContainer" containerID="e595e040c808c6ff1272f2e1711d8738290168c21ec237122d957555b4c6ee32" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.301760 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e595e040c808c6ff1272f2e1711d8738290168c21ec237122d957555b4c6ee32"} err="failed to get container status \"e595e040c808c6ff1272f2e1711d8738290168c21ec237122d957555b4c6ee32\": rpc error: code = NotFound desc = could not find container \"e595e040c808c6ff1272f2e1711d8738290168c21ec237122d957555b4c6ee32\": container with ID starting with e595e040c808c6ff1272f2e1711d8738290168c21ec237122d957555b4c6ee32 not found: ID does not exist" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.301782 4973 scope.go:117] "RemoveContainer" containerID="b1823c54226a43a64ad87a2d7a4f4b1da16b5abed254b036cf6677a156280707" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.302038 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1823c54226a43a64ad87a2d7a4f4b1da16b5abed254b036cf6677a156280707"} err="failed to get container status \"b1823c54226a43a64ad87a2d7a4f4b1da16b5abed254b036cf6677a156280707\": rpc error: code = NotFound desc = could not find container \"b1823c54226a43a64ad87a2d7a4f4b1da16b5abed254b036cf6677a156280707\": container with ID starting with b1823c54226a43a64ad87a2d7a4f4b1da16b5abed254b036cf6677a156280707 not found: ID does not exist" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.302059 4973 scope.go:117] "RemoveContainer" containerID="e18c22861fc939cf08d257bb298f915be70b256288ef86a5562790313a7de3f4" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.302448 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e18c22861fc939cf08d257bb298f915be70b256288ef86a5562790313a7de3f4"} err="failed to get container status \"e18c22861fc939cf08d257bb298f915be70b256288ef86a5562790313a7de3f4\": rpc error: code = NotFound desc = could not find container \"e18c22861fc939cf08d257bb298f915be70b256288ef86a5562790313a7de3f4\": container with ID starting with e18c22861fc939cf08d257bb298f915be70b256288ef86a5562790313a7de3f4 not found: ID does not exist" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.302475 4973 scope.go:117] "RemoveContainer" containerID="595892f958d5780238fb7cb0bc9b80f2125230df7da9404cb715895c23054e47" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.302840 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"595892f958d5780238fb7cb0bc9b80f2125230df7da9404cb715895c23054e47"} err="failed to get container status \"595892f958d5780238fb7cb0bc9b80f2125230df7da9404cb715895c23054e47\": rpc error: code = NotFound desc = could not find container \"595892f958d5780238fb7cb0bc9b80f2125230df7da9404cb715895c23054e47\": container with ID starting with 595892f958d5780238fb7cb0bc9b80f2125230df7da9404cb715895c23054e47 not found: ID does not exist" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.302862 4973 scope.go:117] "RemoveContainer" containerID="e595e040c808c6ff1272f2e1711d8738290168c21ec237122d957555b4c6ee32" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.303111 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e595e040c808c6ff1272f2e1711d8738290168c21ec237122d957555b4c6ee32"} err="failed to get container status \"e595e040c808c6ff1272f2e1711d8738290168c21ec237122d957555b4c6ee32\": rpc error: code = NotFound desc = could not find container \"e595e040c808c6ff1272f2e1711d8738290168c21ec237122d957555b4c6ee32\": container with ID starting with e595e040c808c6ff1272f2e1711d8738290168c21ec237122d957555b4c6ee32 not found: ID does not exist" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.303132 4973 scope.go:117] "RemoveContainer" containerID="b1823c54226a43a64ad87a2d7a4f4b1da16b5abed254b036cf6677a156280707" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.303485 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1823c54226a43a64ad87a2d7a4f4b1da16b5abed254b036cf6677a156280707"} err="failed to get container status \"b1823c54226a43a64ad87a2d7a4f4b1da16b5abed254b036cf6677a156280707\": rpc error: code = NotFound desc = could not find container \"b1823c54226a43a64ad87a2d7a4f4b1da16b5abed254b036cf6677a156280707\": container with ID starting with b1823c54226a43a64ad87a2d7a4f4b1da16b5abed254b036cf6677a156280707 not found: ID does not exist" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.319193 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-combined-ca-bundle\") pod \"aodh-0\" (UID: \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\") " pod="openstack/aodh-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.319258 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-public-tls-certs\") pod \"aodh-0\" (UID: \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\") " pod="openstack/aodh-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.319316 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-internal-tls-certs\") pod \"aodh-0\" (UID: \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\") " pod="openstack/aodh-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.319428 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dppmk\" (UniqueName: \"kubernetes.io/projected/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-kube-api-access-dppmk\") pod \"aodh-0\" (UID: \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\") " pod="openstack/aodh-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.319458 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-scripts\") pod \"aodh-0\" (UID: \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\") " pod="openstack/aodh-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.319507 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-config-data\") pod \"aodh-0\" (UID: \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\") " pod="openstack/aodh-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.322820 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-scripts\") pod \"aodh-0\" (UID: \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\") " pod="openstack/aodh-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.322842 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-internal-tls-certs\") pod \"aodh-0\" (UID: \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\") " pod="openstack/aodh-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.323157 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-combined-ca-bundle\") pod \"aodh-0\" (UID: \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\") " pod="openstack/aodh-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.323214 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-public-tls-certs\") pod \"aodh-0\" (UID: \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\") " pod="openstack/aodh-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.326841 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-config-data\") pod \"aodh-0\" (UID: \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\") " pod="openstack/aodh-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.337490 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dppmk\" (UniqueName: \"kubernetes.io/projected/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-kube-api-access-dppmk\") pod \"aodh-0\" (UID: \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\") " pod="openstack/aodh-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.456941 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.493423 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.511649 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.528543 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.531623 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.536494 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.536542 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.548066 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.636856 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/649ff891-dde4-449f-bfed-c7029e328ebe-log-httpd\") pod \"ceilometer-0\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " pod="openstack/ceilometer-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.636924 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/649ff891-dde4-449f-bfed-c7029e328ebe-config-data\") pod \"ceilometer-0\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " pod="openstack/ceilometer-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.637073 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/649ff891-dde4-449f-bfed-c7029e328ebe-run-httpd\") pod \"ceilometer-0\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " pod="openstack/ceilometer-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.637121 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/649ff891-dde4-449f-bfed-c7029e328ebe-scripts\") pod \"ceilometer-0\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " pod="openstack/ceilometer-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.637141 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/649ff891-dde4-449f-bfed-c7029e328ebe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " pod="openstack/ceilometer-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.637199 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2r5xc\" (UniqueName: \"kubernetes.io/projected/649ff891-dde4-449f-bfed-c7029e328ebe-kube-api-access-2r5xc\") pod \"ceilometer-0\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " pod="openstack/ceilometer-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.637244 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/649ff891-dde4-449f-bfed-c7029e328ebe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " pod="openstack/ceilometer-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.639501 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.742069 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2r5xc\" (UniqueName: \"kubernetes.io/projected/649ff891-dde4-449f-bfed-c7029e328ebe-kube-api-access-2r5xc\") pod \"ceilometer-0\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " pod="openstack/ceilometer-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.742756 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/649ff891-dde4-449f-bfed-c7029e328ebe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " pod="openstack/ceilometer-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.751470 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/649ff891-dde4-449f-bfed-c7029e328ebe-log-httpd\") pod \"ceilometer-0\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " pod="openstack/ceilometer-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.751512 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/649ff891-dde4-449f-bfed-c7029e328ebe-config-data\") pod \"ceilometer-0\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " pod="openstack/ceilometer-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.751681 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/649ff891-dde4-449f-bfed-c7029e328ebe-run-httpd\") pod \"ceilometer-0\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " pod="openstack/ceilometer-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.751725 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/649ff891-dde4-449f-bfed-c7029e328ebe-scripts\") pod \"ceilometer-0\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " pod="openstack/ceilometer-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.751742 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/649ff891-dde4-449f-bfed-c7029e328ebe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " pod="openstack/ceilometer-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.752643 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/649ff891-dde4-449f-bfed-c7029e328ebe-log-httpd\") pod \"ceilometer-0\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " pod="openstack/ceilometer-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.756788 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/649ff891-dde4-449f-bfed-c7029e328ebe-run-httpd\") pod \"ceilometer-0\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " pod="openstack/ceilometer-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.757726 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/649ff891-dde4-449f-bfed-c7029e328ebe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " pod="openstack/ceilometer-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.759374 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/649ff891-dde4-449f-bfed-c7029e328ebe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " pod="openstack/ceilometer-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.762187 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/649ff891-dde4-449f-bfed-c7029e328ebe-config-data\") pod \"ceilometer-0\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " pod="openstack/ceilometer-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.766278 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2r5xc\" (UniqueName: \"kubernetes.io/projected/649ff891-dde4-449f-bfed-c7029e328ebe-kube-api-access-2r5xc\") pod \"ceilometer-0\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " pod="openstack/ceilometer-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.767626 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/649ff891-dde4-449f-bfed-c7029e328ebe-scripts\") pod \"ceilometer-0\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " pod="openstack/ceilometer-0" Mar 20 13:49:26 crc kubenswrapper[4973]: I0320 13:49:26.965975 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:49:27 crc kubenswrapper[4973]: I0320 13:49:27.166011 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 20 13:49:27 crc kubenswrapper[4973]: I0320 13:49:27.470214 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:49:27 crc kubenswrapper[4973]: I0320 13:49:27.626746 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:49:27 crc kubenswrapper[4973]: I0320 13:49:27.967063 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3" path="/var/lib/kubelet/pods/31eee5f2-b5c0-4ba8-8668-4f7f89b7e1b3/volumes" Mar 20 13:49:27 crc kubenswrapper[4973]: I0320 13:49:27.968273 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46" path="/var/lib/kubelet/pods/6cf232ba-9dbc-4c57-bbb0-5e3a5d8a0d46/volumes" Mar 20 13:49:27 crc kubenswrapper[4973]: I0320 13:49:27.998555 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.038008 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"649ff891-dde4-449f-bfed-c7029e328ebe","Type":"ContainerStarted","Data":"841dae495ceccc7b6c845b4178b2ec21d9e70d3fc58a572de652f7c4373dec60"} Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.042763 4973 generic.go:334] "Generic (PLEG): container finished" podID="ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05" containerID="6c8812e52715412db13ba8f6a0ba63eb6bfe9c306b51a4cbedb775c86856bff0" exitCode=0 Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.042831 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05","Type":"ContainerDied","Data":"6c8812e52715412db13ba8f6a0ba63eb6bfe9c306b51a4cbedb775c86856bff0"} Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.042915 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05","Type":"ContainerDied","Data":"32f5be99d9e354f6657f58f16da93c14cf4af07b269b3b6cf720ab9a4ceb4343"} Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.042942 4973 scope.go:117] "RemoveContainer" containerID="6c8812e52715412db13ba8f6a0ba63eb6bfe9c306b51a4cbedb775c86856bff0" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.043101 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.046202 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6","Type":"ContainerStarted","Data":"1f5e3550820929c0cb49a37397e03409f9a72524e09dd349767ec75d6838e767"} Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.112162 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05-config-data\") pod \"ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05\" (UID: \"ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05\") " Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.112409 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05-combined-ca-bundle\") pod \"ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05\" (UID: \"ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05\") " Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.112456 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05-logs\") pod \"ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05\" (UID: \"ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05\") " Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.112643 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9nbl\" (UniqueName: \"kubernetes.io/projected/ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05-kube-api-access-d9nbl\") pod \"ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05\" (UID: \"ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05\") " Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.114464 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05-logs" (OuterVolumeSpecName: "logs") pod "ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05" (UID: "ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.125686 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05-kube-api-access-d9nbl" (OuterVolumeSpecName: "kube-api-access-d9nbl") pod "ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05" (UID: "ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05"). InnerVolumeSpecName "kube-api-access-d9nbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.149472 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05-config-data" (OuterVolumeSpecName: "config-data") pod "ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05" (UID: "ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.160547 4973 scope.go:117] "RemoveContainer" containerID="5ad9bbce7c8f1f0ef5b02aaf29c6093ae73f35ddbeccacd26894369c485098ec" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.164134 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05" (UID: "ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.218509 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.218550 4973 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.218564 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9nbl\" (UniqueName: \"kubernetes.io/projected/ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05-kube-api-access-d9nbl\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.218579 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.243134 4973 scope.go:117] "RemoveContainer" containerID="6c8812e52715412db13ba8f6a0ba63eb6bfe9c306b51a4cbedb775c86856bff0" Mar 20 13:49:28 crc kubenswrapper[4973]: E0320 13:49:28.244239 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c8812e52715412db13ba8f6a0ba63eb6bfe9c306b51a4cbedb775c86856bff0\": container with ID starting with 6c8812e52715412db13ba8f6a0ba63eb6bfe9c306b51a4cbedb775c86856bff0 not found: ID does not exist" containerID="6c8812e52715412db13ba8f6a0ba63eb6bfe9c306b51a4cbedb775c86856bff0" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.244273 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c8812e52715412db13ba8f6a0ba63eb6bfe9c306b51a4cbedb775c86856bff0"} err="failed to get container status \"6c8812e52715412db13ba8f6a0ba63eb6bfe9c306b51a4cbedb775c86856bff0\": rpc error: code = NotFound desc = could not find container \"6c8812e52715412db13ba8f6a0ba63eb6bfe9c306b51a4cbedb775c86856bff0\": container with ID starting with 6c8812e52715412db13ba8f6a0ba63eb6bfe9c306b51a4cbedb775c86856bff0 not found: ID does not exist" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.244294 4973 scope.go:117] "RemoveContainer" containerID="5ad9bbce7c8f1f0ef5b02aaf29c6093ae73f35ddbeccacd26894369c485098ec" Mar 20 13:49:28 crc kubenswrapper[4973]: E0320 13:49:28.244825 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ad9bbce7c8f1f0ef5b02aaf29c6093ae73f35ddbeccacd26894369c485098ec\": container with ID starting with 5ad9bbce7c8f1f0ef5b02aaf29c6093ae73f35ddbeccacd26894369c485098ec not found: ID does not exist" containerID="5ad9bbce7c8f1f0ef5b02aaf29c6093ae73f35ddbeccacd26894369c485098ec" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.244848 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ad9bbce7c8f1f0ef5b02aaf29c6093ae73f35ddbeccacd26894369c485098ec"} err="failed to get container status \"5ad9bbce7c8f1f0ef5b02aaf29c6093ae73f35ddbeccacd26894369c485098ec\": rpc error: code = NotFound desc = could not find container \"5ad9bbce7c8f1f0ef5b02aaf29c6093ae73f35ddbeccacd26894369c485098ec\": container with ID starting with 5ad9bbce7c8f1f0ef5b02aaf29c6093ae73f35ddbeccacd26894369c485098ec not found: ID does not exist" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.546976 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-q7czk" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.547016 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-q7czk" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.600897 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-q7czk" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.653619 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.669263 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.686552 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 13:49:28 crc kubenswrapper[4973]: E0320 13:49:28.687154 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05" containerName="nova-api-api" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.687172 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05" containerName="nova-api-api" Mar 20 13:49:28 crc kubenswrapper[4973]: E0320 13:49:28.687218 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05" containerName="nova-api-log" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.687226 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05" containerName="nova-api-log" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.687509 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05" containerName="nova-api-api" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.687535 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05" containerName="nova-api-log" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.689605 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.694711 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.695469 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.697689 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.747707 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.834590 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af3161a-157d-4346-ae60-2b68ff6ba9c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\") " pod="openstack/nova-api-0" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.834643 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqncn\" (UniqueName: \"kubernetes.io/projected/4af3161a-157d-4346-ae60-2b68ff6ba9c0-kube-api-access-jqncn\") pod \"nova-api-0\" (UID: \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\") " pod="openstack/nova-api-0" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.834717 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4af3161a-157d-4346-ae60-2b68ff6ba9c0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\") " pod="openstack/nova-api-0" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.834891 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4af3161a-157d-4346-ae60-2b68ff6ba9c0-public-tls-certs\") pod \"nova-api-0\" (UID: \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\") " pod="openstack/nova-api-0" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.835089 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af3161a-157d-4346-ae60-2b68ff6ba9c0-config-data\") pod \"nova-api-0\" (UID: \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\") " pod="openstack/nova-api-0" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.835238 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4af3161a-157d-4346-ae60-2b68ff6ba9c0-logs\") pod \"nova-api-0\" (UID: \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\") " pod="openstack/nova-api-0" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.938607 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4af3161a-157d-4346-ae60-2b68ff6ba9c0-logs\") pod \"nova-api-0\" (UID: \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\") " pod="openstack/nova-api-0" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.938812 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af3161a-157d-4346-ae60-2b68ff6ba9c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\") " pod="openstack/nova-api-0" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.938851 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqncn\" (UniqueName: \"kubernetes.io/projected/4af3161a-157d-4346-ae60-2b68ff6ba9c0-kube-api-access-jqncn\") pod \"nova-api-0\" (UID: \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\") " pod="openstack/nova-api-0" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.938895 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4af3161a-157d-4346-ae60-2b68ff6ba9c0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\") " pod="openstack/nova-api-0" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.938946 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4af3161a-157d-4346-ae60-2b68ff6ba9c0-public-tls-certs\") pod \"nova-api-0\" (UID: \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\") " pod="openstack/nova-api-0" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.939035 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af3161a-157d-4346-ae60-2b68ff6ba9c0-config-data\") pod \"nova-api-0\" (UID: \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\") " pod="openstack/nova-api-0" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.939300 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4af3161a-157d-4346-ae60-2b68ff6ba9c0-logs\") pod \"nova-api-0\" (UID: \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\") " pod="openstack/nova-api-0" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.948442 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4af3161a-157d-4346-ae60-2b68ff6ba9c0-public-tls-certs\") pod \"nova-api-0\" (UID: \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\") " pod="openstack/nova-api-0" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.948503 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af3161a-157d-4346-ae60-2b68ff6ba9c0-config-data\") pod \"nova-api-0\" (UID: \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\") " pod="openstack/nova-api-0" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.948452 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af3161a-157d-4346-ae60-2b68ff6ba9c0-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\") " pod="openstack/nova-api-0" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.949033 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4af3161a-157d-4346-ae60-2b68ff6ba9c0-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\") " pod="openstack/nova-api-0" Mar 20 13:49:28 crc kubenswrapper[4973]: I0320 13:49:28.958004 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqncn\" (UniqueName: \"kubernetes.io/projected/4af3161a-157d-4346-ae60-2b68ff6ba9c0-kube-api-access-jqncn\") pod \"nova-api-0\" (UID: \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\") " pod="openstack/nova-api-0" Mar 20 13:49:29 crc kubenswrapper[4973]: I0320 13:49:29.019108 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:49:29 crc kubenswrapper[4973]: I0320 13:49:29.066739 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6","Type":"ContainerStarted","Data":"0f6a751e8f966556681f7f43b659c9eb663f2754d4df283526610a95879c0d5a"} Mar 20 13:49:29 crc kubenswrapper[4973]: I0320 13:49:29.069133 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"649ff891-dde4-449f-bfed-c7029e328ebe","Type":"ContainerStarted","Data":"5d2b87fdaad05b05fdd6092e28e115ce487b6fbc98e93c9cc6f27216d12dd6d6"} Mar 20 13:49:29 crc kubenswrapper[4973]: I0320 13:49:29.647982 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:49:29 crc kubenswrapper[4973]: I0320 13:49:29.980593 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05" path="/var/lib/kubelet/pods/ee7b84f5-994c-4a8b-b7f3-a1ee6fad4d05/volumes" Mar 20 13:49:30 crc kubenswrapper[4973]: I0320 13:49:30.091857 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"649ff891-dde4-449f-bfed-c7029e328ebe","Type":"ContainerStarted","Data":"50ae74679fb80dbbe8c19df5a2b2a0c406e1c6b2cf3ef1d17ce9ad59f15e5c32"} Mar 20 13:49:30 crc kubenswrapper[4973]: I0320 13:49:30.097954 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6","Type":"ContainerStarted","Data":"c8eac8c778b02c5ab14e1bfc1ce1c5ecce5659d774e5631b2eb49413a548c16d"} Mar 20 13:49:30 crc kubenswrapper[4973]: I0320 13:49:30.107468 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4af3161a-157d-4346-ae60-2b68ff6ba9c0","Type":"ContainerStarted","Data":"ed27ff557459353fe05ad461bc9c5a248234f50f573851ce37fca9184f285058"} Mar 20 13:49:30 crc kubenswrapper[4973]: I0320 13:49:30.107519 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4af3161a-157d-4346-ae60-2b68ff6ba9c0","Type":"ContainerStarted","Data":"ecf87b347dc2689afd2727b81e3254c1c83143e487e4c664d7399fc053816d07"} Mar 20 13:49:31 crc kubenswrapper[4973]: I0320 13:49:31.153031 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6","Type":"ContainerStarted","Data":"fef3925f94ce619168301d5782a0cd7e4d4781173dd9337f0878b67b117cb10a"} Mar 20 13:49:31 crc kubenswrapper[4973]: I0320 13:49:31.153413 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6","Type":"ContainerStarted","Data":"b75848de0126f937d24f5fb5efcc89558ec1b45db617456de33c25662975c984"} Mar 20 13:49:31 crc kubenswrapper[4973]: I0320 13:49:31.155909 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4af3161a-157d-4346-ae60-2b68ff6ba9c0","Type":"ContainerStarted","Data":"ce8a52152c265175a293d8665793b2e420125b4eed40254332bcdb56359132d5"} Mar 20 13:49:31 crc kubenswrapper[4973]: I0320 13:49:31.167866 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"649ff891-dde4-449f-bfed-c7029e328ebe","Type":"ContainerStarted","Data":"2dbbc6b3fce072faa069fd6a1167b2d76692d3e3764674c4225b3ed7ac9a3875"} Mar 20 13:49:31 crc kubenswrapper[4973]: I0320 13:49:31.182184 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.638118438 podStartE2EDuration="5.182167166s" podCreationTimestamp="2026-03-20 13:49:26 +0000 UTC" firstStartedPulling="2026-03-20 13:49:27.186499034 +0000 UTC m=+1687.930168768" lastFinishedPulling="2026-03-20 13:49:30.730547752 +0000 UTC m=+1691.474217496" observedRunningTime="2026-03-20 13:49:31.181296653 +0000 UTC m=+1691.924966417" watchObservedRunningTime="2026-03-20 13:49:31.182167166 +0000 UTC m=+1691.925836910" Mar 20 13:49:31 crc kubenswrapper[4973]: I0320 13:49:31.222949 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.2229311 podStartE2EDuration="3.2229311s" podCreationTimestamp="2026-03-20 13:49:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:31.211359273 +0000 UTC m=+1691.955029017" watchObservedRunningTime="2026-03-20 13:49:31.2229311 +0000 UTC m=+1691.966600844" Mar 20 13:49:31 crc kubenswrapper[4973]: I0320 13:49:31.384561 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" Mar 20 13:49:31 crc kubenswrapper[4973]: I0320 13:49:31.473226 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-jh6ck"] Mar 20 13:49:31 crc kubenswrapper[4973]: I0320 13:49:31.473525 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" podUID="7369f281-2d8f-4609-b027-d5efa15e5567" containerName="dnsmasq-dns" containerID="cri-o://4c68a8e2b0d801372785724d8bbcb0d7b739bfe828ffdce539d96659261e782b" gracePeriod=10 Mar 20 13:49:31 crc kubenswrapper[4973]: I0320 13:49:31.639446 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:49:31 crc kubenswrapper[4973]: I0320 13:49:31.682956 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.137918 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.185500 4973 generic.go:334] "Generic (PLEG): container finished" podID="7369f281-2d8f-4609-b027-d5efa15e5567" containerID="4c68a8e2b0d801372785724d8bbcb0d7b739bfe828ffdce539d96659261e782b" exitCode=0 Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.185582 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.185667 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" event={"ID":"7369f281-2d8f-4609-b027-d5efa15e5567","Type":"ContainerDied","Data":"4c68a8e2b0d801372785724d8bbcb0d7b739bfe828ffdce539d96659261e782b"} Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.185754 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5fbc4d444f-jh6ck" event={"ID":"7369f281-2d8f-4609-b027-d5efa15e5567","Type":"ContainerDied","Data":"e8cf3b3679f01de833657f63a0d5f04df9c77bf17775cc353454d8d4b18f3cfa"} Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.185782 4973 scope.go:117] "RemoveContainer" containerID="4c68a8e2b0d801372785724d8bbcb0d7b739bfe828ffdce539d96659261e782b" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.217941 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.277671 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-dns-svc\") pod \"7369f281-2d8f-4609-b027-d5efa15e5567\" (UID: \"7369f281-2d8f-4609-b027-d5efa15e5567\") " Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.277733 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-ovsdbserver-nb\") pod \"7369f281-2d8f-4609-b027-d5efa15e5567\" (UID: \"7369f281-2d8f-4609-b027-d5efa15e5567\") " Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.277914 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-dns-swift-storage-0\") pod \"7369f281-2d8f-4609-b027-d5efa15e5567\" (UID: \"7369f281-2d8f-4609-b027-d5efa15e5567\") " Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.277945 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-ovsdbserver-sb\") pod \"7369f281-2d8f-4609-b027-d5efa15e5567\" (UID: \"7369f281-2d8f-4609-b027-d5efa15e5567\") " Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.278019 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wl4bz\" (UniqueName: \"kubernetes.io/projected/7369f281-2d8f-4609-b027-d5efa15e5567-kube-api-access-wl4bz\") pod \"7369f281-2d8f-4609-b027-d5efa15e5567\" (UID: \"7369f281-2d8f-4609-b027-d5efa15e5567\") " Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.278079 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-config\") pod \"7369f281-2d8f-4609-b027-d5efa15e5567\" (UID: \"7369f281-2d8f-4609-b027-d5efa15e5567\") " Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.279459 4973 scope.go:117] "RemoveContainer" containerID="493fb2840749d3c1b41b27aeb579cd78973c04df87c08d5d149a143a7b082093" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.314566 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7369f281-2d8f-4609-b027-d5efa15e5567-kube-api-access-wl4bz" (OuterVolumeSpecName: "kube-api-access-wl4bz") pod "7369f281-2d8f-4609-b027-d5efa15e5567" (UID: "7369f281-2d8f-4609-b027-d5efa15e5567"). InnerVolumeSpecName "kube-api-access-wl4bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.381371 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wl4bz\" (UniqueName: \"kubernetes.io/projected/7369f281-2d8f-4609-b027-d5efa15e5567-kube-api-access-wl4bz\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.513440 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-7tq9v"] Mar 20 13:49:32 crc kubenswrapper[4973]: E0320 13:49:32.514312 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7369f281-2d8f-4609-b027-d5efa15e5567" containerName="init" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.514365 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="7369f281-2d8f-4609-b027-d5efa15e5567" containerName="init" Mar 20 13:49:32 crc kubenswrapper[4973]: E0320 13:49:32.514400 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7369f281-2d8f-4609-b027-d5efa15e5567" containerName="dnsmasq-dns" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.514409 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="7369f281-2d8f-4609-b027-d5efa15e5567" containerName="dnsmasq-dns" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.514880 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="7369f281-2d8f-4609-b027-d5efa15e5567" containerName="dnsmasq-dns" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.516406 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7tq9v" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.524170 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.524475 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.533660 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7369f281-2d8f-4609-b027-d5efa15e5567" (UID: "7369f281-2d8f-4609-b027-d5efa15e5567"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.545070 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7369f281-2d8f-4609-b027-d5efa15e5567" (UID: "7369f281-2d8f-4609-b027-d5efa15e5567"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.561784 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7tq9v"] Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.566077 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-config" (OuterVolumeSpecName: "config") pod "7369f281-2d8f-4609-b027-d5efa15e5567" (UID: "7369f281-2d8f-4609-b027-d5efa15e5567"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.568517 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7369f281-2d8f-4609-b027-d5efa15e5567" (UID: "7369f281-2d8f-4609-b027-d5efa15e5567"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.582693 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7369f281-2d8f-4609-b027-d5efa15e5567" (UID: "7369f281-2d8f-4609-b027-d5efa15e5567"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.605862 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87d2fb90-f8cc-4942-b86b-457b21b9790d-scripts\") pod \"nova-cell1-cell-mapping-7tq9v\" (UID: \"87d2fb90-f8cc-4942-b86b-457b21b9790d\") " pod="openstack/nova-cell1-cell-mapping-7tq9v" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.605935 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87d2fb90-f8cc-4942-b86b-457b21b9790d-config-data\") pod \"nova-cell1-cell-mapping-7tq9v\" (UID: \"87d2fb90-f8cc-4942-b86b-457b21b9790d\") " pod="openstack/nova-cell1-cell-mapping-7tq9v" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.606028 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87d2fb90-f8cc-4942-b86b-457b21b9790d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7tq9v\" (UID: \"87d2fb90-f8cc-4942-b86b-457b21b9790d\") " pod="openstack/nova-cell1-cell-mapping-7tq9v" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.606073 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb6j9\" (UniqueName: \"kubernetes.io/projected/87d2fb90-f8cc-4942-b86b-457b21b9790d-kube-api-access-wb6j9\") pod \"nova-cell1-cell-mapping-7tq9v\" (UID: \"87d2fb90-f8cc-4942-b86b-457b21b9790d\") " pod="openstack/nova-cell1-cell-mapping-7tq9v" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.606251 4973 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.606262 4973 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.606275 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.606283 4973 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.606291 4973 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7369f281-2d8f-4609-b027-d5efa15e5567-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.709279 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87d2fb90-f8cc-4942-b86b-457b21b9790d-scripts\") pod \"nova-cell1-cell-mapping-7tq9v\" (UID: \"87d2fb90-f8cc-4942-b86b-457b21b9790d\") " pod="openstack/nova-cell1-cell-mapping-7tq9v" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.709371 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87d2fb90-f8cc-4942-b86b-457b21b9790d-config-data\") pod \"nova-cell1-cell-mapping-7tq9v\" (UID: \"87d2fb90-f8cc-4942-b86b-457b21b9790d\") " pod="openstack/nova-cell1-cell-mapping-7tq9v" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.709460 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87d2fb90-f8cc-4942-b86b-457b21b9790d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7tq9v\" (UID: \"87d2fb90-f8cc-4942-b86b-457b21b9790d\") " pod="openstack/nova-cell1-cell-mapping-7tq9v" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.709488 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb6j9\" (UniqueName: \"kubernetes.io/projected/87d2fb90-f8cc-4942-b86b-457b21b9790d-kube-api-access-wb6j9\") pod \"nova-cell1-cell-mapping-7tq9v\" (UID: \"87d2fb90-f8cc-4942-b86b-457b21b9790d\") " pod="openstack/nova-cell1-cell-mapping-7tq9v" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.720244 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87d2fb90-f8cc-4942-b86b-457b21b9790d-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-7tq9v\" (UID: \"87d2fb90-f8cc-4942-b86b-457b21b9790d\") " pod="openstack/nova-cell1-cell-mapping-7tq9v" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.720889 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87d2fb90-f8cc-4942-b86b-457b21b9790d-config-data\") pod \"nova-cell1-cell-mapping-7tq9v\" (UID: \"87d2fb90-f8cc-4942-b86b-457b21b9790d\") " pod="openstack/nova-cell1-cell-mapping-7tq9v" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.721450 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87d2fb90-f8cc-4942-b86b-457b21b9790d-scripts\") pod \"nova-cell1-cell-mapping-7tq9v\" (UID: \"87d2fb90-f8cc-4942-b86b-457b21b9790d\") " pod="openstack/nova-cell1-cell-mapping-7tq9v" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.733903 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb6j9\" (UniqueName: \"kubernetes.io/projected/87d2fb90-f8cc-4942-b86b-457b21b9790d-kube-api-access-wb6j9\") pod \"nova-cell1-cell-mapping-7tq9v\" (UID: \"87d2fb90-f8cc-4942-b86b-457b21b9790d\") " pod="openstack/nova-cell1-cell-mapping-7tq9v" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.758140 4973 scope.go:117] "RemoveContainer" containerID="4c68a8e2b0d801372785724d8bbcb0d7b739bfe828ffdce539d96659261e782b" Mar 20 13:49:32 crc kubenswrapper[4973]: E0320 13:49:32.760165 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c68a8e2b0d801372785724d8bbcb0d7b739bfe828ffdce539d96659261e782b\": container with ID starting with 4c68a8e2b0d801372785724d8bbcb0d7b739bfe828ffdce539d96659261e782b not found: ID does not exist" containerID="4c68a8e2b0d801372785724d8bbcb0d7b739bfe828ffdce539d96659261e782b" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.760403 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c68a8e2b0d801372785724d8bbcb0d7b739bfe828ffdce539d96659261e782b"} err="failed to get container status \"4c68a8e2b0d801372785724d8bbcb0d7b739bfe828ffdce539d96659261e782b\": rpc error: code = NotFound desc = could not find container \"4c68a8e2b0d801372785724d8bbcb0d7b739bfe828ffdce539d96659261e782b\": container with ID starting with 4c68a8e2b0d801372785724d8bbcb0d7b739bfe828ffdce539d96659261e782b not found: ID does not exist" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.760438 4973 scope.go:117] "RemoveContainer" containerID="493fb2840749d3c1b41b27aeb579cd78973c04df87c08d5d149a143a7b082093" Mar 20 13:49:32 crc kubenswrapper[4973]: E0320 13:49:32.760902 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"493fb2840749d3c1b41b27aeb579cd78973c04df87c08d5d149a143a7b082093\": container with ID starting with 493fb2840749d3c1b41b27aeb579cd78973c04df87c08d5d149a143a7b082093 not found: ID does not exist" containerID="493fb2840749d3c1b41b27aeb579cd78973c04df87c08d5d149a143a7b082093" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.760965 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"493fb2840749d3c1b41b27aeb579cd78973c04df87c08d5d149a143a7b082093"} err="failed to get container status \"493fb2840749d3c1b41b27aeb579cd78973c04df87c08d5d149a143a7b082093\": rpc error: code = NotFound desc = could not find container \"493fb2840749d3c1b41b27aeb579cd78973c04df87c08d5d149a143a7b082093\": container with ID starting with 493fb2840749d3c1b41b27aeb579cd78973c04df87c08d5d149a143a7b082093 not found: ID does not exist" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.769664 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7tq9v" Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.867323 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-jh6ck"] Mar 20 13:49:32 crc kubenswrapper[4973]: I0320 13:49:32.904567 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5fbc4d444f-jh6ck"] Mar 20 13:49:33 crc kubenswrapper[4973]: I0320 13:49:33.206846 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"649ff891-dde4-449f-bfed-c7029e328ebe","Type":"ContainerStarted","Data":"0695857d35eb78ba2a16a9ecf72d45eac97dc31efae5c3e930ee86d0ca62de63"} Mar 20 13:49:33 crc kubenswrapper[4973]: I0320 13:49:33.207463 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="649ff891-dde4-449f-bfed-c7029e328ebe" containerName="ceilometer-central-agent" containerID="cri-o://5d2b87fdaad05b05fdd6092e28e115ce487b6fbc98e93c9cc6f27216d12dd6d6" gracePeriod=30 Mar 20 13:49:33 crc kubenswrapper[4973]: I0320 13:49:33.207628 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:49:33 crc kubenswrapper[4973]: I0320 13:49:33.207658 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="649ff891-dde4-449f-bfed-c7029e328ebe" containerName="proxy-httpd" containerID="cri-o://0695857d35eb78ba2a16a9ecf72d45eac97dc31efae5c3e930ee86d0ca62de63" gracePeriod=30 Mar 20 13:49:33 crc kubenswrapper[4973]: I0320 13:49:33.207747 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="649ff891-dde4-449f-bfed-c7029e328ebe" containerName="sg-core" containerID="cri-o://2dbbc6b3fce072faa069fd6a1167b2d76692d3e3764674c4225b3ed7ac9a3875" gracePeriod=30 Mar 20 13:49:33 crc kubenswrapper[4973]: I0320 13:49:33.207851 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="649ff891-dde4-449f-bfed-c7029e328ebe" containerName="ceilometer-notification-agent" containerID="cri-o://50ae74679fb80dbbe8c19df5a2b2a0c406e1c6b2cf3ef1d17ce9ad59f15e5c32" gracePeriod=30 Mar 20 13:49:33 crc kubenswrapper[4973]: I0320 13:49:33.235142 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.289611653 podStartE2EDuration="7.235119998s" podCreationTimestamp="2026-03-20 13:49:26 +0000 UTC" firstStartedPulling="2026-03-20 13:49:27.674565304 +0000 UTC m=+1688.418235048" lastFinishedPulling="2026-03-20 13:49:32.620073659 +0000 UTC m=+1693.363743393" observedRunningTime="2026-03-20 13:49:33.231397196 +0000 UTC m=+1693.975066940" watchObservedRunningTime="2026-03-20 13:49:33.235119998 +0000 UTC m=+1693.978789742" Mar 20 13:49:33 crc kubenswrapper[4973]: I0320 13:49:33.445402 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-7tq9v"] Mar 20 13:49:33 crc kubenswrapper[4973]: W0320 13:49:33.451671 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87d2fb90_f8cc_4942_b86b_457b21b9790d.slice/crio-60973e90d0c0260854672df8c69b961f579c7924cc0b11de43210b81ddddd80b WatchSource:0}: Error finding container 60973e90d0c0260854672df8c69b961f579c7924cc0b11de43210b81ddddd80b: Status 404 returned error can't find the container with id 60973e90d0c0260854672df8c69b961f579c7924cc0b11de43210b81ddddd80b Mar 20 13:49:33 crc kubenswrapper[4973]: I0320 13:49:33.965625 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7369f281-2d8f-4609-b027-d5efa15e5567" path="/var/lib/kubelet/pods/7369f281-2d8f-4609-b027-d5efa15e5567/volumes" Mar 20 13:49:34 crc kubenswrapper[4973]: I0320 13:49:34.243450 4973 generic.go:334] "Generic (PLEG): container finished" podID="649ff891-dde4-449f-bfed-c7029e328ebe" containerID="2dbbc6b3fce072faa069fd6a1167b2d76692d3e3764674c4225b3ed7ac9a3875" exitCode=2 Mar 20 13:49:34 crc kubenswrapper[4973]: I0320 13:49:34.243820 4973 generic.go:334] "Generic (PLEG): container finished" podID="649ff891-dde4-449f-bfed-c7029e328ebe" containerID="50ae74679fb80dbbe8c19df5a2b2a0c406e1c6b2cf3ef1d17ce9ad59f15e5c32" exitCode=0 Mar 20 13:49:34 crc kubenswrapper[4973]: I0320 13:49:34.243520 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"649ff891-dde4-449f-bfed-c7029e328ebe","Type":"ContainerDied","Data":"2dbbc6b3fce072faa069fd6a1167b2d76692d3e3764674c4225b3ed7ac9a3875"} Mar 20 13:49:34 crc kubenswrapper[4973]: I0320 13:49:34.243872 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"649ff891-dde4-449f-bfed-c7029e328ebe","Type":"ContainerDied","Data":"50ae74679fb80dbbe8c19df5a2b2a0c406e1c6b2cf3ef1d17ce9ad59f15e5c32"} Mar 20 13:49:34 crc kubenswrapper[4973]: I0320 13:49:34.253441 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7tq9v" event={"ID":"87d2fb90-f8cc-4942-b86b-457b21b9790d","Type":"ContainerStarted","Data":"ad9e3dedb9addd4bbdb9a1ca6d267f476bbd2183f98c5b6606156a79252a673a"} Mar 20 13:49:34 crc kubenswrapper[4973]: I0320 13:49:34.253495 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7tq9v" event={"ID":"87d2fb90-f8cc-4942-b86b-457b21b9790d","Type":"ContainerStarted","Data":"60973e90d0c0260854672df8c69b961f579c7924cc0b11de43210b81ddddd80b"} Mar 20 13:49:34 crc kubenswrapper[4973]: I0320 13:49:34.281139 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-7tq9v" podStartSLOduration=2.281113607 podStartE2EDuration="2.281113607s" podCreationTimestamp="2026-03-20 13:49:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:34.270981771 +0000 UTC m=+1695.014651535" watchObservedRunningTime="2026-03-20 13:49:34.281113607 +0000 UTC m=+1695.024783351" Mar 20 13:49:34 crc kubenswrapper[4973]: I0320 13:49:34.950713 4973 scope.go:117] "RemoveContainer" containerID="6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" Mar 20 13:49:34 crc kubenswrapper[4973]: E0320 13:49:34.951056 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:49:38 crc kubenswrapper[4973]: I0320 13:49:38.298660 4973 generic.go:334] "Generic (PLEG): container finished" podID="649ff891-dde4-449f-bfed-c7029e328ebe" containerID="5d2b87fdaad05b05fdd6092e28e115ce487b6fbc98e93c9cc6f27216d12dd6d6" exitCode=0 Mar 20 13:49:38 crc kubenswrapper[4973]: I0320 13:49:38.298744 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"649ff891-dde4-449f-bfed-c7029e328ebe","Type":"ContainerDied","Data":"5d2b87fdaad05b05fdd6092e28e115ce487b6fbc98e93c9cc6f27216d12dd6d6"} Mar 20 13:49:38 crc kubenswrapper[4973]: I0320 13:49:38.592292 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-q7czk" Mar 20 13:49:38 crc kubenswrapper[4973]: I0320 13:49:38.653538 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q7czk"] Mar 20 13:49:39 crc kubenswrapper[4973]: I0320 13:49:39.020382 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:49:39 crc kubenswrapper[4973]: I0320 13:49:39.020967 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:49:39 crc kubenswrapper[4973]: I0320 13:49:39.309588 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-q7czk" podUID="07cb9303-0be6-4603-b71d-718a79aa18c3" containerName="registry-server" containerID="cri-o://1b433332a960e33766a761f3b2c26caee6571903a2ad69a4dcd3e42e05f7db2c" gracePeriod=2 Mar 20 13:49:39 crc kubenswrapper[4973]: I0320 13:49:39.878083 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7czk" Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.015067 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24wr2\" (UniqueName: \"kubernetes.io/projected/07cb9303-0be6-4603-b71d-718a79aa18c3-kube-api-access-24wr2\") pod \"07cb9303-0be6-4603-b71d-718a79aa18c3\" (UID: \"07cb9303-0be6-4603-b71d-718a79aa18c3\") " Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.015208 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07cb9303-0be6-4603-b71d-718a79aa18c3-catalog-content\") pod \"07cb9303-0be6-4603-b71d-718a79aa18c3\" (UID: \"07cb9303-0be6-4603-b71d-718a79aa18c3\") " Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.015617 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07cb9303-0be6-4603-b71d-718a79aa18c3-utilities\") pod \"07cb9303-0be6-4603-b71d-718a79aa18c3\" (UID: \"07cb9303-0be6-4603-b71d-718a79aa18c3\") " Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.016086 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07cb9303-0be6-4603-b71d-718a79aa18c3-utilities" (OuterVolumeSpecName: "utilities") pod "07cb9303-0be6-4603-b71d-718a79aa18c3" (UID: "07cb9303-0be6-4603-b71d-718a79aa18c3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.016630 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/07cb9303-0be6-4603-b71d-718a79aa18c3-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.021742 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07cb9303-0be6-4603-b71d-718a79aa18c3-kube-api-access-24wr2" (OuterVolumeSpecName: "kube-api-access-24wr2") pod "07cb9303-0be6-4603-b71d-718a79aa18c3" (UID: "07cb9303-0be6-4603-b71d-718a79aa18c3"). InnerVolumeSpecName "kube-api-access-24wr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.037580 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4af3161a-157d-4346-ae60-2b68ff6ba9c0" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.15:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.037841 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4af3161a-157d-4346-ae60-2b68ff6ba9c0" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.15:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.075223 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/07cb9303-0be6-4603-b71d-718a79aa18c3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "07cb9303-0be6-4603-b71d-718a79aa18c3" (UID: "07cb9303-0be6-4603-b71d-718a79aa18c3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.119035 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-24wr2\" (UniqueName: \"kubernetes.io/projected/07cb9303-0be6-4603-b71d-718a79aa18c3-kube-api-access-24wr2\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.119067 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/07cb9303-0be6-4603-b71d-718a79aa18c3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.321043 4973 generic.go:334] "Generic (PLEG): container finished" podID="07cb9303-0be6-4603-b71d-718a79aa18c3" containerID="1b433332a960e33766a761f3b2c26caee6571903a2ad69a4dcd3e42e05f7db2c" exitCode=0 Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.321120 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7czk" event={"ID":"07cb9303-0be6-4603-b71d-718a79aa18c3","Type":"ContainerDied","Data":"1b433332a960e33766a761f3b2c26caee6571903a2ad69a4dcd3e42e05f7db2c"} Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.321154 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-q7czk" event={"ID":"07cb9303-0be6-4603-b71d-718a79aa18c3","Type":"ContainerDied","Data":"bd6b44b1a08ef64d92200a76ba746a1a822f3f1cae4e34447d83d224bc9cb71e"} Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.321175 4973 scope.go:117] "RemoveContainer" containerID="1b433332a960e33766a761f3b2c26caee6571903a2ad69a4dcd3e42e05f7db2c" Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.321368 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-q7czk" Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.331102 4973 generic.go:334] "Generic (PLEG): container finished" podID="87d2fb90-f8cc-4942-b86b-457b21b9790d" containerID="ad9e3dedb9addd4bbdb9a1ca6d267f476bbd2183f98c5b6606156a79252a673a" exitCode=0 Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.331142 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7tq9v" event={"ID":"87d2fb90-f8cc-4942-b86b-457b21b9790d","Type":"ContainerDied","Data":"ad9e3dedb9addd4bbdb9a1ca6d267f476bbd2183f98c5b6606156a79252a673a"} Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.371683 4973 scope.go:117] "RemoveContainer" containerID="0ac33e587c2ea99ce0197f2c3bdbde8140f2b76bacd035f7a0cac81f18eb9493" Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.399422 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-q7czk"] Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.411762 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-q7czk"] Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.416752 4973 scope.go:117] "RemoveContainer" containerID="58d3f836ad8a882cc9d11e7d74573b65f6106122957a53908fa6f943141b4d23" Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.479730 4973 scope.go:117] "RemoveContainer" containerID="1b433332a960e33766a761f3b2c26caee6571903a2ad69a4dcd3e42e05f7db2c" Mar 20 13:49:40 crc kubenswrapper[4973]: E0320 13:49:40.480093 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b433332a960e33766a761f3b2c26caee6571903a2ad69a4dcd3e42e05f7db2c\": container with ID starting with 1b433332a960e33766a761f3b2c26caee6571903a2ad69a4dcd3e42e05f7db2c not found: ID does not exist" containerID="1b433332a960e33766a761f3b2c26caee6571903a2ad69a4dcd3e42e05f7db2c" Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.480123 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b433332a960e33766a761f3b2c26caee6571903a2ad69a4dcd3e42e05f7db2c"} err="failed to get container status \"1b433332a960e33766a761f3b2c26caee6571903a2ad69a4dcd3e42e05f7db2c\": rpc error: code = NotFound desc = could not find container \"1b433332a960e33766a761f3b2c26caee6571903a2ad69a4dcd3e42e05f7db2c\": container with ID starting with 1b433332a960e33766a761f3b2c26caee6571903a2ad69a4dcd3e42e05f7db2c not found: ID does not exist" Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.480144 4973 scope.go:117] "RemoveContainer" containerID="0ac33e587c2ea99ce0197f2c3bdbde8140f2b76bacd035f7a0cac81f18eb9493" Mar 20 13:49:40 crc kubenswrapper[4973]: E0320 13:49:40.480468 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ac33e587c2ea99ce0197f2c3bdbde8140f2b76bacd035f7a0cac81f18eb9493\": container with ID starting with 0ac33e587c2ea99ce0197f2c3bdbde8140f2b76bacd035f7a0cac81f18eb9493 not found: ID does not exist" containerID="0ac33e587c2ea99ce0197f2c3bdbde8140f2b76bacd035f7a0cac81f18eb9493" Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.480490 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ac33e587c2ea99ce0197f2c3bdbde8140f2b76bacd035f7a0cac81f18eb9493"} err="failed to get container status \"0ac33e587c2ea99ce0197f2c3bdbde8140f2b76bacd035f7a0cac81f18eb9493\": rpc error: code = NotFound desc = could not find container \"0ac33e587c2ea99ce0197f2c3bdbde8140f2b76bacd035f7a0cac81f18eb9493\": container with ID starting with 0ac33e587c2ea99ce0197f2c3bdbde8140f2b76bacd035f7a0cac81f18eb9493 not found: ID does not exist" Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.480505 4973 scope.go:117] "RemoveContainer" containerID="58d3f836ad8a882cc9d11e7d74573b65f6106122957a53908fa6f943141b4d23" Mar 20 13:49:40 crc kubenswrapper[4973]: E0320 13:49:40.480753 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58d3f836ad8a882cc9d11e7d74573b65f6106122957a53908fa6f943141b4d23\": container with ID starting with 58d3f836ad8a882cc9d11e7d74573b65f6106122957a53908fa6f943141b4d23 not found: ID does not exist" containerID="58d3f836ad8a882cc9d11e7d74573b65f6106122957a53908fa6f943141b4d23" Mar 20 13:49:40 crc kubenswrapper[4973]: I0320 13:49:40.480780 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d3f836ad8a882cc9d11e7d74573b65f6106122957a53908fa6f943141b4d23"} err="failed to get container status \"58d3f836ad8a882cc9d11e7d74573b65f6106122957a53908fa6f943141b4d23\": rpc error: code = NotFound desc = could not find container \"58d3f836ad8a882cc9d11e7d74573b65f6106122957a53908fa6f943141b4d23\": container with ID starting with 58d3f836ad8a882cc9d11e7d74573b65f6106122957a53908fa6f943141b4d23 not found: ID does not exist" Mar 20 13:49:41 crc kubenswrapper[4973]: I0320 13:49:41.789605 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7tq9v" Mar 20 13:49:41 crc kubenswrapper[4973]: I0320 13:49:41.868956 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87d2fb90-f8cc-4942-b86b-457b21b9790d-config-data\") pod \"87d2fb90-f8cc-4942-b86b-457b21b9790d\" (UID: \"87d2fb90-f8cc-4942-b86b-457b21b9790d\") " Mar 20 13:49:41 crc kubenswrapper[4973]: I0320 13:49:41.869029 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87d2fb90-f8cc-4942-b86b-457b21b9790d-combined-ca-bundle\") pod \"87d2fb90-f8cc-4942-b86b-457b21b9790d\" (UID: \"87d2fb90-f8cc-4942-b86b-457b21b9790d\") " Mar 20 13:49:41 crc kubenswrapper[4973]: I0320 13:49:41.869251 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87d2fb90-f8cc-4942-b86b-457b21b9790d-scripts\") pod \"87d2fb90-f8cc-4942-b86b-457b21b9790d\" (UID: \"87d2fb90-f8cc-4942-b86b-457b21b9790d\") " Mar 20 13:49:41 crc kubenswrapper[4973]: I0320 13:49:41.869466 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wb6j9\" (UniqueName: \"kubernetes.io/projected/87d2fb90-f8cc-4942-b86b-457b21b9790d-kube-api-access-wb6j9\") pod \"87d2fb90-f8cc-4942-b86b-457b21b9790d\" (UID: \"87d2fb90-f8cc-4942-b86b-457b21b9790d\") " Mar 20 13:49:41 crc kubenswrapper[4973]: I0320 13:49:41.875277 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87d2fb90-f8cc-4942-b86b-457b21b9790d-kube-api-access-wb6j9" (OuterVolumeSpecName: "kube-api-access-wb6j9") pod "87d2fb90-f8cc-4942-b86b-457b21b9790d" (UID: "87d2fb90-f8cc-4942-b86b-457b21b9790d"). InnerVolumeSpecName "kube-api-access-wb6j9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:41 crc kubenswrapper[4973]: I0320 13:49:41.875438 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87d2fb90-f8cc-4942-b86b-457b21b9790d-scripts" (OuterVolumeSpecName: "scripts") pod "87d2fb90-f8cc-4942-b86b-457b21b9790d" (UID: "87d2fb90-f8cc-4942-b86b-457b21b9790d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:41 crc kubenswrapper[4973]: I0320 13:49:41.904993 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87d2fb90-f8cc-4942-b86b-457b21b9790d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87d2fb90-f8cc-4942-b86b-457b21b9790d" (UID: "87d2fb90-f8cc-4942-b86b-457b21b9790d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:41 crc kubenswrapper[4973]: I0320 13:49:41.906464 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87d2fb90-f8cc-4942-b86b-457b21b9790d-config-data" (OuterVolumeSpecName: "config-data") pod "87d2fb90-f8cc-4942-b86b-457b21b9790d" (UID: "87d2fb90-f8cc-4942-b86b-457b21b9790d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:41 crc kubenswrapper[4973]: I0320 13:49:41.964097 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07cb9303-0be6-4603-b71d-718a79aa18c3" path="/var/lib/kubelet/pods/07cb9303-0be6-4603-b71d-718a79aa18c3/volumes" Mar 20 13:49:41 crc kubenswrapper[4973]: I0320 13:49:41.972147 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87d2fb90-f8cc-4942-b86b-457b21b9790d-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:41 crc kubenswrapper[4973]: I0320 13:49:41.972389 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wb6j9\" (UniqueName: \"kubernetes.io/projected/87d2fb90-f8cc-4942-b86b-457b21b9790d-kube-api-access-wb6j9\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:41 crc kubenswrapper[4973]: I0320 13:49:41.972466 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87d2fb90-f8cc-4942-b86b-457b21b9790d-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:41 crc kubenswrapper[4973]: I0320 13:49:41.972531 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87d2fb90-f8cc-4942-b86b-457b21b9790d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:42 crc kubenswrapper[4973]: I0320 13:49:42.354431 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-7tq9v" event={"ID":"87d2fb90-f8cc-4942-b86b-457b21b9790d","Type":"ContainerDied","Data":"60973e90d0c0260854672df8c69b961f579c7924cc0b11de43210b81ddddd80b"} Mar 20 13:49:42 crc kubenswrapper[4973]: I0320 13:49:42.354793 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="60973e90d0c0260854672df8c69b961f579c7924cc0b11de43210b81ddddd80b" Mar 20 13:49:42 crc kubenswrapper[4973]: I0320 13:49:42.354491 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-7tq9v" Mar 20 13:49:42 crc kubenswrapper[4973]: I0320 13:49:42.650461 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:49:42 crc kubenswrapper[4973]: I0320 13:49:42.650698 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ed74ec77-093e-49ba-97d4-4a84588a85d7" containerName="nova-scheduler-scheduler" containerID="cri-o://2b4e390e5ee6861f903f24dfe75cfffa82edce8f2efeecc029efe7ba0d788466" gracePeriod=30 Mar 20 13:49:42 crc kubenswrapper[4973]: I0320 13:49:42.668853 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:49:42 crc kubenswrapper[4973]: I0320 13:49:42.669142 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4af3161a-157d-4346-ae60-2b68ff6ba9c0" containerName="nova-api-log" containerID="cri-o://ed27ff557459353fe05ad461bc9c5a248234f50f573851ce37fca9184f285058" gracePeriod=30 Mar 20 13:49:42 crc kubenswrapper[4973]: I0320 13:49:42.669229 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="4af3161a-157d-4346-ae60-2b68ff6ba9c0" containerName="nova-api-api" containerID="cri-o://ce8a52152c265175a293d8665793b2e420125b4eed40254332bcdb56359132d5" gracePeriod=30 Mar 20 13:49:42 crc kubenswrapper[4973]: I0320 13:49:42.690867 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:49:42 crc kubenswrapper[4973]: I0320 13:49:42.691200 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5c1d1009-6e9d-4ad7-88d4-b00ecba35a63" containerName="nova-metadata-metadata" containerID="cri-o://28407e65fe1ec51655ffbd273f2bc22852ce76b206055770b09aaa9005715f0c" gracePeriod=30 Mar 20 13:49:42 crc kubenswrapper[4973]: I0320 13:49:42.691265 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="5c1d1009-6e9d-4ad7-88d4-b00ecba35a63" containerName="nova-metadata-log" containerID="cri-o://e4c641f2198e8c988cc0ae5db424af35dc12916631b4160f4d62820ecbf30bbb" gracePeriod=30 Mar 20 13:49:43 crc kubenswrapper[4973]: I0320 13:49:43.369659 4973 generic.go:334] "Generic (PLEG): container finished" podID="4af3161a-157d-4346-ae60-2b68ff6ba9c0" containerID="ed27ff557459353fe05ad461bc9c5a248234f50f573851ce37fca9184f285058" exitCode=143 Mar 20 13:49:43 crc kubenswrapper[4973]: I0320 13:49:43.369746 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4af3161a-157d-4346-ae60-2b68ff6ba9c0","Type":"ContainerDied","Data":"ed27ff557459353fe05ad461bc9c5a248234f50f573851ce37fca9184f285058"} Mar 20 13:49:43 crc kubenswrapper[4973]: I0320 13:49:43.373283 4973 generic.go:334] "Generic (PLEG): container finished" podID="5c1d1009-6e9d-4ad7-88d4-b00ecba35a63" containerID="e4c641f2198e8c988cc0ae5db424af35dc12916631b4160f4d62820ecbf30bbb" exitCode=143 Mar 20 13:49:43 crc kubenswrapper[4973]: I0320 13:49:43.373332 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63","Type":"ContainerDied","Data":"e4c641f2198e8c988cc0ae5db424af35dc12916631b4160f4d62820ecbf30bbb"} Mar 20 13:49:45 crc kubenswrapper[4973]: E0320 13:49:45.913983 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2b4e390e5ee6861f903f24dfe75cfffa82edce8f2efeecc029efe7ba0d788466 is running failed: container process not found" containerID="2b4e390e5ee6861f903f24dfe75cfffa82edce8f2efeecc029efe7ba0d788466" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:49:45 crc kubenswrapper[4973]: E0320 13:49:45.915871 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2b4e390e5ee6861f903f24dfe75cfffa82edce8f2efeecc029efe7ba0d788466 is running failed: container process not found" containerID="2b4e390e5ee6861f903f24dfe75cfffa82edce8f2efeecc029efe7ba0d788466" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:49:45 crc kubenswrapper[4973]: E0320 13:49:45.916180 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2b4e390e5ee6861f903f24dfe75cfffa82edce8f2efeecc029efe7ba0d788466 is running failed: container process not found" containerID="2b4e390e5ee6861f903f24dfe75cfffa82edce8f2efeecc029efe7ba0d788466" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 20 13:49:45 crc kubenswrapper[4973]: E0320 13:49:45.916224 4973 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 2b4e390e5ee6861f903f24dfe75cfffa82edce8f2efeecc029efe7ba0d788466 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="ed74ec77-093e-49ba-97d4-4a84588a85d7" containerName="nova-scheduler-scheduler" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.147724 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.251141 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed74ec77-093e-49ba-97d4-4a84588a85d7-combined-ca-bundle\") pod \"ed74ec77-093e-49ba-97d4-4a84588a85d7\" (UID: \"ed74ec77-093e-49ba-97d4-4a84588a85d7\") " Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.251365 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2pbw\" (UniqueName: \"kubernetes.io/projected/ed74ec77-093e-49ba-97d4-4a84588a85d7-kube-api-access-d2pbw\") pod \"ed74ec77-093e-49ba-97d4-4a84588a85d7\" (UID: \"ed74ec77-093e-49ba-97d4-4a84588a85d7\") " Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.251501 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed74ec77-093e-49ba-97d4-4a84588a85d7-config-data\") pod \"ed74ec77-093e-49ba-97d4-4a84588a85d7\" (UID: \"ed74ec77-093e-49ba-97d4-4a84588a85d7\") " Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.305518 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed74ec77-093e-49ba-97d4-4a84588a85d7-kube-api-access-d2pbw" (OuterVolumeSpecName: "kube-api-access-d2pbw") pod "ed74ec77-093e-49ba-97d4-4a84588a85d7" (UID: "ed74ec77-093e-49ba-97d4-4a84588a85d7"). InnerVolumeSpecName "kube-api-access-d2pbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.351520 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed74ec77-093e-49ba-97d4-4a84588a85d7-config-data" (OuterVolumeSpecName: "config-data") pod "ed74ec77-093e-49ba-97d4-4a84588a85d7" (UID: "ed74ec77-093e-49ba-97d4-4a84588a85d7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.354925 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed74ec77-093e-49ba-97d4-4a84588a85d7-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.354962 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2pbw\" (UniqueName: \"kubernetes.io/projected/ed74ec77-093e-49ba-97d4-4a84588a85d7-kube-api-access-d2pbw\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.365663 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed74ec77-093e-49ba-97d4-4a84588a85d7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed74ec77-093e-49ba-97d4-4a84588a85d7" (UID: "ed74ec77-093e-49ba-97d4-4a84588a85d7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.416657 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.425700 4973 generic.go:334] "Generic (PLEG): container finished" podID="ed74ec77-093e-49ba-97d4-4a84588a85d7" containerID="2b4e390e5ee6861f903f24dfe75cfffa82edce8f2efeecc029efe7ba0d788466" exitCode=0 Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.426209 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ed74ec77-093e-49ba-97d4-4a84588a85d7","Type":"ContainerDied","Data":"2b4e390e5ee6861f903f24dfe75cfffa82edce8f2efeecc029efe7ba0d788466"} Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.426323 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ed74ec77-093e-49ba-97d4-4a84588a85d7","Type":"ContainerDied","Data":"6ba6a9d8bf722dc219fba47a80601d10f716954b98c0d712e678b0f5fa46c343"} Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.426434 4973 scope.go:117] "RemoveContainer" containerID="2b4e390e5ee6861f903f24dfe75cfffa82edce8f2efeecc029efe7ba0d788466" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.426634 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.447065 4973 generic.go:334] "Generic (PLEG): container finished" podID="5c1d1009-6e9d-4ad7-88d4-b00ecba35a63" containerID="28407e65fe1ec51655ffbd273f2bc22852ce76b206055770b09aaa9005715f0c" exitCode=0 Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.447179 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63","Type":"ContainerDied","Data":"28407e65fe1ec51655ffbd273f2bc22852ce76b206055770b09aaa9005715f0c"} Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.454783 4973 generic.go:334] "Generic (PLEG): container finished" podID="4af3161a-157d-4346-ae60-2b68ff6ba9c0" containerID="ce8a52152c265175a293d8665793b2e420125b4eed40254332bcdb56359132d5" exitCode=0 Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.454830 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4af3161a-157d-4346-ae60-2b68ff6ba9c0","Type":"ContainerDied","Data":"ce8a52152c265175a293d8665793b2e420125b4eed40254332bcdb56359132d5"} Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.454859 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4af3161a-157d-4346-ae60-2b68ff6ba9c0","Type":"ContainerDied","Data":"ecf87b347dc2689afd2727b81e3254c1c83143e487e4c664d7399fc053816d07"} Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.455786 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.456738 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4af3161a-157d-4346-ae60-2b68ff6ba9c0-logs\") pod \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\" (UID: \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\") " Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.456824 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqncn\" (UniqueName: \"kubernetes.io/projected/4af3161a-157d-4346-ae60-2b68ff6ba9c0-kube-api-access-jqncn\") pod \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\" (UID: \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\") " Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.456995 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4af3161a-157d-4346-ae60-2b68ff6ba9c0-public-tls-certs\") pod \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\" (UID: \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\") " Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.457040 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af3161a-157d-4346-ae60-2b68ff6ba9c0-combined-ca-bundle\") pod \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\" (UID: \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\") " Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.457162 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af3161a-157d-4346-ae60-2b68ff6ba9c0-config-data\") pod \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\" (UID: \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\") " Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.457255 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4af3161a-157d-4346-ae60-2b68ff6ba9c0-internal-tls-certs\") pod \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\" (UID: \"4af3161a-157d-4346-ae60-2b68ff6ba9c0\") " Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.457965 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed74ec77-093e-49ba-97d4-4a84588a85d7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.461220 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4af3161a-157d-4346-ae60-2b68ff6ba9c0-logs" (OuterVolumeSpecName: "logs") pod "4af3161a-157d-4346-ae60-2b68ff6ba9c0" (UID: "4af3161a-157d-4346-ae60-2b68ff6ba9c0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.469025 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4af3161a-157d-4346-ae60-2b68ff6ba9c0-kube-api-access-jqncn" (OuterVolumeSpecName: "kube-api-access-jqncn") pod "4af3161a-157d-4346-ae60-2b68ff6ba9c0" (UID: "4af3161a-157d-4346-ae60-2b68ff6ba9c0"). InnerVolumeSpecName "kube-api-access-jqncn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.511020 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.529938 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af3161a-157d-4346-ae60-2b68ff6ba9c0-config-data" (OuterVolumeSpecName: "config-data") pod "4af3161a-157d-4346-ae60-2b68ff6ba9c0" (UID: "4af3161a-157d-4346-ae60-2b68ff6ba9c0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.530136 4973 scope.go:117] "RemoveContainer" containerID="2b4e390e5ee6861f903f24dfe75cfffa82edce8f2efeecc029efe7ba0d788466" Mar 20 13:49:46 crc kubenswrapper[4973]: E0320 13:49:46.531856 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b4e390e5ee6861f903f24dfe75cfffa82edce8f2efeecc029efe7ba0d788466\": container with ID starting with 2b4e390e5ee6861f903f24dfe75cfffa82edce8f2efeecc029efe7ba0d788466 not found: ID does not exist" containerID="2b4e390e5ee6861f903f24dfe75cfffa82edce8f2efeecc029efe7ba0d788466" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.531968 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b4e390e5ee6861f903f24dfe75cfffa82edce8f2efeecc029efe7ba0d788466"} err="failed to get container status \"2b4e390e5ee6861f903f24dfe75cfffa82edce8f2efeecc029efe7ba0d788466\": rpc error: code = NotFound desc = could not find container \"2b4e390e5ee6861f903f24dfe75cfffa82edce8f2efeecc029efe7ba0d788466\": container with ID starting with 2b4e390e5ee6861f903f24dfe75cfffa82edce8f2efeecc029efe7ba0d788466 not found: ID does not exist" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.532061 4973 scope.go:117] "RemoveContainer" containerID="ce8a52152c265175a293d8665793b2e420125b4eed40254332bcdb56359132d5" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.541601 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.559798 4973 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4af3161a-157d-4346-ae60-2b68ff6ba9c0-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.560747 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqncn\" (UniqueName: \"kubernetes.io/projected/4af3161a-157d-4346-ae60-2b68ff6ba9c0-kube-api-access-jqncn\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.560844 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4af3161a-157d-4346-ae60-2b68ff6ba9c0-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.566303 4973 scope.go:117] "RemoveContainer" containerID="ed27ff557459353fe05ad461bc9c5a248234f50f573851ce37fca9184f285058" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.569578 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:49:46 crc kubenswrapper[4973]: E0320 13:49:46.570215 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07cb9303-0be6-4603-b71d-718a79aa18c3" containerName="extract-content" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.570253 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="07cb9303-0be6-4603-b71d-718a79aa18c3" containerName="extract-content" Mar 20 13:49:46 crc kubenswrapper[4973]: E0320 13:49:46.570277 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed74ec77-093e-49ba-97d4-4a84588a85d7" containerName="nova-scheduler-scheduler" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.570286 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed74ec77-093e-49ba-97d4-4a84588a85d7" containerName="nova-scheduler-scheduler" Mar 20 13:49:46 crc kubenswrapper[4973]: E0320 13:49:46.570297 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07cb9303-0be6-4603-b71d-718a79aa18c3" containerName="extract-utilities" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.570306 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="07cb9303-0be6-4603-b71d-718a79aa18c3" containerName="extract-utilities" Mar 20 13:49:46 crc kubenswrapper[4973]: E0320 13:49:46.570321 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87d2fb90-f8cc-4942-b86b-457b21b9790d" containerName="nova-manage" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.570329 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="87d2fb90-f8cc-4942-b86b-457b21b9790d" containerName="nova-manage" Mar 20 13:49:46 crc kubenswrapper[4973]: E0320 13:49:46.570368 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4af3161a-157d-4346-ae60-2b68ff6ba9c0" containerName="nova-api-api" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.570376 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af3161a-157d-4346-ae60-2b68ff6ba9c0" containerName="nova-api-api" Mar 20 13:49:46 crc kubenswrapper[4973]: E0320 13:49:46.570401 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07cb9303-0be6-4603-b71d-718a79aa18c3" containerName="registry-server" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.570410 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="07cb9303-0be6-4603-b71d-718a79aa18c3" containerName="registry-server" Mar 20 13:49:46 crc kubenswrapper[4973]: E0320 13:49:46.570429 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4af3161a-157d-4346-ae60-2b68ff6ba9c0" containerName="nova-api-log" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.570437 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="4af3161a-157d-4346-ae60-2b68ff6ba9c0" containerName="nova-api-log" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.570762 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed74ec77-093e-49ba-97d4-4a84588a85d7" containerName="nova-scheduler-scheduler" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.570797 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="4af3161a-157d-4346-ae60-2b68ff6ba9c0" containerName="nova-api-log" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.570816 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="4af3161a-157d-4346-ae60-2b68ff6ba9c0" containerName="nova-api-api" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.570836 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="87d2fb90-f8cc-4942-b86b-457b21b9790d" containerName="nova-manage" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.570855 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="07cb9303-0be6-4603-b71d-718a79aa18c3" containerName="registry-server" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.571911 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.573308 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.573994 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af3161a-157d-4346-ae60-2b68ff6ba9c0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4af3161a-157d-4346-ae60-2b68ff6ba9c0" (UID: "4af3161a-157d-4346-ae60-2b68ff6ba9c0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.575469 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.585219 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.590958 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af3161a-157d-4346-ae60-2b68ff6ba9c0-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "4af3161a-157d-4346-ae60-2b68ff6ba9c0" (UID: "4af3161a-157d-4346-ae60-2b68ff6ba9c0"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.602091 4973 scope.go:117] "RemoveContainer" containerID="ce8a52152c265175a293d8665793b2e420125b4eed40254332bcdb56359132d5" Mar 20 13:49:46 crc kubenswrapper[4973]: E0320 13:49:46.602596 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce8a52152c265175a293d8665793b2e420125b4eed40254332bcdb56359132d5\": container with ID starting with ce8a52152c265175a293d8665793b2e420125b4eed40254332bcdb56359132d5 not found: ID does not exist" containerID="ce8a52152c265175a293d8665793b2e420125b4eed40254332bcdb56359132d5" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.602639 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce8a52152c265175a293d8665793b2e420125b4eed40254332bcdb56359132d5"} err="failed to get container status \"ce8a52152c265175a293d8665793b2e420125b4eed40254332bcdb56359132d5\": rpc error: code = NotFound desc = could not find container \"ce8a52152c265175a293d8665793b2e420125b4eed40254332bcdb56359132d5\": container with ID starting with ce8a52152c265175a293d8665793b2e420125b4eed40254332bcdb56359132d5 not found: ID does not exist" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.602670 4973 scope.go:117] "RemoveContainer" containerID="ed27ff557459353fe05ad461bc9c5a248234f50f573851ce37fca9184f285058" Mar 20 13:49:46 crc kubenswrapper[4973]: E0320 13:49:46.603039 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed27ff557459353fe05ad461bc9c5a248234f50f573851ce37fca9184f285058\": container with ID starting with ed27ff557459353fe05ad461bc9c5a248234f50f573851ce37fca9184f285058 not found: ID does not exist" containerID="ed27ff557459353fe05ad461bc9c5a248234f50f573851ce37fca9184f285058" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.603160 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed27ff557459353fe05ad461bc9c5a248234f50f573851ce37fca9184f285058"} err="failed to get container status \"ed27ff557459353fe05ad461bc9c5a248234f50f573851ce37fca9184f285058\": rpc error: code = NotFound desc = could not find container \"ed27ff557459353fe05ad461bc9c5a248234f50f573851ce37fca9184f285058\": container with ID starting with ed27ff557459353fe05ad461bc9c5a248234f50f573851ce37fca9184f285058 not found: ID does not exist" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.630869 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4af3161a-157d-4346-ae60-2b68ff6ba9c0-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "4af3161a-157d-4346-ae60-2b68ff6ba9c0" (UID: "4af3161a-157d-4346-ae60-2b68ff6ba9c0"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.665328 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rj7hc\" (UniqueName: \"kubernetes.io/projected/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-kube-api-access-rj7hc\") pod \"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63\" (UID: \"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63\") " Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.665798 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-nova-metadata-tls-certs\") pod \"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63\" (UID: \"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63\") " Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.666105 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-config-data\") pod \"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63\" (UID: \"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63\") " Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.666302 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-logs\") pod \"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63\" (UID: \"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63\") " Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.666622 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-combined-ca-bundle\") pod \"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63\" (UID: \"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63\") " Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.666794 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-logs" (OuterVolumeSpecName: "logs") pod "5c1d1009-6e9d-4ad7-88d4-b00ecba35a63" (UID: "5c1d1009-6e9d-4ad7-88d4-b00ecba35a63"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.667534 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e31feddc-5c5c-4520-bd2e-52ab71a2e318-config-data\") pod \"nova-scheduler-0\" (UID: \"e31feddc-5c5c-4520-bd2e-52ab71a2e318\") " pod="openstack/nova-scheduler-0" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.667796 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkfr9\" (UniqueName: \"kubernetes.io/projected/e31feddc-5c5c-4520-bd2e-52ab71a2e318-kube-api-access-kkfr9\") pod \"nova-scheduler-0\" (UID: \"e31feddc-5c5c-4520-bd2e-52ab71a2e318\") " pod="openstack/nova-scheduler-0" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.668032 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e31feddc-5c5c-4520-bd2e-52ab71a2e318-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e31feddc-5c5c-4520-bd2e-52ab71a2e318\") " pod="openstack/nova-scheduler-0" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.668287 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-kube-api-access-rj7hc" (OuterVolumeSpecName: "kube-api-access-rj7hc") pod "5c1d1009-6e9d-4ad7-88d4-b00ecba35a63" (UID: "5c1d1009-6e9d-4ad7-88d4-b00ecba35a63"). InnerVolumeSpecName "kube-api-access-rj7hc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.668555 4973 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4af3161a-157d-4346-ae60-2b68ff6ba9c0-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.668639 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rj7hc\" (UniqueName: \"kubernetes.io/projected/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-kube-api-access-rj7hc\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.668720 4973 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4af3161a-157d-4346-ae60-2b68ff6ba9c0-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.668794 4973 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-logs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.668914 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4af3161a-157d-4346-ae60-2b68ff6ba9c0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.708655 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-config-data" (OuterVolumeSpecName: "config-data") pod "5c1d1009-6e9d-4ad7-88d4-b00ecba35a63" (UID: "5c1d1009-6e9d-4ad7-88d4-b00ecba35a63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4973]: E0320 13:49:46.724289 4973 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded74ec77_093e_49ba_97d4_4a84588a85d7.slice/crio-6ba6a9d8bf722dc219fba47a80601d10f716954b98c0d712e678b0f5fa46c343\": RecentStats: unable to find data in memory cache]" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.746827 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5c1d1009-6e9d-4ad7-88d4-b00ecba35a63" (UID: "5c1d1009-6e9d-4ad7-88d4-b00ecba35a63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.770237 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "5c1d1009-6e9d-4ad7-88d4-b00ecba35a63" (UID: "5c1d1009-6e9d-4ad7-88d4-b00ecba35a63"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.771098 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-nova-metadata-tls-certs\") pod \"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63\" (UID: \"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63\") " Mar 20 13:49:46 crc kubenswrapper[4973]: W0320 13:49:46.771837 4973 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63/volumes/kubernetes.io~secret/nova-metadata-tls-certs Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.771851 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "5c1d1009-6e9d-4ad7-88d4-b00ecba35a63" (UID: "5c1d1009-6e9d-4ad7-88d4-b00ecba35a63"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.772116 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e31feddc-5c5c-4520-bd2e-52ab71a2e318-config-data\") pod \"nova-scheduler-0\" (UID: \"e31feddc-5c5c-4520-bd2e-52ab71a2e318\") " pod="openstack/nova-scheduler-0" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.772151 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkfr9\" (UniqueName: \"kubernetes.io/projected/e31feddc-5c5c-4520-bd2e-52ab71a2e318-kube-api-access-kkfr9\") pod \"nova-scheduler-0\" (UID: \"e31feddc-5c5c-4520-bd2e-52ab71a2e318\") " pod="openstack/nova-scheduler-0" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.772205 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e31feddc-5c5c-4520-bd2e-52ab71a2e318-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e31feddc-5c5c-4520-bd2e-52ab71a2e318\") " pod="openstack/nova-scheduler-0" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.772269 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.772284 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.772293 4973 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.776053 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e31feddc-5c5c-4520-bd2e-52ab71a2e318-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e31feddc-5c5c-4520-bd2e-52ab71a2e318\") " pod="openstack/nova-scheduler-0" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.778059 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e31feddc-5c5c-4520-bd2e-52ab71a2e318-config-data\") pod \"nova-scheduler-0\" (UID: \"e31feddc-5c5c-4520-bd2e-52ab71a2e318\") " pod="openstack/nova-scheduler-0" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.788565 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkfr9\" (UniqueName: \"kubernetes.io/projected/e31feddc-5c5c-4520-bd2e-52ab71a2e318-kube-api-access-kkfr9\") pod \"nova-scheduler-0\" (UID: \"e31feddc-5c5c-4520-bd2e-52ab71a2e318\") " pod="openstack/nova-scheduler-0" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.897207 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.918560 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.945188 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.958910 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 20 13:49:46 crc kubenswrapper[4973]: E0320 13:49:46.959534 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c1d1009-6e9d-4ad7-88d4-b00ecba35a63" containerName="nova-metadata-metadata" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.959546 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c1d1009-6e9d-4ad7-88d4-b00ecba35a63" containerName="nova-metadata-metadata" Mar 20 13:49:46 crc kubenswrapper[4973]: E0320 13:49:46.959567 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c1d1009-6e9d-4ad7-88d4-b00ecba35a63" containerName="nova-metadata-log" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.959574 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c1d1009-6e9d-4ad7-88d4-b00ecba35a63" containerName="nova-metadata-log" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.959836 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c1d1009-6e9d-4ad7-88d4-b00ecba35a63" containerName="nova-metadata-metadata" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.959856 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c1d1009-6e9d-4ad7-88d4-b00ecba35a63" containerName="nova-metadata-log" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.961283 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.965797 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.965895 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.965904 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.978120 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83d90a8e-827e-4720-9a8f-307311e2a6d9-public-tls-certs\") pod \"nova-api-0\" (UID: \"83d90a8e-827e-4720-9a8f-307311e2a6d9\") " pod="openstack/nova-api-0" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.978181 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83d90a8e-827e-4720-9a8f-307311e2a6d9-config-data\") pod \"nova-api-0\" (UID: \"83d90a8e-827e-4720-9a8f-307311e2a6d9\") " pod="openstack/nova-api-0" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.978560 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8ncm\" (UniqueName: \"kubernetes.io/projected/83d90a8e-827e-4720-9a8f-307311e2a6d9-kube-api-access-x8ncm\") pod \"nova-api-0\" (UID: \"83d90a8e-827e-4720-9a8f-307311e2a6d9\") " pod="openstack/nova-api-0" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.978628 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83d90a8e-827e-4720-9a8f-307311e2a6d9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"83d90a8e-827e-4720-9a8f-307311e2a6d9\") " pod="openstack/nova-api-0" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.978834 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83d90a8e-827e-4720-9a8f-307311e2a6d9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"83d90a8e-827e-4720-9a8f-307311e2a6d9\") " pod="openstack/nova-api-0" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.978897 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83d90a8e-827e-4720-9a8f-307311e2a6d9-logs\") pod \"nova-api-0\" (UID: \"83d90a8e-827e-4720-9a8f-307311e2a6d9\") " pod="openstack/nova-api-0" Mar 20 13:49:46 crc kubenswrapper[4973]: I0320 13:49:46.985694 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.081249 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83d90a8e-827e-4720-9a8f-307311e2a6d9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"83d90a8e-827e-4720-9a8f-307311e2a6d9\") " pod="openstack/nova-api-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.081637 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83d90a8e-827e-4720-9a8f-307311e2a6d9-logs\") pod \"nova-api-0\" (UID: \"83d90a8e-827e-4720-9a8f-307311e2a6d9\") " pod="openstack/nova-api-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.081866 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83d90a8e-827e-4720-9a8f-307311e2a6d9-public-tls-certs\") pod \"nova-api-0\" (UID: \"83d90a8e-827e-4720-9a8f-307311e2a6d9\") " pod="openstack/nova-api-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.081908 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83d90a8e-827e-4720-9a8f-307311e2a6d9-config-data\") pod \"nova-api-0\" (UID: \"83d90a8e-827e-4720-9a8f-307311e2a6d9\") " pod="openstack/nova-api-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.082105 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8ncm\" (UniqueName: \"kubernetes.io/projected/83d90a8e-827e-4720-9a8f-307311e2a6d9-kube-api-access-x8ncm\") pod \"nova-api-0\" (UID: \"83d90a8e-827e-4720-9a8f-307311e2a6d9\") " pod="openstack/nova-api-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.082140 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83d90a8e-827e-4720-9a8f-307311e2a6d9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"83d90a8e-827e-4720-9a8f-307311e2a6d9\") " pod="openstack/nova-api-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.082672 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83d90a8e-827e-4720-9a8f-307311e2a6d9-logs\") pod \"nova-api-0\" (UID: \"83d90a8e-827e-4720-9a8f-307311e2a6d9\") " pod="openstack/nova-api-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.091085 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83d90a8e-827e-4720-9a8f-307311e2a6d9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"83d90a8e-827e-4720-9a8f-307311e2a6d9\") " pod="openstack/nova-api-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.091133 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83d90a8e-827e-4720-9a8f-307311e2a6d9-config-data\") pod \"nova-api-0\" (UID: \"83d90a8e-827e-4720-9a8f-307311e2a6d9\") " pod="openstack/nova-api-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.094186 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83d90a8e-827e-4720-9a8f-307311e2a6d9-internal-tls-certs\") pod \"nova-api-0\" (UID: \"83d90a8e-827e-4720-9a8f-307311e2a6d9\") " pod="openstack/nova-api-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.100177 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83d90a8e-827e-4720-9a8f-307311e2a6d9-public-tls-certs\") pod \"nova-api-0\" (UID: \"83d90a8e-827e-4720-9a8f-307311e2a6d9\") " pod="openstack/nova-api-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.110236 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8ncm\" (UniqueName: \"kubernetes.io/projected/83d90a8e-827e-4720-9a8f-307311e2a6d9-kube-api-access-x8ncm\") pod \"nova-api-0\" (UID: \"83d90a8e-827e-4720-9a8f-307311e2a6d9\") " pod="openstack/nova-api-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.305725 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.430139 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 20 13:49:47 crc kubenswrapper[4973]: W0320 13:49:47.436718 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode31feddc_5c5c_4520_bd2e_52ab71a2e318.slice/crio-fabf98e556cd29cef7e4f67bcc560ded7cc2493fc431f64f00eac9d97cc30afe WatchSource:0}: Error finding container fabf98e556cd29cef7e4f67bcc560ded7cc2493fc431f64f00eac9d97cc30afe: Status 404 returned error can't find the container with id fabf98e556cd29cef7e4f67bcc560ded7cc2493fc431f64f00eac9d97cc30afe Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.489313 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5c1d1009-6e9d-4ad7-88d4-b00ecba35a63","Type":"ContainerDied","Data":"c168e121ad495c90a32d5a31986a32aa1fdc13730ab6cc9c456a51d0da88fc66"} Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.489459 4973 scope.go:117] "RemoveContainer" containerID="28407e65fe1ec51655ffbd273f2bc22852ce76b206055770b09aaa9005715f0c" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.489618 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.494918 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e31feddc-5c5c-4520-bd2e-52ab71a2e318","Type":"ContainerStarted","Data":"fabf98e556cd29cef7e4f67bcc560ded7cc2493fc431f64f00eac9d97cc30afe"} Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.559621 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.582906 4973 scope.go:117] "RemoveContainer" containerID="e4c641f2198e8c988cc0ae5db424af35dc12916631b4160f4d62820ecbf30bbb" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.583076 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.600498 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.602785 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.609142 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.609537 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.629018 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.698436 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da8c0543-63bd-4817-90c7-8cc02e6ddd5d-config-data\") pod \"nova-metadata-0\" (UID: \"da8c0543-63bd-4817-90c7-8cc02e6ddd5d\") " pod="openstack/nova-metadata-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.698500 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/da8c0543-63bd-4817-90c7-8cc02e6ddd5d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"da8c0543-63bd-4817-90c7-8cc02e6ddd5d\") " pod="openstack/nova-metadata-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.698532 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da8c0543-63bd-4817-90c7-8cc02e6ddd5d-logs\") pod \"nova-metadata-0\" (UID: \"da8c0543-63bd-4817-90c7-8cc02e6ddd5d\") " pod="openstack/nova-metadata-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.698558 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rf2n\" (UniqueName: \"kubernetes.io/projected/da8c0543-63bd-4817-90c7-8cc02e6ddd5d-kube-api-access-7rf2n\") pod \"nova-metadata-0\" (UID: \"da8c0543-63bd-4817-90c7-8cc02e6ddd5d\") " pod="openstack/nova-metadata-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.698838 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da8c0543-63bd-4817-90c7-8cc02e6ddd5d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"da8c0543-63bd-4817-90c7-8cc02e6ddd5d\") " pod="openstack/nova-metadata-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.804514 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da8c0543-63bd-4817-90c7-8cc02e6ddd5d-config-data\") pod \"nova-metadata-0\" (UID: \"da8c0543-63bd-4817-90c7-8cc02e6ddd5d\") " pod="openstack/nova-metadata-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.804582 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/da8c0543-63bd-4817-90c7-8cc02e6ddd5d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"da8c0543-63bd-4817-90c7-8cc02e6ddd5d\") " pod="openstack/nova-metadata-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.804628 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da8c0543-63bd-4817-90c7-8cc02e6ddd5d-logs\") pod \"nova-metadata-0\" (UID: \"da8c0543-63bd-4817-90c7-8cc02e6ddd5d\") " pod="openstack/nova-metadata-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.804659 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rf2n\" (UniqueName: \"kubernetes.io/projected/da8c0543-63bd-4817-90c7-8cc02e6ddd5d-kube-api-access-7rf2n\") pod \"nova-metadata-0\" (UID: \"da8c0543-63bd-4817-90c7-8cc02e6ddd5d\") " pod="openstack/nova-metadata-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.809978 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da8c0543-63bd-4817-90c7-8cc02e6ddd5d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"da8c0543-63bd-4817-90c7-8cc02e6ddd5d\") " pod="openstack/nova-metadata-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.811896 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.812455 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/da8c0543-63bd-4817-90c7-8cc02e6ddd5d-logs\") pod \"nova-metadata-0\" (UID: \"da8c0543-63bd-4817-90c7-8cc02e6ddd5d\") " pod="openstack/nova-metadata-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.816045 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da8c0543-63bd-4817-90c7-8cc02e6ddd5d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"da8c0543-63bd-4817-90c7-8cc02e6ddd5d\") " pod="openstack/nova-metadata-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.817056 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da8c0543-63bd-4817-90c7-8cc02e6ddd5d-config-data\") pod \"nova-metadata-0\" (UID: \"da8c0543-63bd-4817-90c7-8cc02e6ddd5d\") " pod="openstack/nova-metadata-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.843246 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rf2n\" (UniqueName: \"kubernetes.io/projected/da8c0543-63bd-4817-90c7-8cc02e6ddd5d-kube-api-access-7rf2n\") pod \"nova-metadata-0\" (UID: \"da8c0543-63bd-4817-90c7-8cc02e6ddd5d\") " pod="openstack/nova-metadata-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.858191 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/da8c0543-63bd-4817-90c7-8cc02e6ddd5d-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"da8c0543-63bd-4817-90c7-8cc02e6ddd5d\") " pod="openstack/nova-metadata-0" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.965757 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4af3161a-157d-4346-ae60-2b68ff6ba9c0" path="/var/lib/kubelet/pods/4af3161a-157d-4346-ae60-2b68ff6ba9c0/volumes" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.966709 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c1d1009-6e9d-4ad7-88d4-b00ecba35a63" path="/var/lib/kubelet/pods/5c1d1009-6e9d-4ad7-88d4-b00ecba35a63/volumes" Mar 20 13:49:47 crc kubenswrapper[4973]: I0320 13:49:47.967292 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed74ec77-093e-49ba-97d4-4a84588a85d7" path="/var/lib/kubelet/pods/ed74ec77-093e-49ba-97d4-4a84588a85d7/volumes" Mar 20 13:49:48 crc kubenswrapper[4973]: I0320 13:49:48.025987 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 20 13:49:48 crc kubenswrapper[4973]: I0320 13:49:48.490327 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 20 13:49:48 crc kubenswrapper[4973]: I0320 13:49:48.510319 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e31feddc-5c5c-4520-bd2e-52ab71a2e318","Type":"ContainerStarted","Data":"8b4f5ea18a0eb3d318c53833821f5a9b2c5e29e94724d424490605541758e09d"} Mar 20 13:49:48 crc kubenswrapper[4973]: I0320 13:49:48.513652 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83d90a8e-827e-4720-9a8f-307311e2a6d9","Type":"ContainerStarted","Data":"b65566ba8b115c4099e1a0e5afa21a72503bbe759da6799bd45352af60f6d6dc"} Mar 20 13:49:48 crc kubenswrapper[4973]: I0320 13:49:48.513682 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83d90a8e-827e-4720-9a8f-307311e2a6d9","Type":"ContainerStarted","Data":"560bd547f7c63fc2093b0186dfde2d5211db37b3444951b78df5f9a63b6f4e97"} Mar 20 13:49:48 crc kubenswrapper[4973]: I0320 13:49:48.513691 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83d90a8e-827e-4720-9a8f-307311e2a6d9","Type":"ContainerStarted","Data":"9f54eb8bd4f54ca7d6b0bf56f3d8c1a9ba1f359c8bb664e8db074e8fc539f653"} Mar 20 13:49:48 crc kubenswrapper[4973]: I0320 13:49:48.516823 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"da8c0543-63bd-4817-90c7-8cc02e6ddd5d","Type":"ContainerStarted","Data":"752022d8d154bd746b7ad79cc5fe9ad180b02920834c935d09b1cb61225748ff"} Mar 20 13:49:48 crc kubenswrapper[4973]: I0320 13:49:48.561522 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.561503443 podStartE2EDuration="2.561503443s" podCreationTimestamp="2026-03-20 13:49:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:48.533783216 +0000 UTC m=+1709.277452960" watchObservedRunningTime="2026-03-20 13:49:48.561503443 +0000 UTC m=+1709.305173187" Mar 20 13:49:48 crc kubenswrapper[4973]: I0320 13:49:48.579279 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.579251298 podStartE2EDuration="2.579251298s" podCreationTimestamp="2026-03-20 13:49:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:48.553948747 +0000 UTC m=+1709.297618491" watchObservedRunningTime="2026-03-20 13:49:48.579251298 +0000 UTC m=+1709.322921042" Mar 20 13:49:49 crc kubenswrapper[4973]: I0320 13:49:49.529503 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"da8c0543-63bd-4817-90c7-8cc02e6ddd5d","Type":"ContainerStarted","Data":"5e2dda5123009b640cf63c8722696ae7e96355c611d538d1fe965ee464db84dc"} Mar 20 13:49:49 crc kubenswrapper[4973]: I0320 13:49:49.530020 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"da8c0543-63bd-4817-90c7-8cc02e6ddd5d","Type":"ContainerStarted","Data":"4da8713e1cba4d672e33e700f43175e3f97cbbee5b8ca1b46b2648f77ceb9a16"} Mar 20 13:49:49 crc kubenswrapper[4973]: I0320 13:49:49.548814 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.548795519 podStartE2EDuration="2.548795519s" podCreationTimestamp="2026-03-20 13:49:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:49:49.546459824 +0000 UTC m=+1710.290129568" watchObservedRunningTime="2026-03-20 13:49:49.548795519 +0000 UTC m=+1710.292465263" Mar 20 13:49:49 crc kubenswrapper[4973]: I0320 13:49:49.960396 4973 scope.go:117] "RemoveContainer" containerID="6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" Mar 20 13:49:49 crc kubenswrapper[4973]: E0320 13:49:49.960720 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:49:50 crc kubenswrapper[4973]: I0320 13:49:50.816005 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lz6zd"] Mar 20 13:49:50 crc kubenswrapper[4973]: I0320 13:49:50.833852 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lz6zd" Mar 20 13:49:50 crc kubenswrapper[4973]: I0320 13:49:50.843191 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lz6zd"] Mar 20 13:49:50 crc kubenswrapper[4973]: I0320 13:49:50.927630 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d554d58c-75bd-43a4-9886-ebfc513fdc6d-catalog-content\") pod \"community-operators-lz6zd\" (UID: \"d554d58c-75bd-43a4-9886-ebfc513fdc6d\") " pod="openshift-marketplace/community-operators-lz6zd" Mar 20 13:49:50 crc kubenswrapper[4973]: I0320 13:49:50.928273 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5lrq\" (UniqueName: \"kubernetes.io/projected/d554d58c-75bd-43a4-9886-ebfc513fdc6d-kube-api-access-d5lrq\") pod \"community-operators-lz6zd\" (UID: \"d554d58c-75bd-43a4-9886-ebfc513fdc6d\") " pod="openshift-marketplace/community-operators-lz6zd" Mar 20 13:49:50 crc kubenswrapper[4973]: I0320 13:49:50.928451 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d554d58c-75bd-43a4-9886-ebfc513fdc6d-utilities\") pod \"community-operators-lz6zd\" (UID: \"d554d58c-75bd-43a4-9886-ebfc513fdc6d\") " pod="openshift-marketplace/community-operators-lz6zd" Mar 20 13:49:51 crc kubenswrapper[4973]: I0320 13:49:51.030658 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d554d58c-75bd-43a4-9886-ebfc513fdc6d-catalog-content\") pod \"community-operators-lz6zd\" (UID: \"d554d58c-75bd-43a4-9886-ebfc513fdc6d\") " pod="openshift-marketplace/community-operators-lz6zd" Mar 20 13:49:51 crc kubenswrapper[4973]: I0320 13:49:51.031031 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5lrq\" (UniqueName: \"kubernetes.io/projected/d554d58c-75bd-43a4-9886-ebfc513fdc6d-kube-api-access-d5lrq\") pod \"community-operators-lz6zd\" (UID: \"d554d58c-75bd-43a4-9886-ebfc513fdc6d\") " pod="openshift-marketplace/community-operators-lz6zd" Mar 20 13:49:51 crc kubenswrapper[4973]: I0320 13:49:51.031081 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d554d58c-75bd-43a4-9886-ebfc513fdc6d-utilities\") pod \"community-operators-lz6zd\" (UID: \"d554d58c-75bd-43a4-9886-ebfc513fdc6d\") " pod="openshift-marketplace/community-operators-lz6zd" Mar 20 13:49:51 crc kubenswrapper[4973]: I0320 13:49:51.031145 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d554d58c-75bd-43a4-9886-ebfc513fdc6d-catalog-content\") pod \"community-operators-lz6zd\" (UID: \"d554d58c-75bd-43a4-9886-ebfc513fdc6d\") " pod="openshift-marketplace/community-operators-lz6zd" Mar 20 13:49:51 crc kubenswrapper[4973]: I0320 13:49:51.031689 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d554d58c-75bd-43a4-9886-ebfc513fdc6d-utilities\") pod \"community-operators-lz6zd\" (UID: \"d554d58c-75bd-43a4-9886-ebfc513fdc6d\") " pod="openshift-marketplace/community-operators-lz6zd" Mar 20 13:49:51 crc kubenswrapper[4973]: I0320 13:49:51.055400 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5lrq\" (UniqueName: \"kubernetes.io/projected/d554d58c-75bd-43a4-9886-ebfc513fdc6d-kube-api-access-d5lrq\") pod \"community-operators-lz6zd\" (UID: \"d554d58c-75bd-43a4-9886-ebfc513fdc6d\") " pod="openshift-marketplace/community-operators-lz6zd" Mar 20 13:49:51 crc kubenswrapper[4973]: I0320 13:49:51.177320 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lz6zd" Mar 20 13:49:51 crc kubenswrapper[4973]: I0320 13:49:51.658715 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lz6zd"] Mar 20 13:49:51 crc kubenswrapper[4973]: I0320 13:49:51.897476 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 20 13:49:52 crc kubenswrapper[4973]: I0320 13:49:52.570744 4973 generic.go:334] "Generic (PLEG): container finished" podID="d554d58c-75bd-43a4-9886-ebfc513fdc6d" containerID="984b504fdbea2df3f47e86c7f58a8001f5d05e2d9b4cba928eea9c7a0ba07f70" exitCode=0 Mar 20 13:49:52 crc kubenswrapper[4973]: I0320 13:49:52.570811 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lz6zd" event={"ID":"d554d58c-75bd-43a4-9886-ebfc513fdc6d","Type":"ContainerDied","Data":"984b504fdbea2df3f47e86c7f58a8001f5d05e2d9b4cba928eea9c7a0ba07f70"} Mar 20 13:49:52 crc kubenswrapper[4973]: I0320 13:49:52.571114 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lz6zd" event={"ID":"d554d58c-75bd-43a4-9886-ebfc513fdc6d","Type":"ContainerStarted","Data":"6ce91e756a507f3ee76e0781ae09e86487e6d88f9c34f3bd90c4228aa98f5029"} Mar 20 13:49:54 crc kubenswrapper[4973]: I0320 13:49:54.593259 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lz6zd" event={"ID":"d554d58c-75bd-43a4-9886-ebfc513fdc6d","Type":"ContainerStarted","Data":"865417e8754f40f5735010e3366d2c3e4a441d8db3331677e0e1cc14587c26d3"} Mar 20 13:49:56 crc kubenswrapper[4973]: I0320 13:49:56.897571 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 20 13:49:56 crc kubenswrapper[4973]: I0320 13:49:56.934211 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 20 13:49:57 crc kubenswrapper[4973]: I0320 13:49:57.077734 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="649ff891-dde4-449f-bfed-c7029e328ebe" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 503" Mar 20 13:49:57 crc kubenswrapper[4973]: I0320 13:49:57.306290 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:49:57 crc kubenswrapper[4973]: I0320 13:49:57.306546 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 20 13:49:57 crc kubenswrapper[4973]: I0320 13:49:57.676237 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 20 13:49:58 crc kubenswrapper[4973]: I0320 13:49:58.026376 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 13:49:58 crc kubenswrapper[4973]: I0320 13:49:58.026802 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 20 13:49:58 crc kubenswrapper[4973]: I0320 13:49:58.321558 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="83d90a8e-827e-4720-9a8f-307311e2a6d9" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.1.18:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:49:58 crc kubenswrapper[4973]: I0320 13:49:58.321644 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="83d90a8e-827e-4720-9a8f-307311e2a6d9" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.1.18:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:49:59 crc kubenswrapper[4973]: I0320 13:49:59.041630 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="da8c0543-63bd-4817-90c7-8cc02e6ddd5d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.1.19:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:49:59 crc kubenswrapper[4973]: I0320 13:49:59.041630 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="da8c0543-63bd-4817-90c7-8cc02e6ddd5d" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.1.19:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 13:49:59 crc kubenswrapper[4973]: I0320 13:49:59.652449 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lz6zd" event={"ID":"d554d58c-75bd-43a4-9886-ebfc513fdc6d","Type":"ContainerDied","Data":"865417e8754f40f5735010e3366d2c3e4a441d8db3331677e0e1cc14587c26d3"} Mar 20 13:49:59 crc kubenswrapper[4973]: I0320 13:49:59.652451 4973 generic.go:334] "Generic (PLEG): container finished" podID="d554d58c-75bd-43a4-9886-ebfc513fdc6d" containerID="865417e8754f40f5735010e3366d2c3e4a441d8db3331677e0e1cc14587c26d3" exitCode=0 Mar 20 13:50:00 crc kubenswrapper[4973]: I0320 13:50:00.136993 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566910-hm49w"] Mar 20 13:50:00 crc kubenswrapper[4973]: I0320 13:50:00.138730 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566910-hm49w" Mar 20 13:50:00 crc kubenswrapper[4973]: I0320 13:50:00.141223 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:50:00 crc kubenswrapper[4973]: I0320 13:50:00.141837 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 13:50:00 crc kubenswrapper[4973]: I0320 13:50:00.142107 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:50:00 crc kubenswrapper[4973]: I0320 13:50:00.161401 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566910-hm49w"] Mar 20 13:50:00 crc kubenswrapper[4973]: I0320 13:50:00.282785 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htknj\" (UniqueName: \"kubernetes.io/projected/867c1f31-b30c-48ac-bd37-b433d68230b7-kube-api-access-htknj\") pod \"auto-csr-approver-29566910-hm49w\" (UID: \"867c1f31-b30c-48ac-bd37-b433d68230b7\") " pod="openshift-infra/auto-csr-approver-29566910-hm49w" Mar 20 13:50:00 crc kubenswrapper[4973]: I0320 13:50:00.385860 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htknj\" (UniqueName: \"kubernetes.io/projected/867c1f31-b30c-48ac-bd37-b433d68230b7-kube-api-access-htknj\") pod \"auto-csr-approver-29566910-hm49w\" (UID: \"867c1f31-b30c-48ac-bd37-b433d68230b7\") " pod="openshift-infra/auto-csr-approver-29566910-hm49w" Mar 20 13:50:00 crc kubenswrapper[4973]: I0320 13:50:00.406613 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htknj\" (UniqueName: \"kubernetes.io/projected/867c1f31-b30c-48ac-bd37-b433d68230b7-kube-api-access-htknj\") pod \"auto-csr-approver-29566910-hm49w\" (UID: \"867c1f31-b30c-48ac-bd37-b433d68230b7\") " pod="openshift-infra/auto-csr-approver-29566910-hm49w" Mar 20 13:50:00 crc kubenswrapper[4973]: I0320 13:50:00.463629 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566910-hm49w" Mar 20 13:50:01 crc kubenswrapper[4973]: W0320 13:50:01.012566 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod867c1f31_b30c_48ac_bd37_b433d68230b7.slice/crio-b303710cf1fc03adf7a24790f12ba523b25d6be59d7f40a3721363175fbee73d WatchSource:0}: Error finding container b303710cf1fc03adf7a24790f12ba523b25d6be59d7f40a3721363175fbee73d: Status 404 returned error can't find the container with id b303710cf1fc03adf7a24790f12ba523b25d6be59d7f40a3721363175fbee73d Mar 20 13:50:01 crc kubenswrapper[4973]: I0320 13:50:01.017001 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566910-hm49w"] Mar 20 13:50:01 crc kubenswrapper[4973]: I0320 13:50:01.679454 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566910-hm49w" event={"ID":"867c1f31-b30c-48ac-bd37-b433d68230b7","Type":"ContainerStarted","Data":"b303710cf1fc03adf7a24790f12ba523b25d6be59d7f40a3721363175fbee73d"} Mar 20 13:50:01 crc kubenswrapper[4973]: I0320 13:50:01.682605 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lz6zd" event={"ID":"d554d58c-75bd-43a4-9886-ebfc513fdc6d","Type":"ContainerStarted","Data":"1058b3587be83c93c4298e6d4c4e8515b84e7d1a1a9fc26ab7be2df3398cad02"} Mar 20 13:50:01 crc kubenswrapper[4973]: I0320 13:50:01.703753 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lz6zd" podStartSLOduration=3.273741527 podStartE2EDuration="11.703736343s" podCreationTimestamp="2026-03-20 13:49:50 +0000 UTC" firstStartedPulling="2026-03-20 13:49:52.572806073 +0000 UTC m=+1713.316475817" lastFinishedPulling="2026-03-20 13:50:01.002800889 +0000 UTC m=+1721.746470633" observedRunningTime="2026-03-20 13:50:01.698539752 +0000 UTC m=+1722.442209496" watchObservedRunningTime="2026-03-20 13:50:01.703736343 +0000 UTC m=+1722.447406087" Mar 20 13:50:02 crc kubenswrapper[4973]: I0320 13:50:02.951000 4973 scope.go:117] "RemoveContainer" containerID="6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" Mar 20 13:50:02 crc kubenswrapper[4973]: E0320 13:50:02.952964 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.694431 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.751677 4973 generic.go:334] "Generic (PLEG): container finished" podID="867c1f31-b30c-48ac-bd37-b433d68230b7" containerID="77558af0734689165986f3348fb19eb96f9d02e0d0a7e2e3a8896ef17a83c032" exitCode=0 Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.752289 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566910-hm49w" event={"ID":"867c1f31-b30c-48ac-bd37-b433d68230b7","Type":"ContainerDied","Data":"77558af0734689165986f3348fb19eb96f9d02e0d0a7e2e3a8896ef17a83c032"} Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.760839 4973 generic.go:334] "Generic (PLEG): container finished" podID="649ff891-dde4-449f-bfed-c7029e328ebe" containerID="0695857d35eb78ba2a16a9ecf72d45eac97dc31efae5c3e930ee86d0ca62de63" exitCode=137 Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.760898 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"649ff891-dde4-449f-bfed-c7029e328ebe","Type":"ContainerDied","Data":"0695857d35eb78ba2a16a9ecf72d45eac97dc31efae5c3e930ee86d0ca62de63"} Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.760950 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"649ff891-dde4-449f-bfed-c7029e328ebe","Type":"ContainerDied","Data":"841dae495ceccc7b6c845b4178b2ec21d9e70d3fc58a572de652f7c4373dec60"} Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.760972 4973 scope.go:117] "RemoveContainer" containerID="0695857d35eb78ba2a16a9ecf72d45eac97dc31efae5c3e930ee86d0ca62de63" Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.761256 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.796711 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2r5xc\" (UniqueName: \"kubernetes.io/projected/649ff891-dde4-449f-bfed-c7029e328ebe-kube-api-access-2r5xc\") pod \"649ff891-dde4-449f-bfed-c7029e328ebe\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.796906 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/649ff891-dde4-449f-bfed-c7029e328ebe-log-httpd\") pod \"649ff891-dde4-449f-bfed-c7029e328ebe\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.797061 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/649ff891-dde4-449f-bfed-c7029e328ebe-sg-core-conf-yaml\") pod \"649ff891-dde4-449f-bfed-c7029e328ebe\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.797132 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/649ff891-dde4-449f-bfed-c7029e328ebe-combined-ca-bundle\") pod \"649ff891-dde4-449f-bfed-c7029e328ebe\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.797388 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/649ff891-dde4-449f-bfed-c7029e328ebe-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "649ff891-dde4-449f-bfed-c7029e328ebe" (UID: "649ff891-dde4-449f-bfed-c7029e328ebe"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.797549 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/649ff891-dde4-449f-bfed-c7029e328ebe-run-httpd\") pod \"649ff891-dde4-449f-bfed-c7029e328ebe\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.797671 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/649ff891-dde4-449f-bfed-c7029e328ebe-config-data\") pod \"649ff891-dde4-449f-bfed-c7029e328ebe\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.797823 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/649ff891-dde4-449f-bfed-c7029e328ebe-scripts\") pod \"649ff891-dde4-449f-bfed-c7029e328ebe\" (UID: \"649ff891-dde4-449f-bfed-c7029e328ebe\") " Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.798105 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/649ff891-dde4-449f-bfed-c7029e328ebe-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "649ff891-dde4-449f-bfed-c7029e328ebe" (UID: "649ff891-dde4-449f-bfed-c7029e328ebe"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.799854 4973 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/649ff891-dde4-449f-bfed-c7029e328ebe-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.799895 4973 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/649ff891-dde4-449f-bfed-c7029e328ebe-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.809661 4973 scope.go:117] "RemoveContainer" containerID="2dbbc6b3fce072faa069fd6a1167b2d76692d3e3764674c4225b3ed7ac9a3875" Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.820152 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/649ff891-dde4-449f-bfed-c7029e328ebe-kube-api-access-2r5xc" (OuterVolumeSpecName: "kube-api-access-2r5xc") pod "649ff891-dde4-449f-bfed-c7029e328ebe" (UID: "649ff891-dde4-449f-bfed-c7029e328ebe"). InnerVolumeSpecName "kube-api-access-2r5xc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.820448 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/649ff891-dde4-449f-bfed-c7029e328ebe-scripts" (OuterVolumeSpecName: "scripts") pod "649ff891-dde4-449f-bfed-c7029e328ebe" (UID: "649ff891-dde4-449f-bfed-c7029e328ebe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.838897 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/649ff891-dde4-449f-bfed-c7029e328ebe-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "649ff891-dde4-449f-bfed-c7029e328ebe" (UID: "649ff891-dde4-449f-bfed-c7029e328ebe"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.902143 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2r5xc\" (UniqueName: \"kubernetes.io/projected/649ff891-dde4-449f-bfed-c7029e328ebe-kube-api-access-2r5xc\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.902178 4973 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/649ff891-dde4-449f-bfed-c7029e328ebe-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.902189 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/649ff891-dde4-449f-bfed-c7029e328ebe-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.922708 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/649ff891-dde4-449f-bfed-c7029e328ebe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "649ff891-dde4-449f-bfed-c7029e328ebe" (UID: "649ff891-dde4-449f-bfed-c7029e328ebe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:03 crc kubenswrapper[4973]: I0320 13:50:03.923772 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/649ff891-dde4-449f-bfed-c7029e328ebe-config-data" (OuterVolumeSpecName: "config-data") pod "649ff891-dde4-449f-bfed-c7029e328ebe" (UID: "649ff891-dde4-449f-bfed-c7029e328ebe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.004249 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/649ff891-dde4-449f-bfed-c7029e328ebe-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.004286 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/649ff891-dde4-449f-bfed-c7029e328ebe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.025852 4973 scope.go:117] "RemoveContainer" containerID="50ae74679fb80dbbe8c19df5a2b2a0c406e1c6b2cf3ef1d17ce9ad59f15e5c32" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.048271 4973 scope.go:117] "RemoveContainer" containerID="5d2b87fdaad05b05fdd6092e28e115ce487b6fbc98e93c9cc6f27216d12dd6d6" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.075584 4973 scope.go:117] "RemoveContainer" containerID="0695857d35eb78ba2a16a9ecf72d45eac97dc31efae5c3e930ee86d0ca62de63" Mar 20 13:50:04 crc kubenswrapper[4973]: E0320 13:50:04.078366 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0695857d35eb78ba2a16a9ecf72d45eac97dc31efae5c3e930ee86d0ca62de63\": container with ID starting with 0695857d35eb78ba2a16a9ecf72d45eac97dc31efae5c3e930ee86d0ca62de63 not found: ID does not exist" containerID="0695857d35eb78ba2a16a9ecf72d45eac97dc31efae5c3e930ee86d0ca62de63" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.078582 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0695857d35eb78ba2a16a9ecf72d45eac97dc31efae5c3e930ee86d0ca62de63"} err="failed to get container status \"0695857d35eb78ba2a16a9ecf72d45eac97dc31efae5c3e930ee86d0ca62de63\": rpc error: code = NotFound desc = could not find container \"0695857d35eb78ba2a16a9ecf72d45eac97dc31efae5c3e930ee86d0ca62de63\": container with ID starting with 0695857d35eb78ba2a16a9ecf72d45eac97dc31efae5c3e930ee86d0ca62de63 not found: ID does not exist" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.078715 4973 scope.go:117] "RemoveContainer" containerID="2dbbc6b3fce072faa069fd6a1167b2d76692d3e3764674c4225b3ed7ac9a3875" Mar 20 13:50:04 crc kubenswrapper[4973]: E0320 13:50:04.081310 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dbbc6b3fce072faa069fd6a1167b2d76692d3e3764674c4225b3ed7ac9a3875\": container with ID starting with 2dbbc6b3fce072faa069fd6a1167b2d76692d3e3764674c4225b3ed7ac9a3875 not found: ID does not exist" containerID="2dbbc6b3fce072faa069fd6a1167b2d76692d3e3764674c4225b3ed7ac9a3875" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.081383 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dbbc6b3fce072faa069fd6a1167b2d76692d3e3764674c4225b3ed7ac9a3875"} err="failed to get container status \"2dbbc6b3fce072faa069fd6a1167b2d76692d3e3764674c4225b3ed7ac9a3875\": rpc error: code = NotFound desc = could not find container \"2dbbc6b3fce072faa069fd6a1167b2d76692d3e3764674c4225b3ed7ac9a3875\": container with ID starting with 2dbbc6b3fce072faa069fd6a1167b2d76692d3e3764674c4225b3ed7ac9a3875 not found: ID does not exist" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.081417 4973 scope.go:117] "RemoveContainer" containerID="50ae74679fb80dbbe8c19df5a2b2a0c406e1c6b2cf3ef1d17ce9ad59f15e5c32" Mar 20 13:50:04 crc kubenswrapper[4973]: E0320 13:50:04.082741 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50ae74679fb80dbbe8c19df5a2b2a0c406e1c6b2cf3ef1d17ce9ad59f15e5c32\": container with ID starting with 50ae74679fb80dbbe8c19df5a2b2a0c406e1c6b2cf3ef1d17ce9ad59f15e5c32 not found: ID does not exist" containerID="50ae74679fb80dbbe8c19df5a2b2a0c406e1c6b2cf3ef1d17ce9ad59f15e5c32" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.082770 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50ae74679fb80dbbe8c19df5a2b2a0c406e1c6b2cf3ef1d17ce9ad59f15e5c32"} err="failed to get container status \"50ae74679fb80dbbe8c19df5a2b2a0c406e1c6b2cf3ef1d17ce9ad59f15e5c32\": rpc error: code = NotFound desc = could not find container \"50ae74679fb80dbbe8c19df5a2b2a0c406e1c6b2cf3ef1d17ce9ad59f15e5c32\": container with ID starting with 50ae74679fb80dbbe8c19df5a2b2a0c406e1c6b2cf3ef1d17ce9ad59f15e5c32 not found: ID does not exist" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.082792 4973 scope.go:117] "RemoveContainer" containerID="5d2b87fdaad05b05fdd6092e28e115ce487b6fbc98e93c9cc6f27216d12dd6d6" Mar 20 13:50:04 crc kubenswrapper[4973]: E0320 13:50:04.087647 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d2b87fdaad05b05fdd6092e28e115ce487b6fbc98e93c9cc6f27216d12dd6d6\": container with ID starting with 5d2b87fdaad05b05fdd6092e28e115ce487b6fbc98e93c9cc6f27216d12dd6d6 not found: ID does not exist" containerID="5d2b87fdaad05b05fdd6092e28e115ce487b6fbc98e93c9cc6f27216d12dd6d6" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.087718 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d2b87fdaad05b05fdd6092e28e115ce487b6fbc98e93c9cc6f27216d12dd6d6"} err="failed to get container status \"5d2b87fdaad05b05fdd6092e28e115ce487b6fbc98e93c9cc6f27216d12dd6d6\": rpc error: code = NotFound desc = could not find container \"5d2b87fdaad05b05fdd6092e28e115ce487b6fbc98e93c9cc6f27216d12dd6d6\": container with ID starting with 5d2b87fdaad05b05fdd6092e28e115ce487b6fbc98e93c9cc6f27216d12dd6d6 not found: ID does not exist" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.097661 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.124194 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.139400 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:50:04 crc kubenswrapper[4973]: E0320 13:50:04.140134 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="649ff891-dde4-449f-bfed-c7029e328ebe" containerName="proxy-httpd" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.140161 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="649ff891-dde4-449f-bfed-c7029e328ebe" containerName="proxy-httpd" Mar 20 13:50:04 crc kubenswrapper[4973]: E0320 13:50:04.140186 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="649ff891-dde4-449f-bfed-c7029e328ebe" containerName="ceilometer-notification-agent" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.140195 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="649ff891-dde4-449f-bfed-c7029e328ebe" containerName="ceilometer-notification-agent" Mar 20 13:50:04 crc kubenswrapper[4973]: E0320 13:50:04.140233 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="649ff891-dde4-449f-bfed-c7029e328ebe" containerName="ceilometer-central-agent" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.140244 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="649ff891-dde4-449f-bfed-c7029e328ebe" containerName="ceilometer-central-agent" Mar 20 13:50:04 crc kubenswrapper[4973]: E0320 13:50:04.140281 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="649ff891-dde4-449f-bfed-c7029e328ebe" containerName="sg-core" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.140290 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="649ff891-dde4-449f-bfed-c7029e328ebe" containerName="sg-core" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.140656 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="649ff891-dde4-449f-bfed-c7029e328ebe" containerName="ceilometer-central-agent" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.140676 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="649ff891-dde4-449f-bfed-c7029e328ebe" containerName="ceilometer-notification-agent" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.140699 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="649ff891-dde4-449f-bfed-c7029e328ebe" containerName="sg-core" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.140719 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="649ff891-dde4-449f-bfed-c7029e328ebe" containerName="proxy-httpd" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.144006 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.151366 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.151393 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.152277 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.309645 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-log-httpd\") pod \"ceilometer-0\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " pod="openstack/ceilometer-0" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.310012 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-config-data\") pod \"ceilometer-0\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " pod="openstack/ceilometer-0" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.310060 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " pod="openstack/ceilometer-0" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.310194 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-scripts\") pod \"ceilometer-0\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " pod="openstack/ceilometer-0" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.310263 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-run-httpd\") pod \"ceilometer-0\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " pod="openstack/ceilometer-0" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.310301 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " pod="openstack/ceilometer-0" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.310349 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5gfxr\" (UniqueName: \"kubernetes.io/projected/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-kube-api-access-5gfxr\") pod \"ceilometer-0\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " pod="openstack/ceilometer-0" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.412687 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " pod="openstack/ceilometer-0" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.413059 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-scripts\") pod \"ceilometer-0\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " pod="openstack/ceilometer-0" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.413167 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-run-httpd\") pod \"ceilometer-0\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " pod="openstack/ceilometer-0" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.413423 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " pod="openstack/ceilometer-0" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.413588 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5gfxr\" (UniqueName: \"kubernetes.io/projected/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-kube-api-access-5gfxr\") pod \"ceilometer-0\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " pod="openstack/ceilometer-0" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.413729 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-log-httpd\") pod \"ceilometer-0\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " pod="openstack/ceilometer-0" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.413848 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-config-data\") pod \"ceilometer-0\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " pod="openstack/ceilometer-0" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.415003 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-run-httpd\") pod \"ceilometer-0\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " pod="openstack/ceilometer-0" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.415365 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-log-httpd\") pod \"ceilometer-0\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " pod="openstack/ceilometer-0" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.418622 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " pod="openstack/ceilometer-0" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.419049 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-scripts\") pod \"ceilometer-0\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " pod="openstack/ceilometer-0" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.419635 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-config-data\") pod \"ceilometer-0\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " pod="openstack/ceilometer-0" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.421642 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " pod="openstack/ceilometer-0" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.439696 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5gfxr\" (UniqueName: \"kubernetes.io/projected/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-kube-api-access-5gfxr\") pod \"ceilometer-0\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " pod="openstack/ceilometer-0" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.488368 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:50:04 crc kubenswrapper[4973]: I0320 13:50:04.973038 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:50:04 crc kubenswrapper[4973]: W0320 13:50:04.991465 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde7005fa_d06d_4ffd_a96b_29a7e6e60b7c.slice/crio-4896f92b57a3f275130350661abdde491906e21cf39f727c5efbf69c1e945904 WatchSource:0}: Error finding container 4896f92b57a3f275130350661abdde491906e21cf39f727c5efbf69c1e945904: Status 404 returned error can't find the container with id 4896f92b57a3f275130350661abdde491906e21cf39f727c5efbf69c1e945904 Mar 20 13:50:05 crc kubenswrapper[4973]: I0320 13:50:05.122013 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566910-hm49w" Mar 20 13:50:05 crc kubenswrapper[4973]: I0320 13:50:05.235043 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htknj\" (UniqueName: \"kubernetes.io/projected/867c1f31-b30c-48ac-bd37-b433d68230b7-kube-api-access-htknj\") pod \"867c1f31-b30c-48ac-bd37-b433d68230b7\" (UID: \"867c1f31-b30c-48ac-bd37-b433d68230b7\") " Mar 20 13:50:05 crc kubenswrapper[4973]: I0320 13:50:05.240822 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/867c1f31-b30c-48ac-bd37-b433d68230b7-kube-api-access-htknj" (OuterVolumeSpecName: "kube-api-access-htknj") pod "867c1f31-b30c-48ac-bd37-b433d68230b7" (UID: "867c1f31-b30c-48ac-bd37-b433d68230b7"). InnerVolumeSpecName "kube-api-access-htknj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:05 crc kubenswrapper[4973]: I0320 13:50:05.307772 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:50:05 crc kubenswrapper[4973]: I0320 13:50:05.308036 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 20 13:50:05 crc kubenswrapper[4973]: I0320 13:50:05.338675 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htknj\" (UniqueName: \"kubernetes.io/projected/867c1f31-b30c-48ac-bd37-b433d68230b7-kube-api-access-htknj\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:05 crc kubenswrapper[4973]: I0320 13:50:05.794975 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c","Type":"ContainerStarted","Data":"05b321175f15055bcec98b2fcfcf7cb43535840c55f5ef31c8ab8eb384f57610"} Mar 20 13:50:05 crc kubenswrapper[4973]: I0320 13:50:05.795022 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c","Type":"ContainerStarted","Data":"4896f92b57a3f275130350661abdde491906e21cf39f727c5efbf69c1e945904"} Mar 20 13:50:05 crc kubenswrapper[4973]: I0320 13:50:05.797922 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566910-hm49w" event={"ID":"867c1f31-b30c-48ac-bd37-b433d68230b7","Type":"ContainerDied","Data":"b303710cf1fc03adf7a24790f12ba523b25d6be59d7f40a3721363175fbee73d"} Mar 20 13:50:05 crc kubenswrapper[4973]: I0320 13:50:05.797962 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b303710cf1fc03adf7a24790f12ba523b25d6be59d7f40a3721363175fbee73d" Mar 20 13:50:05 crc kubenswrapper[4973]: I0320 13:50:05.797994 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566910-hm49w" Mar 20 13:50:05 crc kubenswrapper[4973]: I0320 13:50:05.964309 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="649ff891-dde4-449f-bfed-c7029e328ebe" path="/var/lib/kubelet/pods/649ff891-dde4-449f-bfed-c7029e328ebe/volumes" Mar 20 13:50:06 crc kubenswrapper[4973]: I0320 13:50:06.027160 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 13:50:06 crc kubenswrapper[4973]: I0320 13:50:06.027223 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 20 13:50:06 crc kubenswrapper[4973]: I0320 13:50:06.194605 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566904-l8zrn"] Mar 20 13:50:06 crc kubenswrapper[4973]: I0320 13:50:06.207022 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566904-l8zrn"] Mar 20 13:50:06 crc kubenswrapper[4973]: I0320 13:50:06.811978 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c","Type":"ContainerStarted","Data":"9b0a88452a86460d0b602a6604950a83aaaf1606ad4bff5b98327c605c82e71d"} Mar 20 13:50:07 crc kubenswrapper[4973]: I0320 13:50:07.313790 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 13:50:07 crc kubenswrapper[4973]: I0320 13:50:07.323727 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 13:50:07 crc kubenswrapper[4973]: I0320 13:50:07.323875 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 20 13:50:07 crc kubenswrapper[4973]: I0320 13:50:07.839220 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c","Type":"ContainerStarted","Data":"de09e3af5327dc04eefedd91248b75e2ab0442a1dece36859741c8525914ac6a"} Mar 20 13:50:07 crc kubenswrapper[4973]: I0320 13:50:07.857083 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 20 13:50:07 crc kubenswrapper[4973]: I0320 13:50:07.972047 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0592b321-65ba-4ab3-987b-8384ec9ee7e2" path="/var/lib/kubelet/pods/0592b321-65ba-4ab3-987b-8384ec9ee7e2/volumes" Mar 20 13:50:08 crc kubenswrapper[4973]: I0320 13:50:08.033793 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 13:50:08 crc kubenswrapper[4973]: I0320 13:50:08.034397 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 20 13:50:08 crc kubenswrapper[4973]: I0320 13:50:08.040440 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 13:50:08 crc kubenswrapper[4973]: I0320 13:50:08.047835 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 20 13:50:10 crc kubenswrapper[4973]: I0320 13:50:10.871375 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c","Type":"ContainerStarted","Data":"7e02ca038db58c98e31b07e13b6541847c90d7fee3364610d5202b296444932b"} Mar 20 13:50:10 crc kubenswrapper[4973]: I0320 13:50:10.873784 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:50:10 crc kubenswrapper[4973]: I0320 13:50:10.902190 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.4463371560000002 podStartE2EDuration="6.902167458s" podCreationTimestamp="2026-03-20 13:50:04 +0000 UTC" firstStartedPulling="2026-03-20 13:50:04.995275594 +0000 UTC m=+1725.738945338" lastFinishedPulling="2026-03-20 13:50:09.451105906 +0000 UTC m=+1730.194775640" observedRunningTime="2026-03-20 13:50:10.896902804 +0000 UTC m=+1731.640572558" watchObservedRunningTime="2026-03-20 13:50:10.902167458 +0000 UTC m=+1731.645837202" Mar 20 13:50:11 crc kubenswrapper[4973]: I0320 13:50:11.178016 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lz6zd" Mar 20 13:50:11 crc kubenswrapper[4973]: I0320 13:50:11.178283 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lz6zd" Mar 20 13:50:11 crc kubenswrapper[4973]: I0320 13:50:11.248170 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lz6zd" Mar 20 13:50:11 crc kubenswrapper[4973]: I0320 13:50:11.942256 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lz6zd" Mar 20 13:50:12 crc kubenswrapper[4973]: I0320 13:50:12.004145 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lz6zd"] Mar 20 13:50:13 crc kubenswrapper[4973]: I0320 13:50:13.901891 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lz6zd" podUID="d554d58c-75bd-43a4-9886-ebfc513fdc6d" containerName="registry-server" containerID="cri-o://1058b3587be83c93c4298e6d4c4e8515b84e7d1a1a9fc26ab7be2df3398cad02" gracePeriod=2 Mar 20 13:50:13 crc kubenswrapper[4973]: I0320 13:50:13.953358 4973 scope.go:117] "RemoveContainer" containerID="6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" Mar 20 13:50:13 crc kubenswrapper[4973]: E0320 13:50:13.953681 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:50:14 crc kubenswrapper[4973]: I0320 13:50:14.739082 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lz6zd" Mar 20 13:50:14 crc kubenswrapper[4973]: I0320 13:50:14.916430 4973 generic.go:334] "Generic (PLEG): container finished" podID="d554d58c-75bd-43a4-9886-ebfc513fdc6d" containerID="1058b3587be83c93c4298e6d4c4e8515b84e7d1a1a9fc26ab7be2df3398cad02" exitCode=0 Mar 20 13:50:14 crc kubenswrapper[4973]: I0320 13:50:14.916476 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lz6zd" event={"ID":"d554d58c-75bd-43a4-9886-ebfc513fdc6d","Type":"ContainerDied","Data":"1058b3587be83c93c4298e6d4c4e8515b84e7d1a1a9fc26ab7be2df3398cad02"} Mar 20 13:50:14 crc kubenswrapper[4973]: I0320 13:50:14.916503 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lz6zd" event={"ID":"d554d58c-75bd-43a4-9886-ebfc513fdc6d","Type":"ContainerDied","Data":"6ce91e756a507f3ee76e0781ae09e86487e6d88f9c34f3bd90c4228aa98f5029"} Mar 20 13:50:14 crc kubenswrapper[4973]: I0320 13:50:14.916521 4973 scope.go:117] "RemoveContainer" containerID="1058b3587be83c93c4298e6d4c4e8515b84e7d1a1a9fc26ab7be2df3398cad02" Mar 20 13:50:14 crc kubenswrapper[4973]: I0320 13:50:14.916526 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lz6zd" Mar 20 13:50:14 crc kubenswrapper[4973]: I0320 13:50:14.933210 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d554d58c-75bd-43a4-9886-ebfc513fdc6d-utilities\") pod \"d554d58c-75bd-43a4-9886-ebfc513fdc6d\" (UID: \"d554d58c-75bd-43a4-9886-ebfc513fdc6d\") " Mar 20 13:50:14 crc kubenswrapper[4973]: I0320 13:50:14.933434 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d554d58c-75bd-43a4-9886-ebfc513fdc6d-catalog-content\") pod \"d554d58c-75bd-43a4-9886-ebfc513fdc6d\" (UID: \"d554d58c-75bd-43a4-9886-ebfc513fdc6d\") " Mar 20 13:50:14 crc kubenswrapper[4973]: I0320 13:50:14.933517 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5lrq\" (UniqueName: \"kubernetes.io/projected/d554d58c-75bd-43a4-9886-ebfc513fdc6d-kube-api-access-d5lrq\") pod \"d554d58c-75bd-43a4-9886-ebfc513fdc6d\" (UID: \"d554d58c-75bd-43a4-9886-ebfc513fdc6d\") " Mar 20 13:50:14 crc kubenswrapper[4973]: I0320 13:50:14.933886 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d554d58c-75bd-43a4-9886-ebfc513fdc6d-utilities" (OuterVolumeSpecName: "utilities") pod "d554d58c-75bd-43a4-9886-ebfc513fdc6d" (UID: "d554d58c-75bd-43a4-9886-ebfc513fdc6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4973]: I0320 13:50:14.934566 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d554d58c-75bd-43a4-9886-ebfc513fdc6d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:14 crc kubenswrapper[4973]: I0320 13:50:14.942524 4973 scope.go:117] "RemoveContainer" containerID="865417e8754f40f5735010e3366d2c3e4a441d8db3331677e0e1cc14587c26d3" Mar 20 13:50:14 crc kubenswrapper[4973]: I0320 13:50:14.953137 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d554d58c-75bd-43a4-9886-ebfc513fdc6d-kube-api-access-d5lrq" (OuterVolumeSpecName: "kube-api-access-d5lrq") pod "d554d58c-75bd-43a4-9886-ebfc513fdc6d" (UID: "d554d58c-75bd-43a4-9886-ebfc513fdc6d"). InnerVolumeSpecName "kube-api-access-d5lrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:14 crc kubenswrapper[4973]: I0320 13:50:14.991813 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d554d58c-75bd-43a4-9886-ebfc513fdc6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d554d58c-75bd-43a4-9886-ebfc513fdc6d" (UID: "d554d58c-75bd-43a4-9886-ebfc513fdc6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:15 crc kubenswrapper[4973]: I0320 13:50:15.022847 4973 scope.go:117] "RemoveContainer" containerID="984b504fdbea2df3f47e86c7f58a8001f5d05e2d9b4cba928eea9c7a0ba07f70" Mar 20 13:50:15 crc kubenswrapper[4973]: I0320 13:50:15.037743 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d554d58c-75bd-43a4-9886-ebfc513fdc6d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:15 crc kubenswrapper[4973]: I0320 13:50:15.037770 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5lrq\" (UniqueName: \"kubernetes.io/projected/d554d58c-75bd-43a4-9886-ebfc513fdc6d-kube-api-access-d5lrq\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:15 crc kubenswrapper[4973]: I0320 13:50:15.085112 4973 scope.go:117] "RemoveContainer" containerID="1058b3587be83c93c4298e6d4c4e8515b84e7d1a1a9fc26ab7be2df3398cad02" Mar 20 13:50:15 crc kubenswrapper[4973]: E0320 13:50:15.086687 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1058b3587be83c93c4298e6d4c4e8515b84e7d1a1a9fc26ab7be2df3398cad02\": container with ID starting with 1058b3587be83c93c4298e6d4c4e8515b84e7d1a1a9fc26ab7be2df3398cad02 not found: ID does not exist" containerID="1058b3587be83c93c4298e6d4c4e8515b84e7d1a1a9fc26ab7be2df3398cad02" Mar 20 13:50:15 crc kubenswrapper[4973]: I0320 13:50:15.086718 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1058b3587be83c93c4298e6d4c4e8515b84e7d1a1a9fc26ab7be2df3398cad02"} err="failed to get container status \"1058b3587be83c93c4298e6d4c4e8515b84e7d1a1a9fc26ab7be2df3398cad02\": rpc error: code = NotFound desc = could not find container \"1058b3587be83c93c4298e6d4c4e8515b84e7d1a1a9fc26ab7be2df3398cad02\": container with ID starting with 1058b3587be83c93c4298e6d4c4e8515b84e7d1a1a9fc26ab7be2df3398cad02 not found: ID does not exist" Mar 20 13:50:15 crc kubenswrapper[4973]: I0320 13:50:15.086741 4973 scope.go:117] "RemoveContainer" containerID="865417e8754f40f5735010e3366d2c3e4a441d8db3331677e0e1cc14587c26d3" Mar 20 13:50:15 crc kubenswrapper[4973]: E0320 13:50:15.087255 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"865417e8754f40f5735010e3366d2c3e4a441d8db3331677e0e1cc14587c26d3\": container with ID starting with 865417e8754f40f5735010e3366d2c3e4a441d8db3331677e0e1cc14587c26d3 not found: ID does not exist" containerID="865417e8754f40f5735010e3366d2c3e4a441d8db3331677e0e1cc14587c26d3" Mar 20 13:50:15 crc kubenswrapper[4973]: I0320 13:50:15.087276 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"865417e8754f40f5735010e3366d2c3e4a441d8db3331677e0e1cc14587c26d3"} err="failed to get container status \"865417e8754f40f5735010e3366d2c3e4a441d8db3331677e0e1cc14587c26d3\": rpc error: code = NotFound desc = could not find container \"865417e8754f40f5735010e3366d2c3e4a441d8db3331677e0e1cc14587c26d3\": container with ID starting with 865417e8754f40f5735010e3366d2c3e4a441d8db3331677e0e1cc14587c26d3 not found: ID does not exist" Mar 20 13:50:15 crc kubenswrapper[4973]: I0320 13:50:15.087289 4973 scope.go:117] "RemoveContainer" containerID="984b504fdbea2df3f47e86c7f58a8001f5d05e2d9b4cba928eea9c7a0ba07f70" Mar 20 13:50:15 crc kubenswrapper[4973]: E0320 13:50:15.089880 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"984b504fdbea2df3f47e86c7f58a8001f5d05e2d9b4cba928eea9c7a0ba07f70\": container with ID starting with 984b504fdbea2df3f47e86c7f58a8001f5d05e2d9b4cba928eea9c7a0ba07f70 not found: ID does not exist" containerID="984b504fdbea2df3f47e86c7f58a8001f5d05e2d9b4cba928eea9c7a0ba07f70" Mar 20 13:50:15 crc kubenswrapper[4973]: I0320 13:50:15.089929 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"984b504fdbea2df3f47e86c7f58a8001f5d05e2d9b4cba928eea9c7a0ba07f70"} err="failed to get container status \"984b504fdbea2df3f47e86c7f58a8001f5d05e2d9b4cba928eea9c7a0ba07f70\": rpc error: code = NotFound desc = could not find container \"984b504fdbea2df3f47e86c7f58a8001f5d05e2d9b4cba928eea9c7a0ba07f70\": container with ID starting with 984b504fdbea2df3f47e86c7f58a8001f5d05e2d9b4cba928eea9c7a0ba07f70 not found: ID does not exist" Mar 20 13:50:15 crc kubenswrapper[4973]: I0320 13:50:15.250450 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lz6zd"] Mar 20 13:50:15 crc kubenswrapper[4973]: I0320 13:50:15.262636 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lz6zd"] Mar 20 13:50:15 crc kubenswrapper[4973]: I0320 13:50:15.967665 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d554d58c-75bd-43a4-9886-ebfc513fdc6d" path="/var/lib/kubelet/pods/d554d58c-75bd-43a4-9886-ebfc513fdc6d/volumes" Mar 20 13:50:27 crc kubenswrapper[4973]: I0320 13:50:27.950589 4973 scope.go:117] "RemoveContainer" containerID="6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" Mar 20 13:50:27 crc kubenswrapper[4973]: E0320 13:50:27.951491 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:50:34 crc kubenswrapper[4973]: I0320 13:50:34.497634 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 13:50:38 crc kubenswrapper[4973]: I0320 13:50:38.469184 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:50:38 crc kubenswrapper[4973]: I0320 13:50:38.470004 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="4d08d0fa-8d0e-412d-b657-3af016e5c0d1" containerName="kube-state-metrics" containerID="cri-o://a0a18060aa59c7fdc5f65165d3080ab833f126993bdd47953b7effb4b67e5261" gracePeriod=30 Mar 20 13:50:38 crc kubenswrapper[4973]: I0320 13:50:38.598190 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 20 13:50:38 crc kubenswrapper[4973]: I0320 13:50:38.598435 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mysqld-exporter-0" podUID="312b29b4-b820-4dcc-bed0-62c42937d544" containerName="mysqld-exporter" containerID="cri-o://c95c1def070c1897894cc1a793da1f6d9c833ffd4b974d392025d896ae849f69" gracePeriod=30 Mar 20 13:50:38 crc kubenswrapper[4973]: I0320 13:50:38.951506 4973 scope.go:117] "RemoveContainer" containerID="6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" Mar 20 13:50:38 crc kubenswrapper[4973]: E0320 13:50:38.952091 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.045046 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.156554 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.167982 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzlzm\" (UniqueName: \"kubernetes.io/projected/4d08d0fa-8d0e-412d-b657-3af016e5c0d1-kube-api-access-xzlzm\") pod \"4d08d0fa-8d0e-412d-b657-3af016e5c0d1\" (UID: \"4d08d0fa-8d0e-412d-b657-3af016e5c0d1\") " Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.184560 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d08d0fa-8d0e-412d-b657-3af016e5c0d1-kube-api-access-xzlzm" (OuterVolumeSpecName: "kube-api-access-xzlzm") pod "4d08d0fa-8d0e-412d-b657-3af016e5c0d1" (UID: "4d08d0fa-8d0e-412d-b657-3af016e5c0d1"). InnerVolumeSpecName "kube-api-access-xzlzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.209497 4973 generic.go:334] "Generic (PLEG): container finished" podID="312b29b4-b820-4dcc-bed0-62c42937d544" containerID="c95c1def070c1897894cc1a793da1f6d9c833ffd4b974d392025d896ae849f69" exitCode=2 Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.209580 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"312b29b4-b820-4dcc-bed0-62c42937d544","Type":"ContainerDied","Data":"c95c1def070c1897894cc1a793da1f6d9c833ffd4b974d392025d896ae849f69"} Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.209618 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"312b29b4-b820-4dcc-bed0-62c42937d544","Type":"ContainerDied","Data":"c95ee0e5a0fd2fbd22e6d527fb55aba4f84fdf29621cd292e973b57367547d06"} Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.209641 4973 scope.go:117] "RemoveContainer" containerID="c95c1def070c1897894cc1a793da1f6d9c833ffd4b974d392025d896ae849f69" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.209712 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.214132 4973 generic.go:334] "Generic (PLEG): container finished" podID="4d08d0fa-8d0e-412d-b657-3af016e5c0d1" containerID="a0a18060aa59c7fdc5f65165d3080ab833f126993bdd47953b7effb4b67e5261" exitCode=2 Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.214185 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4d08d0fa-8d0e-412d-b657-3af016e5c0d1","Type":"ContainerDied","Data":"a0a18060aa59c7fdc5f65165d3080ab833f126993bdd47953b7effb4b67e5261"} Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.214216 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"4d08d0fa-8d0e-412d-b657-3af016e5c0d1","Type":"ContainerDied","Data":"5113571eba9c33b9c22a30a577be48fca98fb1861dbad0cb54ae52c90bb99230"} Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.214233 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.244760 4973 scope.go:117] "RemoveContainer" containerID="c95c1def070c1897894cc1a793da1f6d9c833ffd4b974d392025d896ae849f69" Mar 20 13:50:39 crc kubenswrapper[4973]: E0320 13:50:39.245306 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c95c1def070c1897894cc1a793da1f6d9c833ffd4b974d392025d896ae849f69\": container with ID starting with c95c1def070c1897894cc1a793da1f6d9c833ffd4b974d392025d896ae849f69 not found: ID does not exist" containerID="c95c1def070c1897894cc1a793da1f6d9c833ffd4b974d392025d896ae849f69" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.248751 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c95c1def070c1897894cc1a793da1f6d9c833ffd4b974d392025d896ae849f69"} err="failed to get container status \"c95c1def070c1897894cc1a793da1f6d9c833ffd4b974d392025d896ae849f69\": rpc error: code = NotFound desc = could not find container \"c95c1def070c1897894cc1a793da1f6d9c833ffd4b974d392025d896ae849f69\": container with ID starting with c95c1def070c1897894cc1a793da1f6d9c833ffd4b974d392025d896ae849f69 not found: ID does not exist" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.248783 4973 scope.go:117] "RemoveContainer" containerID="a0a18060aa59c7fdc5f65165d3080ab833f126993bdd47953b7effb4b67e5261" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.269714 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6p729\" (UniqueName: \"kubernetes.io/projected/312b29b4-b820-4dcc-bed0-62c42937d544-kube-api-access-6p729\") pod \"312b29b4-b820-4dcc-bed0-62c42937d544\" (UID: \"312b29b4-b820-4dcc-bed0-62c42937d544\") " Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.269932 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312b29b4-b820-4dcc-bed0-62c42937d544-combined-ca-bundle\") pod \"312b29b4-b820-4dcc-bed0-62c42937d544\" (UID: \"312b29b4-b820-4dcc-bed0-62c42937d544\") " Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.270087 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/312b29b4-b820-4dcc-bed0-62c42937d544-config-data\") pod \"312b29b4-b820-4dcc-bed0-62c42937d544\" (UID: \"312b29b4-b820-4dcc-bed0-62c42937d544\") " Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.270796 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzlzm\" (UniqueName: \"kubernetes.io/projected/4d08d0fa-8d0e-412d-b657-3af016e5c0d1-kube-api-access-xzlzm\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.275780 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.285685 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/312b29b4-b820-4dcc-bed0-62c42937d544-kube-api-access-6p729" (OuterVolumeSpecName: "kube-api-access-6p729") pod "312b29b4-b820-4dcc-bed0-62c42937d544" (UID: "312b29b4-b820-4dcc-bed0-62c42937d544"). InnerVolumeSpecName "kube-api-access-6p729". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.298023 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.308414 4973 scope.go:117] "RemoveContainer" containerID="a0a18060aa59c7fdc5f65165d3080ab833f126993bdd47953b7effb4b67e5261" Mar 20 13:50:39 crc kubenswrapper[4973]: E0320 13:50:39.309474 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0a18060aa59c7fdc5f65165d3080ab833f126993bdd47953b7effb4b67e5261\": container with ID starting with a0a18060aa59c7fdc5f65165d3080ab833f126993bdd47953b7effb4b67e5261 not found: ID does not exist" containerID="a0a18060aa59c7fdc5f65165d3080ab833f126993bdd47953b7effb4b67e5261" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.309503 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.309512 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0a18060aa59c7fdc5f65165d3080ab833f126993bdd47953b7effb4b67e5261"} err="failed to get container status \"a0a18060aa59c7fdc5f65165d3080ab833f126993bdd47953b7effb4b67e5261\": rpc error: code = NotFound desc = could not find container \"a0a18060aa59c7fdc5f65165d3080ab833f126993bdd47953b7effb4b67e5261\": container with ID starting with a0a18060aa59c7fdc5f65165d3080ab833f126993bdd47953b7effb4b67e5261 not found: ID does not exist" Mar 20 13:50:39 crc kubenswrapper[4973]: E0320 13:50:39.310471 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d08d0fa-8d0e-412d-b657-3af016e5c0d1" containerName="kube-state-metrics" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.310492 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d08d0fa-8d0e-412d-b657-3af016e5c0d1" containerName="kube-state-metrics" Mar 20 13:50:39 crc kubenswrapper[4973]: E0320 13:50:39.310565 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="867c1f31-b30c-48ac-bd37-b433d68230b7" containerName="oc" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.310574 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="867c1f31-b30c-48ac-bd37-b433d68230b7" containerName="oc" Mar 20 13:50:39 crc kubenswrapper[4973]: E0320 13:50:39.310591 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d554d58c-75bd-43a4-9886-ebfc513fdc6d" containerName="extract-utilities" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.310599 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="d554d58c-75bd-43a4-9886-ebfc513fdc6d" containerName="extract-utilities" Mar 20 13:50:39 crc kubenswrapper[4973]: E0320 13:50:39.310642 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d554d58c-75bd-43a4-9886-ebfc513fdc6d" containerName="extract-content" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.310650 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="d554d58c-75bd-43a4-9886-ebfc513fdc6d" containerName="extract-content" Mar 20 13:50:39 crc kubenswrapper[4973]: E0320 13:50:39.310670 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d554d58c-75bd-43a4-9886-ebfc513fdc6d" containerName="registry-server" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.310675 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="d554d58c-75bd-43a4-9886-ebfc513fdc6d" containerName="registry-server" Mar 20 13:50:39 crc kubenswrapper[4973]: E0320 13:50:39.310685 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="312b29b4-b820-4dcc-bed0-62c42937d544" containerName="mysqld-exporter" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.310691 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="312b29b4-b820-4dcc-bed0-62c42937d544" containerName="mysqld-exporter" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.310964 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="312b29b4-b820-4dcc-bed0-62c42937d544" containerName="mysqld-exporter" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.310980 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="d554d58c-75bd-43a4-9886-ebfc513fdc6d" containerName="registry-server" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.310994 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d08d0fa-8d0e-412d-b657-3af016e5c0d1" containerName="kube-state-metrics" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.311014 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="867c1f31-b30c-48ac-bd37-b433d68230b7" containerName="oc" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.312186 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.317230 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.317435 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.339804 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.358100 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/312b29b4-b820-4dcc-bed0-62c42937d544-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "312b29b4-b820-4dcc-bed0-62c42937d544" (UID: "312b29b4-b820-4dcc-bed0-62c42937d544"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.358208 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/312b29b4-b820-4dcc-bed0-62c42937d544-config-data" (OuterVolumeSpecName: "config-data") pod "312b29b4-b820-4dcc-bed0-62c42937d544" (UID: "312b29b4-b820-4dcc-bed0-62c42937d544"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.372732 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7stwj\" (UniqueName: \"kubernetes.io/projected/6a06b002-fe34-40e9-ae6d-54b6a9e7751b-kube-api-access-7stwj\") pod \"kube-state-metrics-0\" (UID: \"6a06b002-fe34-40e9-ae6d-54b6a9e7751b\") " pod="openstack/kube-state-metrics-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.372794 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a06b002-fe34-40e9-ae6d-54b6a9e7751b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6a06b002-fe34-40e9-ae6d-54b6a9e7751b\") " pod="openstack/kube-state-metrics-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.373178 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a06b002-fe34-40e9-ae6d-54b6a9e7751b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6a06b002-fe34-40e9-ae6d-54b6a9e7751b\") " pod="openstack/kube-state-metrics-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.373330 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6a06b002-fe34-40e9-ae6d-54b6a9e7751b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6a06b002-fe34-40e9-ae6d-54b6a9e7751b\") " pod="openstack/kube-state-metrics-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.373519 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6p729\" (UniqueName: \"kubernetes.io/projected/312b29b4-b820-4dcc-bed0-62c42937d544-kube-api-access-6p729\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.373544 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/312b29b4-b820-4dcc-bed0-62c42937d544-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.373557 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/312b29b4-b820-4dcc-bed0-62c42937d544-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.475950 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6a06b002-fe34-40e9-ae6d-54b6a9e7751b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6a06b002-fe34-40e9-ae6d-54b6a9e7751b\") " pod="openstack/kube-state-metrics-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.476119 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7stwj\" (UniqueName: \"kubernetes.io/projected/6a06b002-fe34-40e9-ae6d-54b6a9e7751b-kube-api-access-7stwj\") pod \"kube-state-metrics-0\" (UID: \"6a06b002-fe34-40e9-ae6d-54b6a9e7751b\") " pod="openstack/kube-state-metrics-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.476163 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a06b002-fe34-40e9-ae6d-54b6a9e7751b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6a06b002-fe34-40e9-ae6d-54b6a9e7751b\") " pod="openstack/kube-state-metrics-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.476285 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a06b002-fe34-40e9-ae6d-54b6a9e7751b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6a06b002-fe34-40e9-ae6d-54b6a9e7751b\") " pod="openstack/kube-state-metrics-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.480120 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/6a06b002-fe34-40e9-ae6d-54b6a9e7751b-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"6a06b002-fe34-40e9-ae6d-54b6a9e7751b\") " pod="openstack/kube-state-metrics-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.480332 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a06b002-fe34-40e9-ae6d-54b6a9e7751b-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"6a06b002-fe34-40e9-ae6d-54b6a9e7751b\") " pod="openstack/kube-state-metrics-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.482076 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/6a06b002-fe34-40e9-ae6d-54b6a9e7751b-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"6a06b002-fe34-40e9-ae6d-54b6a9e7751b\") " pod="openstack/kube-state-metrics-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.493435 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7stwj\" (UniqueName: \"kubernetes.io/projected/6a06b002-fe34-40e9-ae6d-54b6a9e7751b-kube-api-access-7stwj\") pod \"kube-state-metrics-0\" (UID: \"6a06b002-fe34-40e9-ae6d-54b6a9e7751b\") " pod="openstack/kube-state-metrics-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.552120 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.574174 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.589638 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mysqld-exporter-0"] Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.591773 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.595538 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"mysqld-exporter-config-data" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.595633 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-mysqld-exporter-svc" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.604295 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.638144 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.680971 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8fe0389-306c-44f1-9a9b-9ae5907ec1ef-config-data\") pod \"mysqld-exporter-0\" (UID: \"c8fe0389-306c-44f1-9a9b-9ae5907ec1ef\") " pod="openstack/mysqld-exporter-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.681457 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s5b6\" (UniqueName: \"kubernetes.io/projected/c8fe0389-306c-44f1-9a9b-9ae5907ec1ef-kube-api-access-7s5b6\") pod \"mysqld-exporter-0\" (UID: \"c8fe0389-306c-44f1-9a9b-9ae5907ec1ef\") " pod="openstack/mysqld-exporter-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.681524 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8fe0389-306c-44f1-9a9b-9ae5907ec1ef-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"c8fe0389-306c-44f1-9a9b-9ae5907ec1ef\") " pod="openstack/mysqld-exporter-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.681560 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8fe0389-306c-44f1-9a9b-9ae5907ec1ef-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"c8fe0389-306c-44f1-9a9b-9ae5907ec1ef\") " pod="openstack/mysqld-exporter-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.783279 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s5b6\" (UniqueName: \"kubernetes.io/projected/c8fe0389-306c-44f1-9a9b-9ae5907ec1ef-kube-api-access-7s5b6\") pod \"mysqld-exporter-0\" (UID: \"c8fe0389-306c-44f1-9a9b-9ae5907ec1ef\") " pod="openstack/mysqld-exporter-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.783375 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8fe0389-306c-44f1-9a9b-9ae5907ec1ef-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"c8fe0389-306c-44f1-9a9b-9ae5907ec1ef\") " pod="openstack/mysqld-exporter-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.783410 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8fe0389-306c-44f1-9a9b-9ae5907ec1ef-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"c8fe0389-306c-44f1-9a9b-9ae5907ec1ef\") " pod="openstack/mysqld-exporter-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.783496 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8fe0389-306c-44f1-9a9b-9ae5907ec1ef-config-data\") pod \"mysqld-exporter-0\" (UID: \"c8fe0389-306c-44f1-9a9b-9ae5907ec1ef\") " pod="openstack/mysqld-exporter-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.789891 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c8fe0389-306c-44f1-9a9b-9ae5907ec1ef-config-data\") pod \"mysqld-exporter-0\" (UID: \"c8fe0389-306c-44f1-9a9b-9ae5907ec1ef\") " pod="openstack/mysqld-exporter-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.790211 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c8fe0389-306c-44f1-9a9b-9ae5907ec1ef-combined-ca-bundle\") pod \"mysqld-exporter-0\" (UID: \"c8fe0389-306c-44f1-9a9b-9ae5907ec1ef\") " pod="openstack/mysqld-exporter-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.791045 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mysqld-exporter-tls-certs\" (UniqueName: \"kubernetes.io/secret/c8fe0389-306c-44f1-9a9b-9ae5907ec1ef-mysqld-exporter-tls-certs\") pod \"mysqld-exporter-0\" (UID: \"c8fe0389-306c-44f1-9a9b-9ae5907ec1ef\") " pod="openstack/mysqld-exporter-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.807878 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s5b6\" (UniqueName: \"kubernetes.io/projected/c8fe0389-306c-44f1-9a9b-9ae5907ec1ef-kube-api-access-7s5b6\") pod \"mysqld-exporter-0\" (UID: \"c8fe0389-306c-44f1-9a9b-9ae5907ec1ef\") " pod="openstack/mysqld-exporter-0" Mar 20 13:50:39 crc kubenswrapper[4973]: I0320 13:50:39.905710 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mysqld-exporter-0" Mar 20 13:50:40 crc kubenswrapper[4973]: I0320 13:50:40.008051 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="312b29b4-b820-4dcc-bed0-62c42937d544" path="/var/lib/kubelet/pods/312b29b4-b820-4dcc-bed0-62c42937d544/volumes" Mar 20 13:50:40 crc kubenswrapper[4973]: I0320 13:50:40.013448 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d08d0fa-8d0e-412d-b657-3af016e5c0d1" path="/var/lib/kubelet/pods/4d08d0fa-8d0e-412d-b657-3af016e5c0d1/volumes" Mar 20 13:50:40 crc kubenswrapper[4973]: I0320 13:50:40.133396 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 20 13:50:40 crc kubenswrapper[4973]: I0320 13:50:40.141627 4973 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:50:40 crc kubenswrapper[4973]: I0320 13:50:40.233950 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6a06b002-fe34-40e9-ae6d-54b6a9e7751b","Type":"ContainerStarted","Data":"02a817b990d2aa36de2604b39600d1e66685d4985041f60e01d2a3b70d7e05b8"} Mar 20 13:50:40 crc kubenswrapper[4973]: W0320 13:50:40.420113 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8fe0389_306c_44f1_9a9b_9ae5907ec1ef.slice/crio-e0d25159b0230c9f7fdc6e8844c3badf8626e35444912693feeca5d7f52c9d4c WatchSource:0}: Error finding container e0d25159b0230c9f7fdc6e8844c3badf8626e35444912693feeca5d7f52c9d4c: Status 404 returned error can't find the container with id e0d25159b0230c9f7fdc6e8844c3badf8626e35444912693feeca5d7f52c9d4c Mar 20 13:50:40 crc kubenswrapper[4973]: I0320 13:50:40.423285 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mysqld-exporter-0"] Mar 20 13:50:40 crc kubenswrapper[4973]: I0320 13:50:40.749054 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:50:40 crc kubenswrapper[4973]: I0320 13:50:40.749326 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" containerName="ceilometer-central-agent" containerID="cri-o://05b321175f15055bcec98b2fcfcf7cb43535840c55f5ef31c8ab8eb384f57610" gracePeriod=30 Mar 20 13:50:40 crc kubenswrapper[4973]: I0320 13:50:40.749363 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" containerName="proxy-httpd" containerID="cri-o://7e02ca038db58c98e31b07e13b6541847c90d7fee3364610d5202b296444932b" gracePeriod=30 Mar 20 13:50:40 crc kubenswrapper[4973]: I0320 13:50:40.749456 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" containerName="sg-core" containerID="cri-o://de09e3af5327dc04eefedd91248b75e2ab0442a1dece36859741c8525914ac6a" gracePeriod=30 Mar 20 13:50:40 crc kubenswrapper[4973]: I0320 13:50:40.749443 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" containerName="ceilometer-notification-agent" containerID="cri-o://9b0a88452a86460d0b602a6604950a83aaaf1606ad4bff5b98327c605c82e71d" gracePeriod=30 Mar 20 13:50:41 crc kubenswrapper[4973]: I0320 13:50:41.252475 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"6a06b002-fe34-40e9-ae6d-54b6a9e7751b","Type":"ContainerStarted","Data":"be3d99433957696934e0fb71f13ab2bc43ba506b3301d07d95bce118c5dfe4ae"} Mar 20 13:50:41 crc kubenswrapper[4973]: I0320 13:50:41.253272 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 20 13:50:41 crc kubenswrapper[4973]: I0320 13:50:41.255378 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"c8fe0389-306c-44f1-9a9b-9ae5907ec1ef","Type":"ContainerStarted","Data":"95448d1788862d57768abae8be2dd2b24938ded37ff2f9df37e809da4aea33fc"} Mar 20 13:50:41 crc kubenswrapper[4973]: I0320 13:50:41.255417 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mysqld-exporter-0" event={"ID":"c8fe0389-306c-44f1-9a9b-9ae5907ec1ef","Type":"ContainerStarted","Data":"e0d25159b0230c9f7fdc6e8844c3badf8626e35444912693feeca5d7f52c9d4c"} Mar 20 13:50:41 crc kubenswrapper[4973]: I0320 13:50:41.264313 4973 generic.go:334] "Generic (PLEG): container finished" podID="de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" containerID="7e02ca038db58c98e31b07e13b6541847c90d7fee3364610d5202b296444932b" exitCode=0 Mar 20 13:50:41 crc kubenswrapper[4973]: I0320 13:50:41.264378 4973 generic.go:334] "Generic (PLEG): container finished" podID="de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" containerID="de09e3af5327dc04eefedd91248b75e2ab0442a1dece36859741c8525914ac6a" exitCode=2 Mar 20 13:50:41 crc kubenswrapper[4973]: I0320 13:50:41.264368 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c","Type":"ContainerDied","Data":"7e02ca038db58c98e31b07e13b6541847c90d7fee3364610d5202b296444932b"} Mar 20 13:50:41 crc kubenswrapper[4973]: I0320 13:50:41.264440 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c","Type":"ContainerDied","Data":"de09e3af5327dc04eefedd91248b75e2ab0442a1dece36859741c8525914ac6a"} Mar 20 13:50:41 crc kubenswrapper[4973]: I0320 13:50:41.277602 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.877740062 podStartE2EDuration="2.277574303s" podCreationTimestamp="2026-03-20 13:50:39 +0000 UTC" firstStartedPulling="2026-03-20 13:50:40.141433232 +0000 UTC m=+1760.885102976" lastFinishedPulling="2026-03-20 13:50:40.541267473 +0000 UTC m=+1761.284937217" observedRunningTime="2026-03-20 13:50:41.271005044 +0000 UTC m=+1762.014674788" watchObservedRunningTime="2026-03-20 13:50:41.277574303 +0000 UTC m=+1762.021244047" Mar 20 13:50:41 crc kubenswrapper[4973]: I0320 13:50:41.297961 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mysqld-exporter-0" podStartSLOduration=1.762143475 podStartE2EDuration="2.297934569s" podCreationTimestamp="2026-03-20 13:50:39 +0000 UTC" firstStartedPulling="2026-03-20 13:50:40.422756376 +0000 UTC m=+1761.166426120" lastFinishedPulling="2026-03-20 13:50:40.95854748 +0000 UTC m=+1761.702217214" observedRunningTime="2026-03-20 13:50:41.291830863 +0000 UTC m=+1762.035500617" watchObservedRunningTime="2026-03-20 13:50:41.297934569 +0000 UTC m=+1762.041604313" Mar 20 13:50:42 crc kubenswrapper[4973]: I0320 13:50:42.277315 4973 generic.go:334] "Generic (PLEG): container finished" podID="de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" containerID="05b321175f15055bcec98b2fcfcf7cb43535840c55f5ef31c8ab8eb384f57610" exitCode=0 Mar 20 13:50:42 crc kubenswrapper[4973]: I0320 13:50:42.277385 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c","Type":"ContainerDied","Data":"05b321175f15055bcec98b2fcfcf7cb43535840c55f5ef31c8ab8eb384f57610"} Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.294995 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.302732 4973 generic.go:334] "Generic (PLEG): container finished" podID="de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" containerID="9b0a88452a86460d0b602a6604950a83aaaf1606ad4bff5b98327c605c82e71d" exitCode=0 Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.302779 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c","Type":"ContainerDied","Data":"9b0a88452a86460d0b602a6604950a83aaaf1606ad4bff5b98327c605c82e71d"} Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.302806 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c","Type":"ContainerDied","Data":"4896f92b57a3f275130350661abdde491906e21cf39f727c5efbf69c1e945904"} Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.302823 4973 scope.go:117] "RemoveContainer" containerID="7e02ca038db58c98e31b07e13b6541847c90d7fee3364610d5202b296444932b" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.302978 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.355714 4973 scope.go:117] "RemoveContainer" containerID="de09e3af5327dc04eefedd91248b75e2ab0442a1dece36859741c8525914ac6a" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.420297 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-sg-core-conf-yaml\") pod \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.420416 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-run-httpd\") pod \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.420449 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-combined-ca-bundle\") pod \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.420529 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-config-data\") pod \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.420606 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-log-httpd\") pod \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.420696 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-scripts\") pod \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.420819 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5gfxr\" (UniqueName: \"kubernetes.io/projected/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-kube-api-access-5gfxr\") pod \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\" (UID: \"de7005fa-d06d-4ffd-a96b-29a7e6e60b7c\") " Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.425480 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" (UID: "de7005fa-d06d-4ffd-a96b-29a7e6e60b7c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.425838 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" (UID: "de7005fa-d06d-4ffd-a96b-29a7e6e60b7c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.429769 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-kube-api-access-5gfxr" (OuterVolumeSpecName: "kube-api-access-5gfxr") pod "de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" (UID: "de7005fa-d06d-4ffd-a96b-29a7e6e60b7c"). InnerVolumeSpecName "kube-api-access-5gfxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.441988 4973 scope.go:117] "RemoveContainer" containerID="9b0a88452a86460d0b602a6604950a83aaaf1606ad4bff5b98327c605c82e71d" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.442045 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-scripts" (OuterVolumeSpecName: "scripts") pod "de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" (UID: "de7005fa-d06d-4ffd-a96b-29a7e6e60b7c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.464556 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" (UID: "de7005fa-d06d-4ffd-a96b-29a7e6e60b7c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.524722 4973 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.524764 4973 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.524773 4973 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.524784 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.524795 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5gfxr\" (UniqueName: \"kubernetes.io/projected/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-kube-api-access-5gfxr\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.578208 4973 scope.go:117] "RemoveContainer" containerID="05b321175f15055bcec98b2fcfcf7cb43535840c55f5ef31c8ab8eb384f57610" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.585095 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" (UID: "de7005fa-d06d-4ffd-a96b-29a7e6e60b7c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.622479 4973 scope.go:117] "RemoveContainer" containerID="7e02ca038db58c98e31b07e13b6541847c90d7fee3364610d5202b296444932b" Mar 20 13:50:44 crc kubenswrapper[4973]: E0320 13:50:44.622869 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e02ca038db58c98e31b07e13b6541847c90d7fee3364610d5202b296444932b\": container with ID starting with 7e02ca038db58c98e31b07e13b6541847c90d7fee3364610d5202b296444932b not found: ID does not exist" containerID="7e02ca038db58c98e31b07e13b6541847c90d7fee3364610d5202b296444932b" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.622905 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e02ca038db58c98e31b07e13b6541847c90d7fee3364610d5202b296444932b"} err="failed to get container status \"7e02ca038db58c98e31b07e13b6541847c90d7fee3364610d5202b296444932b\": rpc error: code = NotFound desc = could not find container \"7e02ca038db58c98e31b07e13b6541847c90d7fee3364610d5202b296444932b\": container with ID starting with 7e02ca038db58c98e31b07e13b6541847c90d7fee3364610d5202b296444932b not found: ID does not exist" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.622927 4973 scope.go:117] "RemoveContainer" containerID="de09e3af5327dc04eefedd91248b75e2ab0442a1dece36859741c8525914ac6a" Mar 20 13:50:44 crc kubenswrapper[4973]: E0320 13:50:44.623280 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de09e3af5327dc04eefedd91248b75e2ab0442a1dece36859741c8525914ac6a\": container with ID starting with de09e3af5327dc04eefedd91248b75e2ab0442a1dece36859741c8525914ac6a not found: ID does not exist" containerID="de09e3af5327dc04eefedd91248b75e2ab0442a1dece36859741c8525914ac6a" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.623304 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de09e3af5327dc04eefedd91248b75e2ab0442a1dece36859741c8525914ac6a"} err="failed to get container status \"de09e3af5327dc04eefedd91248b75e2ab0442a1dece36859741c8525914ac6a\": rpc error: code = NotFound desc = could not find container \"de09e3af5327dc04eefedd91248b75e2ab0442a1dece36859741c8525914ac6a\": container with ID starting with de09e3af5327dc04eefedd91248b75e2ab0442a1dece36859741c8525914ac6a not found: ID does not exist" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.623319 4973 scope.go:117] "RemoveContainer" containerID="9b0a88452a86460d0b602a6604950a83aaaf1606ad4bff5b98327c605c82e71d" Mar 20 13:50:44 crc kubenswrapper[4973]: E0320 13:50:44.623550 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b0a88452a86460d0b602a6604950a83aaaf1606ad4bff5b98327c605c82e71d\": container with ID starting with 9b0a88452a86460d0b602a6604950a83aaaf1606ad4bff5b98327c605c82e71d not found: ID does not exist" containerID="9b0a88452a86460d0b602a6604950a83aaaf1606ad4bff5b98327c605c82e71d" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.623573 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b0a88452a86460d0b602a6604950a83aaaf1606ad4bff5b98327c605c82e71d"} err="failed to get container status \"9b0a88452a86460d0b602a6604950a83aaaf1606ad4bff5b98327c605c82e71d\": rpc error: code = NotFound desc = could not find container \"9b0a88452a86460d0b602a6604950a83aaaf1606ad4bff5b98327c605c82e71d\": container with ID starting with 9b0a88452a86460d0b602a6604950a83aaaf1606ad4bff5b98327c605c82e71d not found: ID does not exist" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.623590 4973 scope.go:117] "RemoveContainer" containerID="05b321175f15055bcec98b2fcfcf7cb43535840c55f5ef31c8ab8eb384f57610" Mar 20 13:50:44 crc kubenswrapper[4973]: E0320 13:50:44.623798 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05b321175f15055bcec98b2fcfcf7cb43535840c55f5ef31c8ab8eb384f57610\": container with ID starting with 05b321175f15055bcec98b2fcfcf7cb43535840c55f5ef31c8ab8eb384f57610 not found: ID does not exist" containerID="05b321175f15055bcec98b2fcfcf7cb43535840c55f5ef31c8ab8eb384f57610" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.623819 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05b321175f15055bcec98b2fcfcf7cb43535840c55f5ef31c8ab8eb384f57610"} err="failed to get container status \"05b321175f15055bcec98b2fcfcf7cb43535840c55f5ef31c8ab8eb384f57610\": rpc error: code = NotFound desc = could not find container \"05b321175f15055bcec98b2fcfcf7cb43535840c55f5ef31c8ab8eb384f57610\": container with ID starting with 05b321175f15055bcec98b2fcfcf7cb43535840c55f5ef31c8ab8eb384f57610 not found: ID does not exist" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.626688 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.657158 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-config-data" (OuterVolumeSpecName: "config-data") pod "de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" (UID: "de7005fa-d06d-4ffd-a96b-29a7e6e60b7c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.728921 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.952757 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.966071 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.986415 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:50:44 crc kubenswrapper[4973]: E0320 13:50:44.986989 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" containerName="proxy-httpd" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.987011 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" containerName="proxy-httpd" Mar 20 13:50:44 crc kubenswrapper[4973]: E0320 13:50:44.987036 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" containerName="ceilometer-notification-agent" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.987044 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" containerName="ceilometer-notification-agent" Mar 20 13:50:44 crc kubenswrapper[4973]: E0320 13:50:44.987068 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" containerName="sg-core" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.987077 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" containerName="sg-core" Mar 20 13:50:44 crc kubenswrapper[4973]: E0320 13:50:44.987104 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" containerName="ceilometer-central-agent" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.987113 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" containerName="ceilometer-central-agent" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.987407 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" containerName="proxy-httpd" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.987444 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" containerName="ceilometer-notification-agent" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.987459 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" containerName="sg-core" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.987475 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" containerName="ceilometer-central-agent" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.990618 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.995797 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.996176 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:50:44 crc kubenswrapper[4973]: I0320 13:50:44.996445 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.019700 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.036955 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " pod="openstack/ceilometer-0" Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.037805 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34375e56-6366-4661-88d4-dcb414d81972-log-httpd\") pod \"ceilometer-0\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " pod="openstack/ceilometer-0" Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.037851 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " pod="openstack/ceilometer-0" Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.037952 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " pod="openstack/ceilometer-0" Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.038017 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lvhr7\" (UniqueName: \"kubernetes.io/projected/34375e56-6366-4661-88d4-dcb414d81972-kube-api-access-lvhr7\") pod \"ceilometer-0\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " pod="openstack/ceilometer-0" Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.038043 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-config-data\") pod \"ceilometer-0\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " pod="openstack/ceilometer-0" Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.038066 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34375e56-6366-4661-88d4-dcb414d81972-run-httpd\") pod \"ceilometer-0\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " pod="openstack/ceilometer-0" Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.038098 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-scripts\") pod \"ceilometer-0\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " pod="openstack/ceilometer-0" Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.140427 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34375e56-6366-4661-88d4-dcb414d81972-log-httpd\") pod \"ceilometer-0\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " pod="openstack/ceilometer-0" Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.140499 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " pod="openstack/ceilometer-0" Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.140616 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " pod="openstack/ceilometer-0" Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.140667 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lvhr7\" (UniqueName: \"kubernetes.io/projected/34375e56-6366-4661-88d4-dcb414d81972-kube-api-access-lvhr7\") pod \"ceilometer-0\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " pod="openstack/ceilometer-0" Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.140699 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-config-data\") pod \"ceilometer-0\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " pod="openstack/ceilometer-0" Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.140730 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34375e56-6366-4661-88d4-dcb414d81972-run-httpd\") pod \"ceilometer-0\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " pod="openstack/ceilometer-0" Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.140765 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-scripts\") pod \"ceilometer-0\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " pod="openstack/ceilometer-0" Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.140828 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " pod="openstack/ceilometer-0" Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.141666 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34375e56-6366-4661-88d4-dcb414d81972-run-httpd\") pod \"ceilometer-0\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " pod="openstack/ceilometer-0" Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.141929 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34375e56-6366-4661-88d4-dcb414d81972-log-httpd\") pod \"ceilometer-0\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " pod="openstack/ceilometer-0" Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.145099 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " pod="openstack/ceilometer-0" Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.145502 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-config-data\") pod \"ceilometer-0\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " pod="openstack/ceilometer-0" Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.145519 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " pod="openstack/ceilometer-0" Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.145569 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " pod="openstack/ceilometer-0" Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.151010 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-scripts\") pod \"ceilometer-0\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " pod="openstack/ceilometer-0" Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.159484 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lvhr7\" (UniqueName: \"kubernetes.io/projected/34375e56-6366-4661-88d4-dcb414d81972-kube-api-access-lvhr7\") pod \"ceilometer-0\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " pod="openstack/ceilometer-0" Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.312891 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:50:45 crc kubenswrapper[4973]: W0320 13:50:45.816028 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34375e56_6366_4661_88d4_dcb414d81972.slice/crio-486cd085b3ee6feb20d1bb0e506d8fa8cf406ebd460b320501a6c1ec24484df9 WatchSource:0}: Error finding container 486cd085b3ee6feb20d1bb0e506d8fa8cf406ebd460b320501a6c1ec24484df9: Status 404 returned error can't find the container with id 486cd085b3ee6feb20d1bb0e506d8fa8cf406ebd460b320501a6c1ec24484df9 Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.819872 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:50:45 crc kubenswrapper[4973]: I0320 13:50:45.964173 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de7005fa-d06d-4ffd-a96b-29a7e6e60b7c" path="/var/lib/kubelet/pods/de7005fa-d06d-4ffd-a96b-29a7e6e60b7c/volumes" Mar 20 13:50:46 crc kubenswrapper[4973]: I0320 13:50:46.333546 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34375e56-6366-4661-88d4-dcb414d81972","Type":"ContainerStarted","Data":"486cd085b3ee6feb20d1bb0e506d8fa8cf406ebd460b320501a6c1ec24484df9"} Mar 20 13:50:47 crc kubenswrapper[4973]: I0320 13:50:47.159617 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-znp6r"] Mar 20 13:50:47 crc kubenswrapper[4973]: I0320 13:50:47.169887 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-znp6r"] Mar 20 13:50:47 crc kubenswrapper[4973]: I0320 13:50:47.253855 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-bxbkc"] Mar 20 13:50:47 crc kubenswrapper[4973]: I0320 13:50:47.255680 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-bxbkc" Mar 20 13:50:47 crc kubenswrapper[4973]: I0320 13:50:47.276137 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-bxbkc"] Mar 20 13:50:47 crc kubenswrapper[4973]: I0320 13:50:47.292957 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx8xz\" (UniqueName: \"kubernetes.io/projected/ca1a0fe5-a203-41e8-814e-fec933be3407-kube-api-access-hx8xz\") pod \"heat-db-sync-bxbkc\" (UID: \"ca1a0fe5-a203-41e8-814e-fec933be3407\") " pod="openstack/heat-db-sync-bxbkc" Mar 20 13:50:47 crc kubenswrapper[4973]: I0320 13:50:47.293251 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca1a0fe5-a203-41e8-814e-fec933be3407-combined-ca-bundle\") pod \"heat-db-sync-bxbkc\" (UID: \"ca1a0fe5-a203-41e8-814e-fec933be3407\") " pod="openstack/heat-db-sync-bxbkc" Mar 20 13:50:47 crc kubenswrapper[4973]: I0320 13:50:47.293625 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca1a0fe5-a203-41e8-814e-fec933be3407-config-data\") pod \"heat-db-sync-bxbkc\" (UID: \"ca1a0fe5-a203-41e8-814e-fec933be3407\") " pod="openstack/heat-db-sync-bxbkc" Mar 20 13:50:47 crc kubenswrapper[4973]: I0320 13:50:47.350071 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34375e56-6366-4661-88d4-dcb414d81972","Type":"ContainerStarted","Data":"953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f"} Mar 20 13:50:47 crc kubenswrapper[4973]: I0320 13:50:47.395927 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca1a0fe5-a203-41e8-814e-fec933be3407-combined-ca-bundle\") pod \"heat-db-sync-bxbkc\" (UID: \"ca1a0fe5-a203-41e8-814e-fec933be3407\") " pod="openstack/heat-db-sync-bxbkc" Mar 20 13:50:47 crc kubenswrapper[4973]: I0320 13:50:47.396171 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca1a0fe5-a203-41e8-814e-fec933be3407-config-data\") pod \"heat-db-sync-bxbkc\" (UID: \"ca1a0fe5-a203-41e8-814e-fec933be3407\") " pod="openstack/heat-db-sync-bxbkc" Mar 20 13:50:47 crc kubenswrapper[4973]: I0320 13:50:47.396581 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx8xz\" (UniqueName: \"kubernetes.io/projected/ca1a0fe5-a203-41e8-814e-fec933be3407-kube-api-access-hx8xz\") pod \"heat-db-sync-bxbkc\" (UID: \"ca1a0fe5-a203-41e8-814e-fec933be3407\") " pod="openstack/heat-db-sync-bxbkc" Mar 20 13:50:47 crc kubenswrapper[4973]: I0320 13:50:47.401585 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca1a0fe5-a203-41e8-814e-fec933be3407-combined-ca-bundle\") pod \"heat-db-sync-bxbkc\" (UID: \"ca1a0fe5-a203-41e8-814e-fec933be3407\") " pod="openstack/heat-db-sync-bxbkc" Mar 20 13:50:47 crc kubenswrapper[4973]: I0320 13:50:47.412551 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca1a0fe5-a203-41e8-814e-fec933be3407-config-data\") pod \"heat-db-sync-bxbkc\" (UID: \"ca1a0fe5-a203-41e8-814e-fec933be3407\") " pod="openstack/heat-db-sync-bxbkc" Mar 20 13:50:47 crc kubenswrapper[4973]: I0320 13:50:47.414227 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx8xz\" (UniqueName: \"kubernetes.io/projected/ca1a0fe5-a203-41e8-814e-fec933be3407-kube-api-access-hx8xz\") pod \"heat-db-sync-bxbkc\" (UID: \"ca1a0fe5-a203-41e8-814e-fec933be3407\") " pod="openstack/heat-db-sync-bxbkc" Mar 20 13:50:47 crc kubenswrapper[4973]: I0320 13:50:47.577882 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-bxbkc" Mar 20 13:50:47 crc kubenswrapper[4973]: I0320 13:50:47.971149 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d867119-66df-4aa7-a2dd-13d0d40ce2cc" path="/var/lib/kubelet/pods/5d867119-66df-4aa7-a2dd-13d0d40ce2cc/volumes" Mar 20 13:50:48 crc kubenswrapper[4973]: I0320 13:50:48.153672 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-bxbkc"] Mar 20 13:50:48 crc kubenswrapper[4973]: W0320 13:50:48.153889 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podca1a0fe5_a203_41e8_814e_fec933be3407.slice/crio-4e096a3005ed4092041862163ca39d3d459135f2313ed0fd3da7dc77d348a12e WatchSource:0}: Error finding container 4e096a3005ed4092041862163ca39d3d459135f2313ed0fd3da7dc77d348a12e: Status 404 returned error can't find the container with id 4e096a3005ed4092041862163ca39d3d459135f2313ed0fd3da7dc77d348a12e Mar 20 13:50:48 crc kubenswrapper[4973]: I0320 13:50:48.370660 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-bxbkc" event={"ID":"ca1a0fe5-a203-41e8-814e-fec933be3407","Type":"ContainerStarted","Data":"4e096a3005ed4092041862163ca39d3d459135f2313ed0fd3da7dc77d348a12e"} Mar 20 13:50:48 crc kubenswrapper[4973]: I0320 13:50:48.381105 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34375e56-6366-4661-88d4-dcb414d81972","Type":"ContainerStarted","Data":"138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd"} Mar 20 13:50:48 crc kubenswrapper[4973]: I0320 13:50:48.985332 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 20 13:50:49 crc kubenswrapper[4973]: I0320 13:50:49.404029 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34375e56-6366-4661-88d4-dcb414d81972","Type":"ContainerStarted","Data":"bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de"} Mar 20 13:50:49 crc kubenswrapper[4973]: I0320 13:50:49.456441 4973 scope.go:117] "RemoveContainer" containerID="aca291581b66fd8cf40546b1ff467366c606359b6459beed3da383d55cc23e26" Mar 20 13:50:49 crc kubenswrapper[4973]: I0320 13:50:49.554606 4973 scope.go:117] "RemoveContainer" containerID="f78778e5adb4b7c6e6b3e7e0d2b46e5e0e4199d4a34428851d16da0c275857b1" Mar 20 13:50:49 crc kubenswrapper[4973]: I0320 13:50:49.648811 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 20 13:50:49 crc kubenswrapper[4973]: I0320 13:50:49.683853 4973 scope.go:117] "RemoveContainer" containerID="9757315f6b18fc3dd71beee72648b81288d616265cb9aba22d703be83307bf4b" Mar 20 13:50:49 crc kubenswrapper[4973]: I0320 13:50:49.741949 4973 scope.go:117] "RemoveContainer" containerID="310970b4d6f35411af5ccf038f1e1ee5f497226276d4aac33e30cd6fbc0d34af" Mar 20 13:50:49 crc kubenswrapper[4973]: I0320 13:50:49.834886 4973 scope.go:117] "RemoveContainer" containerID="efb96e669bfd1f34ec8068edc1c1938749d9e8206ecf2a33f71de536aec83da2" Mar 20 13:50:50 crc kubenswrapper[4973]: I0320 13:50:50.168841 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:50:52 crc kubenswrapper[4973]: I0320 13:50:52.454888 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34375e56-6366-4661-88d4-dcb414d81972","Type":"ContainerStarted","Data":"31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0"} Mar 20 13:50:52 crc kubenswrapper[4973]: I0320 13:50:52.457364 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:50:52 crc kubenswrapper[4973]: I0320 13:50:52.760929 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.604115718 podStartE2EDuration="8.760863393s" podCreationTimestamp="2026-03-20 13:50:44 +0000 UTC" firstStartedPulling="2026-03-20 13:50:45.81855528 +0000 UTC m=+1766.562225034" lastFinishedPulling="2026-03-20 13:50:50.975302965 +0000 UTC m=+1771.718972709" observedRunningTime="2026-03-20 13:50:52.490856648 +0000 UTC m=+1773.234526402" watchObservedRunningTime="2026-03-20 13:50:52.760863393 +0000 UTC m=+1773.504533137" Mar 20 13:50:52 crc kubenswrapper[4973]: I0320 13:50:52.773424 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:50:52 crc kubenswrapper[4973]: I0320 13:50:52.950294 4973 scope.go:117] "RemoveContainer" containerID="6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" Mar 20 13:50:52 crc kubenswrapper[4973]: E0320 13:50:52.950660 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:50:54 crc kubenswrapper[4973]: I0320 13:50:54.483501 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="34375e56-6366-4661-88d4-dcb414d81972" containerName="ceilometer-central-agent" containerID="cri-o://953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f" gracePeriod=30 Mar 20 13:50:54 crc kubenswrapper[4973]: I0320 13:50:54.483566 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="34375e56-6366-4661-88d4-dcb414d81972" containerName="sg-core" containerID="cri-o://bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de" gracePeriod=30 Mar 20 13:50:54 crc kubenswrapper[4973]: I0320 13:50:54.483567 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="34375e56-6366-4661-88d4-dcb414d81972" containerName="ceilometer-notification-agent" containerID="cri-o://138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd" gracePeriod=30 Mar 20 13:50:54 crc kubenswrapper[4973]: I0320 13:50:54.483604 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="34375e56-6366-4661-88d4-dcb414d81972" containerName="proxy-httpd" containerID="cri-o://31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0" gracePeriod=30 Mar 20 13:50:54 crc kubenswrapper[4973]: I0320 13:50:54.791084 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-2" podUID="4ed60638-5022-406b-b568-7fa0d6bf4ba8" containerName="rabbitmq" containerID="cri-o://037f5251eaf4cb4f67de62e170130e7e06ee060a34d17561441e79a60d62fd3c" gracePeriod=604795 Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.236831 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="20780ec2-d338-45a4-9259-16a651e46e55" containerName="rabbitmq" containerID="cri-o://92e9958d24196db83cc045e1f87b9f21ff9358fb34e199690b6dd3b40a1daaaa" gracePeriod=604795 Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.401216 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="20780ec2-d338-45a4-9259-16a651e46e55" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.132:5671: connect: connection refused" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.490616 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.497322 4973 generic.go:334] "Generic (PLEG): container finished" podID="34375e56-6366-4661-88d4-dcb414d81972" containerID="31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0" exitCode=0 Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.497367 4973 generic.go:334] "Generic (PLEG): container finished" podID="34375e56-6366-4661-88d4-dcb414d81972" containerID="bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de" exitCode=2 Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.497377 4973 generic.go:334] "Generic (PLEG): container finished" podID="34375e56-6366-4661-88d4-dcb414d81972" containerID="138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd" exitCode=0 Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.497384 4973 generic.go:334] "Generic (PLEG): container finished" podID="34375e56-6366-4661-88d4-dcb414d81972" containerID="953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f" exitCode=0 Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.497405 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34375e56-6366-4661-88d4-dcb414d81972","Type":"ContainerDied","Data":"31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0"} Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.497431 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34375e56-6366-4661-88d4-dcb414d81972","Type":"ContainerDied","Data":"bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de"} Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.497441 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34375e56-6366-4661-88d4-dcb414d81972","Type":"ContainerDied","Data":"138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd"} Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.497450 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34375e56-6366-4661-88d4-dcb414d81972","Type":"ContainerDied","Data":"953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f"} Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.497458 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"34375e56-6366-4661-88d4-dcb414d81972","Type":"ContainerDied","Data":"486cd085b3ee6feb20d1bb0e506d8fa8cf406ebd460b320501a6c1ec24484df9"} Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.497474 4973 scope.go:117] "RemoveContainer" containerID="31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.497527 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.542807 4973 scope.go:117] "RemoveContainer" containerID="bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.547966 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-ceilometer-tls-certs\") pod \"34375e56-6366-4661-88d4-dcb414d81972\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.548069 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-sg-core-conf-yaml\") pod \"34375e56-6366-4661-88d4-dcb414d81972\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.548123 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-scripts\") pod \"34375e56-6366-4661-88d4-dcb414d81972\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.548225 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-config-data\") pod \"34375e56-6366-4661-88d4-dcb414d81972\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.548272 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34375e56-6366-4661-88d4-dcb414d81972-log-httpd\") pod \"34375e56-6366-4661-88d4-dcb414d81972\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.548389 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-combined-ca-bundle\") pod \"34375e56-6366-4661-88d4-dcb414d81972\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.548415 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lvhr7\" (UniqueName: \"kubernetes.io/projected/34375e56-6366-4661-88d4-dcb414d81972-kube-api-access-lvhr7\") pod \"34375e56-6366-4661-88d4-dcb414d81972\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.548613 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34375e56-6366-4661-88d4-dcb414d81972-run-httpd\") pod \"34375e56-6366-4661-88d4-dcb414d81972\" (UID: \"34375e56-6366-4661-88d4-dcb414d81972\") " Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.549579 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34375e56-6366-4661-88d4-dcb414d81972-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "34375e56-6366-4661-88d4-dcb414d81972" (UID: "34375e56-6366-4661-88d4-dcb414d81972"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.549949 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34375e56-6366-4661-88d4-dcb414d81972-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "34375e56-6366-4661-88d4-dcb414d81972" (UID: "34375e56-6366-4661-88d4-dcb414d81972"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.555897 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-scripts" (OuterVolumeSpecName: "scripts") pod "34375e56-6366-4661-88d4-dcb414d81972" (UID: "34375e56-6366-4661-88d4-dcb414d81972"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.560834 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34375e56-6366-4661-88d4-dcb414d81972-kube-api-access-lvhr7" (OuterVolumeSpecName: "kube-api-access-lvhr7") pod "34375e56-6366-4661-88d4-dcb414d81972" (UID: "34375e56-6366-4661-88d4-dcb414d81972"). InnerVolumeSpecName "kube-api-access-lvhr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.571120 4973 scope.go:117] "RemoveContainer" containerID="138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.611194 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "34375e56-6366-4661-88d4-dcb414d81972" (UID: "34375e56-6366-4661-88d4-dcb414d81972"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.627410 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "34375e56-6366-4661-88d4-dcb414d81972" (UID: "34375e56-6366-4661-88d4-dcb414d81972"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.628636 4973 scope.go:117] "RemoveContainer" containerID="953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.650943 4973 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34375e56-6366-4661-88d4-dcb414d81972-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.650974 4973 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.650985 4973 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.650994 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.651005 4973 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/34375e56-6366-4661-88d4-dcb414d81972-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.651013 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lvhr7\" (UniqueName: \"kubernetes.io/projected/34375e56-6366-4661-88d4-dcb414d81972-kube-api-access-lvhr7\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.699581 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34375e56-6366-4661-88d4-dcb414d81972" (UID: "34375e56-6366-4661-88d4-dcb414d81972"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.711588 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-config-data" (OuterVolumeSpecName: "config-data") pod "34375e56-6366-4661-88d4-dcb414d81972" (UID: "34375e56-6366-4661-88d4-dcb414d81972"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.754389 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.754412 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34375e56-6366-4661-88d4-dcb414d81972-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.789747 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="4ed60638-5022-406b-b568-7fa0d6bf4ba8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.135:5671: connect: connection refused" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.862784 4973 scope.go:117] "RemoveContainer" containerID="31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0" Mar 20 13:50:55 crc kubenswrapper[4973]: E0320 13:50:55.863459 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0\": container with ID starting with 31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0 not found: ID does not exist" containerID="31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.863495 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0"} err="failed to get container status \"31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0\": rpc error: code = NotFound desc = could not find container \"31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0\": container with ID starting with 31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0 not found: ID does not exist" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.863525 4973 scope.go:117] "RemoveContainer" containerID="bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de" Mar 20 13:50:55 crc kubenswrapper[4973]: E0320 13:50:55.863856 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de\": container with ID starting with bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de not found: ID does not exist" containerID="bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.863919 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de"} err="failed to get container status \"bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de\": rpc error: code = NotFound desc = could not find container \"bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de\": container with ID starting with bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de not found: ID does not exist" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.863949 4973 scope.go:117] "RemoveContainer" containerID="138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd" Mar 20 13:50:55 crc kubenswrapper[4973]: E0320 13:50:55.864370 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd\": container with ID starting with 138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd not found: ID does not exist" containerID="138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.864419 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd"} err="failed to get container status \"138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd\": rpc error: code = NotFound desc = could not find container \"138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd\": container with ID starting with 138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd not found: ID does not exist" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.864476 4973 scope.go:117] "RemoveContainer" containerID="953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f" Mar 20 13:50:55 crc kubenswrapper[4973]: E0320 13:50:55.866309 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f\": container with ID starting with 953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f not found: ID does not exist" containerID="953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.866331 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f"} err="failed to get container status \"953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f\": rpc error: code = NotFound desc = could not find container \"953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f\": container with ID starting with 953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f not found: ID does not exist" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.866373 4973 scope.go:117] "RemoveContainer" containerID="31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.866651 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0"} err="failed to get container status \"31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0\": rpc error: code = NotFound desc = could not find container \"31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0\": container with ID starting with 31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0 not found: ID does not exist" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.866684 4973 scope.go:117] "RemoveContainer" containerID="bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.867025 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de"} err="failed to get container status \"bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de\": rpc error: code = NotFound desc = could not find container \"bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de\": container with ID starting with bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de not found: ID does not exist" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.867047 4973 scope.go:117] "RemoveContainer" containerID="138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.867407 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd"} err="failed to get container status \"138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd\": rpc error: code = NotFound desc = could not find container \"138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd\": container with ID starting with 138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd not found: ID does not exist" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.867442 4973 scope.go:117] "RemoveContainer" containerID="953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.867784 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f"} err="failed to get container status \"953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f\": rpc error: code = NotFound desc = could not find container \"953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f\": container with ID starting with 953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f not found: ID does not exist" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.867810 4973 scope.go:117] "RemoveContainer" containerID="31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.868083 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0"} err="failed to get container status \"31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0\": rpc error: code = NotFound desc = could not find container \"31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0\": container with ID starting with 31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0 not found: ID does not exist" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.868104 4973 scope.go:117] "RemoveContainer" containerID="bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.868375 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de"} err="failed to get container status \"bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de\": rpc error: code = NotFound desc = could not find container \"bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de\": container with ID starting with bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de not found: ID does not exist" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.868408 4973 scope.go:117] "RemoveContainer" containerID="138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.879035 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd"} err="failed to get container status \"138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd\": rpc error: code = NotFound desc = could not find container \"138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd\": container with ID starting with 138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd not found: ID does not exist" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.879126 4973 scope.go:117] "RemoveContainer" containerID="953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.879998 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f"} err="failed to get container status \"953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f\": rpc error: code = NotFound desc = could not find container \"953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f\": container with ID starting with 953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f not found: ID does not exist" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.880038 4973 scope.go:117] "RemoveContainer" containerID="31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.880371 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0"} err="failed to get container status \"31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0\": rpc error: code = NotFound desc = could not find container \"31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0\": container with ID starting with 31b6d93578d97b674366b28231f9c59bc5ce57a06eeaf689dc9ddbda3c08a0f0 not found: ID does not exist" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.880392 4973 scope.go:117] "RemoveContainer" containerID="bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.880663 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de"} err="failed to get container status \"bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de\": rpc error: code = NotFound desc = could not find container \"bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de\": container with ID starting with bea7ddad6ba747a292c3636c462d3e2bfcde8010770a329a13e250b7fd1801de not found: ID does not exist" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.880691 4973 scope.go:117] "RemoveContainer" containerID="138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.886880 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd"} err="failed to get container status \"138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd\": rpc error: code = NotFound desc = could not find container \"138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd\": container with ID starting with 138dcee1d16ed6e2b497df04e5f7e56f82a2632e961c1bf285561fe63a248abd not found: ID does not exist" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.886942 4973 scope.go:117] "RemoveContainer" containerID="953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.887533 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f"} err="failed to get container status \"953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f\": rpc error: code = NotFound desc = could not find container \"953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f\": container with ID starting with 953b60d950de0afc501100c405c5db9fc2388e021a5863130b5e14462223195f not found: ID does not exist" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.901079 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.942611 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.979506 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34375e56-6366-4661-88d4-dcb414d81972" path="/var/lib/kubelet/pods/34375e56-6366-4661-88d4-dcb414d81972/volumes" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.982383 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:50:55 crc kubenswrapper[4973]: E0320 13:50:55.983692 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34375e56-6366-4661-88d4-dcb414d81972" containerName="proxy-httpd" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.983716 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="34375e56-6366-4661-88d4-dcb414d81972" containerName="proxy-httpd" Mar 20 13:50:55 crc kubenswrapper[4973]: E0320 13:50:55.983744 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34375e56-6366-4661-88d4-dcb414d81972" containerName="ceilometer-notification-agent" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.983750 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="34375e56-6366-4661-88d4-dcb414d81972" containerName="ceilometer-notification-agent" Mar 20 13:50:55 crc kubenswrapper[4973]: E0320 13:50:55.983766 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34375e56-6366-4661-88d4-dcb414d81972" containerName="sg-core" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.983771 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="34375e56-6366-4661-88d4-dcb414d81972" containerName="sg-core" Mar 20 13:50:55 crc kubenswrapper[4973]: E0320 13:50:55.983799 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34375e56-6366-4661-88d4-dcb414d81972" containerName="ceilometer-central-agent" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.983805 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="34375e56-6366-4661-88d4-dcb414d81972" containerName="ceilometer-central-agent" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.984048 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="34375e56-6366-4661-88d4-dcb414d81972" containerName="ceilometer-central-agent" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.984071 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="34375e56-6366-4661-88d4-dcb414d81972" containerName="sg-core" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.984079 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="34375e56-6366-4661-88d4-dcb414d81972" containerName="proxy-httpd" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.984091 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="34375e56-6366-4661-88d4-dcb414d81972" containerName="ceilometer-notification-agent" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.990507 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.996896 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.998261 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.998439 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 20 13:50:55 crc kubenswrapper[4973]: I0320 13:50:55.999159 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 20 13:50:56 crc kubenswrapper[4973]: I0320 13:50:56.070162 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cde80e1-f72d-4080-a86b-5968a8904333-scripts\") pod \"ceilometer-0\" (UID: \"1cde80e1-f72d-4080-a86b-5968a8904333\") " pod="openstack/ceilometer-0" Mar 20 13:50:56 crc kubenswrapper[4973]: I0320 13:50:56.070554 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjlnk\" (UniqueName: \"kubernetes.io/projected/1cde80e1-f72d-4080-a86b-5968a8904333-kube-api-access-xjlnk\") pod \"ceilometer-0\" (UID: \"1cde80e1-f72d-4080-a86b-5968a8904333\") " pod="openstack/ceilometer-0" Mar 20 13:50:56 crc kubenswrapper[4973]: I0320 13:50:56.070638 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cde80e1-f72d-4080-a86b-5968a8904333-run-httpd\") pod \"ceilometer-0\" (UID: \"1cde80e1-f72d-4080-a86b-5968a8904333\") " pod="openstack/ceilometer-0" Mar 20 13:50:56 crc kubenswrapper[4973]: I0320 13:50:56.070836 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cde80e1-f72d-4080-a86b-5968a8904333-config-data\") pod \"ceilometer-0\" (UID: \"1cde80e1-f72d-4080-a86b-5968a8904333\") " pod="openstack/ceilometer-0" Mar 20 13:50:56 crc kubenswrapper[4973]: I0320 13:50:56.070931 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cde80e1-f72d-4080-a86b-5968a8904333-log-httpd\") pod \"ceilometer-0\" (UID: \"1cde80e1-f72d-4080-a86b-5968a8904333\") " pod="openstack/ceilometer-0" Mar 20 13:50:56 crc kubenswrapper[4973]: I0320 13:50:56.071006 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cde80e1-f72d-4080-a86b-5968a8904333-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1cde80e1-f72d-4080-a86b-5968a8904333\") " pod="openstack/ceilometer-0" Mar 20 13:50:56 crc kubenswrapper[4973]: I0320 13:50:56.071125 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1cde80e1-f72d-4080-a86b-5968a8904333-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1cde80e1-f72d-4080-a86b-5968a8904333\") " pod="openstack/ceilometer-0" Mar 20 13:50:56 crc kubenswrapper[4973]: I0320 13:50:56.071205 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cde80e1-f72d-4080-a86b-5968a8904333-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1cde80e1-f72d-4080-a86b-5968a8904333\") " pod="openstack/ceilometer-0" Mar 20 13:50:56 crc kubenswrapper[4973]: I0320 13:50:56.173459 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1cde80e1-f72d-4080-a86b-5968a8904333-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1cde80e1-f72d-4080-a86b-5968a8904333\") " pod="openstack/ceilometer-0" Mar 20 13:50:56 crc kubenswrapper[4973]: I0320 13:50:56.173737 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cde80e1-f72d-4080-a86b-5968a8904333-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1cde80e1-f72d-4080-a86b-5968a8904333\") " pod="openstack/ceilometer-0" Mar 20 13:50:56 crc kubenswrapper[4973]: I0320 13:50:56.173875 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cde80e1-f72d-4080-a86b-5968a8904333-scripts\") pod \"ceilometer-0\" (UID: \"1cde80e1-f72d-4080-a86b-5968a8904333\") " pod="openstack/ceilometer-0" Mar 20 13:50:56 crc kubenswrapper[4973]: I0320 13:50:56.174025 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xjlnk\" (UniqueName: \"kubernetes.io/projected/1cde80e1-f72d-4080-a86b-5968a8904333-kube-api-access-xjlnk\") pod \"ceilometer-0\" (UID: \"1cde80e1-f72d-4080-a86b-5968a8904333\") " pod="openstack/ceilometer-0" Mar 20 13:50:56 crc kubenswrapper[4973]: I0320 13:50:56.174125 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cde80e1-f72d-4080-a86b-5968a8904333-run-httpd\") pod \"ceilometer-0\" (UID: \"1cde80e1-f72d-4080-a86b-5968a8904333\") " pod="openstack/ceilometer-0" Mar 20 13:50:56 crc kubenswrapper[4973]: I0320 13:50:56.174314 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cde80e1-f72d-4080-a86b-5968a8904333-config-data\") pod \"ceilometer-0\" (UID: \"1cde80e1-f72d-4080-a86b-5968a8904333\") " pod="openstack/ceilometer-0" Mar 20 13:50:56 crc kubenswrapper[4973]: I0320 13:50:56.174441 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cde80e1-f72d-4080-a86b-5968a8904333-log-httpd\") pod \"ceilometer-0\" (UID: \"1cde80e1-f72d-4080-a86b-5968a8904333\") " pod="openstack/ceilometer-0" Mar 20 13:50:56 crc kubenswrapper[4973]: I0320 13:50:56.174525 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cde80e1-f72d-4080-a86b-5968a8904333-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1cde80e1-f72d-4080-a86b-5968a8904333\") " pod="openstack/ceilometer-0" Mar 20 13:50:56 crc kubenswrapper[4973]: I0320 13:50:56.174842 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cde80e1-f72d-4080-a86b-5968a8904333-run-httpd\") pod \"ceilometer-0\" (UID: \"1cde80e1-f72d-4080-a86b-5968a8904333\") " pod="openstack/ceilometer-0" Mar 20 13:50:56 crc kubenswrapper[4973]: I0320 13:50:56.174946 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1cde80e1-f72d-4080-a86b-5968a8904333-log-httpd\") pod \"ceilometer-0\" (UID: \"1cde80e1-f72d-4080-a86b-5968a8904333\") " pod="openstack/ceilometer-0" Mar 20 13:50:56 crc kubenswrapper[4973]: I0320 13:50:56.179024 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cde80e1-f72d-4080-a86b-5968a8904333-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1cde80e1-f72d-4080-a86b-5968a8904333\") " pod="openstack/ceilometer-0" Mar 20 13:50:56 crc kubenswrapper[4973]: I0320 13:50:56.179452 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1cde80e1-f72d-4080-a86b-5968a8904333-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1cde80e1-f72d-4080-a86b-5968a8904333\") " pod="openstack/ceilometer-0" Mar 20 13:50:56 crc kubenswrapper[4973]: I0320 13:50:56.181310 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1cde80e1-f72d-4080-a86b-5968a8904333-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1cde80e1-f72d-4080-a86b-5968a8904333\") " pod="openstack/ceilometer-0" Mar 20 13:50:56 crc kubenswrapper[4973]: I0320 13:50:56.183700 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1cde80e1-f72d-4080-a86b-5968a8904333-config-data\") pod \"ceilometer-0\" (UID: \"1cde80e1-f72d-4080-a86b-5968a8904333\") " pod="openstack/ceilometer-0" Mar 20 13:50:56 crc kubenswrapper[4973]: I0320 13:50:56.190708 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1cde80e1-f72d-4080-a86b-5968a8904333-scripts\") pod \"ceilometer-0\" (UID: \"1cde80e1-f72d-4080-a86b-5968a8904333\") " pod="openstack/ceilometer-0" Mar 20 13:50:56 crc kubenswrapper[4973]: I0320 13:50:56.191697 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjlnk\" (UniqueName: \"kubernetes.io/projected/1cde80e1-f72d-4080-a86b-5968a8904333-kube-api-access-xjlnk\") pod \"ceilometer-0\" (UID: \"1cde80e1-f72d-4080-a86b-5968a8904333\") " pod="openstack/ceilometer-0" Mar 20 13:50:56 crc kubenswrapper[4973]: I0320 13:50:56.312037 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 20 13:50:56 crc kubenswrapper[4973]: I0320 13:50:56.780196 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 20 13:50:57 crc kubenswrapper[4973]: I0320 13:50:57.529440 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cde80e1-f72d-4080-a86b-5968a8904333","Type":"ContainerStarted","Data":"74fb5a5997efcd8d54e038d25192ec956878290af70c38fa1870530585b01d8f"} Mar 20 13:51:01 crc kubenswrapper[4973]: I0320 13:51:01.618621 4973 generic.go:334] "Generic (PLEG): container finished" podID="20780ec2-d338-45a4-9259-16a651e46e55" containerID="92e9958d24196db83cc045e1f87b9f21ff9358fb34e199690b6dd3b40a1daaaa" exitCode=0 Mar 20 13:51:01 crc kubenswrapper[4973]: I0320 13:51:01.618738 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"20780ec2-d338-45a4-9259-16a651e46e55","Type":"ContainerDied","Data":"92e9958d24196db83cc045e1f87b9f21ff9358fb34e199690b6dd3b40a1daaaa"} Mar 20 13:51:01 crc kubenswrapper[4973]: I0320 13:51:01.624132 4973 generic.go:334] "Generic (PLEG): container finished" podID="4ed60638-5022-406b-b568-7fa0d6bf4ba8" containerID="037f5251eaf4cb4f67de62e170130e7e06ee060a34d17561441e79a60d62fd3c" exitCode=0 Mar 20 13:51:01 crc kubenswrapper[4973]: I0320 13:51:01.624185 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"4ed60638-5022-406b-b568-7fa0d6bf4ba8","Type":"ContainerDied","Data":"037f5251eaf4cb4f67de62e170130e7e06ee060a34d17561441e79a60d62fd3c"} Mar 20 13:51:05 crc kubenswrapper[4973]: I0320 13:51:05.401534 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="20780ec2-d338-45a4-9259-16a651e46e55" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.132:5671: connect: connection refused" Mar 20 13:51:05 crc kubenswrapper[4973]: I0320 13:51:05.789321 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="4ed60638-5022-406b-b568-7fa0d6bf4ba8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.135:5671: connect: connection refused" Mar 20 13:51:05 crc kubenswrapper[4973]: I0320 13:51:05.951245 4973 scope.go:117] "RemoveContainer" containerID="6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" Mar 20 13:51:05 crc kubenswrapper[4973]: E0320 13:51:05.951673 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:51:06 crc kubenswrapper[4973]: I0320 13:51:06.212588 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68df85789f-z974z"] Mar 20 13:51:06 crc kubenswrapper[4973]: I0320 13:51:06.215046 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68df85789f-z974z" Mar 20 13:51:06 crc kubenswrapper[4973]: I0320 13:51:06.218589 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 20 13:51:06 crc kubenswrapper[4973]: I0320 13:51:06.234336 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68df85789f-z974z"] Mar 20 13:51:06 crc kubenswrapper[4973]: I0320 13:51:06.306281 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-dns-svc\") pod \"dnsmasq-dns-68df85789f-z974z\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " pod="openstack/dnsmasq-dns-68df85789f-z974z" Mar 20 13:51:06 crc kubenswrapper[4973]: I0320 13:51:06.306334 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-config\") pod \"dnsmasq-dns-68df85789f-z974z\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " pod="openstack/dnsmasq-dns-68df85789f-z974z" Mar 20 13:51:06 crc kubenswrapper[4973]: I0320 13:51:06.306604 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s6q5\" (UniqueName: \"kubernetes.io/projected/98db46c5-a461-4c56-a0ce-666906912d6a-kube-api-access-7s6q5\") pod \"dnsmasq-dns-68df85789f-z974z\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " pod="openstack/dnsmasq-dns-68df85789f-z974z" Mar 20 13:51:06 crc kubenswrapper[4973]: I0320 13:51:06.306764 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-openstack-edpm-ipam\") pod \"dnsmasq-dns-68df85789f-z974z\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " pod="openstack/dnsmasq-dns-68df85789f-z974z" Mar 20 13:51:06 crc kubenswrapper[4973]: I0320 13:51:06.306822 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-dns-swift-storage-0\") pod \"dnsmasq-dns-68df85789f-z974z\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " pod="openstack/dnsmasq-dns-68df85789f-z974z" Mar 20 13:51:06 crc kubenswrapper[4973]: I0320 13:51:06.307040 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-ovsdbserver-nb\") pod \"dnsmasq-dns-68df85789f-z974z\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " pod="openstack/dnsmasq-dns-68df85789f-z974z" Mar 20 13:51:06 crc kubenswrapper[4973]: I0320 13:51:06.307124 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-ovsdbserver-sb\") pod \"dnsmasq-dns-68df85789f-z974z\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " pod="openstack/dnsmasq-dns-68df85789f-z974z" Mar 20 13:51:06 crc kubenswrapper[4973]: I0320 13:51:06.412606 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s6q5\" (UniqueName: \"kubernetes.io/projected/98db46c5-a461-4c56-a0ce-666906912d6a-kube-api-access-7s6q5\") pod \"dnsmasq-dns-68df85789f-z974z\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " pod="openstack/dnsmasq-dns-68df85789f-z974z" Mar 20 13:51:06 crc kubenswrapper[4973]: I0320 13:51:06.412782 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-openstack-edpm-ipam\") pod \"dnsmasq-dns-68df85789f-z974z\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " pod="openstack/dnsmasq-dns-68df85789f-z974z" Mar 20 13:51:06 crc kubenswrapper[4973]: I0320 13:51:06.412840 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-dns-swift-storage-0\") pod \"dnsmasq-dns-68df85789f-z974z\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " pod="openstack/dnsmasq-dns-68df85789f-z974z" Mar 20 13:51:06 crc kubenswrapper[4973]: I0320 13:51:06.413171 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-ovsdbserver-nb\") pod \"dnsmasq-dns-68df85789f-z974z\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " pod="openstack/dnsmasq-dns-68df85789f-z974z" Mar 20 13:51:06 crc kubenswrapper[4973]: I0320 13:51:06.413267 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-ovsdbserver-sb\") pod \"dnsmasq-dns-68df85789f-z974z\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " pod="openstack/dnsmasq-dns-68df85789f-z974z" Mar 20 13:51:06 crc kubenswrapper[4973]: I0320 13:51:06.413478 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-dns-svc\") pod \"dnsmasq-dns-68df85789f-z974z\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " pod="openstack/dnsmasq-dns-68df85789f-z974z" Mar 20 13:51:06 crc kubenswrapper[4973]: I0320 13:51:06.413511 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-config\") pod \"dnsmasq-dns-68df85789f-z974z\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " pod="openstack/dnsmasq-dns-68df85789f-z974z" Mar 20 13:51:06 crc kubenswrapper[4973]: I0320 13:51:06.414174 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-dns-swift-storage-0\") pod \"dnsmasq-dns-68df85789f-z974z\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " pod="openstack/dnsmasq-dns-68df85789f-z974z" Mar 20 13:51:06 crc kubenswrapper[4973]: I0320 13:51:06.414803 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-dns-svc\") pod \"dnsmasq-dns-68df85789f-z974z\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " pod="openstack/dnsmasq-dns-68df85789f-z974z" Mar 20 13:51:06 crc kubenswrapper[4973]: I0320 13:51:06.414835 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-ovsdbserver-nb\") pod \"dnsmasq-dns-68df85789f-z974z\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " pod="openstack/dnsmasq-dns-68df85789f-z974z" Mar 20 13:51:06 crc kubenswrapper[4973]: I0320 13:51:06.415004 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-openstack-edpm-ipam\") pod \"dnsmasq-dns-68df85789f-z974z\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " pod="openstack/dnsmasq-dns-68df85789f-z974z" Mar 20 13:51:06 crc kubenswrapper[4973]: I0320 13:51:06.415988 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-config\") pod \"dnsmasq-dns-68df85789f-z974z\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " pod="openstack/dnsmasq-dns-68df85789f-z974z" Mar 20 13:51:06 crc kubenswrapper[4973]: I0320 13:51:06.416481 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-ovsdbserver-sb\") pod \"dnsmasq-dns-68df85789f-z974z\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " pod="openstack/dnsmasq-dns-68df85789f-z974z" Mar 20 13:51:06 crc kubenswrapper[4973]: I0320 13:51:06.444437 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s6q5\" (UniqueName: \"kubernetes.io/projected/98db46c5-a461-4c56-a0ce-666906912d6a-kube-api-access-7s6q5\") pod \"dnsmasq-dns-68df85789f-z974z\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " pod="openstack/dnsmasq-dns-68df85789f-z974z" Mar 20 13:51:06 crc kubenswrapper[4973]: I0320 13:51:06.575251 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68df85789f-z974z" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.156854 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.357463 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4ed60638-5022-406b-b568-7fa0d6bf4ba8-rabbitmq-tls\") pod \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.357687 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f28mv\" (UniqueName: \"kubernetes.io/projected/4ed60638-5022-406b-b568-7fa0d6bf4ba8-kube-api-access-f28mv\") pod \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.357847 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ed60638-5022-406b-b568-7fa0d6bf4ba8-rabbitmq-erlang-cookie\") pod \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.357915 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ed60638-5022-406b-b568-7fa0d6bf4ba8-rabbitmq-confd\") pod \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.358008 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ed60638-5022-406b-b568-7fa0d6bf4ba8-rabbitmq-plugins\") pod \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.358118 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4ed60638-5022-406b-b568-7fa0d6bf4ba8-server-conf\") pod \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.358174 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ed60638-5022-406b-b568-7fa0d6bf4ba8-config-data\") pod \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.358213 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ed60638-5022-406b-b568-7fa0d6bf4ba8-erlang-cookie-secret\") pod \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.358251 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ed60638-5022-406b-b568-7fa0d6bf4ba8-plugins-conf\") pod \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.358282 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ed60638-5022-406b-b568-7fa0d6bf4ba8-pod-info\") pod \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.361126 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95f86f18-af9b-4797-b8b2-e3f3d8804c52\") pod \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\" (UID: \"4ed60638-5022-406b-b568-7fa0d6bf4ba8\") " Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.361506 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ed60638-5022-406b-b568-7fa0d6bf4ba8-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "4ed60638-5022-406b-b568-7fa0d6bf4ba8" (UID: "4ed60638-5022-406b-b568-7fa0d6bf4ba8"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.362822 4973 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4ed60638-5022-406b-b568-7fa0d6bf4ba8-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.363729 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ed60638-5022-406b-b568-7fa0d6bf4ba8-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "4ed60638-5022-406b-b568-7fa0d6bf4ba8" (UID: "4ed60638-5022-406b-b568-7fa0d6bf4ba8"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.366580 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ed60638-5022-406b-b568-7fa0d6bf4ba8-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "4ed60638-5022-406b-b568-7fa0d6bf4ba8" (UID: "4ed60638-5022-406b-b568-7fa0d6bf4ba8"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.370081 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ed60638-5022-406b-b568-7fa0d6bf4ba8-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "4ed60638-5022-406b-b568-7fa0d6bf4ba8" (UID: "4ed60638-5022-406b-b568-7fa0d6bf4ba8"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.378656 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/4ed60638-5022-406b-b568-7fa0d6bf4ba8-pod-info" (OuterVolumeSpecName: "pod-info") pod "4ed60638-5022-406b-b568-7fa0d6bf4ba8" (UID: "4ed60638-5022-406b-b568-7fa0d6bf4ba8"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.378768 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ed60638-5022-406b-b568-7fa0d6bf4ba8-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "4ed60638-5022-406b-b568-7fa0d6bf4ba8" (UID: "4ed60638-5022-406b-b568-7fa0d6bf4ba8"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.388106 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ed60638-5022-406b-b568-7fa0d6bf4ba8-kube-api-access-f28mv" (OuterVolumeSpecName: "kube-api-access-f28mv") pod "4ed60638-5022-406b-b568-7fa0d6bf4ba8" (UID: "4ed60638-5022-406b-b568-7fa0d6bf4ba8"). InnerVolumeSpecName "kube-api-access-f28mv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.444487 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95f86f18-af9b-4797-b8b2-e3f3d8804c52" (OuterVolumeSpecName: "persistence") pod "4ed60638-5022-406b-b568-7fa0d6bf4ba8" (UID: "4ed60638-5022-406b-b568-7fa0d6bf4ba8"). InnerVolumeSpecName "pvc-95f86f18-af9b-4797-b8b2-e3f3d8804c52". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.459130 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ed60638-5022-406b-b568-7fa0d6bf4ba8-config-data" (OuterVolumeSpecName: "config-data") pod "4ed60638-5022-406b-b568-7fa0d6bf4ba8" (UID: "4ed60638-5022-406b-b568-7fa0d6bf4ba8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.468852 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4ed60638-5022-406b-b568-7fa0d6bf4ba8-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.468888 4973 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4ed60638-5022-406b-b568-7fa0d6bf4ba8-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.468901 4973 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4ed60638-5022-406b-b568-7fa0d6bf4ba8-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.468910 4973 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4ed60638-5022-406b-b568-7fa0d6bf4ba8-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.468939 4973 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-95f86f18-af9b-4797-b8b2-e3f3d8804c52\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95f86f18-af9b-4797-b8b2-e3f3d8804c52\") on node \"crc\" " Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.468951 4973 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4ed60638-5022-406b-b568-7fa0d6bf4ba8-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.468963 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f28mv\" (UniqueName: \"kubernetes.io/projected/4ed60638-5022-406b-b568-7fa0d6bf4ba8-kube-api-access-f28mv\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.468976 4973 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4ed60638-5022-406b-b568-7fa0d6bf4ba8-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.515989 4973 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.516282 4973 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-95f86f18-af9b-4797-b8b2-e3f3d8804c52" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95f86f18-af9b-4797-b8b2-e3f3d8804c52") on node "crc" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.519698 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ed60638-5022-406b-b568-7fa0d6bf4ba8-server-conf" (OuterVolumeSpecName: "server-conf") pod "4ed60638-5022-406b-b568-7fa0d6bf4ba8" (UID: "4ed60638-5022-406b-b568-7fa0d6bf4ba8"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.554561 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ed60638-5022-406b-b568-7fa0d6bf4ba8-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "4ed60638-5022-406b-b568-7fa0d6bf4ba8" (UID: "4ed60638-5022-406b-b568-7fa0d6bf4ba8"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.570282 4973 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4ed60638-5022-406b-b568-7fa0d6bf4ba8-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.570318 4973 reconciler_common.go:293] "Volume detached for volume \"pvc-95f86f18-af9b-4797-b8b2-e3f3d8804c52\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95f86f18-af9b-4797-b8b2-e3f3d8804c52\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.570330 4973 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4ed60638-5022-406b-b568-7fa0d6bf4ba8-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.669048 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:16 crc kubenswrapper[4973]: E0320 13:51:16.698943 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Mar 20 13:51:16 crc kubenswrapper[4973]: E0320 13:51:16.699009 4973 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested" Mar 20 13:51:16 crc kubenswrapper[4973]: E0320 13:51:16.699149 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:heat-db-sync,Image:quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested,Command:[/bin/bash],Args:[-c /usr/bin/heat-manage --config-dir /etc/heat/heat.conf.d db_sync],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/heat/heat.conf.d/00-default.conf,SubPath:00-default.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/heat/heat.conf.d/01-custom.conf,SubPath:01-custom.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hx8xz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42418,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*42418,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-db-sync-bxbkc_openstack(ca1a0fe5-a203-41e8-814e-fec933be3407): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:51:16 crc kubenswrapper[4973]: E0320 13:51:16.700853 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/heat-db-sync-bxbkc" podUID="ca1a0fe5-a203-41e8-814e-fec933be3407" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.775652 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/20780ec2-d338-45a4-9259-16a651e46e55-pod-info\") pod \"20780ec2-d338-45a4-9259-16a651e46e55\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.776037 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/20780ec2-d338-45a4-9259-16a651e46e55-server-conf\") pod \"20780ec2-d338-45a4-9259-16a651e46e55\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.776146 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/20780ec2-d338-45a4-9259-16a651e46e55-rabbitmq-plugins\") pod \"20780ec2-d338-45a4-9259-16a651e46e55\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.776223 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/20780ec2-d338-45a4-9259-16a651e46e55-rabbitmq-erlang-cookie\") pod \"20780ec2-d338-45a4-9259-16a651e46e55\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.776286 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/20780ec2-d338-45a4-9259-16a651e46e55-plugins-conf\") pod \"20780ec2-d338-45a4-9259-16a651e46e55\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.776444 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/20780ec2-d338-45a4-9259-16a651e46e55-erlang-cookie-secret\") pod \"20780ec2-d338-45a4-9259-16a651e46e55\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.776526 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20780ec2-d338-45a4-9259-16a651e46e55-config-data\") pod \"20780ec2-d338-45a4-9259-16a651e46e55\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.776551 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/20780ec2-d338-45a4-9259-16a651e46e55-rabbitmq-confd\") pod \"20780ec2-d338-45a4-9259-16a651e46e55\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.776576 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/20780ec2-d338-45a4-9259-16a651e46e55-rabbitmq-tls\") pod \"20780ec2-d338-45a4-9259-16a651e46e55\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.778137 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b04e78e5-659b-4e93-8c0c-4bb9c3f6007a\") pod \"20780ec2-d338-45a4-9259-16a651e46e55\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.778258 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xs8k\" (UniqueName: \"kubernetes.io/projected/20780ec2-d338-45a4-9259-16a651e46e55-kube-api-access-6xs8k\") pod \"20780ec2-d338-45a4-9259-16a651e46e55\" (UID: \"20780ec2-d338-45a4-9259-16a651e46e55\") " Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.780169 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20780ec2-d338-45a4-9259-16a651e46e55-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "20780ec2-d338-45a4-9259-16a651e46e55" (UID: "20780ec2-d338-45a4-9259-16a651e46e55"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.780842 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20780ec2-d338-45a4-9259-16a651e46e55-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "20780ec2-d338-45a4-9259-16a651e46e55" (UID: "20780ec2-d338-45a4-9259-16a651e46e55"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.786864 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20780ec2-d338-45a4-9259-16a651e46e55-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "20780ec2-d338-45a4-9259-16a651e46e55" (UID: "20780ec2-d338-45a4-9259-16a651e46e55"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.801619 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/20780ec2-d338-45a4-9259-16a651e46e55-pod-info" (OuterVolumeSpecName: "pod-info") pod "20780ec2-d338-45a4-9259-16a651e46e55" (UID: "20780ec2-d338-45a4-9259-16a651e46e55"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.810874 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20780ec2-d338-45a4-9259-16a651e46e55-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "20780ec2-d338-45a4-9259-16a651e46e55" (UID: "20780ec2-d338-45a4-9259-16a651e46e55"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.816761 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20780ec2-d338-45a4-9259-16a651e46e55-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "20780ec2-d338-45a4-9259-16a651e46e55" (UID: "20780ec2-d338-45a4-9259-16a651e46e55"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.817029 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20780ec2-d338-45a4-9259-16a651e46e55-kube-api-access-6xs8k" (OuterVolumeSpecName: "kube-api-access-6xs8k") pod "20780ec2-d338-45a4-9259-16a651e46e55" (UID: "20780ec2-d338-45a4-9259-16a651e46e55"). InnerVolumeSpecName "kube-api-access-6xs8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.824802 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.827119 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"4ed60638-5022-406b-b568-7fa0d6bf4ba8","Type":"ContainerDied","Data":"45a0ee277eb1a8e5d445d9edeb1429a0a44c51aaa2702d64b321a8fa5e60ea0d"} Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.827185 4973 scope.go:117] "RemoveContainer" containerID="037f5251eaf4cb4f67de62e170130e7e06ee060a34d17561441e79a60d62fd3c" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.838421 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b04e78e5-659b-4e93-8c0c-4bb9c3f6007a" (OuterVolumeSpecName: "persistence") pod "20780ec2-d338-45a4-9259-16a651e46e55" (UID: "20780ec2-d338-45a4-9259-16a651e46e55"). InnerVolumeSpecName "pvc-b04e78e5-659b-4e93-8c0c-4bb9c3f6007a". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.843074 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.843379 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"20780ec2-d338-45a4-9259-16a651e46e55","Type":"ContainerDied","Data":"a171d65528ccf508aec93ce498b6de689df033ad309b08561c3b732cb72f2011"} Mar 20 13:51:16 crc kubenswrapper[4973]: E0320 13:51:16.846599 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"heat-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-heat-engine:current-tested\\\"\"" pod="openstack/heat-db-sync-bxbkc" podUID="ca1a0fe5-a203-41e8-814e-fec933be3407" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.865073 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20780ec2-d338-45a4-9259-16a651e46e55-config-data" (OuterVolumeSpecName: "config-data") pod "20780ec2-d338-45a4-9259-16a651e46e55" (UID: "20780ec2-d338-45a4-9259-16a651e46e55"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.889963 4973 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/20780ec2-d338-45a4-9259-16a651e46e55-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.890000 4973 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/20780ec2-d338-45a4-9259-16a651e46e55-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.890016 4973 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/20780ec2-d338-45a4-9259-16a651e46e55-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.890049 4973 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/20780ec2-d338-45a4-9259-16a651e46e55-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.890060 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/20780ec2-d338-45a4-9259-16a651e46e55-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.890071 4973 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/20780ec2-d338-45a4-9259-16a651e46e55-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.890136 4973 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-b04e78e5-659b-4e93-8c0c-4bb9c3f6007a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b04e78e5-659b-4e93-8c0c-4bb9c3f6007a\") on node \"crc\" " Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.890153 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xs8k\" (UniqueName: \"kubernetes.io/projected/20780ec2-d338-45a4-9259-16a651e46e55-kube-api-access-6xs8k\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.890169 4973 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/20780ec2-d338-45a4-9259-16a651e46e55-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.943085 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20780ec2-d338-45a4-9259-16a651e46e55-server-conf" (OuterVolumeSpecName: "server-conf") pod "20780ec2-d338-45a4-9259-16a651e46e55" (UID: "20780ec2-d338-45a4-9259-16a651e46e55"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.948289 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.973437 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.983188 4973 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.983401 4973 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-b04e78e5-659b-4e93-8c0c-4bb9c3f6007a" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b04e78e5-659b-4e93-8c0c-4bb9c3f6007a") on node "crc" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.992225 4973 reconciler_common.go:293] "Volume detached for volume \"pvc-b04e78e5-659b-4e93-8c0c-4bb9c3f6007a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b04e78e5-659b-4e93-8c0c-4bb9c3f6007a\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.992261 4973 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/20780ec2-d338-45a4-9259-16a651e46e55-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.995078 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-2"] Mar 20 13:51:16 crc kubenswrapper[4973]: E0320 13:51:16.995681 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ed60638-5022-406b-b568-7fa0d6bf4ba8" containerName="setup-container" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.995703 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed60638-5022-406b-b568-7fa0d6bf4ba8" containerName="setup-container" Mar 20 13:51:16 crc kubenswrapper[4973]: E0320 13:51:16.995750 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20780ec2-d338-45a4-9259-16a651e46e55" containerName="setup-container" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.995758 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="20780ec2-d338-45a4-9259-16a651e46e55" containerName="setup-container" Mar 20 13:51:16 crc kubenswrapper[4973]: E0320 13:51:16.995772 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ed60638-5022-406b-b568-7fa0d6bf4ba8" containerName="rabbitmq" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.995779 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ed60638-5022-406b-b568-7fa0d6bf4ba8" containerName="rabbitmq" Mar 20 13:51:16 crc kubenswrapper[4973]: E0320 13:51:16.995807 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20780ec2-d338-45a4-9259-16a651e46e55" containerName="rabbitmq" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.995814 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="20780ec2-d338-45a4-9259-16a651e46e55" containerName="rabbitmq" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.996506 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ed60638-5022-406b-b568-7fa0d6bf4ba8" containerName="rabbitmq" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.996554 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="20780ec2-d338-45a4-9259-16a651e46e55" containerName="rabbitmq" Mar 20 13:51:16 crc kubenswrapper[4973]: I0320 13:51:16.998400 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.022064 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.046574 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20780ec2-d338-45a4-9259-16a651e46e55-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "20780ec2-d338-45a4-9259-16a651e46e55" (UID: "20780ec2-d338-45a4-9259-16a651e46e55"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.094796 4973 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/20780ec2-d338-45a4-9259-16a651e46e55-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.200109 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f8b4580d-53af-4c18-9f2c-b883b8621113-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.200481 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h72np\" (UniqueName: \"kubernetes.io/projected/f8b4580d-53af-4c18-9f2c-b883b8621113-kube-api-access-h72np\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.200658 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-95f86f18-af9b-4797-b8b2-e3f3d8804c52\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95f86f18-af9b-4797-b8b2-e3f3d8804c52\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.200778 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f8b4580d-53af-4c18-9f2c-b883b8621113-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.200835 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8b4580d-53af-4c18-9f2c-b883b8621113-config-data\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.200860 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f8b4580d-53af-4c18-9f2c-b883b8621113-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.200993 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f8b4580d-53af-4c18-9f2c-b883b8621113-pod-info\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.201370 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f8b4580d-53af-4c18-9f2c-b883b8621113-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.201623 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f8b4580d-53af-4c18-9f2c-b883b8621113-server-conf\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.201895 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f8b4580d-53af-4c18-9f2c-b883b8621113-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.202047 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f8b4580d-53af-4c18-9f2c-b883b8621113-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.239492 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.252060 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.304374 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f8b4580d-53af-4c18-9f2c-b883b8621113-server-conf\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.304456 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f8b4580d-53af-4c18-9f2c-b883b8621113-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.304517 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f8b4580d-53af-4c18-9f2c-b883b8621113-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.304547 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f8b4580d-53af-4c18-9f2c-b883b8621113-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.304621 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h72np\" (UniqueName: \"kubernetes.io/projected/f8b4580d-53af-4c18-9f2c-b883b8621113-kube-api-access-h72np\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.304655 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-95f86f18-af9b-4797-b8b2-e3f3d8804c52\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95f86f18-af9b-4797-b8b2-e3f3d8804c52\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.304718 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f8b4580d-53af-4c18-9f2c-b883b8621113-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.304754 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8b4580d-53af-4c18-9f2c-b883b8621113-config-data\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.304773 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f8b4580d-53af-4c18-9f2c-b883b8621113-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.304843 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f8b4580d-53af-4c18-9f2c-b883b8621113-pod-info\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.304925 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f8b4580d-53af-4c18-9f2c-b883b8621113-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.305450 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.305910 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/f8b4580d-53af-4c18-9f2c-b883b8621113-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.306754 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/f8b4580d-53af-4c18-9f2c-b883b8621113-config-data\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.306805 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/f8b4580d-53af-4c18-9f2c-b883b8621113-rabbitmq-plugins\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.307171 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/f8b4580d-53af-4c18-9f2c-b883b8621113-plugins-conf\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.307696 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/f8b4580d-53af-4c18-9f2c-b883b8621113-server-conf\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.311290 4973 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.311325 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-95f86f18-af9b-4797-b8b2-e3f3d8804c52\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95f86f18-af9b-4797-b8b2-e3f3d8804c52\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1164d215983f7dc2868a1be48f97f5ce9134f1d9c7b479725732de51a01c727b/globalmount\"" pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.316549 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/f8b4580d-53af-4c18-9f2c-b883b8621113-rabbitmq-tls\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.335673 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/f8b4580d-53af-4c18-9f2c-b883b8621113-pod-info\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.344132 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/f8b4580d-53af-4c18-9f2c-b883b8621113-erlang-cookie-secret\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.344618 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/f8b4580d-53af-4c18-9f2c-b883b8621113-rabbitmq-confd\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.348715 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h72np\" (UniqueName: \"kubernetes.io/projected/f8b4580d-53af-4c18-9f2c-b883b8621113-kube-api-access-h72np\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.350020 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.357836 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.366449 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.366591 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-trmvg" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.366982 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.367224 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.367441 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.367660 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.368973 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.426143 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-95f86f18-af9b-4797-b8b2-e3f3d8804c52\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-95f86f18-af9b-4797-b8b2-e3f3d8804c52\") pod \"rabbitmq-server-2\" (UID: \"f8b4580d-53af-4c18-9f2c-b883b8621113\") " pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.481912 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-2" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.509034 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b04e78e5-659b-4e93-8c0c-4bb9c3f6007a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b04e78e5-659b-4e93-8c0c-4bb9c3f6007a\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.509078 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/89a22e25-955b-4786-baf2-7138a668f512-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.509156 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ggvd\" (UniqueName: \"kubernetes.io/projected/89a22e25-955b-4786-baf2-7138a668f512-kube-api-access-7ggvd\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.509282 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/89a22e25-955b-4786-baf2-7138a668f512-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.509389 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/89a22e25-955b-4786-baf2-7138a668f512-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.509411 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89a22e25-955b-4786-baf2-7138a668f512-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.509435 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/89a22e25-955b-4786-baf2-7138a668f512-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.509461 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/89a22e25-955b-4786-baf2-7138a668f512-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.509545 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/89a22e25-955b-4786-baf2-7138a668f512-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.509593 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/89a22e25-955b-4786-baf2-7138a668f512-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.509676 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/89a22e25-955b-4786-baf2-7138a668f512-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.612827 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/89a22e25-955b-4786-baf2-7138a668f512-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.612879 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89a22e25-955b-4786-baf2-7138a668f512-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.612916 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/89a22e25-955b-4786-baf2-7138a668f512-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.613932 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/89a22e25-955b-4786-baf2-7138a668f512-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.614078 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/89a22e25-955b-4786-baf2-7138a668f512-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.614106 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/89a22e25-955b-4786-baf2-7138a668f512-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.614243 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/89a22e25-955b-4786-baf2-7138a668f512-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.614970 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/89a22e25-955b-4786-baf2-7138a668f512-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.615035 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/89a22e25-955b-4786-baf2-7138a668f512-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.615448 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/89a22e25-955b-4786-baf2-7138a668f512-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.615655 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b04e78e5-659b-4e93-8c0c-4bb9c3f6007a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b04e78e5-659b-4e93-8c0c-4bb9c3f6007a\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.615761 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/89a22e25-955b-4786-baf2-7138a668f512-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.615812 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ggvd\" (UniqueName: \"kubernetes.io/projected/89a22e25-955b-4786-baf2-7138a668f512-kube-api-access-7ggvd\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.616078 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/89a22e25-955b-4786-baf2-7138a668f512-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.616752 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/89a22e25-955b-4786-baf2-7138a668f512-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.618082 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/89a22e25-955b-4786-baf2-7138a668f512-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.618563 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/89a22e25-955b-4786-baf2-7138a668f512-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.619122 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/89a22e25-955b-4786-baf2-7138a668f512-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.619466 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/89a22e25-955b-4786-baf2-7138a668f512-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.619811 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/89a22e25-955b-4786-baf2-7138a668f512-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.622273 4973 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.622371 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b04e78e5-659b-4e93-8c0c-4bb9c3f6007a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b04e78e5-659b-4e93-8c0c-4bb9c3f6007a\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9287085a89b925405c99e289731435a8079d99592059ce42321d27146daffc28/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.635944 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ggvd\" (UniqueName: \"kubernetes.io/projected/89a22e25-955b-4786-baf2-7138a668f512-kube-api-access-7ggvd\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: E0320 13:51:17.675628 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Mar 20 13:51:17 crc kubenswrapper[4973]: E0320 13:51:17.675884 4973 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested" Mar 20 13:51:17 crc kubenswrapper[4973]: E0320 13:51:17.676125 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n559h686h566h86h664h698h559hffh5b6h5c4hdh569h5f4h674hf6h577h668hf9h556hd8h5d4h58chf4h59dh599h77h5c8h5c4h5dfh697h54bhc5q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xjlnk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(1cde80e1-f72d-4080-a86b-5968a8904333): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.679329 4973 scope.go:117] "RemoveContainer" containerID="1197fb893b7f76b1e4555b8d0ff5bfaca6b9fd60a6146cd1c20b9f45d87f3162" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.682268 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b04e78e5-659b-4e93-8c0c-4bb9c3f6007a\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-b04e78e5-659b-4e93-8c0c-4bb9c3f6007a\") pod \"rabbitmq-cell1-server-0\" (UID: \"89a22e25-955b-4786-baf2-7138a668f512\") " pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.716088 4973 scope.go:117] "RemoveContainer" containerID="92e9958d24196db83cc045e1f87b9f21ff9358fb34e199690b6dd3b40a1daaaa" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.745093 4973 scope.go:117] "RemoveContainer" containerID="4fe78d802a26e68c3351b6b1daff5a6fc354aeb5b49e215fb4795ef080c317df" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.767948 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.985726 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20780ec2-d338-45a4-9259-16a651e46e55" path="/var/lib/kubelet/pods/20780ec2-d338-45a4-9259-16a651e46e55/volumes" Mar 20 13:51:17 crc kubenswrapper[4973]: I0320 13:51:17.987075 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ed60638-5022-406b-b568-7fa0d6bf4ba8" path="/var/lib/kubelet/pods/4ed60638-5022-406b-b568-7fa0d6bf4ba8/volumes" Mar 20 13:51:18 crc kubenswrapper[4973]: I0320 13:51:18.217231 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-2"] Mar 20 13:51:18 crc kubenswrapper[4973]: I0320 13:51:18.358171 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68df85789f-z974z"] Mar 20 13:51:18 crc kubenswrapper[4973]: W0320 13:51:18.359876 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod98db46c5_a461_4c56_a0ce_666906912d6a.slice/crio-adc0ffd78ec6135c65fdbbf678493c6681beb60ff0d253326729610ae6dd24c3 WatchSource:0}: Error finding container adc0ffd78ec6135c65fdbbf678493c6681beb60ff0d253326729610ae6dd24c3: Status 404 returned error can't find the container with id adc0ffd78ec6135c65fdbbf678493c6681beb60ff0d253326729610ae6dd24c3 Mar 20 13:51:18 crc kubenswrapper[4973]: W0320 13:51:18.505916 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89a22e25_955b_4786_baf2_7138a668f512.slice/crio-2f12836cdf098e25cb6c04b8fc852b6efbe49410c0a2ab1265d1123fea21a752 WatchSource:0}: Error finding container 2f12836cdf098e25cb6c04b8fc852b6efbe49410c0a2ab1265d1123fea21a752: Status 404 returned error can't find the container with id 2f12836cdf098e25cb6c04b8fc852b6efbe49410c0a2ab1265d1123fea21a752 Mar 20 13:51:18 crc kubenswrapper[4973]: I0320 13:51:18.508170 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 20 13:51:18 crc kubenswrapper[4973]: I0320 13:51:18.870380 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"f8b4580d-53af-4c18-9f2c-b883b8621113","Type":"ContainerStarted","Data":"9110663df2b40c6e7c617b671b016f3adc4eea0b8793689a8ab4acc5f7ffc411"} Mar 20 13:51:18 crc kubenswrapper[4973]: I0320 13:51:18.872624 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cde80e1-f72d-4080-a86b-5968a8904333","Type":"ContainerStarted","Data":"800654125f78fdf5a9132ee5dda7e8cd40f7a094f028f4f95050718d0a1f9a46"} Mar 20 13:51:18 crc kubenswrapper[4973]: I0320 13:51:18.874017 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"89a22e25-955b-4786-baf2-7138a668f512","Type":"ContainerStarted","Data":"2f12836cdf098e25cb6c04b8fc852b6efbe49410c0a2ab1265d1123fea21a752"} Mar 20 13:51:18 crc kubenswrapper[4973]: I0320 13:51:18.876016 4973 generic.go:334] "Generic (PLEG): container finished" podID="98db46c5-a461-4c56-a0ce-666906912d6a" containerID="b1134f08b65731b5aa9fc3da72c28c4c63aa244f25e847cf489f553e46b7b06e" exitCode=0 Mar 20 13:51:18 crc kubenswrapper[4973]: I0320 13:51:18.876055 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68df85789f-z974z" event={"ID":"98db46c5-a461-4c56-a0ce-666906912d6a","Type":"ContainerDied","Data":"b1134f08b65731b5aa9fc3da72c28c4c63aa244f25e847cf489f553e46b7b06e"} Mar 20 13:51:18 crc kubenswrapper[4973]: I0320 13:51:18.876076 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68df85789f-z974z" event={"ID":"98db46c5-a461-4c56-a0ce-666906912d6a","Type":"ContainerStarted","Data":"adc0ffd78ec6135c65fdbbf678493c6681beb60ff0d253326729610ae6dd24c3"} Mar 20 13:51:19 crc kubenswrapper[4973]: I0320 13:51:19.889358 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68df85789f-z974z" event={"ID":"98db46c5-a461-4c56-a0ce-666906912d6a","Type":"ContainerStarted","Data":"4f8d6102d4db48ee1a767bc3a8d199e6ddeb6beea12b8f5239061004141985ca"} Mar 20 13:51:19 crc kubenswrapper[4973]: I0320 13:51:19.889652 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68df85789f-z974z" Mar 20 13:51:19 crc kubenswrapper[4973]: I0320 13:51:19.891815 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cde80e1-f72d-4080-a86b-5968a8904333","Type":"ContainerStarted","Data":"1120829b40f5f887499f84e1d473a8ce1bd8992f9a5c0d8fcd2eec4bd926cbf0"} Mar 20 13:51:19 crc kubenswrapper[4973]: I0320 13:51:19.916638 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68df85789f-z974z" podStartSLOduration=13.916613051 podStartE2EDuration="13.916613051s" podCreationTimestamp="2026-03-20 13:51:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:51:19.906860825 +0000 UTC m=+1800.650530589" watchObservedRunningTime="2026-03-20 13:51:19.916613051 +0000 UTC m=+1800.660282795" Mar 20 13:51:19 crc kubenswrapper[4973]: I0320 13:51:19.963178 4973 scope.go:117] "RemoveContainer" containerID="6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" Mar 20 13:51:19 crc kubenswrapper[4973]: E0320 13:51:19.963863 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:51:20 crc kubenswrapper[4973]: I0320 13:51:20.403134 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="20780ec2-d338-45a4-9259-16a651e46e55" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.132:5671: i/o timeout" Mar 20 13:51:20 crc kubenswrapper[4973]: I0320 13:51:20.789486 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="4ed60638-5022-406b-b568-7fa0d6bf4ba8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.135:5671: i/o timeout" Mar 20 13:51:20 crc kubenswrapper[4973]: I0320 13:51:20.907059 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"89a22e25-955b-4786-baf2-7138a668f512","Type":"ContainerStarted","Data":"6b8230c0351f652601a4eca4fb74594d0c76891c01f5777543668b86a1d543e0"} Mar 20 13:51:20 crc kubenswrapper[4973]: I0320 13:51:20.909611 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"f8b4580d-53af-4c18-9f2c-b883b8621113","Type":"ContainerStarted","Data":"2cbee18dc61aa85750195a2e7564ae5ffeaa5102a8fbe567336b64dd93aea919"} Mar 20 13:51:21 crc kubenswrapper[4973]: E0320 13:51:21.446999 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="1cde80e1-f72d-4080-a86b-5968a8904333" Mar 20 13:51:21 crc kubenswrapper[4973]: I0320 13:51:21.923799 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cde80e1-f72d-4080-a86b-5968a8904333","Type":"ContainerStarted","Data":"c99db2067c2b1f267bc68b9ea0b8bcc294943d07a41184f9b07b2fd004db8186"} Mar 20 13:51:21 crc kubenswrapper[4973]: E0320 13:51:21.926480 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="1cde80e1-f72d-4080-a86b-5968a8904333" Mar 20 13:51:22 crc kubenswrapper[4973]: I0320 13:51:22.933128 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 20 13:51:22 crc kubenswrapper[4973]: E0320 13:51:22.935121 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="1cde80e1-f72d-4080-a86b-5968a8904333" Mar 20 13:51:23 crc kubenswrapper[4973]: E0320 13:51:23.946542 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.rdoproject.org/podified-master-centos10/openstack-ceilometer-central:current-tested\\\"\"" pod="openstack/ceilometer-0" podUID="1cde80e1-f72d-4080-a86b-5968a8904333" Mar 20 13:51:26 crc kubenswrapper[4973]: I0320 13:51:26.577498 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68df85789f-z974z" Mar 20 13:51:26 crc kubenswrapper[4973]: I0320 13:51:26.668709 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-nt6ht"] Mar 20 13:51:26 crc kubenswrapper[4973]: I0320 13:51:26.669054 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" podUID="56685fe5-5182-46f0-84f8-c8a40d42a3d2" containerName="dnsmasq-dns" containerID="cri-o://c2866e2a87b3d4355519c2a6554318792c9a3df4fb6b3a0e2a24fb7c651776b1" gracePeriod=10 Mar 20 13:51:26 crc kubenswrapper[4973]: I0320 13:51:26.901897 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bb85b8995-9fz8b"] Mar 20 13:51:26 crc kubenswrapper[4973]: I0320 13:51:26.905123 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" Mar 20 13:51:26 crc kubenswrapper[4973]: I0320 13:51:26.921916 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bb85b8995-9fz8b"] Mar 20 13:51:26 crc kubenswrapper[4973]: I0320 13:51:26.981772 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bkn66\" (UniqueName: \"kubernetes.io/projected/9253dc20-632e-44ce-8d38-e452186eddbd-kube-api-access-bkn66\") pod \"dnsmasq-dns-bb85b8995-9fz8b\" (UID: \"9253dc20-632e-44ce-8d38-e452186eddbd\") " pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" Mar 20 13:51:26 crc kubenswrapper[4973]: I0320 13:51:26.981851 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9253dc20-632e-44ce-8d38-e452186eddbd-config\") pod \"dnsmasq-dns-bb85b8995-9fz8b\" (UID: \"9253dc20-632e-44ce-8d38-e452186eddbd\") " pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" Mar 20 13:51:26 crc kubenswrapper[4973]: I0320 13:51:26.982032 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9253dc20-632e-44ce-8d38-e452186eddbd-openstack-edpm-ipam\") pod \"dnsmasq-dns-bb85b8995-9fz8b\" (UID: \"9253dc20-632e-44ce-8d38-e452186eddbd\") " pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" Mar 20 13:51:26 crc kubenswrapper[4973]: I0320 13:51:26.982319 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9253dc20-632e-44ce-8d38-e452186eddbd-ovsdbserver-sb\") pod \"dnsmasq-dns-bb85b8995-9fz8b\" (UID: \"9253dc20-632e-44ce-8d38-e452186eddbd\") " pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" Mar 20 13:51:26 crc kubenswrapper[4973]: I0320 13:51:26.982534 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9253dc20-632e-44ce-8d38-e452186eddbd-dns-swift-storage-0\") pod \"dnsmasq-dns-bb85b8995-9fz8b\" (UID: \"9253dc20-632e-44ce-8d38-e452186eddbd\") " pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" Mar 20 13:51:26 crc kubenswrapper[4973]: I0320 13:51:26.982598 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9253dc20-632e-44ce-8d38-e452186eddbd-dns-svc\") pod \"dnsmasq-dns-bb85b8995-9fz8b\" (UID: \"9253dc20-632e-44ce-8d38-e452186eddbd\") " pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" Mar 20 13:51:26 crc kubenswrapper[4973]: I0320 13:51:26.986168 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9253dc20-632e-44ce-8d38-e452186eddbd-ovsdbserver-nb\") pod \"dnsmasq-dns-bb85b8995-9fz8b\" (UID: \"9253dc20-632e-44ce-8d38-e452186eddbd\") " pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" Mar 20 13:51:26 crc kubenswrapper[4973]: I0320 13:51:26.987594 4973 generic.go:334] "Generic (PLEG): container finished" podID="56685fe5-5182-46f0-84f8-c8a40d42a3d2" containerID="c2866e2a87b3d4355519c2a6554318792c9a3df4fb6b3a0e2a24fb7c651776b1" exitCode=0 Mar 20 13:51:26 crc kubenswrapper[4973]: I0320 13:51:26.987637 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" event={"ID":"56685fe5-5182-46f0-84f8-c8a40d42a3d2","Type":"ContainerDied","Data":"c2866e2a87b3d4355519c2a6554318792c9a3df4fb6b3a0e2a24fb7c651776b1"} Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.088407 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9253dc20-632e-44ce-8d38-e452186eddbd-config\") pod \"dnsmasq-dns-bb85b8995-9fz8b\" (UID: \"9253dc20-632e-44ce-8d38-e452186eddbd\") " pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.088473 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9253dc20-632e-44ce-8d38-e452186eddbd-openstack-edpm-ipam\") pod \"dnsmasq-dns-bb85b8995-9fz8b\" (UID: \"9253dc20-632e-44ce-8d38-e452186eddbd\") " pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.088535 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9253dc20-632e-44ce-8d38-e452186eddbd-ovsdbserver-sb\") pod \"dnsmasq-dns-bb85b8995-9fz8b\" (UID: \"9253dc20-632e-44ce-8d38-e452186eddbd\") " pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.088570 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9253dc20-632e-44ce-8d38-e452186eddbd-dns-swift-storage-0\") pod \"dnsmasq-dns-bb85b8995-9fz8b\" (UID: \"9253dc20-632e-44ce-8d38-e452186eddbd\") " pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.088592 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9253dc20-632e-44ce-8d38-e452186eddbd-dns-svc\") pod \"dnsmasq-dns-bb85b8995-9fz8b\" (UID: \"9253dc20-632e-44ce-8d38-e452186eddbd\") " pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.088615 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9253dc20-632e-44ce-8d38-e452186eddbd-ovsdbserver-nb\") pod \"dnsmasq-dns-bb85b8995-9fz8b\" (UID: \"9253dc20-632e-44ce-8d38-e452186eddbd\") " pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.088713 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bkn66\" (UniqueName: \"kubernetes.io/projected/9253dc20-632e-44ce-8d38-e452186eddbd-kube-api-access-bkn66\") pod \"dnsmasq-dns-bb85b8995-9fz8b\" (UID: \"9253dc20-632e-44ce-8d38-e452186eddbd\") " pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.089868 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9253dc20-632e-44ce-8d38-e452186eddbd-config\") pod \"dnsmasq-dns-bb85b8995-9fz8b\" (UID: \"9253dc20-632e-44ce-8d38-e452186eddbd\") " pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.090396 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/9253dc20-632e-44ce-8d38-e452186eddbd-openstack-edpm-ipam\") pod \"dnsmasq-dns-bb85b8995-9fz8b\" (UID: \"9253dc20-632e-44ce-8d38-e452186eddbd\") " pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.091000 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9253dc20-632e-44ce-8d38-e452186eddbd-ovsdbserver-sb\") pod \"dnsmasq-dns-bb85b8995-9fz8b\" (UID: \"9253dc20-632e-44ce-8d38-e452186eddbd\") " pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.091618 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9253dc20-632e-44ce-8d38-e452186eddbd-dns-swift-storage-0\") pod \"dnsmasq-dns-bb85b8995-9fz8b\" (UID: \"9253dc20-632e-44ce-8d38-e452186eddbd\") " pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.092255 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9253dc20-632e-44ce-8d38-e452186eddbd-dns-svc\") pod \"dnsmasq-dns-bb85b8995-9fz8b\" (UID: \"9253dc20-632e-44ce-8d38-e452186eddbd\") " pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.093107 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9253dc20-632e-44ce-8d38-e452186eddbd-ovsdbserver-nb\") pod \"dnsmasq-dns-bb85b8995-9fz8b\" (UID: \"9253dc20-632e-44ce-8d38-e452186eddbd\") " pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.126987 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bkn66\" (UniqueName: \"kubernetes.io/projected/9253dc20-632e-44ce-8d38-e452186eddbd-kube-api-access-bkn66\") pod \"dnsmasq-dns-bb85b8995-9fz8b\" (UID: \"9253dc20-632e-44ce-8d38-e452186eddbd\") " pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.242281 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.460439 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.505925 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t88q\" (UniqueName: \"kubernetes.io/projected/56685fe5-5182-46f0-84f8-c8a40d42a3d2-kube-api-access-9t88q\") pod \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\" (UID: \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\") " Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.506040 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-dns-svc\") pod \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\" (UID: \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\") " Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.506145 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-dns-swift-storage-0\") pod \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\" (UID: \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\") " Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.506175 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-config\") pod \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\" (UID: \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\") " Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.506263 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-ovsdbserver-sb\") pod \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\" (UID: \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\") " Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.506380 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-ovsdbserver-nb\") pod \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\" (UID: \"56685fe5-5182-46f0-84f8-c8a40d42a3d2\") " Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.535031 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56685fe5-5182-46f0-84f8-c8a40d42a3d2-kube-api-access-9t88q" (OuterVolumeSpecName: "kube-api-access-9t88q") pod "56685fe5-5182-46f0-84f8-c8a40d42a3d2" (UID: "56685fe5-5182-46f0-84f8-c8a40d42a3d2"). InnerVolumeSpecName "kube-api-access-9t88q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.603534 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-config" (OuterVolumeSpecName: "config") pod "56685fe5-5182-46f0-84f8-c8a40d42a3d2" (UID: "56685fe5-5182-46f0-84f8-c8a40d42a3d2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.603643 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "56685fe5-5182-46f0-84f8-c8a40d42a3d2" (UID: "56685fe5-5182-46f0-84f8-c8a40d42a3d2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.609516 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t88q\" (UniqueName: \"kubernetes.io/projected/56685fe5-5182-46f0-84f8-c8a40d42a3d2-kube-api-access-9t88q\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.609545 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.609554 4973 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.610809 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "56685fe5-5182-46f0-84f8-c8a40d42a3d2" (UID: "56685fe5-5182-46f0-84f8-c8a40d42a3d2"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.623450 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "56685fe5-5182-46f0-84f8-c8a40d42a3d2" (UID: "56685fe5-5182-46f0-84f8-c8a40d42a3d2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.652616 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "56685fe5-5182-46f0-84f8-c8a40d42a3d2" (UID: "56685fe5-5182-46f0-84f8-c8a40d42a3d2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.711200 4973 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.711233 4973 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.711244 4973 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/56685fe5-5182-46f0-84f8-c8a40d42a3d2-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:27 crc kubenswrapper[4973]: I0320 13:51:27.911264 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bb85b8995-9fz8b"] Mar 20 13:51:28 crc kubenswrapper[4973]: I0320 13:51:28.147694 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" event={"ID":"9253dc20-632e-44ce-8d38-e452186eddbd","Type":"ContainerStarted","Data":"1851bc01046e543e957adeb8caabf4d4adeacfe950639a9c91ce160858c63768"} Mar 20 13:51:28 crc kubenswrapper[4973]: I0320 13:51:28.160216 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" event={"ID":"56685fe5-5182-46f0-84f8-c8a40d42a3d2","Type":"ContainerDied","Data":"c3df0d88aa31b4738d70ef5fd86b4e819ca4b66a528f195e75d1d27720c7862d"} Mar 20 13:51:28 crc kubenswrapper[4973]: I0320 13:51:28.160296 4973 scope.go:117] "RemoveContainer" containerID="c2866e2a87b3d4355519c2a6554318792c9a3df4fb6b3a0e2a24fb7c651776b1" Mar 20 13:51:28 crc kubenswrapper[4973]: I0320 13:51:28.161766 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-79b5d74c8c-nt6ht" Mar 20 13:51:28 crc kubenswrapper[4973]: I0320 13:51:28.203566 4973 scope.go:117] "RemoveContainer" containerID="043ff4cea4f53ab126f52fec6d086b0adc436b76c84625aa64e5e1faa839a350" Mar 20 13:51:28 crc kubenswrapper[4973]: I0320 13:51:28.206095 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-nt6ht"] Mar 20 13:51:28 crc kubenswrapper[4973]: I0320 13:51:28.230564 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-79b5d74c8c-nt6ht"] Mar 20 13:51:29 crc kubenswrapper[4973]: I0320 13:51:29.191088 4973 generic.go:334] "Generic (PLEG): container finished" podID="9253dc20-632e-44ce-8d38-e452186eddbd" containerID="5c252a726fb24e023e7dc8fe3f1dbd08a5f163ba5845e83e6ecfbce76f8003fe" exitCode=0 Mar 20 13:51:29 crc kubenswrapper[4973]: I0320 13:51:29.191169 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" event={"ID":"9253dc20-632e-44ce-8d38-e452186eddbd","Type":"ContainerDied","Data":"5c252a726fb24e023e7dc8fe3f1dbd08a5f163ba5845e83e6ecfbce76f8003fe"} Mar 20 13:51:29 crc kubenswrapper[4973]: I0320 13:51:29.962263 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56685fe5-5182-46f0-84f8-c8a40d42a3d2" path="/var/lib/kubelet/pods/56685fe5-5182-46f0-84f8-c8a40d42a3d2/volumes" Mar 20 13:51:30 crc kubenswrapper[4973]: I0320 13:51:30.204182 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" event={"ID":"9253dc20-632e-44ce-8d38-e452186eddbd","Type":"ContainerStarted","Data":"d75d081fdcfa6397f65a5bfec44baabe4b9b6ade9c596ccd3104e41a0c375c5f"} Mar 20 13:51:30 crc kubenswrapper[4973]: I0320 13:51:30.204481 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" Mar 20 13:51:30 crc kubenswrapper[4973]: I0320 13:51:30.227324 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" podStartSLOduration=4.227293824 podStartE2EDuration="4.227293824s" podCreationTimestamp="2026-03-20 13:51:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:51:30.225528036 +0000 UTC m=+1810.969197810" watchObservedRunningTime="2026-03-20 13:51:30.227293824 +0000 UTC m=+1810.970963568" Mar 20 13:51:33 crc kubenswrapper[4973]: I0320 13:51:33.240827 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-bxbkc" event={"ID":"ca1a0fe5-a203-41e8-814e-fec933be3407","Type":"ContainerStarted","Data":"35d105e6f6d44075fb976737535843d99d453b8916eb02a9fdc45b90c0818a3a"} Mar 20 13:51:33 crc kubenswrapper[4973]: I0320 13:51:33.270058 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-bxbkc" podStartSLOduration=2.285239514 podStartE2EDuration="46.270041339s" podCreationTimestamp="2026-03-20 13:50:47 +0000 UTC" firstStartedPulling="2026-03-20 13:50:48.15666595 +0000 UTC m=+1768.900335694" lastFinishedPulling="2026-03-20 13:51:32.141467775 +0000 UTC m=+1812.885137519" observedRunningTime="2026-03-20 13:51:33.263633614 +0000 UTC m=+1814.007303378" watchObservedRunningTime="2026-03-20 13:51:33.270041339 +0000 UTC m=+1814.013711083" Mar 20 13:51:33 crc kubenswrapper[4973]: I0320 13:51:33.950588 4973 scope.go:117] "RemoveContainer" containerID="6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" Mar 20 13:51:33 crc kubenswrapper[4973]: E0320 13:51:33.951245 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:51:35 crc kubenswrapper[4973]: I0320 13:51:35.262791 4973 generic.go:334] "Generic (PLEG): container finished" podID="ca1a0fe5-a203-41e8-814e-fec933be3407" containerID="35d105e6f6d44075fb976737535843d99d453b8916eb02a9fdc45b90c0818a3a" exitCode=0 Mar 20 13:51:35 crc kubenswrapper[4973]: I0320 13:51:35.262873 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-bxbkc" event={"ID":"ca1a0fe5-a203-41e8-814e-fec933be3407","Type":"ContainerDied","Data":"35d105e6f6d44075fb976737535843d99d453b8916eb02a9fdc45b90c0818a3a"} Mar 20 13:51:36 crc kubenswrapper[4973]: I0320 13:51:36.844959 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-bxbkc" Mar 20 13:51:36 crc kubenswrapper[4973]: I0320 13:51:36.914764 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca1a0fe5-a203-41e8-814e-fec933be3407-config-data\") pod \"ca1a0fe5-a203-41e8-814e-fec933be3407\" (UID: \"ca1a0fe5-a203-41e8-814e-fec933be3407\") " Mar 20 13:51:36 crc kubenswrapper[4973]: I0320 13:51:36.915084 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca1a0fe5-a203-41e8-814e-fec933be3407-combined-ca-bundle\") pod \"ca1a0fe5-a203-41e8-814e-fec933be3407\" (UID: \"ca1a0fe5-a203-41e8-814e-fec933be3407\") " Mar 20 13:51:36 crc kubenswrapper[4973]: I0320 13:51:36.915253 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hx8xz\" (UniqueName: \"kubernetes.io/projected/ca1a0fe5-a203-41e8-814e-fec933be3407-kube-api-access-hx8xz\") pod \"ca1a0fe5-a203-41e8-814e-fec933be3407\" (UID: \"ca1a0fe5-a203-41e8-814e-fec933be3407\") " Mar 20 13:51:36 crc kubenswrapper[4973]: I0320 13:51:36.929253 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ca1a0fe5-a203-41e8-814e-fec933be3407-kube-api-access-hx8xz" (OuterVolumeSpecName: "kube-api-access-hx8xz") pod "ca1a0fe5-a203-41e8-814e-fec933be3407" (UID: "ca1a0fe5-a203-41e8-814e-fec933be3407"). InnerVolumeSpecName "kube-api-access-hx8xz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:37 crc kubenswrapper[4973]: I0320 13:51:37.015006 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca1a0fe5-a203-41e8-814e-fec933be3407-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ca1a0fe5-a203-41e8-814e-fec933be3407" (UID: "ca1a0fe5-a203-41e8-814e-fec933be3407"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:37 crc kubenswrapper[4973]: I0320 13:51:37.017735 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ca1a0fe5-a203-41e8-814e-fec933be3407-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:37 crc kubenswrapper[4973]: I0320 13:51:37.017878 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hx8xz\" (UniqueName: \"kubernetes.io/projected/ca1a0fe5-a203-41e8-814e-fec933be3407-kube-api-access-hx8xz\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:37 crc kubenswrapper[4973]: I0320 13:51:37.111519 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ca1a0fe5-a203-41e8-814e-fec933be3407-config-data" (OuterVolumeSpecName: "config-data") pod "ca1a0fe5-a203-41e8-814e-fec933be3407" (UID: "ca1a0fe5-a203-41e8-814e-fec933be3407"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:37 crc kubenswrapper[4973]: I0320 13:51:37.120566 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ca1a0fe5-a203-41e8-814e-fec933be3407-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:37 crc kubenswrapper[4973]: I0320 13:51:37.244543 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bb85b8995-9fz8b" Mar 20 13:51:37 crc kubenswrapper[4973]: I0320 13:51:37.312287 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-bxbkc" event={"ID":"ca1a0fe5-a203-41e8-814e-fec933be3407","Type":"ContainerDied","Data":"4e096a3005ed4092041862163ca39d3d459135f2313ed0fd3da7dc77d348a12e"} Mar 20 13:51:37 crc kubenswrapper[4973]: I0320 13:51:37.312351 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e096a3005ed4092041862163ca39d3d459135f2313ed0fd3da7dc77d348a12e" Mar 20 13:51:37 crc kubenswrapper[4973]: I0320 13:51:37.312430 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-bxbkc" Mar 20 13:51:37 crc kubenswrapper[4973]: I0320 13:51:37.367270 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68df85789f-z974z"] Mar 20 13:51:37 crc kubenswrapper[4973]: I0320 13:51:37.367518 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68df85789f-z974z" podUID="98db46c5-a461-4c56-a0ce-666906912d6a" containerName="dnsmasq-dns" containerID="cri-o://4f8d6102d4db48ee1a767bc3a8d199e6ddeb6beea12b8f5239061004141985ca" gracePeriod=10 Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.013709 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.032037 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68df85789f-z974z" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.151148 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-ovsdbserver-nb\") pod \"98db46c5-a461-4c56-a0ce-666906912d6a\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.151251 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s6q5\" (UniqueName: \"kubernetes.io/projected/98db46c5-a461-4c56-a0ce-666906912d6a-kube-api-access-7s6q5\") pod \"98db46c5-a461-4c56-a0ce-666906912d6a\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.151274 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-dns-svc\") pod \"98db46c5-a461-4c56-a0ce-666906912d6a\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.151293 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-dns-swift-storage-0\") pod \"98db46c5-a461-4c56-a0ce-666906912d6a\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.151364 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-config\") pod \"98db46c5-a461-4c56-a0ce-666906912d6a\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.151509 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-ovsdbserver-sb\") pod \"98db46c5-a461-4c56-a0ce-666906912d6a\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.151540 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-openstack-edpm-ipam\") pod \"98db46c5-a461-4c56-a0ce-666906912d6a\" (UID: \"98db46c5-a461-4c56-a0ce-666906912d6a\") " Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.179912 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98db46c5-a461-4c56-a0ce-666906912d6a-kube-api-access-7s6q5" (OuterVolumeSpecName: "kube-api-access-7s6q5") pod "98db46c5-a461-4c56-a0ce-666906912d6a" (UID: "98db46c5-a461-4c56-a0ce-666906912d6a"). InnerVolumeSpecName "kube-api-access-7s6q5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.230460 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "98db46c5-a461-4c56-a0ce-666906912d6a" (UID: "98db46c5-a461-4c56-a0ce-666906912d6a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.230583 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "98db46c5-a461-4c56-a0ce-666906912d6a" (UID: "98db46c5-a461-4c56-a0ce-666906912d6a"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.254081 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s6q5\" (UniqueName: \"kubernetes.io/projected/98db46c5-a461-4c56-a0ce-666906912d6a-kube-api-access-7s6q5\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.254117 4973 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.254126 4973 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.266637 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "98db46c5-a461-4c56-a0ce-666906912d6a" (UID: "98db46c5-a461-4c56-a0ce-666906912d6a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.274501 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "98db46c5-a461-4c56-a0ce-666906912d6a" (UID: "98db46c5-a461-4c56-a0ce-666906912d6a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.284508 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "98db46c5-a461-4c56-a0ce-666906912d6a" (UID: "98db46c5-a461-4c56-a0ce-666906912d6a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.312952 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-config" (OuterVolumeSpecName: "config") pod "98db46c5-a461-4c56-a0ce-666906912d6a" (UID: "98db46c5-a461-4c56-a0ce-666906912d6a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.324830 4973 generic.go:334] "Generic (PLEG): container finished" podID="98db46c5-a461-4c56-a0ce-666906912d6a" containerID="4f8d6102d4db48ee1a767bc3a8d199e6ddeb6beea12b8f5239061004141985ca" exitCode=0 Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.324872 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68df85789f-z974z" event={"ID":"98db46c5-a461-4c56-a0ce-666906912d6a","Type":"ContainerDied","Data":"4f8d6102d4db48ee1a767bc3a8d199e6ddeb6beea12b8f5239061004141985ca"} Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.324901 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68df85789f-z974z" event={"ID":"98db46c5-a461-4c56-a0ce-666906912d6a","Type":"ContainerDied","Data":"adc0ffd78ec6135c65fdbbf678493c6681beb60ff0d253326729610ae6dd24c3"} Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.324917 4973 scope.go:117] "RemoveContainer" containerID="4f8d6102d4db48ee1a767bc3a8d199e6ddeb6beea12b8f5239061004141985ca" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.325043 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68df85789f-z974z" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.356650 4973 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.356679 4973 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.356689 4973 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.356698 4973 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98db46c5-a461-4c56-a0ce-666906912d6a-config\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.380506 4973 scope.go:117] "RemoveContainer" containerID="b1134f08b65731b5aa9fc3da72c28c4c63aa244f25e847cf489f553e46b7b06e" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.417164 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68df85789f-z974z"] Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.418648 4973 scope.go:117] "RemoveContainer" containerID="4f8d6102d4db48ee1a767bc3a8d199e6ddeb6beea12b8f5239061004141985ca" Mar 20 13:51:38 crc kubenswrapper[4973]: E0320 13:51:38.419092 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f8d6102d4db48ee1a767bc3a8d199e6ddeb6beea12b8f5239061004141985ca\": container with ID starting with 4f8d6102d4db48ee1a767bc3a8d199e6ddeb6beea12b8f5239061004141985ca not found: ID does not exist" containerID="4f8d6102d4db48ee1a767bc3a8d199e6ddeb6beea12b8f5239061004141985ca" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.419122 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f8d6102d4db48ee1a767bc3a8d199e6ddeb6beea12b8f5239061004141985ca"} err="failed to get container status \"4f8d6102d4db48ee1a767bc3a8d199e6ddeb6beea12b8f5239061004141985ca\": rpc error: code = NotFound desc = could not find container \"4f8d6102d4db48ee1a767bc3a8d199e6ddeb6beea12b8f5239061004141985ca\": container with ID starting with 4f8d6102d4db48ee1a767bc3a8d199e6ddeb6beea12b8f5239061004141985ca not found: ID does not exist" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.419147 4973 scope.go:117] "RemoveContainer" containerID="b1134f08b65731b5aa9fc3da72c28c4c63aa244f25e847cf489f553e46b7b06e" Mar 20 13:51:38 crc kubenswrapper[4973]: E0320 13:51:38.419496 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1134f08b65731b5aa9fc3da72c28c4c63aa244f25e847cf489f553e46b7b06e\": container with ID starting with b1134f08b65731b5aa9fc3da72c28c4c63aa244f25e847cf489f553e46b7b06e not found: ID does not exist" containerID="b1134f08b65731b5aa9fc3da72c28c4c63aa244f25e847cf489f553e46b7b06e" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.419528 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1134f08b65731b5aa9fc3da72c28c4c63aa244f25e847cf489f553e46b7b06e"} err="failed to get container status \"b1134f08b65731b5aa9fc3da72c28c4c63aa244f25e847cf489f553e46b7b06e\": rpc error: code = NotFound desc = could not find container \"b1134f08b65731b5aa9fc3da72c28c4c63aa244f25e847cf489f553e46b7b06e\": container with ID starting with b1134f08b65731b5aa9fc3da72c28c4c63aa244f25e847cf489f553e46b7b06e not found: ID does not exist" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.433170 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68df85789f-z974z"] Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.827388 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-6b4775c997-mc4n9"] Mar 20 13:51:38 crc kubenswrapper[4973]: E0320 13:51:38.828121 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56685fe5-5182-46f0-84f8-c8a40d42a3d2" containerName="dnsmasq-dns" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.828148 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="56685fe5-5182-46f0-84f8-c8a40d42a3d2" containerName="dnsmasq-dns" Mar 20 13:51:38 crc kubenswrapper[4973]: E0320 13:51:38.828165 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56685fe5-5182-46f0-84f8-c8a40d42a3d2" containerName="init" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.828174 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="56685fe5-5182-46f0-84f8-c8a40d42a3d2" containerName="init" Mar 20 13:51:38 crc kubenswrapper[4973]: E0320 13:51:38.828208 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ca1a0fe5-a203-41e8-814e-fec933be3407" containerName="heat-db-sync" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.828219 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="ca1a0fe5-a203-41e8-814e-fec933be3407" containerName="heat-db-sync" Mar 20 13:51:38 crc kubenswrapper[4973]: E0320 13:51:38.828237 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98db46c5-a461-4c56-a0ce-666906912d6a" containerName="init" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.828245 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="98db46c5-a461-4c56-a0ce-666906912d6a" containerName="init" Mar 20 13:51:38 crc kubenswrapper[4973]: E0320 13:51:38.828271 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98db46c5-a461-4c56-a0ce-666906912d6a" containerName="dnsmasq-dns" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.828279 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="98db46c5-a461-4c56-a0ce-666906912d6a" containerName="dnsmasq-dns" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.828614 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="98db46c5-a461-4c56-a0ce-666906912d6a" containerName="dnsmasq-dns" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.828645 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="ca1a0fe5-a203-41e8-814e-fec933be3407" containerName="heat-db-sync" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.828664 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="56685fe5-5182-46f0-84f8-c8a40d42a3d2" containerName="dnsmasq-dns" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.829726 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6b4775c997-mc4n9" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.864450 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6b4775c997-mc4n9"] Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.973072 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-f55bdff67-69cjm"] Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.975626 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-f55bdff67-69cjm" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.978600 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea073f8d-1032-4d22-b195-61be13e3e832-config-data-custom\") pod \"heat-engine-6b4775c997-mc4n9\" (UID: \"ea073f8d-1032-4d22-b195-61be13e3e832\") " pod="openstack/heat-engine-6b4775c997-mc4n9" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.978663 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5mqt\" (UniqueName: \"kubernetes.io/projected/ea073f8d-1032-4d22-b195-61be13e3e832-kube-api-access-c5mqt\") pod \"heat-engine-6b4775c997-mc4n9\" (UID: \"ea073f8d-1032-4d22-b195-61be13e3e832\") " pod="openstack/heat-engine-6b4775c997-mc4n9" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.978698 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea073f8d-1032-4d22-b195-61be13e3e832-config-data\") pod \"heat-engine-6b4775c997-mc4n9\" (UID: \"ea073f8d-1032-4d22-b195-61be13e3e832\") " pod="openstack/heat-engine-6b4775c997-mc4n9" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.978941 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea073f8d-1032-4d22-b195-61be13e3e832-combined-ca-bundle\") pod \"heat-engine-6b4775c997-mc4n9\" (UID: \"ea073f8d-1032-4d22-b195-61be13e3e832\") " pod="openstack/heat-engine-6b4775c997-mc4n9" Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.980893 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-f55bdff67-69cjm"] Mar 20 13:51:38 crc kubenswrapper[4973]: I0320 13:51:38.993828 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-7d8cc574d6-dm54c"] Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.013940 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7d8cc574d6-dm54c" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.082498 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b35b5167-8059-4cb9-a93b-c8dc96bb23f7-public-tls-certs\") pod \"heat-api-f55bdff67-69cjm\" (UID: \"b35b5167-8059-4cb9-a93b-c8dc96bb23f7\") " pod="openstack/heat-api-f55bdff67-69cjm" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.082628 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b35b5167-8059-4cb9-a93b-c8dc96bb23f7-config-data-custom\") pod \"heat-api-f55bdff67-69cjm\" (UID: \"b35b5167-8059-4cb9-a93b-c8dc96bb23f7\") " pod="openstack/heat-api-f55bdff67-69cjm" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.082944 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b35b5167-8059-4cb9-a93b-c8dc96bb23f7-internal-tls-certs\") pod \"heat-api-f55bdff67-69cjm\" (UID: \"b35b5167-8059-4cb9-a93b-c8dc96bb23f7\") " pod="openstack/heat-api-f55bdff67-69cjm" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.082998 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b35b5167-8059-4cb9-a93b-c8dc96bb23f7-combined-ca-bundle\") pod \"heat-api-f55bdff67-69cjm\" (UID: \"b35b5167-8059-4cb9-a93b-c8dc96bb23f7\") " pod="openstack/heat-api-f55bdff67-69cjm" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.083105 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea073f8d-1032-4d22-b195-61be13e3e832-config-data-custom\") pod \"heat-engine-6b4775c997-mc4n9\" (UID: \"ea073f8d-1032-4d22-b195-61be13e3e832\") " pod="openstack/heat-engine-6b4775c997-mc4n9" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.083160 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5mqt\" (UniqueName: \"kubernetes.io/projected/ea073f8d-1032-4d22-b195-61be13e3e832-kube-api-access-c5mqt\") pod \"heat-engine-6b4775c997-mc4n9\" (UID: \"ea073f8d-1032-4d22-b195-61be13e3e832\") " pod="openstack/heat-engine-6b4775c997-mc4n9" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.083195 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea073f8d-1032-4d22-b195-61be13e3e832-config-data\") pod \"heat-engine-6b4775c997-mc4n9\" (UID: \"ea073f8d-1032-4d22-b195-61be13e3e832\") " pod="openstack/heat-engine-6b4775c997-mc4n9" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.083246 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b35b5167-8059-4cb9-a93b-c8dc96bb23f7-config-data\") pod \"heat-api-f55bdff67-69cjm\" (UID: \"b35b5167-8059-4cb9-a93b-c8dc96bb23f7\") " pod="openstack/heat-api-f55bdff67-69cjm" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.083329 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwl2n\" (UniqueName: \"kubernetes.io/projected/b35b5167-8059-4cb9-a93b-c8dc96bb23f7-kube-api-access-vwl2n\") pod \"heat-api-f55bdff67-69cjm\" (UID: \"b35b5167-8059-4cb9-a93b-c8dc96bb23f7\") " pod="openstack/heat-api-f55bdff67-69cjm" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.083728 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea073f8d-1032-4d22-b195-61be13e3e832-combined-ca-bundle\") pod \"heat-engine-6b4775c997-mc4n9\" (UID: \"ea073f8d-1032-4d22-b195-61be13e3e832\") " pod="openstack/heat-engine-6b4775c997-mc4n9" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.091356 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea073f8d-1032-4d22-b195-61be13e3e832-combined-ca-bundle\") pod \"heat-engine-6b4775c997-mc4n9\" (UID: \"ea073f8d-1032-4d22-b195-61be13e3e832\") " pod="openstack/heat-engine-6b4775c997-mc4n9" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.094494 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ea073f8d-1032-4d22-b195-61be13e3e832-config-data\") pod \"heat-engine-6b4775c997-mc4n9\" (UID: \"ea073f8d-1032-4d22-b195-61be13e3e832\") " pod="openstack/heat-engine-6b4775c997-mc4n9" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.096031 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7d8cc574d6-dm54c"] Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.102195 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/ea073f8d-1032-4d22-b195-61be13e3e832-config-data-custom\") pod \"heat-engine-6b4775c997-mc4n9\" (UID: \"ea073f8d-1032-4d22-b195-61be13e3e832\") " pod="openstack/heat-engine-6b4775c997-mc4n9" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.114574 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5mqt\" (UniqueName: \"kubernetes.io/projected/ea073f8d-1032-4d22-b195-61be13e3e832-kube-api-access-c5mqt\") pod \"heat-engine-6b4775c997-mc4n9\" (UID: \"ea073f8d-1032-4d22-b195-61be13e3e832\") " pod="openstack/heat-engine-6b4775c997-mc4n9" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.154120 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-6b4775c997-mc4n9" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.194148 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b35b5167-8059-4cb9-a93b-c8dc96bb23f7-public-tls-certs\") pod \"heat-api-f55bdff67-69cjm\" (UID: \"b35b5167-8059-4cb9-a93b-c8dc96bb23f7\") " pod="openstack/heat-api-f55bdff67-69cjm" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.194238 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/adfece77-bf45-4bb7-8f7f-c57be5a4edfc-public-tls-certs\") pod \"heat-cfnapi-7d8cc574d6-dm54c\" (UID: \"adfece77-bf45-4bb7-8f7f-c57be5a4edfc\") " pod="openstack/heat-cfnapi-7d8cc574d6-dm54c" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.194302 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b35b5167-8059-4cb9-a93b-c8dc96bb23f7-config-data-custom\") pod \"heat-api-f55bdff67-69cjm\" (UID: \"b35b5167-8059-4cb9-a93b-c8dc96bb23f7\") " pod="openstack/heat-api-f55bdff67-69cjm" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.194444 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/adfece77-bf45-4bb7-8f7f-c57be5a4edfc-internal-tls-certs\") pod \"heat-cfnapi-7d8cc574d6-dm54c\" (UID: \"adfece77-bf45-4bb7-8f7f-c57be5a4edfc\") " pod="openstack/heat-cfnapi-7d8cc574d6-dm54c" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.194474 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b35b5167-8059-4cb9-a93b-c8dc96bb23f7-internal-tls-certs\") pod \"heat-api-f55bdff67-69cjm\" (UID: \"b35b5167-8059-4cb9-a93b-c8dc96bb23f7\") " pod="openstack/heat-api-f55bdff67-69cjm" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.194510 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b35b5167-8059-4cb9-a93b-c8dc96bb23f7-combined-ca-bundle\") pod \"heat-api-f55bdff67-69cjm\" (UID: \"b35b5167-8059-4cb9-a93b-c8dc96bb23f7\") " pod="openstack/heat-api-f55bdff67-69cjm" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.194611 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b35b5167-8059-4cb9-a93b-c8dc96bb23f7-config-data\") pod \"heat-api-f55bdff67-69cjm\" (UID: \"b35b5167-8059-4cb9-a93b-c8dc96bb23f7\") " pod="openstack/heat-api-f55bdff67-69cjm" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.194642 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/adfece77-bf45-4bb7-8f7f-c57be5a4edfc-config-data-custom\") pod \"heat-cfnapi-7d8cc574d6-dm54c\" (UID: \"adfece77-bf45-4bb7-8f7f-c57be5a4edfc\") " pod="openstack/heat-cfnapi-7d8cc574d6-dm54c" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.194671 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwl2n\" (UniqueName: \"kubernetes.io/projected/b35b5167-8059-4cb9-a93b-c8dc96bb23f7-kube-api-access-vwl2n\") pod \"heat-api-f55bdff67-69cjm\" (UID: \"b35b5167-8059-4cb9-a93b-c8dc96bb23f7\") " pod="openstack/heat-api-f55bdff67-69cjm" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.194839 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scbs6\" (UniqueName: \"kubernetes.io/projected/adfece77-bf45-4bb7-8f7f-c57be5a4edfc-kube-api-access-scbs6\") pod \"heat-cfnapi-7d8cc574d6-dm54c\" (UID: \"adfece77-bf45-4bb7-8f7f-c57be5a4edfc\") " pod="openstack/heat-cfnapi-7d8cc574d6-dm54c" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.195017 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adfece77-bf45-4bb7-8f7f-c57be5a4edfc-config-data\") pod \"heat-cfnapi-7d8cc574d6-dm54c\" (UID: \"adfece77-bf45-4bb7-8f7f-c57be5a4edfc\") " pod="openstack/heat-cfnapi-7d8cc574d6-dm54c" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.195042 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adfece77-bf45-4bb7-8f7f-c57be5a4edfc-combined-ca-bundle\") pod \"heat-cfnapi-7d8cc574d6-dm54c\" (UID: \"adfece77-bf45-4bb7-8f7f-c57be5a4edfc\") " pod="openstack/heat-cfnapi-7d8cc574d6-dm54c" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.207116 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b35b5167-8059-4cb9-a93b-c8dc96bb23f7-config-data\") pod \"heat-api-f55bdff67-69cjm\" (UID: \"b35b5167-8059-4cb9-a93b-c8dc96bb23f7\") " pod="openstack/heat-api-f55bdff67-69cjm" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.218314 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b35b5167-8059-4cb9-a93b-c8dc96bb23f7-public-tls-certs\") pod \"heat-api-f55bdff67-69cjm\" (UID: \"b35b5167-8059-4cb9-a93b-c8dc96bb23f7\") " pod="openstack/heat-api-f55bdff67-69cjm" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.219691 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b35b5167-8059-4cb9-a93b-c8dc96bb23f7-config-data-custom\") pod \"heat-api-f55bdff67-69cjm\" (UID: \"b35b5167-8059-4cb9-a93b-c8dc96bb23f7\") " pod="openstack/heat-api-f55bdff67-69cjm" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.220092 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b35b5167-8059-4cb9-a93b-c8dc96bb23f7-internal-tls-certs\") pod \"heat-api-f55bdff67-69cjm\" (UID: \"b35b5167-8059-4cb9-a93b-c8dc96bb23f7\") " pod="openstack/heat-api-f55bdff67-69cjm" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.225818 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwl2n\" (UniqueName: \"kubernetes.io/projected/b35b5167-8059-4cb9-a93b-c8dc96bb23f7-kube-api-access-vwl2n\") pod \"heat-api-f55bdff67-69cjm\" (UID: \"b35b5167-8059-4cb9-a93b-c8dc96bb23f7\") " pod="openstack/heat-api-f55bdff67-69cjm" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.226441 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b35b5167-8059-4cb9-a93b-c8dc96bb23f7-combined-ca-bundle\") pod \"heat-api-f55bdff67-69cjm\" (UID: \"b35b5167-8059-4cb9-a93b-c8dc96bb23f7\") " pod="openstack/heat-api-f55bdff67-69cjm" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.299130 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/adfece77-bf45-4bb7-8f7f-c57be5a4edfc-internal-tls-certs\") pod \"heat-cfnapi-7d8cc574d6-dm54c\" (UID: \"adfece77-bf45-4bb7-8f7f-c57be5a4edfc\") " pod="openstack/heat-cfnapi-7d8cc574d6-dm54c" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.299469 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/adfece77-bf45-4bb7-8f7f-c57be5a4edfc-config-data-custom\") pod \"heat-cfnapi-7d8cc574d6-dm54c\" (UID: \"adfece77-bf45-4bb7-8f7f-c57be5a4edfc\") " pod="openstack/heat-cfnapi-7d8cc574d6-dm54c" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.299556 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scbs6\" (UniqueName: \"kubernetes.io/projected/adfece77-bf45-4bb7-8f7f-c57be5a4edfc-kube-api-access-scbs6\") pod \"heat-cfnapi-7d8cc574d6-dm54c\" (UID: \"adfece77-bf45-4bb7-8f7f-c57be5a4edfc\") " pod="openstack/heat-cfnapi-7d8cc574d6-dm54c" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.299622 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adfece77-bf45-4bb7-8f7f-c57be5a4edfc-combined-ca-bundle\") pod \"heat-cfnapi-7d8cc574d6-dm54c\" (UID: \"adfece77-bf45-4bb7-8f7f-c57be5a4edfc\") " pod="openstack/heat-cfnapi-7d8cc574d6-dm54c" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.299639 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adfece77-bf45-4bb7-8f7f-c57be5a4edfc-config-data\") pod \"heat-cfnapi-7d8cc574d6-dm54c\" (UID: \"adfece77-bf45-4bb7-8f7f-c57be5a4edfc\") " pod="openstack/heat-cfnapi-7d8cc574d6-dm54c" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.299667 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/adfece77-bf45-4bb7-8f7f-c57be5a4edfc-public-tls-certs\") pod \"heat-cfnapi-7d8cc574d6-dm54c\" (UID: \"adfece77-bf45-4bb7-8f7f-c57be5a4edfc\") " pod="openstack/heat-cfnapi-7d8cc574d6-dm54c" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.317231 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/adfece77-bf45-4bb7-8f7f-c57be5a4edfc-internal-tls-certs\") pod \"heat-cfnapi-7d8cc574d6-dm54c\" (UID: \"adfece77-bf45-4bb7-8f7f-c57be5a4edfc\") " pod="openstack/heat-cfnapi-7d8cc574d6-dm54c" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.320071 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-f55bdff67-69cjm" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.323916 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/adfece77-bf45-4bb7-8f7f-c57be5a4edfc-public-tls-certs\") pod \"heat-cfnapi-7d8cc574d6-dm54c\" (UID: \"adfece77-bf45-4bb7-8f7f-c57be5a4edfc\") " pod="openstack/heat-cfnapi-7d8cc574d6-dm54c" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.327190 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adfece77-bf45-4bb7-8f7f-c57be5a4edfc-combined-ca-bundle\") pod \"heat-cfnapi-7d8cc574d6-dm54c\" (UID: \"adfece77-bf45-4bb7-8f7f-c57be5a4edfc\") " pod="openstack/heat-cfnapi-7d8cc574d6-dm54c" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.328382 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/adfece77-bf45-4bb7-8f7f-c57be5a4edfc-config-data-custom\") pod \"heat-cfnapi-7d8cc574d6-dm54c\" (UID: \"adfece77-bf45-4bb7-8f7f-c57be5a4edfc\") " pod="openstack/heat-cfnapi-7d8cc574d6-dm54c" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.333700 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/adfece77-bf45-4bb7-8f7f-c57be5a4edfc-config-data\") pod \"heat-cfnapi-7d8cc574d6-dm54c\" (UID: \"adfece77-bf45-4bb7-8f7f-c57be5a4edfc\") " pod="openstack/heat-cfnapi-7d8cc574d6-dm54c" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.337461 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scbs6\" (UniqueName: \"kubernetes.io/projected/adfece77-bf45-4bb7-8f7f-c57be5a4edfc-kube-api-access-scbs6\") pod \"heat-cfnapi-7d8cc574d6-dm54c\" (UID: \"adfece77-bf45-4bb7-8f7f-c57be5a4edfc\") " pod="openstack/heat-cfnapi-7d8cc574d6-dm54c" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.352011 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-7d8cc574d6-dm54c" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.431287 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cde80e1-f72d-4080-a86b-5968a8904333","Type":"ContainerStarted","Data":"81a2a757344e9954604ab80014ae8c2dab5114605924ae7840572782b3e1f504"} Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.475087 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.069407773 podStartE2EDuration="44.475068075s" podCreationTimestamp="2026-03-20 13:50:55 +0000 UTC" firstStartedPulling="2026-03-20 13:50:56.793449713 +0000 UTC m=+1777.537119457" lastFinishedPulling="2026-03-20 13:51:38.199110015 +0000 UTC m=+1818.942779759" observedRunningTime="2026-03-20 13:51:39.473700248 +0000 UTC m=+1820.217369992" watchObservedRunningTime="2026-03-20 13:51:39.475068075 +0000 UTC m=+1820.218737819" Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.852442 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-6b4775c997-mc4n9"] Mar 20 13:51:39 crc kubenswrapper[4973]: I0320 13:51:39.980921 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98db46c5-a461-4c56-a0ce-666906912d6a" path="/var/lib/kubelet/pods/98db46c5-a461-4c56-a0ce-666906912d6a/volumes" Mar 20 13:51:40 crc kubenswrapper[4973]: W0320 13:51:40.176087 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb35b5167_8059_4cb9_a93b_c8dc96bb23f7.slice/crio-182b549379582dd6cb8f92803ebccdd7c84b833b565eb3037b17b10474aeb3e4 WatchSource:0}: Error finding container 182b549379582dd6cb8f92803ebccdd7c84b833b565eb3037b17b10474aeb3e4: Status 404 returned error can't find the container with id 182b549379582dd6cb8f92803ebccdd7c84b833b565eb3037b17b10474aeb3e4 Mar 20 13:51:40 crc kubenswrapper[4973]: I0320 13:51:40.176737 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-f55bdff67-69cjm"] Mar 20 13:51:40 crc kubenswrapper[4973]: I0320 13:51:40.331802 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-7d8cc574d6-dm54c"] Mar 20 13:51:40 crc kubenswrapper[4973]: I0320 13:51:40.448717 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7d8cc574d6-dm54c" event={"ID":"adfece77-bf45-4bb7-8f7f-c57be5a4edfc","Type":"ContainerStarted","Data":"2923a6c0c4366eac09190159e4cb9281554e90a5f69ec385d2cfd7feed1187d5"} Mar 20 13:51:40 crc kubenswrapper[4973]: I0320 13:51:40.465989 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6b4775c997-mc4n9" event={"ID":"ea073f8d-1032-4d22-b195-61be13e3e832","Type":"ContainerStarted","Data":"22f47e6b4e2e40b88f6b2359d45339aff0210e89a59d01791102e417956430f9"} Mar 20 13:51:40 crc kubenswrapper[4973]: I0320 13:51:40.466032 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-6b4775c997-mc4n9" event={"ID":"ea073f8d-1032-4d22-b195-61be13e3e832","Type":"ContainerStarted","Data":"724a145b3cbd60153a930592005423773571a39e0b82057b4580ada89d44ef7a"} Mar 20 13:51:40 crc kubenswrapper[4973]: I0320 13:51:40.466325 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-6b4775c997-mc4n9" Mar 20 13:51:40 crc kubenswrapper[4973]: I0320 13:51:40.469229 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-f55bdff67-69cjm" event={"ID":"b35b5167-8059-4cb9-a93b-c8dc96bb23f7","Type":"ContainerStarted","Data":"182b549379582dd6cb8f92803ebccdd7c84b833b565eb3037b17b10474aeb3e4"} Mar 20 13:51:40 crc kubenswrapper[4973]: I0320 13:51:40.499584 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-6b4775c997-mc4n9" podStartSLOduration=2.499565006 podStartE2EDuration="2.499565006s" podCreationTimestamp="2026-03-20 13:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:51:40.484491275 +0000 UTC m=+1821.228161039" watchObservedRunningTime="2026-03-20 13:51:40.499565006 +0000 UTC m=+1821.243234750" Mar 20 13:51:42 crc kubenswrapper[4973]: I0320 13:51:42.115250 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58"] Mar 20 13:51:42 crc kubenswrapper[4973]: I0320 13:51:42.117130 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58" Mar 20 13:51:42 crc kubenswrapper[4973]: I0320 13:51:42.120881 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 13:51:42 crc kubenswrapper[4973]: I0320 13:51:42.121193 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d4wjc" Mar 20 13:51:42 crc kubenswrapper[4973]: I0320 13:51:42.122430 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 13:51:42 crc kubenswrapper[4973]: I0320 13:51:42.129076 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 13:51:42 crc kubenswrapper[4973]: I0320 13:51:42.138510 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58"] Mar 20 13:51:42 crc kubenswrapper[4973]: I0320 13:51:42.263094 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7673cb4f-1440-480a-8d50-2640987b8a0f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58\" (UID: \"7673cb4f-1440-480a-8d50-2640987b8a0f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58" Mar 20 13:51:42 crc kubenswrapper[4973]: I0320 13:51:42.263518 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7673cb4f-1440-480a-8d50-2640987b8a0f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58\" (UID: \"7673cb4f-1440-480a-8d50-2640987b8a0f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58" Mar 20 13:51:42 crc kubenswrapper[4973]: I0320 13:51:42.263643 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcqnr\" (UniqueName: \"kubernetes.io/projected/7673cb4f-1440-480a-8d50-2640987b8a0f-kube-api-access-wcqnr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58\" (UID: \"7673cb4f-1440-480a-8d50-2640987b8a0f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58" Mar 20 13:51:42 crc kubenswrapper[4973]: I0320 13:51:42.263661 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7673cb4f-1440-480a-8d50-2640987b8a0f-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58\" (UID: \"7673cb4f-1440-480a-8d50-2640987b8a0f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58" Mar 20 13:51:42 crc kubenswrapper[4973]: I0320 13:51:42.365398 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7673cb4f-1440-480a-8d50-2640987b8a0f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58\" (UID: \"7673cb4f-1440-480a-8d50-2640987b8a0f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58" Mar 20 13:51:42 crc kubenswrapper[4973]: I0320 13:51:42.365507 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7673cb4f-1440-480a-8d50-2640987b8a0f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58\" (UID: \"7673cb4f-1440-480a-8d50-2640987b8a0f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58" Mar 20 13:51:42 crc kubenswrapper[4973]: I0320 13:51:42.365626 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcqnr\" (UniqueName: \"kubernetes.io/projected/7673cb4f-1440-480a-8d50-2640987b8a0f-kube-api-access-wcqnr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58\" (UID: \"7673cb4f-1440-480a-8d50-2640987b8a0f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58" Mar 20 13:51:42 crc kubenswrapper[4973]: I0320 13:51:42.365647 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7673cb4f-1440-480a-8d50-2640987b8a0f-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58\" (UID: \"7673cb4f-1440-480a-8d50-2640987b8a0f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58" Mar 20 13:51:42 crc kubenswrapper[4973]: I0320 13:51:42.377997 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7673cb4f-1440-480a-8d50-2640987b8a0f-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58\" (UID: \"7673cb4f-1440-480a-8d50-2640987b8a0f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58" Mar 20 13:51:42 crc kubenswrapper[4973]: I0320 13:51:42.380384 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7673cb4f-1440-480a-8d50-2640987b8a0f-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58\" (UID: \"7673cb4f-1440-480a-8d50-2640987b8a0f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58" Mar 20 13:51:42 crc kubenswrapper[4973]: I0320 13:51:42.391119 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7673cb4f-1440-480a-8d50-2640987b8a0f-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58\" (UID: \"7673cb4f-1440-480a-8d50-2640987b8a0f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58" Mar 20 13:51:42 crc kubenswrapper[4973]: I0320 13:51:42.392574 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcqnr\" (UniqueName: \"kubernetes.io/projected/7673cb4f-1440-480a-8d50-2640987b8a0f-kube-api-access-wcqnr\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58\" (UID: \"7673cb4f-1440-480a-8d50-2640987b8a0f\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58" Mar 20 13:51:42 crc kubenswrapper[4973]: I0320 13:51:42.468985 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58" Mar 20 13:51:43 crc kubenswrapper[4973]: I0320 13:51:43.521191 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-f55bdff67-69cjm" event={"ID":"b35b5167-8059-4cb9-a93b-c8dc96bb23f7","Type":"ContainerStarted","Data":"42582250744f9d1f52e8d9291f1a72f05230c0541c06080d4424290b1ef13538"} Mar 20 13:51:43 crc kubenswrapper[4973]: I0320 13:51:43.522819 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-f55bdff67-69cjm" Mar 20 13:51:43 crc kubenswrapper[4973]: I0320 13:51:43.527001 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-7d8cc574d6-dm54c" event={"ID":"adfece77-bf45-4bb7-8f7f-c57be5a4edfc","Type":"ContainerStarted","Data":"fd77fce47100b4aa39fe0b4b1e2c02adaca48b504316eb2a798aeed470758614"} Mar 20 13:51:43 crc kubenswrapper[4973]: I0320 13:51:43.527712 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-7d8cc574d6-dm54c" Mar 20 13:51:43 crc kubenswrapper[4973]: I0320 13:51:43.561769 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-f55bdff67-69cjm" podStartSLOduration=2.8845171609999998 podStartE2EDuration="5.561749553s" podCreationTimestamp="2026-03-20 13:51:38 +0000 UTC" firstStartedPulling="2026-03-20 13:51:40.1939677 +0000 UTC m=+1820.937637444" lastFinishedPulling="2026-03-20 13:51:42.871200092 +0000 UTC m=+1823.614869836" observedRunningTime="2026-03-20 13:51:43.548410969 +0000 UTC m=+1824.292080733" watchObservedRunningTime="2026-03-20 13:51:43.561749553 +0000 UTC m=+1824.305419297" Mar 20 13:51:44 crc kubenswrapper[4973]: I0320 13:51:44.001484 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-7d8cc574d6-dm54c" podStartSLOduration=3.464182375 podStartE2EDuration="6.001461904s" podCreationTimestamp="2026-03-20 13:51:38 +0000 UTC" firstStartedPulling="2026-03-20 13:51:40.332649748 +0000 UTC m=+1821.076319492" lastFinishedPulling="2026-03-20 13:51:42.869929277 +0000 UTC m=+1823.613599021" observedRunningTime="2026-03-20 13:51:43.573509985 +0000 UTC m=+1824.317179739" watchObservedRunningTime="2026-03-20 13:51:44.001461904 +0000 UTC m=+1824.745131648" Mar 20 13:51:44 crc kubenswrapper[4973]: I0320 13:51:44.012397 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58"] Mar 20 13:51:44 crc kubenswrapper[4973]: W0320 13:51:44.020440 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7673cb4f_1440_480a_8d50_2640987b8a0f.slice/crio-5b619b5d5639bc86143747b410cad5e1bfd490677e197cfa3bc36dfdd7eaadae WatchSource:0}: Error finding container 5b619b5d5639bc86143747b410cad5e1bfd490677e197cfa3bc36dfdd7eaadae: Status 404 returned error can't find the container with id 5b619b5d5639bc86143747b410cad5e1bfd490677e197cfa3bc36dfdd7eaadae Mar 20 13:51:44 crc kubenswrapper[4973]: I0320 13:51:44.553922 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58" event={"ID":"7673cb4f-1440-480a-8d50-2640987b8a0f","Type":"ContainerStarted","Data":"5b619b5d5639bc86143747b410cad5e1bfd490677e197cfa3bc36dfdd7eaadae"} Mar 20 13:51:46 crc kubenswrapper[4973]: I0320 13:51:46.951154 4973 scope.go:117] "RemoveContainer" containerID="6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" Mar 20 13:51:46 crc kubenswrapper[4973]: E0320 13:51:46.951691 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:51:50 crc kubenswrapper[4973]: I0320 13:51:50.356201 4973 scope.go:117] "RemoveContainer" containerID="50aaf9928844e9feee6b2c0d1b4bd40f41ffcc0751174b6b279ca3eadddbbde6" Mar 20 13:51:50 crc kubenswrapper[4973]: I0320 13:51:50.902774 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-f55bdff67-69cjm" Mar 20 13:51:50 crc kubenswrapper[4973]: I0320 13:51:50.981859 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6558cfd5cd-bptc4"] Mar 20 13:51:50 crc kubenswrapper[4973]: I0320 13:51:50.982114 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-api-6558cfd5cd-bptc4" podUID="3fd39b11-95a1-493b-afd5-6469bd8ee321" containerName="heat-api" containerID="cri-o://5fd3a24fa902cdc7db2d0123e6cc1a28435afafe8ff66817e12cfb41d15950f8" gracePeriod=60 Mar 20 13:51:51 crc kubenswrapper[4973]: I0320 13:51:51.474257 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-7d8cc574d6-dm54c" Mar 20 13:51:51 crc kubenswrapper[4973]: I0320 13:51:51.546319 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-b78ff9797-ms9rv"] Mar 20 13:51:51 crc kubenswrapper[4973]: I0320 13:51:51.546792 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-cfnapi-b78ff9797-ms9rv" podUID="fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c" containerName="heat-cfnapi" containerID="cri-o://d3995aa4396af1f1e32565ada9c9f146370fb46167efc8a4bdbda0a96f007ede" gracePeriod=60 Mar 20 13:51:52 crc kubenswrapper[4973]: I0320 13:51:52.658171 4973 generic.go:334] "Generic (PLEG): container finished" podID="89a22e25-955b-4786-baf2-7138a668f512" containerID="6b8230c0351f652601a4eca4fb74594d0c76891c01f5777543668b86a1d543e0" exitCode=0 Mar 20 13:51:52 crc kubenswrapper[4973]: I0320 13:51:52.658709 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"89a22e25-955b-4786-baf2-7138a668f512","Type":"ContainerDied","Data":"6b8230c0351f652601a4eca4fb74594d0c76891c01f5777543668b86a1d543e0"} Mar 20 13:51:52 crc kubenswrapper[4973]: I0320 13:51:52.663268 4973 generic.go:334] "Generic (PLEG): container finished" podID="f8b4580d-53af-4c18-9f2c-b883b8621113" containerID="2cbee18dc61aa85750195a2e7564ae5ffeaa5102a8fbe567336b64dd93aea919" exitCode=0 Mar 20 13:51:52 crc kubenswrapper[4973]: I0320 13:51:52.663313 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"f8b4580d-53af-4c18-9f2c-b883b8621113","Type":"ContainerDied","Data":"2cbee18dc61aa85750195a2e7564ae5ffeaa5102a8fbe567336b64dd93aea919"} Mar 20 13:51:54 crc kubenswrapper[4973]: I0320 13:51:54.203577 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-api-6558cfd5cd-bptc4" podUID="3fd39b11-95a1-493b-afd5-6469bd8ee321" containerName="heat-api" probeResult="failure" output="Get \"https://10.217.0.236:8004/healthcheck\": read tcp 10.217.0.2:46044->10.217.0.236:8004: read: connection reset by peer" Mar 20 13:51:54 crc kubenswrapper[4973]: I0320 13:51:54.693024 4973 generic.go:334] "Generic (PLEG): container finished" podID="3fd39b11-95a1-493b-afd5-6469bd8ee321" containerID="5fd3a24fa902cdc7db2d0123e6cc1a28435afafe8ff66817e12cfb41d15950f8" exitCode=0 Mar 20 13:51:54 crc kubenswrapper[4973]: I0320 13:51:54.693086 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6558cfd5cd-bptc4" event={"ID":"3fd39b11-95a1-493b-afd5-6469bd8ee321","Type":"ContainerDied","Data":"5fd3a24fa902cdc7db2d0123e6cc1a28435afafe8ff66817e12cfb41d15950f8"} Mar 20 13:51:54 crc kubenswrapper[4973]: I0320 13:51:54.741422 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/heat-cfnapi-b78ff9797-ms9rv" podUID="fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c" containerName="heat-cfnapi" probeResult="failure" output="Get \"https://10.217.0.235:8000/healthcheck\": read tcp 10.217.0.2:37020->10.217.0.235:8000: read: connection reset by peer" Mar 20 13:51:55 crc kubenswrapper[4973]: I0320 13:51:55.707873 4973 generic.go:334] "Generic (PLEG): container finished" podID="fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c" containerID="d3995aa4396af1f1e32565ada9c9f146370fb46167efc8a4bdbda0a96f007ede" exitCode=0 Mar 20 13:51:55 crc kubenswrapper[4973]: I0320 13:51:55.707936 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-b78ff9797-ms9rv" event={"ID":"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c","Type":"ContainerDied","Data":"d3995aa4396af1f1e32565ada9c9f146370fb46167efc8a4bdbda0a96f007ede"} Mar 20 13:51:56 crc kubenswrapper[4973]: I0320 13:51:56.259978 4973 scope.go:117] "RemoveContainer" containerID="1978ad4d84dbc6a8919d8ebe3feeb799305389dff7ee374866a8cb8995edfb71" Mar 20 13:51:56 crc kubenswrapper[4973]: I0320 13:51:56.506526 4973 scope.go:117] "RemoveContainer" containerID="17c6198576f37401cc560eb9a955f8ab065ebd950c3c257644e6a555cefe1935" Mar 20 13:51:56 crc kubenswrapper[4973]: I0320 13:51:56.681317 4973 scope.go:117] "RemoveContainer" containerID="934abc44721f6dc5fce46e9def2e043b28599651fd64efb74642f7c4bf719914" Mar 20 13:51:56 crc kubenswrapper[4973]: I0320 13:51:56.744867 4973 scope.go:117] "RemoveContainer" containerID="239a16c7abc7e3d0d3b8d364c939b685e93e4e5d7fc693b726db6e98d85f1b44" Mar 20 13:51:56 crc kubenswrapper[4973]: I0320 13:51:56.823710 4973 scope.go:117] "RemoveContainer" containerID="d7d30c2f5c6f9bbc0871452af8c9840f478efc3187eedcef4d9320834860fb19" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.034544 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-b78ff9797-ms9rv" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.167084 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6558cfd5cd-bptc4" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.223617 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-config-data-custom\") pod \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\" (UID: \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\") " Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.223717 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-internal-tls-certs\") pod \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\" (UID: \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\") " Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.223755 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-combined-ca-bundle\") pod \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\" (UID: \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\") " Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.223896 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9s6k\" (UniqueName: \"kubernetes.io/projected/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-kube-api-access-j9s6k\") pod \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\" (UID: \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\") " Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.223929 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-config-data\") pod \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\" (UID: \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\") " Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.223983 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-public-tls-certs\") pod \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\" (UID: \"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c\") " Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.236004 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c" (UID: "fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.248785 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-kube-api-access-j9s6k" (OuterVolumeSpecName: "kube-api-access-j9s6k") pod "fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c" (UID: "fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c"). InnerVolumeSpecName "kube-api-access-j9s6k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.299929 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c" (UID: "fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.302894 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-config-data" (OuterVolumeSpecName: "config-data") pod "fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c" (UID: "fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.317249 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c" (UID: "fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.326287 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-config-data\") pod \"3fd39b11-95a1-493b-afd5-6469bd8ee321\" (UID: \"3fd39b11-95a1-493b-afd5-6469bd8ee321\") " Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.326629 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-public-tls-certs\") pod \"3fd39b11-95a1-493b-afd5-6469bd8ee321\" (UID: \"3fd39b11-95a1-493b-afd5-6469bd8ee321\") " Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.326726 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-config-data-custom\") pod \"3fd39b11-95a1-493b-afd5-6469bd8ee321\" (UID: \"3fd39b11-95a1-493b-afd5-6469bd8ee321\") " Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.326882 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkslb\" (UniqueName: \"kubernetes.io/projected/3fd39b11-95a1-493b-afd5-6469bd8ee321-kube-api-access-qkslb\") pod \"3fd39b11-95a1-493b-afd5-6469bd8ee321\" (UID: \"3fd39b11-95a1-493b-afd5-6469bd8ee321\") " Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.326942 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-internal-tls-certs\") pod \"3fd39b11-95a1-493b-afd5-6469bd8ee321\" (UID: \"3fd39b11-95a1-493b-afd5-6469bd8ee321\") " Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.327000 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-combined-ca-bundle\") pod \"3fd39b11-95a1-493b-afd5-6469bd8ee321\" (UID: \"3fd39b11-95a1-493b-afd5-6469bd8ee321\") " Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.327656 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.327672 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9s6k\" (UniqueName: \"kubernetes.io/projected/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-kube-api-access-j9s6k\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.327683 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.327692 4973 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.327700 4973 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.331266 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fd39b11-95a1-493b-afd5-6469bd8ee321-kube-api-access-qkslb" (OuterVolumeSpecName: "kube-api-access-qkslb") pod "3fd39b11-95a1-493b-afd5-6469bd8ee321" (UID: "3fd39b11-95a1-493b-afd5-6469bd8ee321"). InnerVolumeSpecName "kube-api-access-qkslb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.334546 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3fd39b11-95a1-493b-afd5-6469bd8ee321" (UID: "3fd39b11-95a1-493b-afd5-6469bd8ee321"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.338915 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c" (UID: "fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.380059 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3fd39b11-95a1-493b-afd5-6469bd8ee321" (UID: "3fd39b11-95a1-493b-afd5-6469bd8ee321"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.406058 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-config-data" (OuterVolumeSpecName: "config-data") pod "3fd39b11-95a1-493b-afd5-6469bd8ee321" (UID: "3fd39b11-95a1-493b-afd5-6469bd8ee321"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.411439 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "3fd39b11-95a1-493b-afd5-6469bd8ee321" (UID: "3fd39b11-95a1-493b-afd5-6469bd8ee321"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.430188 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.430568 4973 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.430659 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkslb\" (UniqueName: \"kubernetes.io/projected/3fd39b11-95a1-493b-afd5-6469bd8ee321-kube-api-access-qkslb\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.430742 4973 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.430810 4973 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.430874 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.436636 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "3fd39b11-95a1-493b-afd5-6469bd8ee321" (UID: "3fd39b11-95a1-493b-afd5-6469bd8ee321"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.533079 4973 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3fd39b11-95a1-493b-afd5-6469bd8ee321-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.748400 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-b78ff9797-ms9rv" event={"ID":"fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c","Type":"ContainerDied","Data":"cebe68b56c8bca1a398cdad49ccb297f8e772fa7a220528d21272387b928173d"} Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.748434 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-b78ff9797-ms9rv" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.749439 4973 scope.go:117] "RemoveContainer" containerID="d3995aa4396af1f1e32565ada9c9f146370fb46167efc8a4bdbda0a96f007ede" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.759420 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"89a22e25-955b-4786-baf2-7138a668f512","Type":"ContainerStarted","Data":"ea4b417f88b18de28c614f0391d79172e74401a41982b136d6016f05e8aeabc0"} Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.760813 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.766660 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-6558cfd5cd-bptc4" event={"ID":"3fd39b11-95a1-493b-afd5-6469bd8ee321","Type":"ContainerDied","Data":"8acf94ebd28196f244271c82a490d19fbf92f2d30b9bf5fd1b0186b18421d224"} Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.767639 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-6558cfd5cd-bptc4" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.770730 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-2" event={"ID":"f8b4580d-53af-4c18-9f2c-b883b8621113","Type":"ContainerStarted","Data":"dd3d0750417096b1cda054eb23c85a3317a77da4dae38ba5105edc508b1679ea"} Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.771384 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-2" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.773597 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58" event={"ID":"7673cb4f-1440-480a-8d50-2640987b8a0f","Type":"ContainerStarted","Data":"24cd30981df0087f3c87ffb550e02326d6e1d32bef0430d0047a4eb94f132454"} Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.786595 4973 scope.go:117] "RemoveContainer" containerID="5fd3a24fa902cdc7db2d0123e6cc1a28435afafe8ff66817e12cfb41d15950f8" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.790976 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.790956302 podStartE2EDuration="40.790956302s" podCreationTimestamp="2026-03-20 13:51:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:51:57.784056803 +0000 UTC m=+1838.527726567" watchObservedRunningTime="2026-03-20 13:51:57.790956302 +0000 UTC m=+1838.534626046" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.843098 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-cfnapi-b78ff9797-ms9rv"] Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.865614 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-cfnapi-b78ff9797-ms9rv"] Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.882882 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-2" podStartSLOduration=41.882855372 podStartE2EDuration="41.882855372s" podCreationTimestamp="2026-03-20 13:51:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:51:57.831330795 +0000 UTC m=+1838.575000539" watchObservedRunningTime="2026-03-20 13:51:57.882855372 +0000 UTC m=+1838.626525136" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.886738 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58" podStartSLOduration=4.402757586 podStartE2EDuration="16.886713998s" podCreationTimestamp="2026-03-20 13:51:41 +0000 UTC" firstStartedPulling="2026-03-20 13:51:44.022549459 +0000 UTC m=+1824.766219203" lastFinishedPulling="2026-03-20 13:51:56.506505881 +0000 UTC m=+1837.250175615" observedRunningTime="2026-03-20 13:51:57.860472011 +0000 UTC m=+1838.604141745" watchObservedRunningTime="2026-03-20 13:51:57.886713998 +0000 UTC m=+1838.630383742" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.915124 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-api-6558cfd5cd-bptc4"] Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.926836 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-api-6558cfd5cd-bptc4"] Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.951275 4973 scope.go:117] "RemoveContainer" containerID="6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" Mar 20 13:51:57 crc kubenswrapper[4973]: E0320 13:51:57.951624 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.963263 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fd39b11-95a1-493b-afd5-6469bd8ee321" path="/var/lib/kubelet/pods/3fd39b11-95a1-493b-afd5-6469bd8ee321/volumes" Mar 20 13:51:57 crc kubenswrapper[4973]: I0320 13:51:57.964555 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c" path="/var/lib/kubelet/pods/fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c/volumes" Mar 20 13:51:59 crc kubenswrapper[4973]: I0320 13:51:59.193640 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-6b4775c997-mc4n9" Mar 20 13:51:59 crc kubenswrapper[4973]: I0320 13:51:59.253680 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-76f767b74-8jwpc"] Mar 20 13:51:59 crc kubenswrapper[4973]: I0320 13:51:59.254058 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/heat-engine-76f767b74-8jwpc" podUID="2d53b7a0-4041-437e-8ec2-91013bed7135" containerName="heat-engine" containerID="cri-o://a9f9968fdfaa6970d67f09742b94586bb90e47569f37a3792ef583b10bca3ec9" gracePeriod=60 Mar 20 13:52:00 crc kubenswrapper[4973]: I0320 13:52:00.153405 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566912-gjjsx"] Mar 20 13:52:00 crc kubenswrapper[4973]: E0320 13:52:00.154266 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c" containerName="heat-cfnapi" Mar 20 13:52:00 crc kubenswrapper[4973]: I0320 13:52:00.154281 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c" containerName="heat-cfnapi" Mar 20 13:52:00 crc kubenswrapper[4973]: E0320 13:52:00.154327 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fd39b11-95a1-493b-afd5-6469bd8ee321" containerName="heat-api" Mar 20 13:52:00 crc kubenswrapper[4973]: I0320 13:52:00.154455 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd39b11-95a1-493b-afd5-6469bd8ee321" containerName="heat-api" Mar 20 13:52:00 crc kubenswrapper[4973]: I0320 13:52:00.154698 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa857eb0-a118-4f57-8a53-5cd4f5ddfa3c" containerName="heat-cfnapi" Mar 20 13:52:00 crc kubenswrapper[4973]: I0320 13:52:00.154724 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fd39b11-95a1-493b-afd5-6469bd8ee321" containerName="heat-api" Mar 20 13:52:00 crc kubenswrapper[4973]: I0320 13:52:00.155636 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566912-gjjsx" Mar 20 13:52:00 crc kubenswrapper[4973]: I0320 13:52:00.165946 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566912-gjjsx"] Mar 20 13:52:00 crc kubenswrapper[4973]: I0320 13:52:00.166636 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:52:00 crc kubenswrapper[4973]: I0320 13:52:00.166824 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 13:52:00 crc kubenswrapper[4973]: I0320 13:52:00.167033 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:52:00 crc kubenswrapper[4973]: I0320 13:52:00.199226 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t86l6\" (UniqueName: \"kubernetes.io/projected/f64f4919-8d92-441b-be1d-84a900cbe013-kube-api-access-t86l6\") pod \"auto-csr-approver-29566912-gjjsx\" (UID: \"f64f4919-8d92-441b-be1d-84a900cbe013\") " pod="openshift-infra/auto-csr-approver-29566912-gjjsx" Mar 20 13:52:00 crc kubenswrapper[4973]: I0320 13:52:00.302036 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t86l6\" (UniqueName: \"kubernetes.io/projected/f64f4919-8d92-441b-be1d-84a900cbe013-kube-api-access-t86l6\") pod \"auto-csr-approver-29566912-gjjsx\" (UID: \"f64f4919-8d92-441b-be1d-84a900cbe013\") " pod="openshift-infra/auto-csr-approver-29566912-gjjsx" Mar 20 13:52:00 crc kubenswrapper[4973]: I0320 13:52:00.347314 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t86l6\" (UniqueName: \"kubernetes.io/projected/f64f4919-8d92-441b-be1d-84a900cbe013-kube-api-access-t86l6\") pod \"auto-csr-approver-29566912-gjjsx\" (UID: \"f64f4919-8d92-441b-be1d-84a900cbe013\") " pod="openshift-infra/auto-csr-approver-29566912-gjjsx" Mar 20 13:52:00 crc kubenswrapper[4973]: I0320 13:52:00.496377 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566912-gjjsx" Mar 20 13:52:01 crc kubenswrapper[4973]: I0320 13:52:01.115677 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566912-gjjsx"] Mar 20 13:52:01 crc kubenswrapper[4973]: I0320 13:52:01.830688 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566912-gjjsx" event={"ID":"f64f4919-8d92-441b-be1d-84a900cbe013","Type":"ContainerStarted","Data":"1a5f05651f8f087c715cbb9818de1b7dc9869c3ab3efdbab292d7eb6ba5a8cb6"} Mar 20 13:52:02 crc kubenswrapper[4973]: I0320 13:52:02.842162 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566912-gjjsx" event={"ID":"f64f4919-8d92-441b-be1d-84a900cbe013","Type":"ContainerStarted","Data":"4eb493f0158eb6368e3711b22e4fdbccb247f467bce476b683654be02cff9320"} Mar 20 13:52:02 crc kubenswrapper[4973]: I0320 13:52:02.857803 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566912-gjjsx" podStartSLOduration=1.662272659 podStartE2EDuration="2.857784471s" podCreationTimestamp="2026-03-20 13:52:00 +0000 UTC" firstStartedPulling="2026-03-20 13:52:01.118294631 +0000 UTC m=+1841.861964375" lastFinishedPulling="2026-03-20 13:52:02.313806433 +0000 UTC m=+1843.057476187" observedRunningTime="2026-03-20 13:52:02.854794809 +0000 UTC m=+1843.598464563" watchObservedRunningTime="2026-03-20 13:52:02.857784471 +0000 UTC m=+1843.601454215" Mar 20 13:52:04 crc kubenswrapper[4973]: I0320 13:52:04.887119 4973 generic.go:334] "Generic (PLEG): container finished" podID="f64f4919-8d92-441b-be1d-84a900cbe013" containerID="4eb493f0158eb6368e3711b22e4fdbccb247f467bce476b683654be02cff9320" exitCode=0 Mar 20 13:52:04 crc kubenswrapper[4973]: I0320 13:52:04.887195 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566912-gjjsx" event={"ID":"f64f4919-8d92-441b-be1d-84a900cbe013","Type":"ContainerDied","Data":"4eb493f0158eb6368e3711b22e4fdbccb247f467bce476b683654be02cff9320"} Mar 20 13:52:06 crc kubenswrapper[4973]: I0320 13:52:06.475657 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566912-gjjsx" Mar 20 13:52:06 crc kubenswrapper[4973]: E0320 13:52:06.516057 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a9f9968fdfaa6970d67f09742b94586bb90e47569f37a3792ef583b10bca3ec9" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 20 13:52:06 crc kubenswrapper[4973]: E0320 13:52:06.520711 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a9f9968fdfaa6970d67f09742b94586bb90e47569f37a3792ef583b10bca3ec9" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 20 13:52:06 crc kubenswrapper[4973]: E0320 13:52:06.532734 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="a9f9968fdfaa6970d67f09742b94586bb90e47569f37a3792ef583b10bca3ec9" cmd=["/usr/bin/pgrep","-r","DRST","heat-engine"] Mar 20 13:52:06 crc kubenswrapper[4973]: E0320 13:52:06.532807 4973 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/heat-engine-76f767b74-8jwpc" podUID="2d53b7a0-4041-437e-8ec2-91013bed7135" containerName="heat-engine" Mar 20 13:52:06 crc kubenswrapper[4973]: I0320 13:52:06.581094 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t86l6\" (UniqueName: \"kubernetes.io/projected/f64f4919-8d92-441b-be1d-84a900cbe013-kube-api-access-t86l6\") pod \"f64f4919-8d92-441b-be1d-84a900cbe013\" (UID: \"f64f4919-8d92-441b-be1d-84a900cbe013\") " Mar 20 13:52:06 crc kubenswrapper[4973]: I0320 13:52:06.591679 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f64f4919-8d92-441b-be1d-84a900cbe013-kube-api-access-t86l6" (OuterVolumeSpecName: "kube-api-access-t86l6") pod "f64f4919-8d92-441b-be1d-84a900cbe013" (UID: "f64f4919-8d92-441b-be1d-84a900cbe013"). InnerVolumeSpecName "kube-api-access-t86l6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:06 crc kubenswrapper[4973]: I0320 13:52:06.685057 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t86l6\" (UniqueName: \"kubernetes.io/projected/f64f4919-8d92-441b-be1d-84a900cbe013-kube-api-access-t86l6\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:06 crc kubenswrapper[4973]: I0320 13:52:06.921825 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566912-gjjsx" event={"ID":"f64f4919-8d92-441b-be1d-84a900cbe013","Type":"ContainerDied","Data":"1a5f05651f8f087c715cbb9818de1b7dc9869c3ab3efdbab292d7eb6ba5a8cb6"} Mar 20 13:52:06 crc kubenswrapper[4973]: I0320 13:52:06.921868 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a5f05651f8f087c715cbb9818de1b7dc9869c3ab3efdbab292d7eb6ba5a8cb6" Mar 20 13:52:06 crc kubenswrapper[4973]: I0320 13:52:06.921934 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566912-gjjsx" Mar 20 13:52:06 crc kubenswrapper[4973]: I0320 13:52:06.986878 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566906-smfqr"] Mar 20 13:52:07 crc kubenswrapper[4973]: I0320 13:52:07.003008 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566906-smfqr"] Mar 20 13:52:07 crc kubenswrapper[4973]: I0320 13:52:07.484198 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-2" podUID="f8b4580d-53af-4c18-9f2c-b883b8621113" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.1.29:5671: connect: connection refused" Mar 20 13:52:07 crc kubenswrapper[4973]: I0320 13:52:07.558391 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-8sqvk"] Mar 20 13:52:07 crc kubenswrapper[4973]: I0320 13:52:07.570742 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-8sqvk"] Mar 20 13:52:07 crc kubenswrapper[4973]: I0320 13:52:07.689849 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-jzhs8"] Mar 20 13:52:07 crc kubenswrapper[4973]: E0320 13:52:07.690394 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f64f4919-8d92-441b-be1d-84a900cbe013" containerName="oc" Mar 20 13:52:07 crc kubenswrapper[4973]: I0320 13:52:07.690414 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="f64f4919-8d92-441b-be1d-84a900cbe013" containerName="oc" Mar 20 13:52:07 crc kubenswrapper[4973]: I0320 13:52:07.690681 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="f64f4919-8d92-441b-be1d-84a900cbe013" containerName="oc" Mar 20 13:52:07 crc kubenswrapper[4973]: I0320 13:52:07.692145 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-jzhs8" Mar 20 13:52:07 crc kubenswrapper[4973]: I0320 13:52:07.695697 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 20 13:52:07 crc kubenswrapper[4973]: I0320 13:52:07.713133 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0b8e77b-1b7f-47ab-817f-54a700c8d2d9-config-data\") pod \"aodh-db-sync-jzhs8\" (UID: \"f0b8e77b-1b7f-47ab-817f-54a700c8d2d9\") " pod="openstack/aodh-db-sync-jzhs8" Mar 20 13:52:07 crc kubenswrapper[4973]: I0320 13:52:07.713568 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0b8e77b-1b7f-47ab-817f-54a700c8d2d9-scripts\") pod \"aodh-db-sync-jzhs8\" (UID: \"f0b8e77b-1b7f-47ab-817f-54a700c8d2d9\") " pod="openstack/aodh-db-sync-jzhs8" Mar 20 13:52:07 crc kubenswrapper[4973]: I0320 13:52:07.713608 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrnmd\" (UniqueName: \"kubernetes.io/projected/f0b8e77b-1b7f-47ab-817f-54a700c8d2d9-kube-api-access-nrnmd\") pod \"aodh-db-sync-jzhs8\" (UID: \"f0b8e77b-1b7f-47ab-817f-54a700c8d2d9\") " pod="openstack/aodh-db-sync-jzhs8" Mar 20 13:52:07 crc kubenswrapper[4973]: I0320 13:52:07.713732 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0b8e77b-1b7f-47ab-817f-54a700c8d2d9-combined-ca-bundle\") pod \"aodh-db-sync-jzhs8\" (UID: \"f0b8e77b-1b7f-47ab-817f-54a700c8d2d9\") " pod="openstack/aodh-db-sync-jzhs8" Mar 20 13:52:07 crc kubenswrapper[4973]: I0320 13:52:07.719128 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-jzhs8"] Mar 20 13:52:07 crc kubenswrapper[4973]: I0320 13:52:07.773569 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 20 13:52:07 crc kubenswrapper[4973]: I0320 13:52:07.816374 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0b8e77b-1b7f-47ab-817f-54a700c8d2d9-config-data\") pod \"aodh-db-sync-jzhs8\" (UID: \"f0b8e77b-1b7f-47ab-817f-54a700c8d2d9\") " pod="openstack/aodh-db-sync-jzhs8" Mar 20 13:52:07 crc kubenswrapper[4973]: I0320 13:52:07.816488 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0b8e77b-1b7f-47ab-817f-54a700c8d2d9-scripts\") pod \"aodh-db-sync-jzhs8\" (UID: \"f0b8e77b-1b7f-47ab-817f-54a700c8d2d9\") " pod="openstack/aodh-db-sync-jzhs8" Mar 20 13:52:07 crc kubenswrapper[4973]: I0320 13:52:07.816532 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrnmd\" (UniqueName: \"kubernetes.io/projected/f0b8e77b-1b7f-47ab-817f-54a700c8d2d9-kube-api-access-nrnmd\") pod \"aodh-db-sync-jzhs8\" (UID: \"f0b8e77b-1b7f-47ab-817f-54a700c8d2d9\") " pod="openstack/aodh-db-sync-jzhs8" Mar 20 13:52:07 crc kubenswrapper[4973]: I0320 13:52:07.816594 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0b8e77b-1b7f-47ab-817f-54a700c8d2d9-combined-ca-bundle\") pod \"aodh-db-sync-jzhs8\" (UID: \"f0b8e77b-1b7f-47ab-817f-54a700c8d2d9\") " pod="openstack/aodh-db-sync-jzhs8" Mar 20 13:52:07 crc kubenswrapper[4973]: I0320 13:52:07.833215 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0b8e77b-1b7f-47ab-817f-54a700c8d2d9-config-data\") pod \"aodh-db-sync-jzhs8\" (UID: \"f0b8e77b-1b7f-47ab-817f-54a700c8d2d9\") " pod="openstack/aodh-db-sync-jzhs8" Mar 20 13:52:07 crc kubenswrapper[4973]: I0320 13:52:07.833609 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0b8e77b-1b7f-47ab-817f-54a700c8d2d9-scripts\") pod \"aodh-db-sync-jzhs8\" (UID: \"f0b8e77b-1b7f-47ab-817f-54a700c8d2d9\") " pod="openstack/aodh-db-sync-jzhs8" Mar 20 13:52:07 crc kubenswrapper[4973]: I0320 13:52:07.853123 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0b8e77b-1b7f-47ab-817f-54a700c8d2d9-combined-ca-bundle\") pod \"aodh-db-sync-jzhs8\" (UID: \"f0b8e77b-1b7f-47ab-817f-54a700c8d2d9\") " pod="openstack/aodh-db-sync-jzhs8" Mar 20 13:52:07 crc kubenswrapper[4973]: I0320 13:52:07.853919 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrnmd\" (UniqueName: \"kubernetes.io/projected/f0b8e77b-1b7f-47ab-817f-54a700c8d2d9-kube-api-access-nrnmd\") pod \"aodh-db-sync-jzhs8\" (UID: \"f0b8e77b-1b7f-47ab-817f-54a700c8d2d9\") " pod="openstack/aodh-db-sync-jzhs8" Mar 20 13:52:08 crc kubenswrapper[4973]: I0320 13:52:08.017715 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-jzhs8" Mar 20 13:52:08 crc kubenswrapper[4973]: I0320 13:52:08.021275 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1712bc13-36c4-4c56-b652-e0a0bd194179" path="/var/lib/kubelet/pods/1712bc13-36c4-4c56-b652-e0a0bd194179/volumes" Mar 20 13:52:08 crc kubenswrapper[4973]: I0320 13:52:08.041625 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7607725a-3dff-46d1-ac38-f1a7a393ff80" path="/var/lib/kubelet/pods/7607725a-3dff-46d1-ac38-f1a7a393ff80/volumes" Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:08.622196 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-jzhs8"] Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:08.762575 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-76f767b74-8jwpc" Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:08.847803 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dj7hr\" (UniqueName: \"kubernetes.io/projected/2d53b7a0-4041-437e-8ec2-91013bed7135-kube-api-access-dj7hr\") pod \"2d53b7a0-4041-437e-8ec2-91013bed7135\" (UID: \"2d53b7a0-4041-437e-8ec2-91013bed7135\") " Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:08.847977 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d53b7a0-4041-437e-8ec2-91013bed7135-combined-ca-bundle\") pod \"2d53b7a0-4041-437e-8ec2-91013bed7135\" (UID: \"2d53b7a0-4041-437e-8ec2-91013bed7135\") " Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:08.848127 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d53b7a0-4041-437e-8ec2-91013bed7135-config-data-custom\") pod \"2d53b7a0-4041-437e-8ec2-91013bed7135\" (UID: \"2d53b7a0-4041-437e-8ec2-91013bed7135\") " Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:08.848200 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d53b7a0-4041-437e-8ec2-91013bed7135-config-data\") pod \"2d53b7a0-4041-437e-8ec2-91013bed7135\" (UID: \"2d53b7a0-4041-437e-8ec2-91013bed7135\") " Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:08.858062 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d53b7a0-4041-437e-8ec2-91013bed7135-kube-api-access-dj7hr" (OuterVolumeSpecName: "kube-api-access-dj7hr") pod "2d53b7a0-4041-437e-8ec2-91013bed7135" (UID: "2d53b7a0-4041-437e-8ec2-91013bed7135"). InnerVolumeSpecName "kube-api-access-dj7hr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:08.859690 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d53b7a0-4041-437e-8ec2-91013bed7135-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2d53b7a0-4041-437e-8ec2-91013bed7135" (UID: "2d53b7a0-4041-437e-8ec2-91013bed7135"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:08.899724 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d53b7a0-4041-437e-8ec2-91013bed7135-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2d53b7a0-4041-437e-8ec2-91013bed7135" (UID: "2d53b7a0-4041-437e-8ec2-91013bed7135"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:08.941350 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d53b7a0-4041-437e-8ec2-91013bed7135-config-data" (OuterVolumeSpecName: "config-data") pod "2d53b7a0-4041-437e-8ec2-91013bed7135" (UID: "2d53b7a0-4041-437e-8ec2-91013bed7135"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:08.964916 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dj7hr\" (UniqueName: \"kubernetes.io/projected/2d53b7a0-4041-437e-8ec2-91013bed7135-kube-api-access-dj7hr\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:08.965428 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2d53b7a0-4041-437e-8ec2-91013bed7135-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:08.965464 4973 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2d53b7a0-4041-437e-8ec2-91013bed7135-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:08.965476 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2d53b7a0-4041-437e-8ec2-91013bed7135-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:09.023471 4973 generic.go:334] "Generic (PLEG): container finished" podID="7673cb4f-1440-480a-8d50-2640987b8a0f" containerID="24cd30981df0087f3c87ffb550e02326d6e1d32bef0430d0047a4eb94f132454" exitCode=0 Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:09.023558 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58" event={"ID":"7673cb4f-1440-480a-8d50-2640987b8a0f","Type":"ContainerDied","Data":"24cd30981df0087f3c87ffb550e02326d6e1d32bef0430d0047a4eb94f132454"} Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:09.028757 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-jzhs8" event={"ID":"f0b8e77b-1b7f-47ab-817f-54a700c8d2d9","Type":"ContainerStarted","Data":"fe682b9432364561637db26ce0a93810962b9158ea72124198b7ce1d01d010bc"} Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:09.031812 4973 generic.go:334] "Generic (PLEG): container finished" podID="2d53b7a0-4041-437e-8ec2-91013bed7135" containerID="a9f9968fdfaa6970d67f09742b94586bb90e47569f37a3792ef583b10bca3ec9" exitCode=0 Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:09.031846 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-76f767b74-8jwpc" event={"ID":"2d53b7a0-4041-437e-8ec2-91013bed7135","Type":"ContainerDied","Data":"a9f9968fdfaa6970d67f09742b94586bb90e47569f37a3792ef583b10bca3ec9"} Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:09.031868 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-76f767b74-8jwpc" event={"ID":"2d53b7a0-4041-437e-8ec2-91013bed7135","Type":"ContainerDied","Data":"604bce4c7a9ef7fe4e0aaf7871ab146874e65a0659e6558fdabc7e719800e2bb"} Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:09.031886 4973 scope.go:117] "RemoveContainer" containerID="a9f9968fdfaa6970d67f09742b94586bb90e47569f37a3792ef583b10bca3ec9" Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:09.032041 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-76f767b74-8jwpc" Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:09.142210 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-engine-76f767b74-8jwpc"] Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:09.143582 4973 scope.go:117] "RemoveContainer" containerID="a9f9968fdfaa6970d67f09742b94586bb90e47569f37a3792ef583b10bca3ec9" Mar 20 13:52:09 crc kubenswrapper[4973]: E0320 13:52:09.147613 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9f9968fdfaa6970d67f09742b94586bb90e47569f37a3792ef583b10bca3ec9\": container with ID starting with a9f9968fdfaa6970d67f09742b94586bb90e47569f37a3792ef583b10bca3ec9 not found: ID does not exist" containerID="a9f9968fdfaa6970d67f09742b94586bb90e47569f37a3792ef583b10bca3ec9" Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:09.147652 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9f9968fdfaa6970d67f09742b94586bb90e47569f37a3792ef583b10bca3ec9"} err="failed to get container status \"a9f9968fdfaa6970d67f09742b94586bb90e47569f37a3792ef583b10bca3ec9\": rpc error: code = NotFound desc = could not find container \"a9f9968fdfaa6970d67f09742b94586bb90e47569f37a3792ef583b10bca3ec9\": container with ID starting with a9f9968fdfaa6970d67f09742b94586bb90e47569f37a3792ef583b10bca3ec9 not found: ID does not exist" Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:09.185657 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-engine-76f767b74-8jwpc"] Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:09.964323 4973 scope.go:117] "RemoveContainer" containerID="6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" Mar 20 13:52:09 crc kubenswrapper[4973]: E0320 13:52:09.964594 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:52:09 crc kubenswrapper[4973]: I0320 13:52:09.984494 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d53b7a0-4041-437e-8ec2-91013bed7135" path="/var/lib/kubelet/pods/2d53b7a0-4041-437e-8ec2-91013bed7135/volumes" Mar 20 13:52:10 crc kubenswrapper[4973]: I0320 13:52:10.878211 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.042274 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7673cb4f-1440-480a-8d50-2640987b8a0f-ssh-key-openstack-edpm-ipam\") pod \"7673cb4f-1440-480a-8d50-2640987b8a0f\" (UID: \"7673cb4f-1440-480a-8d50-2640987b8a0f\") " Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.042363 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcqnr\" (UniqueName: \"kubernetes.io/projected/7673cb4f-1440-480a-8d50-2640987b8a0f-kube-api-access-wcqnr\") pod \"7673cb4f-1440-480a-8d50-2640987b8a0f\" (UID: \"7673cb4f-1440-480a-8d50-2640987b8a0f\") " Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.042415 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7673cb4f-1440-480a-8d50-2640987b8a0f-repo-setup-combined-ca-bundle\") pod \"7673cb4f-1440-480a-8d50-2640987b8a0f\" (UID: \"7673cb4f-1440-480a-8d50-2640987b8a0f\") " Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.042727 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7673cb4f-1440-480a-8d50-2640987b8a0f-inventory\") pod \"7673cb4f-1440-480a-8d50-2640987b8a0f\" (UID: \"7673cb4f-1440-480a-8d50-2640987b8a0f\") " Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.051674 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7673cb4f-1440-480a-8d50-2640987b8a0f-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "7673cb4f-1440-480a-8d50-2640987b8a0f" (UID: "7673cb4f-1440-480a-8d50-2640987b8a0f"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.058576 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7673cb4f-1440-480a-8d50-2640987b8a0f-kube-api-access-wcqnr" (OuterVolumeSpecName: "kube-api-access-wcqnr") pod "7673cb4f-1440-480a-8d50-2640987b8a0f" (UID: "7673cb4f-1440-480a-8d50-2640987b8a0f"). InnerVolumeSpecName "kube-api-access-wcqnr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.110504 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7673cb4f-1440-480a-8d50-2640987b8a0f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7673cb4f-1440-480a-8d50-2640987b8a0f" (UID: "7673cb4f-1440-480a-8d50-2640987b8a0f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.115629 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7673cb4f-1440-480a-8d50-2640987b8a0f-inventory" (OuterVolumeSpecName: "inventory") pod "7673cb4f-1440-480a-8d50-2640987b8a0f" (UID: "7673cb4f-1440-480a-8d50-2640987b8a0f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.147671 4973 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7673cb4f-1440-480a-8d50-2640987b8a0f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.147726 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcqnr\" (UniqueName: \"kubernetes.io/projected/7673cb4f-1440-480a-8d50-2640987b8a0f-kube-api-access-wcqnr\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.147740 4973 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7673cb4f-1440-480a-8d50-2640987b8a0f-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.147752 4973 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7673cb4f-1440-480a-8d50-2640987b8a0f-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.158134 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58" event={"ID":"7673cb4f-1440-480a-8d50-2640987b8a0f","Type":"ContainerDied","Data":"5b619b5d5639bc86143747b410cad5e1bfd490677e197cfa3bc36dfdd7eaadae"} Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.158204 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5b619b5d5639bc86143747b410cad5e1bfd490677e197cfa3bc36dfdd7eaadae" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.158332 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.190831 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-82xbs"] Mar 20 13:52:11 crc kubenswrapper[4973]: E0320 13:52:11.191520 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d53b7a0-4041-437e-8ec2-91013bed7135" containerName="heat-engine" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.191536 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d53b7a0-4041-437e-8ec2-91013bed7135" containerName="heat-engine" Mar 20 13:52:11 crc kubenswrapper[4973]: E0320 13:52:11.191564 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7673cb4f-1440-480a-8d50-2640987b8a0f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.191572 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="7673cb4f-1440-480a-8d50-2640987b8a0f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.192457 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="7673cb4f-1440-480a-8d50-2640987b8a0f" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.192489 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d53b7a0-4041-437e-8ec2-91013bed7135" containerName="heat-engine" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.193323 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-82xbs" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.202572 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.202777 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d4wjc" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.202969 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.203107 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.223251 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-82xbs"] Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.353906 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1eea56a-055e-400a-8300-cde71ecad667-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-82xbs\" (UID: \"c1eea56a-055e-400a-8300-cde71ecad667\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-82xbs" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.354004 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1eea56a-055e-400a-8300-cde71ecad667-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-82xbs\" (UID: \"c1eea56a-055e-400a-8300-cde71ecad667\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-82xbs" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.354097 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f87l5\" (UniqueName: \"kubernetes.io/projected/c1eea56a-055e-400a-8300-cde71ecad667-kube-api-access-f87l5\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-82xbs\" (UID: \"c1eea56a-055e-400a-8300-cde71ecad667\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-82xbs" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.464776 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1eea56a-055e-400a-8300-cde71ecad667-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-82xbs\" (UID: \"c1eea56a-055e-400a-8300-cde71ecad667\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-82xbs" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.465318 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1eea56a-055e-400a-8300-cde71ecad667-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-82xbs\" (UID: \"c1eea56a-055e-400a-8300-cde71ecad667\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-82xbs" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.465490 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f87l5\" (UniqueName: \"kubernetes.io/projected/c1eea56a-055e-400a-8300-cde71ecad667-kube-api-access-f87l5\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-82xbs\" (UID: \"c1eea56a-055e-400a-8300-cde71ecad667\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-82xbs" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.475597 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1eea56a-055e-400a-8300-cde71ecad667-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-82xbs\" (UID: \"c1eea56a-055e-400a-8300-cde71ecad667\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-82xbs" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.483960 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1eea56a-055e-400a-8300-cde71ecad667-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-82xbs\" (UID: \"c1eea56a-055e-400a-8300-cde71ecad667\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-82xbs" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.511026 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f87l5\" (UniqueName: \"kubernetes.io/projected/c1eea56a-055e-400a-8300-cde71ecad667-kube-api-access-f87l5\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-82xbs\" (UID: \"c1eea56a-055e-400a-8300-cde71ecad667\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-82xbs" Mar 20 13:52:11 crc kubenswrapper[4973]: I0320 13:52:11.562148 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-82xbs" Mar 20 13:52:12 crc kubenswrapper[4973]: I0320 13:52:12.159962 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-82xbs"] Mar 20 13:52:14 crc kubenswrapper[4973]: W0320 13:52:14.527680 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1eea56a_055e_400a_8300_cde71ecad667.slice/crio-dda79f580ef93bcd6b8145ec44b51b3f5a50757f4358873d2c9b1ecc8e841624 WatchSource:0}: Error finding container dda79f580ef93bcd6b8145ec44b51b3f5a50757f4358873d2c9b1ecc8e841624: Status 404 returned error can't find the container with id dda79f580ef93bcd6b8145ec44b51b3f5a50757f4358873d2c9b1ecc8e841624 Mar 20 13:52:15 crc kubenswrapper[4973]: I0320 13:52:15.222733 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-82xbs" event={"ID":"c1eea56a-055e-400a-8300-cde71ecad667","Type":"ContainerStarted","Data":"dda79f580ef93bcd6b8145ec44b51b3f5a50757f4358873d2c9b1ecc8e841624"} Mar 20 13:52:15 crc kubenswrapper[4973]: I0320 13:52:15.227276 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-jzhs8" event={"ID":"f0b8e77b-1b7f-47ab-817f-54a700c8d2d9","Type":"ContainerStarted","Data":"af88b9782db03543db3db15178949f15507b71f0fc9b6eb4b591811a54e9f89b"} Mar 20 13:52:15 crc kubenswrapper[4973]: I0320 13:52:15.249659 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-82xbs" podStartSLOduration=3.805257858 podStartE2EDuration="4.249638985s" podCreationTimestamp="2026-03-20 13:52:11 +0000 UTC" firstStartedPulling="2026-03-20 13:52:14.533034444 +0000 UTC m=+1855.276704198" lastFinishedPulling="2026-03-20 13:52:14.977415581 +0000 UTC m=+1855.721085325" observedRunningTime="2026-03-20 13:52:15.236291661 +0000 UTC m=+1855.979961405" watchObservedRunningTime="2026-03-20 13:52:15.249638985 +0000 UTC m=+1855.993308729" Mar 20 13:52:15 crc kubenswrapper[4973]: I0320 13:52:15.275514 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-jzhs8" podStartSLOduration=2.347920053 podStartE2EDuration="8.275493752s" podCreationTimestamp="2026-03-20 13:52:07 +0000 UTC" firstStartedPulling="2026-03-20 13:52:08.657410174 +0000 UTC m=+1849.401079918" lastFinishedPulling="2026-03-20 13:52:14.584983873 +0000 UTC m=+1855.328653617" observedRunningTime="2026-03-20 13:52:15.261101639 +0000 UTC m=+1856.004771403" watchObservedRunningTime="2026-03-20 13:52:15.275493752 +0000 UTC m=+1856.019163496" Mar 20 13:52:16 crc kubenswrapper[4973]: I0320 13:52:16.239869 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-82xbs" event={"ID":"c1eea56a-055e-400a-8300-cde71ecad667","Type":"ContainerStarted","Data":"eea502f405e4fd7eab45a21e84a9d3983abf58aa6ebaba1beab4fea2b7784fe0"} Mar 20 13:52:17 crc kubenswrapper[4973]: I0320 13:52:17.484578 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-2" Mar 20 13:52:17 crc kubenswrapper[4973]: I0320 13:52:17.530984 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 20 13:52:19 crc kubenswrapper[4973]: I0320 13:52:19.275887 4973 generic.go:334] "Generic (PLEG): container finished" podID="c1eea56a-055e-400a-8300-cde71ecad667" containerID="eea502f405e4fd7eab45a21e84a9d3983abf58aa6ebaba1beab4fea2b7784fe0" exitCode=0 Mar 20 13:52:19 crc kubenswrapper[4973]: I0320 13:52:19.275944 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-82xbs" event={"ID":"c1eea56a-055e-400a-8300-cde71ecad667","Type":"ContainerDied","Data":"eea502f405e4fd7eab45a21e84a9d3983abf58aa6ebaba1beab4fea2b7784fe0"} Mar 20 13:52:19 crc kubenswrapper[4973]: I0320 13:52:19.279332 4973 generic.go:334] "Generic (PLEG): container finished" podID="f0b8e77b-1b7f-47ab-817f-54a700c8d2d9" containerID="af88b9782db03543db3db15178949f15507b71f0fc9b6eb4b591811a54e9f89b" exitCode=0 Mar 20 13:52:19 crc kubenswrapper[4973]: I0320 13:52:19.279388 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-jzhs8" event={"ID":"f0b8e77b-1b7f-47ab-817f-54a700c8d2d9","Type":"ContainerDied","Data":"af88b9782db03543db3db15178949f15507b71f0fc9b6eb4b591811a54e9f89b"} Mar 20 13:52:20 crc kubenswrapper[4973]: I0320 13:52:20.940557 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-jzhs8" Mar 20 13:52:20 crc kubenswrapper[4973]: I0320 13:52:20.947196 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-82xbs" Mar 20 13:52:20 crc kubenswrapper[4973]: I0320 13:52:20.954052 4973 scope.go:117] "RemoveContainer" containerID="6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" Mar 20 13:52:20 crc kubenswrapper[4973]: E0320 13:52:20.954813 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.039097 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1eea56a-055e-400a-8300-cde71ecad667-inventory\") pod \"c1eea56a-055e-400a-8300-cde71ecad667\" (UID: \"c1eea56a-055e-400a-8300-cde71ecad667\") " Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.039145 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0b8e77b-1b7f-47ab-817f-54a700c8d2d9-scripts\") pod \"f0b8e77b-1b7f-47ab-817f-54a700c8d2d9\" (UID: \"f0b8e77b-1b7f-47ab-817f-54a700c8d2d9\") " Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.039167 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f87l5\" (UniqueName: \"kubernetes.io/projected/c1eea56a-055e-400a-8300-cde71ecad667-kube-api-access-f87l5\") pod \"c1eea56a-055e-400a-8300-cde71ecad667\" (UID: \"c1eea56a-055e-400a-8300-cde71ecad667\") " Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.039321 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0b8e77b-1b7f-47ab-817f-54a700c8d2d9-config-data\") pod \"f0b8e77b-1b7f-47ab-817f-54a700c8d2d9\" (UID: \"f0b8e77b-1b7f-47ab-817f-54a700c8d2d9\") " Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.039438 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0b8e77b-1b7f-47ab-817f-54a700c8d2d9-combined-ca-bundle\") pod \"f0b8e77b-1b7f-47ab-817f-54a700c8d2d9\" (UID: \"f0b8e77b-1b7f-47ab-817f-54a700c8d2d9\") " Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.039510 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrnmd\" (UniqueName: \"kubernetes.io/projected/f0b8e77b-1b7f-47ab-817f-54a700c8d2d9-kube-api-access-nrnmd\") pod \"f0b8e77b-1b7f-47ab-817f-54a700c8d2d9\" (UID: \"f0b8e77b-1b7f-47ab-817f-54a700c8d2d9\") " Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.039603 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1eea56a-055e-400a-8300-cde71ecad667-ssh-key-openstack-edpm-ipam\") pod \"c1eea56a-055e-400a-8300-cde71ecad667\" (UID: \"c1eea56a-055e-400a-8300-cde71ecad667\") " Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.050731 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0b8e77b-1b7f-47ab-817f-54a700c8d2d9-kube-api-access-nrnmd" (OuterVolumeSpecName: "kube-api-access-nrnmd") pod "f0b8e77b-1b7f-47ab-817f-54a700c8d2d9" (UID: "f0b8e77b-1b7f-47ab-817f-54a700c8d2d9"). InnerVolumeSpecName "kube-api-access-nrnmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.050878 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0b8e77b-1b7f-47ab-817f-54a700c8d2d9-scripts" (OuterVolumeSpecName: "scripts") pod "f0b8e77b-1b7f-47ab-817f-54a700c8d2d9" (UID: "f0b8e77b-1b7f-47ab-817f-54a700c8d2d9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.057682 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1eea56a-055e-400a-8300-cde71ecad667-kube-api-access-f87l5" (OuterVolumeSpecName: "kube-api-access-f87l5") pod "c1eea56a-055e-400a-8300-cde71ecad667" (UID: "c1eea56a-055e-400a-8300-cde71ecad667"). InnerVolumeSpecName "kube-api-access-f87l5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.081953 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0b8e77b-1b7f-47ab-817f-54a700c8d2d9-config-data" (OuterVolumeSpecName: "config-data") pod "f0b8e77b-1b7f-47ab-817f-54a700c8d2d9" (UID: "f0b8e77b-1b7f-47ab-817f-54a700c8d2d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.083350 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f0b8e77b-1b7f-47ab-817f-54a700c8d2d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f0b8e77b-1b7f-47ab-817f-54a700c8d2d9" (UID: "f0b8e77b-1b7f-47ab-817f-54a700c8d2d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.087666 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1eea56a-055e-400a-8300-cde71ecad667-inventory" (OuterVolumeSpecName: "inventory") pod "c1eea56a-055e-400a-8300-cde71ecad667" (UID: "c1eea56a-055e-400a-8300-cde71ecad667"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.087680 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1eea56a-055e-400a-8300-cde71ecad667-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c1eea56a-055e-400a-8300-cde71ecad667" (UID: "c1eea56a-055e-400a-8300-cde71ecad667"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.142756 4973 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c1eea56a-055e-400a-8300-cde71ecad667-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.142949 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f0b8e77b-1b7f-47ab-817f-54a700c8d2d9-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.143044 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f87l5\" (UniqueName: \"kubernetes.io/projected/c1eea56a-055e-400a-8300-cde71ecad667-kube-api-access-f87l5\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.143111 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f0b8e77b-1b7f-47ab-817f-54a700c8d2d9-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.143175 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f0b8e77b-1b7f-47ab-817f-54a700c8d2d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.143236 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrnmd\" (UniqueName: \"kubernetes.io/projected/f0b8e77b-1b7f-47ab-817f-54a700c8d2d9-kube-api-access-nrnmd\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.143301 4973 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c1eea56a-055e-400a-8300-cde71ecad667-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.302990 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-jzhs8" event={"ID":"f0b8e77b-1b7f-47ab-817f-54a700c8d2d9","Type":"ContainerDied","Data":"fe682b9432364561637db26ce0a93810962b9158ea72124198b7ce1d01d010bc"} Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.303018 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-jzhs8" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.303038 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fe682b9432364561637db26ce0a93810962b9158ea72124198b7ce1d01d010bc" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.305309 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-82xbs" event={"ID":"c1eea56a-055e-400a-8300-cde71ecad667","Type":"ContainerDied","Data":"dda79f580ef93bcd6b8145ec44b51b3f5a50757f4358873d2c9b1ecc8e841624"} Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.305369 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dda79f580ef93bcd6b8145ec44b51b3f5a50757f4358873d2c9b1ecc8e841624" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.305425 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-82xbs" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.402509 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl"] Mar 20 13:52:21 crc kubenswrapper[4973]: E0320 13:52:21.403149 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1eea56a-055e-400a-8300-cde71ecad667" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.403174 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1eea56a-055e-400a-8300-cde71ecad667" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 20 13:52:21 crc kubenswrapper[4973]: E0320 13:52:21.403215 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0b8e77b-1b7f-47ab-817f-54a700c8d2d9" containerName="aodh-db-sync" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.403224 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0b8e77b-1b7f-47ab-817f-54a700c8d2d9" containerName="aodh-db-sync" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.403531 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1eea56a-055e-400a-8300-cde71ecad667" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.403567 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0b8e77b-1b7f-47ab-817f-54a700c8d2d9" containerName="aodh-db-sync" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.404596 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.424075 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d4wjc" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.424137 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.424265 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.424449 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.436791 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl"] Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.552267 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8dfa03f4-98c0-4122-8fbb-abeba13439f0-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl\" (UID: \"8dfa03f4-98c0-4122-8fbb-abeba13439f0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.552330 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmlvd\" (UniqueName: \"kubernetes.io/projected/8dfa03f4-98c0-4122-8fbb-abeba13439f0-kube-api-access-nmlvd\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl\" (UID: \"8dfa03f4-98c0-4122-8fbb-abeba13439f0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.552514 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8dfa03f4-98c0-4122-8fbb-abeba13439f0-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl\" (UID: \"8dfa03f4-98c0-4122-8fbb-abeba13439f0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.552580 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dfa03f4-98c0-4122-8fbb-abeba13439f0-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl\" (UID: \"8dfa03f4-98c0-4122-8fbb-abeba13439f0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.654297 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8dfa03f4-98c0-4122-8fbb-abeba13439f0-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl\" (UID: \"8dfa03f4-98c0-4122-8fbb-abeba13439f0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.654381 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dfa03f4-98c0-4122-8fbb-abeba13439f0-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl\" (UID: \"8dfa03f4-98c0-4122-8fbb-abeba13439f0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.654524 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8dfa03f4-98c0-4122-8fbb-abeba13439f0-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl\" (UID: \"8dfa03f4-98c0-4122-8fbb-abeba13439f0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.654550 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmlvd\" (UniqueName: \"kubernetes.io/projected/8dfa03f4-98c0-4122-8fbb-abeba13439f0-kube-api-access-nmlvd\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl\" (UID: \"8dfa03f4-98c0-4122-8fbb-abeba13439f0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.658790 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8dfa03f4-98c0-4122-8fbb-abeba13439f0-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl\" (UID: \"8dfa03f4-98c0-4122-8fbb-abeba13439f0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.659582 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dfa03f4-98c0-4122-8fbb-abeba13439f0-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl\" (UID: \"8dfa03f4-98c0-4122-8fbb-abeba13439f0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.662703 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8dfa03f4-98c0-4122-8fbb-abeba13439f0-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl\" (UID: \"8dfa03f4-98c0-4122-8fbb-abeba13439f0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.670087 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmlvd\" (UniqueName: \"kubernetes.io/projected/8dfa03f4-98c0-4122-8fbb-abeba13439f0-kube-api-access-nmlvd\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl\" (UID: \"8dfa03f4-98c0-4122-8fbb-abeba13439f0\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl" Mar 20 13:52:21 crc kubenswrapper[4973]: I0320 13:52:21.733973 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl" Mar 20 13:52:22 crc kubenswrapper[4973]: I0320 13:52:22.088807 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-1" podUID="797b38f5-d9a7-4f82-bd12-e40e021ef28e" containerName="rabbitmq" containerID="cri-o://d6ddcd3aa7658bceaec6103da5780aeff16abb7dbb7606fe4e686cb5378fa9bb" gracePeriod=604796 Mar 20 13:52:22 crc kubenswrapper[4973]: I0320 13:52:22.319949 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl"] Mar 20 13:52:22 crc kubenswrapper[4973]: I0320 13:52:22.822601 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 20 13:52:22 crc kubenswrapper[4973]: I0320 13:52:22.822879 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="54c36469-89aa-4fbc-bc88-22c3dc8b0bc6" containerName="aodh-api" containerID="cri-o://0f6a751e8f966556681f7f43b659c9eb663f2754d4df283526610a95879c0d5a" gracePeriod=30 Mar 20 13:52:22 crc kubenswrapper[4973]: I0320 13:52:22.823473 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="54c36469-89aa-4fbc-bc88-22c3dc8b0bc6" containerName="aodh-listener" containerID="cri-o://fef3925f94ce619168301d5782a0cd7e4d4781173dd9337f0878b67b117cb10a" gracePeriod=30 Mar 20 13:52:22 crc kubenswrapper[4973]: I0320 13:52:22.823535 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="54c36469-89aa-4fbc-bc88-22c3dc8b0bc6" containerName="aodh-notifier" containerID="cri-o://b75848de0126f937d24f5fb5efcc89558ec1b45db617456de33c25662975c984" gracePeriod=30 Mar 20 13:52:22 crc kubenswrapper[4973]: I0320 13:52:22.823711 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/aodh-0" podUID="54c36469-89aa-4fbc-bc88-22c3dc8b0bc6" containerName="aodh-evaluator" containerID="cri-o://c8eac8c778b02c5ab14e1bfc1ce1c5ecce5659d774e5631b2eb49413a548c16d" gracePeriod=30 Mar 20 13:52:23 crc kubenswrapper[4973]: I0320 13:52:23.329490 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl" event={"ID":"8dfa03f4-98c0-4122-8fbb-abeba13439f0","Type":"ContainerStarted","Data":"867206f0de03d87710aaf1e40085adb675e039dc7e9c84bcca844e6fbde582d4"} Mar 20 13:52:23 crc kubenswrapper[4973]: I0320 13:52:23.329971 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl" event={"ID":"8dfa03f4-98c0-4122-8fbb-abeba13439f0","Type":"ContainerStarted","Data":"b56bb6f84793e142dc07ce532e1318b489f92063b3cfbc40abd82171f457803a"} Mar 20 13:52:23 crc kubenswrapper[4973]: I0320 13:52:23.331697 4973 generic.go:334] "Generic (PLEG): container finished" podID="54c36469-89aa-4fbc-bc88-22c3dc8b0bc6" containerID="0f6a751e8f966556681f7f43b659c9eb663f2754d4df283526610a95879c0d5a" exitCode=0 Mar 20 13:52:23 crc kubenswrapper[4973]: I0320 13:52:23.331747 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6","Type":"ContainerDied","Data":"0f6a751e8f966556681f7f43b659c9eb663f2754d4df283526610a95879c0d5a"} Mar 20 13:52:23 crc kubenswrapper[4973]: I0320 13:52:23.361846 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl" podStartSLOduration=1.793293303 podStartE2EDuration="2.361827581s" podCreationTimestamp="2026-03-20 13:52:21 +0000 UTC" firstStartedPulling="2026-03-20 13:52:22.322934876 +0000 UTC m=+1863.066604620" lastFinishedPulling="2026-03-20 13:52:22.891469154 +0000 UTC m=+1863.635138898" observedRunningTime="2026-03-20 13:52:23.355282433 +0000 UTC m=+1864.098952197" watchObservedRunningTime="2026-03-20 13:52:23.361827581 +0000 UTC m=+1864.105497325" Mar 20 13:52:24 crc kubenswrapper[4973]: I0320 13:52:24.345160 4973 generic.go:334] "Generic (PLEG): container finished" podID="54c36469-89aa-4fbc-bc88-22c3dc8b0bc6" containerID="c8eac8c778b02c5ab14e1bfc1ce1c5ecce5659d774e5631b2eb49413a548c16d" exitCode=0 Mar 20 13:52:24 crc kubenswrapper[4973]: I0320 13:52:24.345313 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6","Type":"ContainerDied","Data":"c8eac8c778b02c5ab14e1bfc1ce1c5ecce5659d774e5631b2eb49413a548c16d"} Mar 20 13:52:25 crc kubenswrapper[4973]: I0320 13:52:25.361522 4973 generic.go:334] "Generic (PLEG): container finished" podID="54c36469-89aa-4fbc-bc88-22c3dc8b0bc6" containerID="b75848de0126f937d24f5fb5efcc89558ec1b45db617456de33c25662975c984" exitCode=0 Mar 20 13:52:25 crc kubenswrapper[4973]: I0320 13:52:25.361587 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6","Type":"ContainerDied","Data":"b75848de0126f937d24f5fb5efcc89558ec1b45db617456de33c25662975c984"} Mar 20 13:52:25 crc kubenswrapper[4973]: I0320 13:52:25.475464 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-1" podUID="797b38f5-d9a7-4f82-bd12-e40e021ef28e" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.134:5671: connect: connection refused" Mar 20 13:52:28 crc kubenswrapper[4973]: I0320 13:52:28.894779 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 20 13:52:28 crc kubenswrapper[4973]: I0320 13:52:28.964699 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/797b38f5-d9a7-4f82-bd12-e40e021ef28e-plugins-conf\") pod \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " Mar 20 13:52:28 crc kubenswrapper[4973]: I0320 13:52:28.965136 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/797b38f5-d9a7-4f82-bd12-e40e021ef28e-rabbitmq-erlang-cookie\") pod \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " Mar 20 13:52:28 crc kubenswrapper[4973]: I0320 13:52:28.965186 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/797b38f5-d9a7-4f82-bd12-e40e021ef28e-rabbitmq-tls\") pod \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " Mar 20 13:52:28 crc kubenswrapper[4973]: I0320 13:52:28.965217 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/797b38f5-d9a7-4f82-bd12-e40e021ef28e-config-data\") pod \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " Mar 20 13:52:28 crc kubenswrapper[4973]: I0320 13:52:28.965463 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/797b38f5-d9a7-4f82-bd12-e40e021ef28e-rabbitmq-confd\") pod \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " Mar 20 13:52:28 crc kubenswrapper[4973]: I0320 13:52:28.965506 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/797b38f5-d9a7-4f82-bd12-e40e021ef28e-erlang-cookie-secret\") pod \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " Mar 20 13:52:28 crc kubenswrapper[4973]: I0320 13:52:28.965551 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/797b38f5-d9a7-4f82-bd12-e40e021ef28e-server-conf\") pod \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " Mar 20 13:52:28 crc kubenswrapper[4973]: I0320 13:52:28.965892 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/797b38f5-d9a7-4f82-bd12-e40e021ef28e-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "797b38f5-d9a7-4f82-bd12-e40e021ef28e" (UID: "797b38f5-d9a7-4f82-bd12-e40e021ef28e"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:52:28 crc kubenswrapper[4973]: I0320 13:52:28.966146 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6474f3c-a7e9-419a-baaa-854971e6b73f\") pod \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " Mar 20 13:52:28 crc kubenswrapper[4973]: I0320 13:52:28.966246 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klrdc\" (UniqueName: \"kubernetes.io/projected/797b38f5-d9a7-4f82-bd12-e40e021ef28e-kube-api-access-klrdc\") pod \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " Mar 20 13:52:28 crc kubenswrapper[4973]: I0320 13:52:28.966302 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/797b38f5-d9a7-4f82-bd12-e40e021ef28e-pod-info\") pod \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " Mar 20 13:52:28 crc kubenswrapper[4973]: I0320 13:52:28.966468 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/797b38f5-d9a7-4f82-bd12-e40e021ef28e-rabbitmq-plugins\") pod \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\" (UID: \"797b38f5-d9a7-4f82-bd12-e40e021ef28e\") " Mar 20 13:52:28 crc kubenswrapper[4973]: I0320 13:52:28.967358 4973 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/797b38f5-d9a7-4f82-bd12-e40e021ef28e-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:28 crc kubenswrapper[4973]: I0320 13:52:28.967821 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/797b38f5-d9a7-4f82-bd12-e40e021ef28e-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "797b38f5-d9a7-4f82-bd12-e40e021ef28e" (UID: "797b38f5-d9a7-4f82-bd12-e40e021ef28e"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:52:28 crc kubenswrapper[4973]: I0320 13:52:28.968043 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/797b38f5-d9a7-4f82-bd12-e40e021ef28e-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "797b38f5-d9a7-4f82-bd12-e40e021ef28e" (UID: "797b38f5-d9a7-4f82-bd12-e40e021ef28e"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:52:28 crc kubenswrapper[4973]: I0320 13:52:28.972641 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/797b38f5-d9a7-4f82-bd12-e40e021ef28e-pod-info" (OuterVolumeSpecName: "pod-info") pod "797b38f5-d9a7-4f82-bd12-e40e021ef28e" (UID: "797b38f5-d9a7-4f82-bd12-e40e021ef28e"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 13:52:28 crc kubenswrapper[4973]: I0320 13:52:28.981120 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/797b38f5-d9a7-4f82-bd12-e40e021ef28e-kube-api-access-klrdc" (OuterVolumeSpecName: "kube-api-access-klrdc") pod "797b38f5-d9a7-4f82-bd12-e40e021ef28e" (UID: "797b38f5-d9a7-4f82-bd12-e40e021ef28e"). InnerVolumeSpecName "kube-api-access-klrdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:28 crc kubenswrapper[4973]: I0320 13:52:28.982217 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/797b38f5-d9a7-4f82-bd12-e40e021ef28e-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "797b38f5-d9a7-4f82-bd12-e40e021ef28e" (UID: "797b38f5-d9a7-4f82-bd12-e40e021ef28e"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:28 crc kubenswrapper[4973]: I0320 13:52:28.996567 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/797b38f5-d9a7-4f82-bd12-e40e021ef28e-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "797b38f5-d9a7-4f82-bd12-e40e021ef28e" (UID: "797b38f5-d9a7-4f82-bd12-e40e021ef28e"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.033361 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6474f3c-a7e9-419a-baaa-854971e6b73f" (OuterVolumeSpecName: "persistence") pod "797b38f5-d9a7-4f82-bd12-e40e021ef28e" (UID: "797b38f5-d9a7-4f82-bd12-e40e021ef28e"). InnerVolumeSpecName "pvc-c6474f3c-a7e9-419a-baaa-854971e6b73f". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.043515 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/797b38f5-d9a7-4f82-bd12-e40e021ef28e-config-data" (OuterVolumeSpecName: "config-data") pod "797b38f5-d9a7-4f82-bd12-e40e021ef28e" (UID: "797b38f5-d9a7-4f82-bd12-e40e021ef28e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.076025 4973 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-c6474f3c-a7e9-419a-baaa-854971e6b73f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6474f3c-a7e9-419a-baaa-854971e6b73f\") on node \"crc\" " Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.076074 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klrdc\" (UniqueName: \"kubernetes.io/projected/797b38f5-d9a7-4f82-bd12-e40e021ef28e-kube-api-access-klrdc\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.076086 4973 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/797b38f5-d9a7-4f82-bd12-e40e021ef28e-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.076098 4973 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/797b38f5-d9a7-4f82-bd12-e40e021ef28e-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.076107 4973 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/797b38f5-d9a7-4f82-bd12-e40e021ef28e-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.076116 4973 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/797b38f5-d9a7-4f82-bd12-e40e021ef28e-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.076126 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/797b38f5-d9a7-4f82-bd12-e40e021ef28e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.076140 4973 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/797b38f5-d9a7-4f82-bd12-e40e021ef28e-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.113705 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/797b38f5-d9a7-4f82-bd12-e40e021ef28e-server-conf" (OuterVolumeSpecName: "server-conf") pod "797b38f5-d9a7-4f82-bd12-e40e021ef28e" (UID: "797b38f5-d9a7-4f82-bd12-e40e021ef28e"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.125179 4973 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.125329 4973 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-c6474f3c-a7e9-419a-baaa-854971e6b73f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6474f3c-a7e9-419a-baaa-854971e6b73f") on node "crc" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.178972 4973 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/797b38f5-d9a7-4f82-bd12-e40e021ef28e-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.179326 4973 reconciler_common.go:293] "Volume detached for volume \"pvc-c6474f3c-a7e9-419a-baaa-854971e6b73f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6474f3c-a7e9-419a-baaa-854971e6b73f\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.205845 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/797b38f5-d9a7-4f82-bd12-e40e021ef28e-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "797b38f5-d9a7-4f82-bd12-e40e021ef28e" (UID: "797b38f5-d9a7-4f82-bd12-e40e021ef28e"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.283318 4973 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/797b38f5-d9a7-4f82-bd12-e40e021ef28e-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.415247 4973 generic.go:334] "Generic (PLEG): container finished" podID="797b38f5-d9a7-4f82-bd12-e40e021ef28e" containerID="d6ddcd3aa7658bceaec6103da5780aeff16abb7dbb7606fe4e686cb5378fa9bb" exitCode=0 Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.415298 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"797b38f5-d9a7-4f82-bd12-e40e021ef28e","Type":"ContainerDied","Data":"d6ddcd3aa7658bceaec6103da5780aeff16abb7dbb7606fe4e686cb5378fa9bb"} Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.415320 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.415352 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"797b38f5-d9a7-4f82-bd12-e40e021ef28e","Type":"ContainerDied","Data":"c88911f10d54bcd424c533ed9559a1a1dd6ad0e4525526d2be28da8bb9fb5cb9"} Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.415380 4973 scope.go:117] "RemoveContainer" containerID="d6ddcd3aa7658bceaec6103da5780aeff16abb7dbb7606fe4e686cb5378fa9bb" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.488358 4973 scope.go:117] "RemoveContainer" containerID="43a6b2817e43c4fa48cce3c4bb7ae49ce43b81d1e67941c6e11676db49de2111" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.502514 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.524395 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.545228 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-1"] Mar 20 13:52:29 crc kubenswrapper[4973]: E0320 13:52:29.545880 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="797b38f5-d9a7-4f82-bd12-e40e021ef28e" containerName="setup-container" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.545899 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="797b38f5-d9a7-4f82-bd12-e40e021ef28e" containerName="setup-container" Mar 20 13:52:29 crc kubenswrapper[4973]: E0320 13:52:29.545927 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="797b38f5-d9a7-4f82-bd12-e40e021ef28e" containerName="rabbitmq" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.545935 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="797b38f5-d9a7-4f82-bd12-e40e021ef28e" containerName="rabbitmq" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.546197 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="797b38f5-d9a7-4f82-bd12-e40e021ef28e" containerName="rabbitmq" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.547496 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.558501 4973 scope.go:117] "RemoveContainer" containerID="d6ddcd3aa7658bceaec6103da5780aeff16abb7dbb7606fe4e686cb5378fa9bb" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.558795 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 20 13:52:29 crc kubenswrapper[4973]: E0320 13:52:29.563566 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6ddcd3aa7658bceaec6103da5780aeff16abb7dbb7606fe4e686cb5378fa9bb\": container with ID starting with d6ddcd3aa7658bceaec6103da5780aeff16abb7dbb7606fe4e686cb5378fa9bb not found: ID does not exist" containerID="d6ddcd3aa7658bceaec6103da5780aeff16abb7dbb7606fe4e686cb5378fa9bb" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.563613 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6ddcd3aa7658bceaec6103da5780aeff16abb7dbb7606fe4e686cb5378fa9bb"} err="failed to get container status \"d6ddcd3aa7658bceaec6103da5780aeff16abb7dbb7606fe4e686cb5378fa9bb\": rpc error: code = NotFound desc = could not find container \"d6ddcd3aa7658bceaec6103da5780aeff16abb7dbb7606fe4e686cb5378fa9bb\": container with ID starting with d6ddcd3aa7658bceaec6103da5780aeff16abb7dbb7606fe4e686cb5378fa9bb not found: ID does not exist" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.563641 4973 scope.go:117] "RemoveContainer" containerID="43a6b2817e43c4fa48cce3c4bb7ae49ce43b81d1e67941c6e11676db49de2111" Mar 20 13:52:29 crc kubenswrapper[4973]: E0320 13:52:29.564355 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43a6b2817e43c4fa48cce3c4bb7ae49ce43b81d1e67941c6e11676db49de2111\": container with ID starting with 43a6b2817e43c4fa48cce3c4bb7ae49ce43b81d1e67941c6e11676db49de2111 not found: ID does not exist" containerID="43a6b2817e43c4fa48cce3c4bb7ae49ce43b81d1e67941c6e11676db49de2111" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.564377 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43a6b2817e43c4fa48cce3c4bb7ae49ce43b81d1e67941c6e11676db49de2111"} err="failed to get container status \"43a6b2817e43c4fa48cce3c4bb7ae49ce43b81d1e67941c6e11676db49de2111\": rpc error: code = NotFound desc = could not find container \"43a6b2817e43c4fa48cce3c4bb7ae49ce43b81d1e67941c6e11676db49de2111\": container with ID starting with 43a6b2817e43c4fa48cce3c4bb7ae49ce43b81d1e67941c6e11676db49de2111 not found: ID does not exist" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.700818 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/22bad2d1-9031-42e8-882b-f3cebea8db32-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.701201 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/22bad2d1-9031-42e8-882b-f3cebea8db32-pod-info\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.701242 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/22bad2d1-9031-42e8-882b-f3cebea8db32-config-data\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.701536 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbjvb\" (UniqueName: \"kubernetes.io/projected/22bad2d1-9031-42e8-882b-f3cebea8db32-kube-api-access-tbjvb\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.701660 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/22bad2d1-9031-42e8-882b-f3cebea8db32-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.701921 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c6474f3c-a7e9-419a-baaa-854971e6b73f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6474f3c-a7e9-419a-baaa-854971e6b73f\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.701972 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/22bad2d1-9031-42e8-882b-f3cebea8db32-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.702025 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/22bad2d1-9031-42e8-882b-f3cebea8db32-server-conf\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.702074 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/22bad2d1-9031-42e8-882b-f3cebea8db32-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.702220 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/22bad2d1-9031-42e8-882b-f3cebea8db32-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.702363 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/22bad2d1-9031-42e8-882b-f3cebea8db32-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.805005 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/22bad2d1-9031-42e8-882b-f3cebea8db32-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.805080 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/22bad2d1-9031-42e8-882b-f3cebea8db32-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.805133 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/22bad2d1-9031-42e8-882b-f3cebea8db32-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.805157 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/22bad2d1-9031-42e8-882b-f3cebea8db32-pod-info\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.806020 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/22bad2d1-9031-42e8-882b-f3cebea8db32-config-data\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.806178 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbjvb\" (UniqueName: \"kubernetes.io/projected/22bad2d1-9031-42e8-882b-f3cebea8db32-kube-api-access-tbjvb\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.806223 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/22bad2d1-9031-42e8-882b-f3cebea8db32-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.806331 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/22bad2d1-9031-42e8-882b-f3cebea8db32-plugins-conf\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.806368 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c6474f3c-a7e9-419a-baaa-854971e6b73f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6474f3c-a7e9-419a-baaa-854971e6b73f\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.806439 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/22bad2d1-9031-42e8-882b-f3cebea8db32-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.806481 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/22bad2d1-9031-42e8-882b-f3cebea8db32-server-conf\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.806523 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/22bad2d1-9031-42e8-882b-f3cebea8db32-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.806904 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/22bad2d1-9031-42e8-882b-f3cebea8db32-config-data\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.807253 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/22bad2d1-9031-42e8-882b-f3cebea8db32-rabbitmq-plugins\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.808207 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/22bad2d1-9031-42e8-882b-f3cebea8db32-server-conf\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.808616 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/22bad2d1-9031-42e8-882b-f3cebea8db32-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.810190 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/22bad2d1-9031-42e8-882b-f3cebea8db32-erlang-cookie-secret\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.810416 4973 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.810453 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c6474f3c-a7e9-419a-baaa-854971e6b73f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6474f3c-a7e9-419a-baaa-854971e6b73f\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/70e0bd6723351d45432d0557e984e104e23058730f41a4be8f1efa0e578a3f37/globalmount\"" pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.810664 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/22bad2d1-9031-42e8-882b-f3cebea8db32-rabbitmq-confd\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.812506 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/22bad2d1-9031-42e8-882b-f3cebea8db32-pod-info\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.822649 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/22bad2d1-9031-42e8-882b-f3cebea8db32-rabbitmq-tls\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.824902 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbjvb\" (UniqueName: \"kubernetes.io/projected/22bad2d1-9031-42e8-882b-f3cebea8db32-kube-api-access-tbjvb\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.871781 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c6474f3c-a7e9-419a-baaa-854971e6b73f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-c6474f3c-a7e9-419a-baaa-854971e6b73f\") pod \"rabbitmq-server-1\" (UID: \"22bad2d1-9031-42e8-882b-f3cebea8db32\") " pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.914044 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-1" Mar 20 13:52:29 crc kubenswrapper[4973]: I0320 13:52:29.966150 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="797b38f5-d9a7-4f82-bd12-e40e021ef28e" path="/var/lib/kubelet/pods/797b38f5-d9a7-4f82-bd12-e40e021ef28e/volumes" Mar 20 13:52:30 crc kubenswrapper[4973]: I0320 13:52:30.473835 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-1"] Mar 20 13:52:31 crc kubenswrapper[4973]: I0320 13:52:31.438969 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"22bad2d1-9031-42e8-882b-f3cebea8db32","Type":"ContainerStarted","Data":"d2d3079db42889c33ab29f340d4d2e39599df8a0c7e9371f556d1ced2d8abb14"} Mar 20 13:52:31 crc kubenswrapper[4973]: I0320 13:52:31.441684 4973 generic.go:334] "Generic (PLEG): container finished" podID="54c36469-89aa-4fbc-bc88-22c3dc8b0bc6" containerID="fef3925f94ce619168301d5782a0cd7e4d4781173dd9337f0878b67b117cb10a" exitCode=0 Mar 20 13:52:31 crc kubenswrapper[4973]: I0320 13:52:31.441711 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6","Type":"ContainerDied","Data":"fef3925f94ce619168301d5782a0cd7e4d4781173dd9337f0878b67b117cb10a"} Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.123872 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.265283 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dppmk\" (UniqueName: \"kubernetes.io/projected/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-kube-api-access-dppmk\") pod \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\" (UID: \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\") " Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.265438 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-combined-ca-bundle\") pod \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\" (UID: \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\") " Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.265949 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-internal-tls-certs\") pod \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\" (UID: \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\") " Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.266054 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-config-data\") pod \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\" (UID: \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\") " Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.266411 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-public-tls-certs\") pod \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\" (UID: \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\") " Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.266461 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-scripts\") pod \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\" (UID: \"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6\") " Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.287095 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-kube-api-access-dppmk" (OuterVolumeSpecName: "kube-api-access-dppmk") pod "54c36469-89aa-4fbc-bc88-22c3dc8b0bc6" (UID: "54c36469-89aa-4fbc-bc88-22c3dc8b0bc6"). InnerVolumeSpecName "kube-api-access-dppmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.288485 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-scripts" (OuterVolumeSpecName: "scripts") pod "54c36469-89aa-4fbc-bc88-22c3dc8b0bc6" (UID: "54c36469-89aa-4fbc-bc88-22c3dc8b0bc6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.345602 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "54c36469-89aa-4fbc-bc88-22c3dc8b0bc6" (UID: "54c36469-89aa-4fbc-bc88-22c3dc8b0bc6"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.356379 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "54c36469-89aa-4fbc-bc88-22c3dc8b0bc6" (UID: "54c36469-89aa-4fbc-bc88-22c3dc8b0bc6"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.370466 4973 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.370507 4973 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-scripts\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.370519 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dppmk\" (UniqueName: \"kubernetes.io/projected/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-kube-api-access-dppmk\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.370530 4973 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.423563 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-config-data" (OuterVolumeSpecName: "config-data") pod "54c36469-89aa-4fbc-bc88-22c3dc8b0bc6" (UID: "54c36469-89aa-4fbc-bc88-22c3dc8b0bc6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.446305 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54c36469-89aa-4fbc-bc88-22c3dc8b0bc6" (UID: "54c36469-89aa-4fbc-bc88-22c3dc8b0bc6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.452832 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"22bad2d1-9031-42e8-882b-f3cebea8db32","Type":"ContainerStarted","Data":"788832a09866e1f2ef2c7ec4670131accb2e7b47eeafa2a1ffe7f04354aae695"} Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.456507 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"54c36469-89aa-4fbc-bc88-22c3dc8b0bc6","Type":"ContainerDied","Data":"1f5e3550820929c0cb49a37397e03409f9a72524e09dd349767ec75d6838e767"} Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.456857 4973 scope.go:117] "RemoveContainer" containerID="fef3925f94ce619168301d5782a0cd7e4d4781173dd9337f0878b67b117cb10a" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.456614 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.473191 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.473217 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.496518 4973 scope.go:117] "RemoveContainer" containerID="b75848de0126f937d24f5fb5efcc89558ec1b45db617456de33c25662975c984" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.532933 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-0"] Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.536234 4973 scope.go:117] "RemoveContainer" containerID="c8eac8c778b02c5ab14e1bfc1ce1c5ecce5659d774e5631b2eb49413a548c16d" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.562329 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-0"] Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.577543 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Mar 20 13:52:32 crc kubenswrapper[4973]: E0320 13:52:32.578211 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54c36469-89aa-4fbc-bc88-22c3dc8b0bc6" containerName="aodh-api" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.578231 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="54c36469-89aa-4fbc-bc88-22c3dc8b0bc6" containerName="aodh-api" Mar 20 13:52:32 crc kubenswrapper[4973]: E0320 13:52:32.578246 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54c36469-89aa-4fbc-bc88-22c3dc8b0bc6" containerName="aodh-notifier" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.578252 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="54c36469-89aa-4fbc-bc88-22c3dc8b0bc6" containerName="aodh-notifier" Mar 20 13:52:32 crc kubenswrapper[4973]: E0320 13:52:32.578291 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54c36469-89aa-4fbc-bc88-22c3dc8b0bc6" containerName="aodh-evaluator" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.578301 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="54c36469-89aa-4fbc-bc88-22c3dc8b0bc6" containerName="aodh-evaluator" Mar 20 13:52:32 crc kubenswrapper[4973]: E0320 13:52:32.578332 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54c36469-89aa-4fbc-bc88-22c3dc8b0bc6" containerName="aodh-listener" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.578362 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="54c36469-89aa-4fbc-bc88-22c3dc8b0bc6" containerName="aodh-listener" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.578637 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="54c36469-89aa-4fbc-bc88-22c3dc8b0bc6" containerName="aodh-listener" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.578658 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="54c36469-89aa-4fbc-bc88-22c3dc8b0bc6" containerName="aodh-evaluator" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.578676 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="54c36469-89aa-4fbc-bc88-22c3dc8b0bc6" containerName="aodh-notifier" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.578698 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="54c36469-89aa-4fbc-bc88-22c3dc8b0bc6" containerName="aodh-api" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.605898 4973 scope.go:117] "RemoveContainer" containerID="0f6a751e8f966556681f7f43b659c9eb663f2754d4df283526610a95879c0d5a" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.623723 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.632640 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-internal-svc" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.635606 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-aodh-public-svc" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.636558 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.636813 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.637063 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-x7tj8" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.640726 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.678212 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a105da-e1bf-4c7a-aabb-b81defe003af-combined-ca-bundle\") pod \"aodh-0\" (UID: \"12a105da-e1bf-4c7a-aabb-b81defe003af\") " pod="openstack/aodh-0" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.678253 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12a105da-e1bf-4c7a-aabb-b81defe003af-public-tls-certs\") pod \"aodh-0\" (UID: \"12a105da-e1bf-4c7a-aabb-b81defe003af\") " pod="openstack/aodh-0" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.678294 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a105da-e1bf-4c7a-aabb-b81defe003af-config-data\") pod \"aodh-0\" (UID: \"12a105da-e1bf-4c7a-aabb-b81defe003af\") " pod="openstack/aodh-0" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.678406 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bz28s\" (UniqueName: \"kubernetes.io/projected/12a105da-e1bf-4c7a-aabb-b81defe003af-kube-api-access-bz28s\") pod \"aodh-0\" (UID: \"12a105da-e1bf-4c7a-aabb-b81defe003af\") " pod="openstack/aodh-0" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.678428 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12a105da-e1bf-4c7a-aabb-b81defe003af-scripts\") pod \"aodh-0\" (UID: \"12a105da-e1bf-4c7a-aabb-b81defe003af\") " pod="openstack/aodh-0" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.678451 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12a105da-e1bf-4c7a-aabb-b81defe003af-internal-tls-certs\") pod \"aodh-0\" (UID: \"12a105da-e1bf-4c7a-aabb-b81defe003af\") " pod="openstack/aodh-0" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.780380 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bz28s\" (UniqueName: \"kubernetes.io/projected/12a105da-e1bf-4c7a-aabb-b81defe003af-kube-api-access-bz28s\") pod \"aodh-0\" (UID: \"12a105da-e1bf-4c7a-aabb-b81defe003af\") " pod="openstack/aodh-0" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.780432 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12a105da-e1bf-4c7a-aabb-b81defe003af-scripts\") pod \"aodh-0\" (UID: \"12a105da-e1bf-4c7a-aabb-b81defe003af\") " pod="openstack/aodh-0" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.780465 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12a105da-e1bf-4c7a-aabb-b81defe003af-internal-tls-certs\") pod \"aodh-0\" (UID: \"12a105da-e1bf-4c7a-aabb-b81defe003af\") " pod="openstack/aodh-0" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.780576 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a105da-e1bf-4c7a-aabb-b81defe003af-combined-ca-bundle\") pod \"aodh-0\" (UID: \"12a105da-e1bf-4c7a-aabb-b81defe003af\") " pod="openstack/aodh-0" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.780598 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12a105da-e1bf-4c7a-aabb-b81defe003af-public-tls-certs\") pod \"aodh-0\" (UID: \"12a105da-e1bf-4c7a-aabb-b81defe003af\") " pod="openstack/aodh-0" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.780633 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a105da-e1bf-4c7a-aabb-b81defe003af-config-data\") pod \"aodh-0\" (UID: \"12a105da-e1bf-4c7a-aabb-b81defe003af\") " pod="openstack/aodh-0" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.784702 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/12a105da-e1bf-4c7a-aabb-b81defe003af-config-data\") pod \"aodh-0\" (UID: \"12a105da-e1bf-4c7a-aabb-b81defe003af\") " pod="openstack/aodh-0" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.784875 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/12a105da-e1bf-4c7a-aabb-b81defe003af-public-tls-certs\") pod \"aodh-0\" (UID: \"12a105da-e1bf-4c7a-aabb-b81defe003af\") " pod="openstack/aodh-0" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.784998 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/12a105da-e1bf-4c7a-aabb-b81defe003af-internal-tls-certs\") pod \"aodh-0\" (UID: \"12a105da-e1bf-4c7a-aabb-b81defe003af\") " pod="openstack/aodh-0" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.785758 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/12a105da-e1bf-4c7a-aabb-b81defe003af-combined-ca-bundle\") pod \"aodh-0\" (UID: \"12a105da-e1bf-4c7a-aabb-b81defe003af\") " pod="openstack/aodh-0" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.788779 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/12a105da-e1bf-4c7a-aabb-b81defe003af-scripts\") pod \"aodh-0\" (UID: \"12a105da-e1bf-4c7a-aabb-b81defe003af\") " pod="openstack/aodh-0" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.802606 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bz28s\" (UniqueName: \"kubernetes.io/projected/12a105da-e1bf-4c7a-aabb-b81defe003af-kube-api-access-bz28s\") pod \"aodh-0\" (UID: \"12a105da-e1bf-4c7a-aabb-b81defe003af\") " pod="openstack/aodh-0" Mar 20 13:52:32 crc kubenswrapper[4973]: I0320 13:52:32.958793 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Mar 20 13:52:33 crc kubenswrapper[4973]: I0320 13:52:33.501825 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Mar 20 13:52:33 crc kubenswrapper[4973]: I0320 13:52:33.966626 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54c36469-89aa-4fbc-bc88-22c3dc8b0bc6" path="/var/lib/kubelet/pods/54c36469-89aa-4fbc-bc88-22c3dc8b0bc6/volumes" Mar 20 13:52:34 crc kubenswrapper[4973]: I0320 13:52:34.490868 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"12a105da-e1bf-4c7a-aabb-b81defe003af","Type":"ContainerStarted","Data":"0e49221018fc1524d2a739bba9b15ec901e03b47d11ca1587f5f29401d0dcda8"} Mar 20 13:52:34 crc kubenswrapper[4973]: I0320 13:52:34.491318 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"12a105da-e1bf-4c7a-aabb-b81defe003af","Type":"ContainerStarted","Data":"98af99e8dcfb785372c512b5cfb8ace4ba716e81867c4ce158f9c531546e0ae5"} Mar 20 13:52:34 crc kubenswrapper[4973]: I0320 13:52:34.955804 4973 scope.go:117] "RemoveContainer" containerID="6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" Mar 20 13:52:34 crc kubenswrapper[4973]: E0320 13:52:34.957561 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:52:35 crc kubenswrapper[4973]: I0320 13:52:35.503330 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"12a105da-e1bf-4c7a-aabb-b81defe003af","Type":"ContainerStarted","Data":"80fda962cb5e239af433bf29a80aa362e201f3a205246388d33a69e453fef7f5"} Mar 20 13:52:36 crc kubenswrapper[4973]: I0320 13:52:36.517876 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"12a105da-e1bf-4c7a-aabb-b81defe003af","Type":"ContainerStarted","Data":"c490408dee862ae68dcf2c334d864329220d9eb0ecf2130efa957e3efb9e788b"} Mar 20 13:52:38 crc kubenswrapper[4973]: I0320 13:52:38.546000 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"12a105da-e1bf-4c7a-aabb-b81defe003af","Type":"ContainerStarted","Data":"37f2556082859a448cb8dfffcdfefb2a680666bb03afa622957edf077d2cae63"} Mar 20 13:52:38 crc kubenswrapper[4973]: I0320 13:52:38.575018 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=2.628042686 podStartE2EDuration="6.575000335s" podCreationTimestamp="2026-03-20 13:52:32 +0000 UTC" firstStartedPulling="2026-03-20 13:52:33.524015314 +0000 UTC m=+1874.267685058" lastFinishedPulling="2026-03-20 13:52:37.470972963 +0000 UTC m=+1878.214642707" observedRunningTime="2026-03-20 13:52:38.57076239 +0000 UTC m=+1879.314432134" watchObservedRunningTime="2026-03-20 13:52:38.575000335 +0000 UTC m=+1879.318670079" Mar 20 13:52:47 crc kubenswrapper[4973]: I0320 13:52:47.951678 4973 scope.go:117] "RemoveContainer" containerID="6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" Mar 20 13:52:47 crc kubenswrapper[4973]: E0320 13:52:47.952605 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:52:57 crc kubenswrapper[4973]: I0320 13:52:57.278111 4973 scope.go:117] "RemoveContainer" containerID="27abd357ed00f4bfa2fb6b2df72ead92b99d556f8566162efdcac6b4a3e05553" Mar 20 13:52:57 crc kubenswrapper[4973]: I0320 13:52:57.356177 4973 scope.go:117] "RemoveContainer" containerID="131d15b9a05a258c4c680a989df302321e53c7be62c04536bb7336f954905cc5" Mar 20 13:52:57 crc kubenswrapper[4973]: I0320 13:52:57.403296 4973 scope.go:117] "RemoveContainer" containerID="8dc127458f2a429cd82307b7a3a2d526590507915e3dd783c3131ec53c85c887" Mar 20 13:52:57 crc kubenswrapper[4973]: I0320 13:52:57.443594 4973 scope.go:117] "RemoveContainer" containerID="d73daea7a5bcd0d9a89125476231ffd70c496368c325a05bfa846fdbdefd8217" Mar 20 13:52:57 crc kubenswrapper[4973]: I0320 13:52:57.491134 4973 scope.go:117] "RemoveContainer" containerID="81f67f519f04127b155b4be39fa9edd64bf13174df55ce6e45c34ac8abb8482b" Mar 20 13:52:58 crc kubenswrapper[4973]: I0320 13:52:58.951423 4973 scope.go:117] "RemoveContainer" containerID="6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" Mar 20 13:52:58 crc kubenswrapper[4973]: E0320 13:52:58.952010 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:53:04 crc kubenswrapper[4973]: I0320 13:53:04.859777 4973 generic.go:334] "Generic (PLEG): container finished" podID="22bad2d1-9031-42e8-882b-f3cebea8db32" containerID="788832a09866e1f2ef2c7ec4670131accb2e7b47eeafa2a1ffe7f04354aae695" exitCode=0 Mar 20 13:53:04 crc kubenswrapper[4973]: I0320 13:53:04.859853 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"22bad2d1-9031-42e8-882b-f3cebea8db32","Type":"ContainerDied","Data":"788832a09866e1f2ef2c7ec4670131accb2e7b47eeafa2a1ffe7f04354aae695"} Mar 20 13:53:05 crc kubenswrapper[4973]: I0320 13:53:05.876548 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-1" event={"ID":"22bad2d1-9031-42e8-882b-f3cebea8db32","Type":"ContainerStarted","Data":"dc487dc7582eac4f6d57a51b368f062b7fcb8dcfa1d77d67b783f86fbd7daee8"} Mar 20 13:53:05 crc kubenswrapper[4973]: I0320 13:53:05.878578 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-1" Mar 20 13:53:05 crc kubenswrapper[4973]: I0320 13:53:05.922612 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-1" podStartSLOduration=36.922581567 podStartE2EDuration="36.922581567s" podCreationTimestamp="2026-03-20 13:52:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:53:05.914788844 +0000 UTC m=+1906.658458598" watchObservedRunningTime="2026-03-20 13:53:05.922581567 +0000 UTC m=+1906.666251311" Mar 20 13:53:09 crc kubenswrapper[4973]: I0320 13:53:09.961191 4973 scope.go:117] "RemoveContainer" containerID="6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" Mar 20 13:53:09 crc kubenswrapper[4973]: E0320 13:53:09.962145 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:53:19 crc kubenswrapper[4973]: I0320 13:53:19.917597 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-1" Mar 20 13:53:20 crc kubenswrapper[4973]: I0320 13:53:20.016675 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:53:24 crc kubenswrapper[4973]: I0320 13:53:24.234126 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="96de22e2-f61c-4f75-8faa-9a0591aa0f38" containerName="rabbitmq" containerID="cri-o://3a7f3d91e76a50e04bcf5c49224d40c07b200971965316fe7aecb18bfbe0b643" gracePeriod=604796 Mar 20 13:53:24 crc kubenswrapper[4973]: I0320 13:53:24.951712 4973 scope.go:117] "RemoveContainer" containerID="6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" Mar 20 13:53:25 crc kubenswrapper[4973]: I0320 13:53:25.455084 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="96de22e2-f61c-4f75-8faa-9a0591aa0f38" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.133:5671: connect: connection refused" Mar 20 13:53:26 crc kubenswrapper[4973]: I0320 13:53:26.117677 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerStarted","Data":"73be12b46987d19974f1cae15f097e9d05c0f53b3e5735ef8e8bb8b42d9ed186"} Mar 20 13:53:30 crc kubenswrapper[4973]: I0320 13:53:30.927296 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.026185 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/96de22e2-f61c-4f75-8faa-9a0591aa0f38-rabbitmq-tls\") pod \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.026536 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/96de22e2-f61c-4f75-8faa-9a0591aa0f38-erlang-cookie-secret\") pod \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.027714 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bc9ff\" (UniqueName: \"kubernetes.io/projected/96de22e2-f61c-4f75-8faa-9a0591aa0f38-kube-api-access-bc9ff\") pod \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.027876 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/96de22e2-f61c-4f75-8faa-9a0591aa0f38-pod-info\") pod \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.028017 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/96de22e2-f61c-4f75-8faa-9a0591aa0f38-rabbitmq-plugins\") pod \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.028146 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/96de22e2-f61c-4f75-8faa-9a0591aa0f38-rabbitmq-erlang-cookie\") pod \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.028283 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/96de22e2-f61c-4f75-8faa-9a0591aa0f38-server-conf\") pod \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.028553 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96de22e2-f61c-4f75-8faa-9a0591aa0f38-config-data\") pod \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.028656 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/96de22e2-f61c-4f75-8faa-9a0591aa0f38-rabbitmq-confd\") pod \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.028725 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/96de22e2-f61c-4f75-8faa-9a0591aa0f38-plugins-conf\") pod \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.029231 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7bd4023-c28d-4f2d-85ff-5fa21c10825d\") pod \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\" (UID: \"96de22e2-f61c-4f75-8faa-9a0591aa0f38\") " Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.032031 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96de22e2-f61c-4f75-8faa-9a0591aa0f38-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "96de22e2-f61c-4f75-8faa-9a0591aa0f38" (UID: "96de22e2-f61c-4f75-8faa-9a0591aa0f38"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.033007 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96de22e2-f61c-4f75-8faa-9a0591aa0f38-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "96de22e2-f61c-4f75-8faa-9a0591aa0f38" (UID: "96de22e2-f61c-4f75-8faa-9a0591aa0f38"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.033510 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96de22e2-f61c-4f75-8faa-9a0591aa0f38-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "96de22e2-f61c-4f75-8faa-9a0591aa0f38" (UID: "96de22e2-f61c-4f75-8faa-9a0591aa0f38"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.035100 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96de22e2-f61c-4f75-8faa-9a0591aa0f38-kube-api-access-bc9ff" (OuterVolumeSpecName: "kube-api-access-bc9ff") pod "96de22e2-f61c-4f75-8faa-9a0591aa0f38" (UID: "96de22e2-f61c-4f75-8faa-9a0591aa0f38"). InnerVolumeSpecName "kube-api-access-bc9ff". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.042229 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/96de22e2-f61c-4f75-8faa-9a0591aa0f38-pod-info" (OuterVolumeSpecName: "pod-info") pod "96de22e2-f61c-4f75-8faa-9a0591aa0f38" (UID: "96de22e2-f61c-4f75-8faa-9a0591aa0f38"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.044861 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96de22e2-f61c-4f75-8faa-9a0591aa0f38-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "96de22e2-f61c-4f75-8faa-9a0591aa0f38" (UID: "96de22e2-f61c-4f75-8faa-9a0591aa0f38"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.053218 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96de22e2-f61c-4f75-8faa-9a0591aa0f38-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "96de22e2-f61c-4f75-8faa-9a0591aa0f38" (UID: "96de22e2-f61c-4f75-8faa-9a0591aa0f38"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.079842 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7bd4023-c28d-4f2d-85ff-5fa21c10825d" (OuterVolumeSpecName: "persistence") pod "96de22e2-f61c-4f75-8faa-9a0591aa0f38" (UID: "96de22e2-f61c-4f75-8faa-9a0591aa0f38"). InnerVolumeSpecName "pvc-d7bd4023-c28d-4f2d-85ff-5fa21c10825d". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.100724 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96de22e2-f61c-4f75-8faa-9a0591aa0f38-config-data" (OuterVolumeSpecName: "config-data") pod "96de22e2-f61c-4f75-8faa-9a0591aa0f38" (UID: "96de22e2-f61c-4f75-8faa-9a0591aa0f38"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.114107 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96de22e2-f61c-4f75-8faa-9a0591aa0f38-server-conf" (OuterVolumeSpecName: "server-conf") pod "96de22e2-f61c-4f75-8faa-9a0591aa0f38" (UID: "96de22e2-f61c-4f75-8faa-9a0591aa0f38"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.134610 4973 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/96de22e2-f61c-4f75-8faa-9a0591aa0f38-pod-info\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.134646 4973 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/96de22e2-f61c-4f75-8faa-9a0591aa0f38-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.134659 4973 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/96de22e2-f61c-4f75-8faa-9a0591aa0f38-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.134668 4973 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/96de22e2-f61c-4f75-8faa-9a0591aa0f38-server-conf\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.134678 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/96de22e2-f61c-4f75-8faa-9a0591aa0f38-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.134686 4973 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/96de22e2-f61c-4f75-8faa-9a0591aa0f38-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.134712 4973 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d7bd4023-c28d-4f2d-85ff-5fa21c10825d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7bd4023-c28d-4f2d-85ff-5fa21c10825d\") on node \"crc\" " Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.134722 4973 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/96de22e2-f61c-4f75-8faa-9a0591aa0f38-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.134732 4973 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/96de22e2-f61c-4f75-8faa-9a0591aa0f38-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.134741 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bc9ff\" (UniqueName: \"kubernetes.io/projected/96de22e2-f61c-4f75-8faa-9a0591aa0f38-kube-api-access-bc9ff\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.183733 4973 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.183917 4973 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d7bd4023-c28d-4f2d-85ff-5fa21c10825d" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7bd4023-c28d-4f2d-85ff-5fa21c10825d") on node "crc" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.199029 4973 generic.go:334] "Generic (PLEG): container finished" podID="96de22e2-f61c-4f75-8faa-9a0591aa0f38" containerID="3a7f3d91e76a50e04bcf5c49224d40c07b200971965316fe7aecb18bfbe0b643" exitCode=0 Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.199082 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"96de22e2-f61c-4f75-8faa-9a0591aa0f38","Type":"ContainerDied","Data":"3a7f3d91e76a50e04bcf5c49224d40c07b200971965316fe7aecb18bfbe0b643"} Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.199114 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"96de22e2-f61c-4f75-8faa-9a0591aa0f38","Type":"ContainerDied","Data":"14149fd1a8a3c68cdbd420943b49e4787cc4084fc7a74204535c3a549168e610"} Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.199131 4973 scope.go:117] "RemoveContainer" containerID="3a7f3d91e76a50e04bcf5c49224d40c07b200971965316fe7aecb18bfbe0b643" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.199126 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.209042 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96de22e2-f61c-4f75-8faa-9a0591aa0f38-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "96de22e2-f61c-4f75-8faa-9a0591aa0f38" (UID: "96de22e2-f61c-4f75-8faa-9a0591aa0f38"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.229305 4973 scope.go:117] "RemoveContainer" containerID="2f909be8e4577c3906dd8fe036f0cec95e6cfd7584a8f17b0b28bb39aa6d6a30" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.238317 4973 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/96de22e2-f61c-4f75-8faa-9a0591aa0f38-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.238366 4973 reconciler_common.go:293] "Volume detached for volume \"pvc-d7bd4023-c28d-4f2d-85ff-5fa21c10825d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7bd4023-c28d-4f2d-85ff-5fa21c10825d\") on node \"crc\" DevicePath \"\"" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.269756 4973 scope.go:117] "RemoveContainer" containerID="3a7f3d91e76a50e04bcf5c49224d40c07b200971965316fe7aecb18bfbe0b643" Mar 20 13:53:31 crc kubenswrapper[4973]: E0320 13:53:31.270099 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a7f3d91e76a50e04bcf5c49224d40c07b200971965316fe7aecb18bfbe0b643\": container with ID starting with 3a7f3d91e76a50e04bcf5c49224d40c07b200971965316fe7aecb18bfbe0b643 not found: ID does not exist" containerID="3a7f3d91e76a50e04bcf5c49224d40c07b200971965316fe7aecb18bfbe0b643" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.270135 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a7f3d91e76a50e04bcf5c49224d40c07b200971965316fe7aecb18bfbe0b643"} err="failed to get container status \"3a7f3d91e76a50e04bcf5c49224d40c07b200971965316fe7aecb18bfbe0b643\": rpc error: code = NotFound desc = could not find container \"3a7f3d91e76a50e04bcf5c49224d40c07b200971965316fe7aecb18bfbe0b643\": container with ID starting with 3a7f3d91e76a50e04bcf5c49224d40c07b200971965316fe7aecb18bfbe0b643 not found: ID does not exist" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.270164 4973 scope.go:117] "RemoveContainer" containerID="2f909be8e4577c3906dd8fe036f0cec95e6cfd7584a8f17b0b28bb39aa6d6a30" Mar 20 13:53:31 crc kubenswrapper[4973]: E0320 13:53:31.271641 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f909be8e4577c3906dd8fe036f0cec95e6cfd7584a8f17b0b28bb39aa6d6a30\": container with ID starting with 2f909be8e4577c3906dd8fe036f0cec95e6cfd7584a8f17b0b28bb39aa6d6a30 not found: ID does not exist" containerID="2f909be8e4577c3906dd8fe036f0cec95e6cfd7584a8f17b0b28bb39aa6d6a30" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.271668 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f909be8e4577c3906dd8fe036f0cec95e6cfd7584a8f17b0b28bb39aa6d6a30"} err="failed to get container status \"2f909be8e4577c3906dd8fe036f0cec95e6cfd7584a8f17b0b28bb39aa6d6a30\": rpc error: code = NotFound desc = could not find container \"2f909be8e4577c3906dd8fe036f0cec95e6cfd7584a8f17b0b28bb39aa6d6a30\": container with ID starting with 2f909be8e4577c3906dd8fe036f0cec95e6cfd7584a8f17b0b28bb39aa6d6a30 not found: ID does not exist" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.564732 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.578788 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.609321 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:53:31 crc kubenswrapper[4973]: E0320 13:53:31.610018 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96de22e2-f61c-4f75-8faa-9a0591aa0f38" containerName="rabbitmq" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.610045 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="96de22e2-f61c-4f75-8faa-9a0591aa0f38" containerName="rabbitmq" Mar 20 13:53:31 crc kubenswrapper[4973]: E0320 13:53:31.610061 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96de22e2-f61c-4f75-8faa-9a0591aa0f38" containerName="setup-container" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.610068 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="96de22e2-f61c-4f75-8faa-9a0591aa0f38" containerName="setup-container" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.610390 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="96de22e2-f61c-4f75-8faa-9a0591aa0f38" containerName="rabbitmq" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.611834 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.624398 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.749009 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/10689a54-14e6-456c-b710-e7c24c71016d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.749059 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/10689a54-14e6-456c-b710-e7c24c71016d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.749093 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzdrq\" (UniqueName: \"kubernetes.io/projected/10689a54-14e6-456c-b710-e7c24c71016d-kube-api-access-pzdrq\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.749123 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/10689a54-14e6-456c-b710-e7c24c71016d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.749173 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d7bd4023-c28d-4f2d-85ff-5fa21c10825d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7bd4023-c28d-4f2d-85ff-5fa21c10825d\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.749211 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/10689a54-14e6-456c-b710-e7c24c71016d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.749242 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/10689a54-14e6-456c-b710-e7c24c71016d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.749276 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/10689a54-14e6-456c-b710-e7c24c71016d-config-data\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.749314 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/10689a54-14e6-456c-b710-e7c24c71016d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.749331 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/10689a54-14e6-456c-b710-e7c24c71016d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.749376 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/10689a54-14e6-456c-b710-e7c24c71016d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.851552 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/10689a54-14e6-456c-b710-e7c24c71016d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.851623 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/10689a54-14e6-456c-b710-e7c24c71016d-config-data\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.851671 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/10689a54-14e6-456c-b710-e7c24c71016d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.851691 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/10689a54-14e6-456c-b710-e7c24c71016d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.851718 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/10689a54-14e6-456c-b710-e7c24c71016d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.851829 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/10689a54-14e6-456c-b710-e7c24c71016d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.851848 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/10689a54-14e6-456c-b710-e7c24c71016d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.851870 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzdrq\" (UniqueName: \"kubernetes.io/projected/10689a54-14e6-456c-b710-e7c24c71016d-kube-api-access-pzdrq\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.851889 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/10689a54-14e6-456c-b710-e7c24c71016d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.851930 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d7bd4023-c28d-4f2d-85ff-5fa21c10825d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7bd4023-c28d-4f2d-85ff-5fa21c10825d\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.851962 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/10689a54-14e6-456c-b710-e7c24c71016d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.852605 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/10689a54-14e6-456c-b710-e7c24c71016d-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.852639 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/10689a54-14e6-456c-b710-e7c24c71016d-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.852815 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/10689a54-14e6-456c-b710-e7c24c71016d-config-data\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.853366 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/10689a54-14e6-456c-b710-e7c24c71016d-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.855029 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/10689a54-14e6-456c-b710-e7c24c71016d-server-conf\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.857504 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/10689a54-14e6-456c-b710-e7c24c71016d-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.857674 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/10689a54-14e6-456c-b710-e7c24c71016d-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.857956 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/10689a54-14e6-456c-b710-e7c24c71016d-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.859078 4973 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.859191 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d7bd4023-c28d-4f2d-85ff-5fa21c10825d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7bd4023-c28d-4f2d-85ff-5fa21c10825d\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/65cf3f7a1b4243e3581d40e35123a2951f3bcb7180b938dca1f1d05b08317567/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.860198 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/10689a54-14e6-456c-b710-e7c24c71016d-pod-info\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.871620 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzdrq\" (UniqueName: \"kubernetes.io/projected/10689a54-14e6-456c-b710-e7c24c71016d-kube-api-access-pzdrq\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.922144 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d7bd4023-c28d-4f2d-85ff-5fa21c10825d\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7bd4023-c28d-4f2d-85ff-5fa21c10825d\") pod \"rabbitmq-server-0\" (UID: \"10689a54-14e6-456c-b710-e7c24c71016d\") " pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.946925 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 20 13:53:31 crc kubenswrapper[4973]: I0320 13:53:31.967918 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96de22e2-f61c-4f75-8faa-9a0591aa0f38" path="/var/lib/kubelet/pods/96de22e2-f61c-4f75-8faa-9a0591aa0f38/volumes" Mar 20 13:53:32 crc kubenswrapper[4973]: I0320 13:53:32.490290 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 20 13:53:33 crc kubenswrapper[4973]: I0320 13:53:33.228085 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"10689a54-14e6-456c-b710-e7c24c71016d","Type":"ContainerStarted","Data":"0cfd5aac2640ea10ecf80bc9a8c8de8cd4aa41b4a9f6187c0240ef773096961a"} Mar 20 13:53:35 crc kubenswrapper[4973]: I0320 13:53:35.249409 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"10689a54-14e6-456c-b710-e7c24c71016d","Type":"ContainerStarted","Data":"07acb22936c8629a73ee1b0704a346383e94bf95a8d408fb07e163b14c33259f"} Mar 20 13:54:00 crc kubenswrapper[4973]: I0320 13:54:00.144029 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566914-ljvvd"] Mar 20 13:54:00 crc kubenswrapper[4973]: I0320 13:54:00.146228 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566914-ljvvd" Mar 20 13:54:00 crc kubenswrapper[4973]: I0320 13:54:00.148237 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 13:54:00 crc kubenswrapper[4973]: I0320 13:54:00.148471 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:54:00 crc kubenswrapper[4973]: I0320 13:54:00.148706 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:54:00 crc kubenswrapper[4973]: I0320 13:54:00.160544 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566914-ljvvd"] Mar 20 13:54:00 crc kubenswrapper[4973]: I0320 13:54:00.271904 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk752\" (UniqueName: \"kubernetes.io/projected/517222fd-ce22-4859-abde-4d45185cd4b0-kube-api-access-jk752\") pod \"auto-csr-approver-29566914-ljvvd\" (UID: \"517222fd-ce22-4859-abde-4d45185cd4b0\") " pod="openshift-infra/auto-csr-approver-29566914-ljvvd" Mar 20 13:54:00 crc kubenswrapper[4973]: I0320 13:54:00.374180 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk752\" (UniqueName: \"kubernetes.io/projected/517222fd-ce22-4859-abde-4d45185cd4b0-kube-api-access-jk752\") pod \"auto-csr-approver-29566914-ljvvd\" (UID: \"517222fd-ce22-4859-abde-4d45185cd4b0\") " pod="openshift-infra/auto-csr-approver-29566914-ljvvd" Mar 20 13:54:00 crc kubenswrapper[4973]: I0320 13:54:00.396788 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk752\" (UniqueName: \"kubernetes.io/projected/517222fd-ce22-4859-abde-4d45185cd4b0-kube-api-access-jk752\") pod \"auto-csr-approver-29566914-ljvvd\" (UID: \"517222fd-ce22-4859-abde-4d45185cd4b0\") " pod="openshift-infra/auto-csr-approver-29566914-ljvvd" Mar 20 13:54:00 crc kubenswrapper[4973]: I0320 13:54:00.517601 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566914-ljvvd" Mar 20 13:54:01 crc kubenswrapper[4973]: I0320 13:54:01.068074 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566914-ljvvd"] Mar 20 13:54:01 crc kubenswrapper[4973]: I0320 13:54:01.527843 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566914-ljvvd" event={"ID":"517222fd-ce22-4859-abde-4d45185cd4b0","Type":"ContainerStarted","Data":"ccc3c8d00466f512199e1b1535dc21edceb731760f5f40eda75225a17e4a78cb"} Mar 20 13:54:02 crc kubenswrapper[4973]: I0320 13:54:02.543259 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566914-ljvvd" event={"ID":"517222fd-ce22-4859-abde-4d45185cd4b0","Type":"ContainerStarted","Data":"0d23c3baf1062e9d49890ec6be219164ac05264162d0919a1284f9aaa63d4fc8"} Mar 20 13:54:02 crc kubenswrapper[4973]: I0320 13:54:02.560423 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566914-ljvvd" podStartSLOduration=1.6461523900000001 podStartE2EDuration="2.560403197s" podCreationTimestamp="2026-03-20 13:54:00 +0000 UTC" firstStartedPulling="2026-03-20 13:54:01.070149645 +0000 UTC m=+1961.813819389" lastFinishedPulling="2026-03-20 13:54:01.984400452 +0000 UTC m=+1962.728070196" observedRunningTime="2026-03-20 13:54:02.558982369 +0000 UTC m=+1963.302652113" watchObservedRunningTime="2026-03-20 13:54:02.560403197 +0000 UTC m=+1963.304072941" Mar 20 13:54:03 crc kubenswrapper[4973]: I0320 13:54:03.558986 4973 generic.go:334] "Generic (PLEG): container finished" podID="517222fd-ce22-4859-abde-4d45185cd4b0" containerID="0d23c3baf1062e9d49890ec6be219164ac05264162d0919a1284f9aaa63d4fc8" exitCode=0 Mar 20 13:54:03 crc kubenswrapper[4973]: I0320 13:54:03.559072 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566914-ljvvd" event={"ID":"517222fd-ce22-4859-abde-4d45185cd4b0","Type":"ContainerDied","Data":"0d23c3baf1062e9d49890ec6be219164ac05264162d0919a1284f9aaa63d4fc8"} Mar 20 13:54:05 crc kubenswrapper[4973]: I0320 13:54:05.061648 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566914-ljvvd" Mar 20 13:54:05 crc kubenswrapper[4973]: I0320 13:54:05.107372 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jk752\" (UniqueName: \"kubernetes.io/projected/517222fd-ce22-4859-abde-4d45185cd4b0-kube-api-access-jk752\") pod \"517222fd-ce22-4859-abde-4d45185cd4b0\" (UID: \"517222fd-ce22-4859-abde-4d45185cd4b0\") " Mar 20 13:54:05 crc kubenswrapper[4973]: I0320 13:54:05.144992 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/517222fd-ce22-4859-abde-4d45185cd4b0-kube-api-access-jk752" (OuterVolumeSpecName: "kube-api-access-jk752") pod "517222fd-ce22-4859-abde-4d45185cd4b0" (UID: "517222fd-ce22-4859-abde-4d45185cd4b0"). InnerVolumeSpecName "kube-api-access-jk752". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:54:05 crc kubenswrapper[4973]: I0320 13:54:05.211298 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jk752\" (UniqueName: \"kubernetes.io/projected/517222fd-ce22-4859-abde-4d45185cd4b0-kube-api-access-jk752\") on node \"crc\" DevicePath \"\"" Mar 20 13:54:05 crc kubenswrapper[4973]: I0320 13:54:05.581102 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566914-ljvvd" event={"ID":"517222fd-ce22-4859-abde-4d45185cd4b0","Type":"ContainerDied","Data":"ccc3c8d00466f512199e1b1535dc21edceb731760f5f40eda75225a17e4a78cb"} Mar 20 13:54:05 crc kubenswrapper[4973]: I0320 13:54:05.581431 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccc3c8d00466f512199e1b1535dc21edceb731760f5f40eda75225a17e4a78cb" Mar 20 13:54:05 crc kubenswrapper[4973]: I0320 13:54:05.581230 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566914-ljvvd" Mar 20 13:54:05 crc kubenswrapper[4973]: I0320 13:54:05.639074 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566908-9w8fc"] Mar 20 13:54:05 crc kubenswrapper[4973]: I0320 13:54:05.650793 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566908-9w8fc"] Mar 20 13:54:05 crc kubenswrapper[4973]: I0320 13:54:05.963848 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7ed9fca6-1a1b-4489-9ca6-dab93e0615f3" path="/var/lib/kubelet/pods/7ed9fca6-1a1b-4489-9ca6-dab93e0615f3/volumes" Mar 20 13:54:06 crc kubenswrapper[4973]: I0320 13:54:06.597733 4973 generic.go:334] "Generic (PLEG): container finished" podID="10689a54-14e6-456c-b710-e7c24c71016d" containerID="07acb22936c8629a73ee1b0704a346383e94bf95a8d408fb07e163b14c33259f" exitCode=0 Mar 20 13:54:06 crc kubenswrapper[4973]: I0320 13:54:06.597807 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"10689a54-14e6-456c-b710-e7c24c71016d","Type":"ContainerDied","Data":"07acb22936c8629a73ee1b0704a346383e94bf95a8d408fb07e163b14c33259f"} Mar 20 13:54:07 crc kubenswrapper[4973]: I0320 13:54:07.619515 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"10689a54-14e6-456c-b710-e7c24c71016d","Type":"ContainerStarted","Data":"7fa903dab74b34db12b05781b2fc8ca42f4104b58262b57c358d2f1e620a5474"} Mar 20 13:54:07 crc kubenswrapper[4973]: I0320 13:54:07.620095 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 20 13:54:07 crc kubenswrapper[4973]: I0320 13:54:07.651514 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.651496743 podStartE2EDuration="36.651496743s" podCreationTimestamp="2026-03-20 13:53:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 13:54:07.642127817 +0000 UTC m=+1968.385797561" watchObservedRunningTime="2026-03-20 13:54:07.651496743 +0000 UTC m=+1968.395166477" Mar 20 13:54:19 crc kubenswrapper[4973]: I0320 13:54:19.033415 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-bkrl2"] Mar 20 13:54:19 crc kubenswrapper[4973]: I0320 13:54:19.052737 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-db-create-bkrl2"] Mar 20 13:54:19 crc kubenswrapper[4973]: I0320 13:54:19.971132 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8" path="/var/lib/kubelet/pods/9d6a91ae-acc9-494b-9fac-1c2d05fdc3c8/volumes" Mar 20 13:54:20 crc kubenswrapper[4973]: I0320 13:54:20.067721 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-0ba4-account-create-update-lcrs9"] Mar 20 13:54:20 crc kubenswrapper[4973]: I0320 13:54:20.134422 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-xbx75"] Mar 20 13:54:20 crc kubenswrapper[4973]: I0320 13:54:20.219413 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-0ba4-account-create-update-lcrs9"] Mar 20 13:54:20 crc kubenswrapper[4973]: I0320 13:54:20.234404 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-68e0-account-create-update-8dkxv"] Mar 20 13:54:20 crc kubenswrapper[4973]: I0320 13:54:20.248024 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-wvx2d"] Mar 20 13:54:20 crc kubenswrapper[4973]: I0320 13:54:20.264838 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-68e0-account-create-update-8dkxv"] Mar 20 13:54:20 crc kubenswrapper[4973]: I0320 13:54:20.277943 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-ee19-account-create-update-jjnq9"] Mar 20 13:54:20 crc kubenswrapper[4973]: I0320 13:54:20.298823 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-xbx75"] Mar 20 13:54:20 crc kubenswrapper[4973]: I0320 13:54:20.314410 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-ee19-account-create-update-jjnq9"] Mar 20 13:54:20 crc kubenswrapper[4973]: I0320 13:54:20.323887 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-wvx2d"] Mar 20 13:54:21 crc kubenswrapper[4973]: I0320 13:54:21.963492 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1df509cd-ad47-4fa5-86bf-c2dc341259b7" path="/var/lib/kubelet/pods/1df509cd-ad47-4fa5-86bf-c2dc341259b7/volumes" Mar 20 13:54:21 crc kubenswrapper[4973]: I0320 13:54:21.964120 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33bc7d77-fed7-4177-83a1-6eff4b55ee6d" path="/var/lib/kubelet/pods/33bc7d77-fed7-4177-83a1-6eff4b55ee6d/volumes" Mar 20 13:54:21 crc kubenswrapper[4973]: I0320 13:54:21.965498 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="707ca787-d762-44eb-932e-0130c091ae6a" path="/var/lib/kubelet/pods/707ca787-d762-44eb-932e-0130c091ae6a/volumes" Mar 20 13:54:21 crc kubenswrapper[4973]: I0320 13:54:21.966839 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6efe097-248d-4d68-b2e0-172c2d005a17" path="/var/lib/kubelet/pods/a6efe097-248d-4d68-b2e0-172c2d005a17/volumes" Mar 20 13:54:21 crc kubenswrapper[4973]: I0320 13:54:21.967982 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e554d8c1-24d9-4dff-8495-2fb41208cdad" path="/var/lib/kubelet/pods/e554d8c1-24d9-4dff-8495-2fb41208cdad/volumes" Mar 20 13:54:21 crc kubenswrapper[4973]: I0320 13:54:21.968727 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 20 13:54:24 crc kubenswrapper[4973]: I0320 13:54:24.031805 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-6399-account-create-update-tkjxp"] Mar 20 13:54:24 crc kubenswrapper[4973]: I0320 13:54:24.048851 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-v59x5"] Mar 20 13:54:24 crc kubenswrapper[4973]: I0320 13:54:24.063264 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-6399-account-create-update-tkjxp"] Mar 20 13:54:24 crc kubenswrapper[4973]: I0320 13:54:24.079988 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-v59x5"] Mar 20 13:54:25 crc kubenswrapper[4973]: I0320 13:54:25.966921 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d64a6d2-5f83-480e-a594-7e633e0e0586" path="/var/lib/kubelet/pods/3d64a6d2-5f83-480e-a594-7e633e0e0586/volumes" Mar 20 13:54:25 crc kubenswrapper[4973]: I0320 13:54:25.967949 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99298e82-7167-49cb-ae27-e107a53c57d8" path="/var/lib/kubelet/pods/99298e82-7167-49cb-ae27-e107a53c57d8/volumes" Mar 20 13:54:28 crc kubenswrapper[4973]: I0320 13:54:28.047257 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-19e0-account-create-update-rkd5n"] Mar 20 13:54:28 crc kubenswrapper[4973]: I0320 13:54:28.060500 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-7vzxf"] Mar 20 13:54:28 crc kubenswrapper[4973]: I0320 13:54:28.072272 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-19e0-account-create-update-rkd5n"] Mar 20 13:54:28 crc kubenswrapper[4973]: I0320 13:54:28.084494 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mysqld-exporter-openstack-cell1-db-create-7vzxf"] Mar 20 13:54:29 crc kubenswrapper[4973]: I0320 13:54:29.966767 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="88d9ea96-b45d-44d9-9aa0-60d2668a9a0c" path="/var/lib/kubelet/pods/88d9ea96-b45d-44d9-9aa0-60d2668a9a0c/volumes" Mar 20 13:54:29 crc kubenswrapper[4973]: I0320 13:54:29.967780 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4ce1c5d-178b-4adc-b4f5-7c354e2914d0" path="/var/lib/kubelet/pods/d4ce1c5d-178b-4adc-b4f5-7c354e2914d0/volumes" Mar 20 13:54:52 crc kubenswrapper[4973]: I0320 13:54:52.028846 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-bhvvf"] Mar 20 13:54:52 crc kubenswrapper[4973]: I0320 13:54:52.040828 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-bhvvf"] Mar 20 13:54:53 crc kubenswrapper[4973]: I0320 13:54:53.964378 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1286f2f0-7c61-4ba2-b2a6-3ec09e602af9" path="/var/lib/kubelet/pods/1286f2f0-7c61-4ba2-b2a6-3ec09e602af9/volumes" Mar 20 13:54:56 crc kubenswrapper[4973]: I0320 13:54:56.052083 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-c2db-account-create-update-4m67k"] Mar 20 13:54:56 crc kubenswrapper[4973]: I0320 13:54:56.063908 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-0791-account-create-update-rz2hx"] Mar 20 13:54:56 crc kubenswrapper[4973]: I0320 13:54:56.075137 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-1972-account-create-update-l6mqj"] Mar 20 13:54:56 crc kubenswrapper[4973]: I0320 13:54:56.090457 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-8fxhw"] Mar 20 13:54:56 crc kubenswrapper[4973]: I0320 13:54:56.098216 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-a222-account-create-update-bsrjg"] Mar 20 13:54:56 crc kubenswrapper[4973]: I0320 13:54:56.138669 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-rwnmf"] Mar 20 13:54:56 crc kubenswrapper[4973]: I0320 13:54:56.151843 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-wdm6w"] Mar 20 13:54:56 crc kubenswrapper[4973]: I0320 13:54:56.162692 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-a222-account-create-update-bsrjg"] Mar 20 13:54:56 crc kubenswrapper[4973]: I0320 13:54:56.176245 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-8fxhw"] Mar 20 13:54:56 crc kubenswrapper[4973]: I0320 13:54:56.188771 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-0791-account-create-update-rz2hx"] Mar 20 13:54:56 crc kubenswrapper[4973]: I0320 13:54:56.200776 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-c2db-account-create-update-4m67k"] Mar 20 13:54:56 crc kubenswrapper[4973]: I0320 13:54:56.211282 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-1972-account-create-update-l6mqj"] Mar 20 13:54:56 crc kubenswrapper[4973]: I0320 13:54:56.221685 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-rwnmf"] Mar 20 13:54:56 crc kubenswrapper[4973]: I0320 13:54:56.233144 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-wdm6w"] Mar 20 13:54:56 crc kubenswrapper[4973]: I0320 13:54:56.247904 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-2k8xz"] Mar 20 13:54:56 crc kubenswrapper[4973]: I0320 13:54:56.258032 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-2k8xz"] Mar 20 13:54:57 crc kubenswrapper[4973]: I0320 13:54:57.718122 4973 scope.go:117] "RemoveContainer" containerID="7d3afabbfc3c755ab5cba1e458d92066c4cc6d813750ceaa81aa3300c1767fa0" Mar 20 13:54:57 crc kubenswrapper[4973]: I0320 13:54:57.758541 4973 scope.go:117] "RemoveContainer" containerID="676970f69fb018b5b2472ce7bc06064b94c37b9a6784143cf9910a3bc985da11" Mar 20 13:54:57 crc kubenswrapper[4973]: I0320 13:54:57.840632 4973 scope.go:117] "RemoveContainer" containerID="c1f1c397ae71d75e71535afe1687540c2e9c8ae5090a077a9a71530258b81724" Mar 20 13:54:57 crc kubenswrapper[4973]: I0320 13:54:57.891130 4973 scope.go:117] "RemoveContainer" containerID="ca1fb82cf3cd4409ba21b65b629fc3bd47925be0d5a2d279d022c6b8dd8b8421" Mar 20 13:54:57 crc kubenswrapper[4973]: I0320 13:54:57.966148 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="024e6737-2a16-43e9-99f4-62d2b39df77b" path="/var/lib/kubelet/pods/024e6737-2a16-43e9-99f4-62d2b39df77b/volumes" Mar 20 13:54:57 crc kubenswrapper[4973]: I0320 13:54:57.967213 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0850cf2e-8755-4161-97ea-a1507ef8f2fb" path="/var/lib/kubelet/pods/0850cf2e-8755-4161-97ea-a1507ef8f2fb/volumes" Mar 20 13:54:57 crc kubenswrapper[4973]: I0320 13:54:57.967961 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bb79520-00fa-4626-9fef-c7250fccd210" path="/var/lib/kubelet/pods/0bb79520-00fa-4626-9fef-c7250fccd210/volumes" Mar 20 13:54:57 crc kubenswrapper[4973]: I0320 13:54:57.968696 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d0fe71e-2a81-41d4-86d6-dd4345597c6e" path="/var/lib/kubelet/pods/2d0fe71e-2a81-41d4-86d6-dd4345597c6e/volumes" Mar 20 13:54:57 crc kubenswrapper[4973]: I0320 13:54:57.969978 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a14f07ce-f8ea-4af0-bf01-9ad6b1af917a" path="/var/lib/kubelet/pods/a14f07ce-f8ea-4af0-bf01-9ad6b1af917a/volumes" Mar 20 13:54:57 crc kubenswrapper[4973]: I0320 13:54:57.970802 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a26bab7b-118f-46a6-8809-8f7b8b7a83d4" path="/var/lib/kubelet/pods/a26bab7b-118f-46a6-8809-8f7b8b7a83d4/volumes" Mar 20 13:54:57 crc kubenswrapper[4973]: I0320 13:54:57.971418 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1f86541-a1d0-4a26-80c2-be736fa62eb9" path="/var/lib/kubelet/pods/b1f86541-a1d0-4a26-80c2-be736fa62eb9/volumes" Mar 20 13:54:57 crc kubenswrapper[4973]: I0320 13:54:57.972672 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be6b9024-516f-484d-94ff-391bf79246bf" path="/var/lib/kubelet/pods/be6b9024-516f-484d-94ff-391bf79246bf/volumes" Mar 20 13:54:57 crc kubenswrapper[4973]: I0320 13:54:57.972880 4973 scope.go:117] "RemoveContainer" containerID="1986de820988ad7690038fa641c33b650c37a027712bbbf0656e4aaecddf6afc" Mar 20 13:54:58 crc kubenswrapper[4973]: I0320 13:54:58.011862 4973 scope.go:117] "RemoveContainer" containerID="d244307f50c47c1f04e7c7e3897f5842a02a6df9aa1aef658d0d450bafe87e5f" Mar 20 13:54:58 crc kubenswrapper[4973]: I0320 13:54:58.051662 4973 scope.go:117] "RemoveContainer" containerID="8c42b9802a8cc2c4fc62f3ddda1fc8da9a9e7b8a0b4dcf1e86c5ec99f37e570e" Mar 20 13:54:58 crc kubenswrapper[4973]: I0320 13:54:58.112870 4973 scope.go:117] "RemoveContainer" containerID="ce47bf39326f73808f09bc041516cf0ea5cdc99e2bf914c2642e2a3f56df6cd1" Mar 20 13:54:58 crc kubenswrapper[4973]: I0320 13:54:58.137964 4973 scope.go:117] "RemoveContainer" containerID="815cf8addeea77972c3c230d8a330e18ff0fb4a62a7f968fe769530bf6effc27" Mar 20 13:54:58 crc kubenswrapper[4973]: I0320 13:54:58.180597 4973 scope.go:117] "RemoveContainer" containerID="121d3840114c7ee194ce3cdda0b1be9791fbf779b319e5e7474aa6bab50bdb3a" Mar 20 13:54:58 crc kubenswrapper[4973]: I0320 13:54:58.203183 4973 scope.go:117] "RemoveContainer" containerID="32ff66beb5df2f942bd61fe0aae352bad22a5e8ca6e4333bccc86455dcb1e33e" Mar 20 13:54:58 crc kubenswrapper[4973]: I0320 13:54:58.236836 4973 scope.go:117] "RemoveContainer" containerID="891e47aad9dc99c16ff5048519990252c678f6ef510adb6cc13cbbca6ed4996e" Mar 20 13:54:58 crc kubenswrapper[4973]: I0320 13:54:58.263640 4973 scope.go:117] "RemoveContainer" containerID="e271186527aecc254c0001e58f8960ac5a957176ab7d2af16516e252d235f756" Mar 20 13:54:58 crc kubenswrapper[4973]: I0320 13:54:58.296861 4973 scope.go:117] "RemoveContainer" containerID="d7e7456f83f9f22b10988cfe4a6c2c043104d4c1a7ce31cb7f90947f74e32844" Mar 20 13:54:58 crc kubenswrapper[4973]: I0320 13:54:58.318046 4973 scope.go:117] "RemoveContainer" containerID="b166582adfa64767b2638eb7d18411fbcc0b9494239d9daa23730365c03f52ca" Mar 20 13:54:58 crc kubenswrapper[4973]: I0320 13:54:58.340264 4973 scope.go:117] "RemoveContainer" containerID="0d0f4c2ce0083e20437758948379ee9d68025cdd2ef9f47004cd9391cf43fca9" Mar 20 13:54:58 crc kubenswrapper[4973]: I0320 13:54:58.364365 4973 scope.go:117] "RemoveContainer" containerID="7b12a6157627efa83996ef02ace60c25a9a0e0cabd3a6894d5725e76994f7494" Mar 20 13:54:58 crc kubenswrapper[4973]: I0320 13:54:58.387718 4973 scope.go:117] "RemoveContainer" containerID="d2e5fe46f3780bfdf625f4f386b47b8f5ae67dd453e25cdd4cb37a9b57cf00b4" Mar 20 13:54:58 crc kubenswrapper[4973]: I0320 13:54:58.420891 4973 scope.go:117] "RemoveContainer" containerID="efd32cc0610733774e53115cb30af510709979d519ff1929472aa92bbe0ffb3d" Mar 20 13:54:58 crc kubenswrapper[4973]: I0320 13:54:58.443669 4973 scope.go:117] "RemoveContainer" containerID="eb741610b3702bfcf6600e5e5bb46025d3e9cdefd0ac87dee15660383f37d5de" Mar 20 13:54:58 crc kubenswrapper[4973]: I0320 13:54:58.472435 4973 scope.go:117] "RemoveContainer" containerID="4d5149ed2679f8e455a19d14a3198c2dac8f0a9b826e28719d86313f43dfb172" Mar 20 13:54:58 crc kubenswrapper[4973]: I0320 13:54:58.501932 4973 scope.go:117] "RemoveContainer" containerID="f997ceb54f445e211dc7001f033ec32f59125665ce183ac0f2b537251562b27e" Mar 20 13:55:02 crc kubenswrapper[4973]: I0320 13:55:02.028896 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-mfb4w"] Mar 20 13:55:02 crc kubenswrapper[4973]: I0320 13:55:02.040850 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-mfb4w"] Mar 20 13:55:03 crc kubenswrapper[4973]: I0320 13:55:03.969424 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="716148fc-095e-4811-8e79-53bf3e2cd53b" path="/var/lib/kubelet/pods/716148fc-095e-4811-8e79-53bf3e2cd53b/volumes" Mar 20 13:55:07 crc kubenswrapper[4973]: I0320 13:55:07.033917 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-n2wvb"] Mar 20 13:55:07 crc kubenswrapper[4973]: I0320 13:55:07.044468 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-n2wvb"] Mar 20 13:55:07 crc kubenswrapper[4973]: I0320 13:55:07.969377 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f86183c-8d71-4d52-860b-9579ba761393" path="/var/lib/kubelet/pods/0f86183c-8d71-4d52-860b-9579ba761393/volumes" Mar 20 13:55:23 crc kubenswrapper[4973]: E0320 13:55:23.156948 4973 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dfa03f4_98c0_4122_8fbb_abeba13439f0.slice/crio-conmon-867206f0de03d87710aaf1e40085adb675e039dc7e9c84bcca844e6fbde582d4.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dfa03f4_98c0_4122_8fbb_abeba13439f0.slice/crio-867206f0de03d87710aaf1e40085adb675e039dc7e9c84bcca844e6fbde582d4.scope\": RecentStats: unable to find data in memory cache]" Mar 20 13:55:23 crc kubenswrapper[4973]: I0320 13:55:23.570690 4973 generic.go:334] "Generic (PLEG): container finished" podID="8dfa03f4-98c0-4122-8fbb-abeba13439f0" containerID="867206f0de03d87710aaf1e40085adb675e039dc7e9c84bcca844e6fbde582d4" exitCode=0 Mar 20 13:55:23 crc kubenswrapper[4973]: I0320 13:55:23.570737 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl" event={"ID":"8dfa03f4-98c0-4122-8fbb-abeba13439f0","Type":"ContainerDied","Data":"867206f0de03d87710aaf1e40085adb675e039dc7e9c84bcca844e6fbde582d4"} Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.054922 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.194301 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8dfa03f4-98c0-4122-8fbb-abeba13439f0-ssh-key-openstack-edpm-ipam\") pod \"8dfa03f4-98c0-4122-8fbb-abeba13439f0\" (UID: \"8dfa03f4-98c0-4122-8fbb-abeba13439f0\") " Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.194847 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8dfa03f4-98c0-4122-8fbb-abeba13439f0-inventory\") pod \"8dfa03f4-98c0-4122-8fbb-abeba13439f0\" (UID: \"8dfa03f4-98c0-4122-8fbb-abeba13439f0\") " Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.194877 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmlvd\" (UniqueName: \"kubernetes.io/projected/8dfa03f4-98c0-4122-8fbb-abeba13439f0-kube-api-access-nmlvd\") pod \"8dfa03f4-98c0-4122-8fbb-abeba13439f0\" (UID: \"8dfa03f4-98c0-4122-8fbb-abeba13439f0\") " Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.194915 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dfa03f4-98c0-4122-8fbb-abeba13439f0-bootstrap-combined-ca-bundle\") pod \"8dfa03f4-98c0-4122-8fbb-abeba13439f0\" (UID: \"8dfa03f4-98c0-4122-8fbb-abeba13439f0\") " Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.200422 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dfa03f4-98c0-4122-8fbb-abeba13439f0-kube-api-access-nmlvd" (OuterVolumeSpecName: "kube-api-access-nmlvd") pod "8dfa03f4-98c0-4122-8fbb-abeba13439f0" (UID: "8dfa03f4-98c0-4122-8fbb-abeba13439f0"). InnerVolumeSpecName "kube-api-access-nmlvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.202662 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dfa03f4-98c0-4122-8fbb-abeba13439f0-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "8dfa03f4-98c0-4122-8fbb-abeba13439f0" (UID: "8dfa03f4-98c0-4122-8fbb-abeba13439f0"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.229565 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dfa03f4-98c0-4122-8fbb-abeba13439f0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8dfa03f4-98c0-4122-8fbb-abeba13439f0" (UID: "8dfa03f4-98c0-4122-8fbb-abeba13439f0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.266103 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dfa03f4-98c0-4122-8fbb-abeba13439f0-inventory" (OuterVolumeSpecName: "inventory") pod "8dfa03f4-98c0-4122-8fbb-abeba13439f0" (UID: "8dfa03f4-98c0-4122-8fbb-abeba13439f0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.298573 4973 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8dfa03f4-98c0-4122-8fbb-abeba13439f0-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.298878 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmlvd\" (UniqueName: \"kubernetes.io/projected/8dfa03f4-98c0-4122-8fbb-abeba13439f0-kube-api-access-nmlvd\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.298960 4973 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dfa03f4-98c0-4122-8fbb-abeba13439f0-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.299021 4973 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8dfa03f4-98c0-4122-8fbb-abeba13439f0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.595187 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl" event={"ID":"8dfa03f4-98c0-4122-8fbb-abeba13439f0","Type":"ContainerDied","Data":"b56bb6f84793e142dc07ce532e1318b489f92063b3cfbc40abd82171f457803a"} Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.595452 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b56bb6f84793e142dc07ce532e1318b489f92063b3cfbc40abd82171f457803a" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.595265 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.671064 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5"] Mar 20 13:55:25 crc kubenswrapper[4973]: E0320 13:55:25.671729 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dfa03f4-98c0-4122-8fbb-abeba13439f0" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.671754 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dfa03f4-98c0-4122-8fbb-abeba13439f0" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 20 13:55:25 crc kubenswrapper[4973]: E0320 13:55:25.671778 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="517222fd-ce22-4859-abde-4d45185cd4b0" containerName="oc" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.671787 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="517222fd-ce22-4859-abde-4d45185cd4b0" containerName="oc" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.672090 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="517222fd-ce22-4859-abde-4d45185cd4b0" containerName="oc" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.672132 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dfa03f4-98c0-4122-8fbb-abeba13439f0" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.673114 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.675692 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d4wjc" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.675851 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.675948 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.676118 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.685088 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5"] Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.811697 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08df9ecf-310e-4fee-9ec6-e13e27f1537b-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5\" (UID: \"08df9ecf-310e-4fee-9ec6-e13e27f1537b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.812225 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kstct\" (UniqueName: \"kubernetes.io/projected/08df9ecf-310e-4fee-9ec6-e13e27f1537b-kube-api-access-kstct\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5\" (UID: \"08df9ecf-310e-4fee-9ec6-e13e27f1537b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.812314 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08df9ecf-310e-4fee-9ec6-e13e27f1537b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5\" (UID: \"08df9ecf-310e-4fee-9ec6-e13e27f1537b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.914388 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kstct\" (UniqueName: \"kubernetes.io/projected/08df9ecf-310e-4fee-9ec6-e13e27f1537b-kube-api-access-kstct\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5\" (UID: \"08df9ecf-310e-4fee-9ec6-e13e27f1537b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.914448 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08df9ecf-310e-4fee-9ec6-e13e27f1537b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5\" (UID: \"08df9ecf-310e-4fee-9ec6-e13e27f1537b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.914514 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08df9ecf-310e-4fee-9ec6-e13e27f1537b-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5\" (UID: \"08df9ecf-310e-4fee-9ec6-e13e27f1537b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.921282 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08df9ecf-310e-4fee-9ec6-e13e27f1537b-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5\" (UID: \"08df9ecf-310e-4fee-9ec6-e13e27f1537b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.922780 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08df9ecf-310e-4fee-9ec6-e13e27f1537b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5\" (UID: \"08df9ecf-310e-4fee-9ec6-e13e27f1537b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5" Mar 20 13:55:25 crc kubenswrapper[4973]: I0320 13:55:25.932126 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kstct\" (UniqueName: \"kubernetes.io/projected/08df9ecf-310e-4fee-9ec6-e13e27f1537b-kube-api-access-kstct\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5\" (UID: \"08df9ecf-310e-4fee-9ec6-e13e27f1537b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5" Mar 20 13:55:26 crc kubenswrapper[4973]: I0320 13:55:26.001512 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5" Mar 20 13:55:26 crc kubenswrapper[4973]: I0320 13:55:26.529800 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5"] Mar 20 13:55:26 crc kubenswrapper[4973]: I0320 13:55:26.605778 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5" event={"ID":"08df9ecf-310e-4fee-9ec6-e13e27f1537b","Type":"ContainerStarted","Data":"09e5d17bb26649da66d502780e57e39ac8b309b340bf3dd14539bd22aafc22e9"} Mar 20 13:55:27 crc kubenswrapper[4973]: I0320 13:55:27.624209 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5" event={"ID":"08df9ecf-310e-4fee-9ec6-e13e27f1537b","Type":"ContainerStarted","Data":"c06f9498f4cbf35dace0cccf0f767d41d3f6ee02d563090c073c2546e04cb56b"} Mar 20 13:55:27 crc kubenswrapper[4973]: I0320 13:55:27.665674 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5" podStartSLOduration=2.237650867 podStartE2EDuration="2.66565445s" podCreationTimestamp="2026-03-20 13:55:25 +0000 UTC" firstStartedPulling="2026-03-20 13:55:26.536643266 +0000 UTC m=+2047.280313020" lastFinishedPulling="2026-03-20 13:55:26.964646859 +0000 UTC m=+2047.708316603" observedRunningTime="2026-03-20 13:55:27.656743217 +0000 UTC m=+2048.400412961" watchObservedRunningTime="2026-03-20 13:55:27.66565445 +0000 UTC m=+2048.409324184" Mar 20 13:55:43 crc kubenswrapper[4973]: I0320 13:55:43.321409 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:55:43 crc kubenswrapper[4973]: I0320 13:55:43.321990 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:55:58 crc kubenswrapper[4973]: I0320 13:55:58.940110 4973 scope.go:117] "RemoveContainer" containerID="f874485f813e1778c4784fe82ad905aac8cafec2fd4161b9d2d30540d5bf6d1c" Mar 20 13:55:58 crc kubenswrapper[4973]: I0320 13:55:58.969449 4973 scope.go:117] "RemoveContainer" containerID="65371447e17e8ee321413c1c298e0361c8d1fff86683a74c4b2db804417c3dec" Mar 20 13:55:59 crc kubenswrapper[4973]: I0320 13:55:59.043534 4973 scope.go:117] "RemoveContainer" containerID="4f344edff595c94c1d1e635246cd7485e152fc7b7aeabbf8c6b77c50458e3271" Mar 20 13:55:59 crc kubenswrapper[4973]: I0320 13:55:59.070062 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-2wgvh"] Mar 20 13:55:59 crc kubenswrapper[4973]: I0320 13:55:59.086822 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-dr5z9"] Mar 20 13:55:59 crc kubenswrapper[4973]: I0320 13:55:59.095908 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-2wgvh"] Mar 20 13:55:59 crc kubenswrapper[4973]: I0320 13:55:59.103365 4973 scope.go:117] "RemoveContainer" containerID="07e4031eb51a244b66dff2c2092044894ab5856be895615cfe8e14f94283f0d9" Mar 20 13:55:59 crc kubenswrapper[4973]: I0320 13:55:59.106895 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-dr5z9"] Mar 20 13:55:59 crc kubenswrapper[4973]: I0320 13:55:59.150641 4973 scope.go:117] "RemoveContainer" containerID="ea23f03170bee7e6382632749bd595b9a1313dbc7c2fdb63027f117566faa374" Mar 20 13:55:59 crc kubenswrapper[4973]: I0320 13:55:59.966302 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bcf98ea-17cb-432f-8d35-18cc016401ed" path="/var/lib/kubelet/pods/5bcf98ea-17cb-432f-8d35-18cc016401ed/volumes" Mar 20 13:55:59 crc kubenswrapper[4973]: I0320 13:55:59.967253 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bc32f3f-2c79-4331-aa41-47d648fc6499" path="/var/lib/kubelet/pods/9bc32f3f-2c79-4331-aa41-47d648fc6499/volumes" Mar 20 13:56:00 crc kubenswrapper[4973]: I0320 13:56:00.145261 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566916-xjq6l"] Mar 20 13:56:00 crc kubenswrapper[4973]: I0320 13:56:00.147116 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566916-xjq6l" Mar 20 13:56:00 crc kubenswrapper[4973]: I0320 13:56:00.149453 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:56:00 crc kubenswrapper[4973]: I0320 13:56:00.149657 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 13:56:00 crc kubenswrapper[4973]: I0320 13:56:00.150105 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:56:00 crc kubenswrapper[4973]: I0320 13:56:00.155850 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566916-xjq6l"] Mar 20 13:56:00 crc kubenswrapper[4973]: I0320 13:56:00.318695 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cb96x\" (UniqueName: \"kubernetes.io/projected/615a8158-d265-4082-b8c2-f342c4b9640e-kube-api-access-cb96x\") pod \"auto-csr-approver-29566916-xjq6l\" (UID: \"615a8158-d265-4082-b8c2-f342c4b9640e\") " pod="openshift-infra/auto-csr-approver-29566916-xjq6l" Mar 20 13:56:00 crc kubenswrapper[4973]: I0320 13:56:00.421148 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cb96x\" (UniqueName: \"kubernetes.io/projected/615a8158-d265-4082-b8c2-f342c4b9640e-kube-api-access-cb96x\") pod \"auto-csr-approver-29566916-xjq6l\" (UID: \"615a8158-d265-4082-b8c2-f342c4b9640e\") " pod="openshift-infra/auto-csr-approver-29566916-xjq6l" Mar 20 13:56:00 crc kubenswrapper[4973]: I0320 13:56:00.446542 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cb96x\" (UniqueName: \"kubernetes.io/projected/615a8158-d265-4082-b8c2-f342c4b9640e-kube-api-access-cb96x\") pod \"auto-csr-approver-29566916-xjq6l\" (UID: \"615a8158-d265-4082-b8c2-f342c4b9640e\") " pod="openshift-infra/auto-csr-approver-29566916-xjq6l" Mar 20 13:56:00 crc kubenswrapper[4973]: I0320 13:56:00.472317 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566916-xjq6l" Mar 20 13:56:00 crc kubenswrapper[4973]: I0320 13:56:00.932356 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566916-xjq6l"] Mar 20 13:56:00 crc kubenswrapper[4973]: I0320 13:56:00.934956 4973 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 13:56:00 crc kubenswrapper[4973]: I0320 13:56:00.999692 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566916-xjq6l" event={"ID":"615a8158-d265-4082-b8c2-f342c4b9640e","Type":"ContainerStarted","Data":"e06bc5bb435f0bd79ba9d62176dbea1e3d4270237a8f479fd6bc2c17c44a38ca"} Mar 20 13:56:03 crc kubenswrapper[4973]: I0320 13:56:03.023102 4973 generic.go:334] "Generic (PLEG): container finished" podID="615a8158-d265-4082-b8c2-f342c4b9640e" containerID="55139a2b61c755778065606c6b40bb6810f06b83d0eb1937845018b6d353eced" exitCode=0 Mar 20 13:56:03 crc kubenswrapper[4973]: I0320 13:56:03.023283 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566916-xjq6l" event={"ID":"615a8158-d265-4082-b8c2-f342c4b9640e","Type":"ContainerDied","Data":"55139a2b61c755778065606c6b40bb6810f06b83d0eb1937845018b6d353eced"} Mar 20 13:56:04 crc kubenswrapper[4973]: I0320 13:56:04.495450 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566916-xjq6l" Mar 20 13:56:04 crc kubenswrapper[4973]: I0320 13:56:04.659579 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cb96x\" (UniqueName: \"kubernetes.io/projected/615a8158-d265-4082-b8c2-f342c4b9640e-kube-api-access-cb96x\") pod \"615a8158-d265-4082-b8c2-f342c4b9640e\" (UID: \"615a8158-d265-4082-b8c2-f342c4b9640e\") " Mar 20 13:56:04 crc kubenswrapper[4973]: I0320 13:56:04.665396 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/615a8158-d265-4082-b8c2-f342c4b9640e-kube-api-access-cb96x" (OuterVolumeSpecName: "kube-api-access-cb96x") pod "615a8158-d265-4082-b8c2-f342c4b9640e" (UID: "615a8158-d265-4082-b8c2-f342c4b9640e"). InnerVolumeSpecName "kube-api-access-cb96x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:56:04 crc kubenswrapper[4973]: I0320 13:56:04.762613 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cb96x\" (UniqueName: \"kubernetes.io/projected/615a8158-d265-4082-b8c2-f342c4b9640e-kube-api-access-cb96x\") on node \"crc\" DevicePath \"\"" Mar 20 13:56:05 crc kubenswrapper[4973]: I0320 13:56:05.048897 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566916-xjq6l" event={"ID":"615a8158-d265-4082-b8c2-f342c4b9640e","Type":"ContainerDied","Data":"e06bc5bb435f0bd79ba9d62176dbea1e3d4270237a8f479fd6bc2c17c44a38ca"} Mar 20 13:56:05 crc kubenswrapper[4973]: I0320 13:56:05.048937 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e06bc5bb435f0bd79ba9d62176dbea1e3d4270237a8f479fd6bc2c17c44a38ca" Mar 20 13:56:05 crc kubenswrapper[4973]: I0320 13:56:05.049098 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566916-xjq6l" Mar 20 13:56:05 crc kubenswrapper[4973]: I0320 13:56:05.570420 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566910-hm49w"] Mar 20 13:56:05 crc kubenswrapper[4973]: I0320 13:56:05.585744 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566910-hm49w"] Mar 20 13:56:05 crc kubenswrapper[4973]: I0320 13:56:05.964722 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="867c1f31-b30c-48ac-bd37-b433d68230b7" path="/var/lib/kubelet/pods/867c1f31-b30c-48ac-bd37-b433d68230b7/volumes" Mar 20 13:56:13 crc kubenswrapper[4973]: I0320 13:56:13.320951 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:56:13 crc kubenswrapper[4973]: I0320 13:56:13.321743 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:56:15 crc kubenswrapper[4973]: I0320 13:56:15.034083 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-lbsl6"] Mar 20 13:56:15 crc kubenswrapper[4973]: I0320 13:56:15.044575 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-lbsl6"] Mar 20 13:56:15 crc kubenswrapper[4973]: I0320 13:56:15.964655 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92ef179e-0d37-4a3d-986f-5a4ea5bc5a23" path="/var/lib/kubelet/pods/92ef179e-0d37-4a3d-986f-5a4ea5bc5a23/volumes" Mar 20 13:56:18 crc kubenswrapper[4973]: I0320 13:56:18.042016 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-vjtzq"] Mar 20 13:56:18 crc kubenswrapper[4973]: I0320 13:56:18.052038 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-zzsmb"] Mar 20 13:56:18 crc kubenswrapper[4973]: I0320 13:56:18.065299 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-vjtzq"] Mar 20 13:56:18 crc kubenswrapper[4973]: I0320 13:56:18.075629 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-zzsmb"] Mar 20 13:56:19 crc kubenswrapper[4973]: I0320 13:56:19.968466 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68" path="/var/lib/kubelet/pods/0e3c2d0b-dbee-4d4d-bc87-6ebb99fd1c68/volumes" Mar 20 13:56:19 crc kubenswrapper[4973]: I0320 13:56:19.970113 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fec0901-00c6-410f-986c-4dcac4fe1359" path="/var/lib/kubelet/pods/6fec0901-00c6-410f-986c-4dcac4fe1359/volumes" Mar 20 13:56:43 crc kubenswrapper[4973]: I0320 13:56:43.321106 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:56:43 crc kubenswrapper[4973]: I0320 13:56:43.321681 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:56:43 crc kubenswrapper[4973]: I0320 13:56:43.321745 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 13:56:43 crc kubenswrapper[4973]: I0320 13:56:43.322694 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"73be12b46987d19974f1cae15f097e9d05c0f53b3e5735ef8e8bb8b42d9ed186"} pod="openshift-machine-config-operator/machine-config-daemon-qlztx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:56:43 crc kubenswrapper[4973]: I0320 13:56:43.322767 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" containerID="cri-o://73be12b46987d19974f1cae15f097e9d05c0f53b3e5735ef8e8bb8b42d9ed186" gracePeriod=600 Mar 20 13:56:43 crc kubenswrapper[4973]: I0320 13:56:43.471043 4973 generic.go:334] "Generic (PLEG): container finished" podID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerID="73be12b46987d19974f1cae15f097e9d05c0f53b3e5735ef8e8bb8b42d9ed186" exitCode=0 Mar 20 13:56:43 crc kubenswrapper[4973]: I0320 13:56:43.471094 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerDied","Data":"73be12b46987d19974f1cae15f097e9d05c0f53b3e5735ef8e8bb8b42d9ed186"} Mar 20 13:56:43 crc kubenswrapper[4973]: I0320 13:56:43.471132 4973 scope.go:117] "RemoveContainer" containerID="6a0d4feb1453df04a61cea5a3b5ae32c86a922f7b4fcd6e8df315da6d7f67e5e" Mar 20 13:56:44 crc kubenswrapper[4973]: I0320 13:56:44.483879 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerStarted","Data":"8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee"} Mar 20 13:56:59 crc kubenswrapper[4973]: I0320 13:56:59.314647 4973 scope.go:117] "RemoveContainer" containerID="da7adef8f2c02307e433d570bf8a3a9ccef94d506092b871aba2526cf05588b9" Mar 20 13:56:59 crc kubenswrapper[4973]: I0320 13:56:59.344668 4973 scope.go:117] "RemoveContainer" containerID="77558af0734689165986f3348fb19eb96f9d02e0d0a7e2e3a8896ef17a83c032" Mar 20 13:56:59 crc kubenswrapper[4973]: I0320 13:56:59.445464 4973 scope.go:117] "RemoveContainer" containerID="43d8c42eff4680a96454959aeea1088a03cbfbd6f3c3956b87bf4b94480a0378" Mar 20 13:56:59 crc kubenswrapper[4973]: I0320 13:56:59.472754 4973 scope.go:117] "RemoveContainer" containerID="cbaf7d0cf8de8e4a6cd3237e1a32f4af31c3690957694832e4bc0a512ce3bd31" Mar 20 13:56:59 crc kubenswrapper[4973]: I0320 13:56:59.543624 4973 scope.go:117] "RemoveContainer" containerID="ef78a08289134992f9c632dad673844b359d1a7ca4b4df90922ece411e9c3da1" Mar 20 13:56:59 crc kubenswrapper[4973]: I0320 13:56:59.603930 4973 scope.go:117] "RemoveContainer" containerID="c74802542357b1e2931636b4d8c55aaf2be5116e981fc6e9cacf286b79a634fd" Mar 20 13:57:02 crc kubenswrapper[4973]: I0320 13:57:02.273969 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-h4jz5"] Mar 20 13:57:02 crc kubenswrapper[4973]: E0320 13:57:02.274822 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="615a8158-d265-4082-b8c2-f342c4b9640e" containerName="oc" Mar 20 13:57:02 crc kubenswrapper[4973]: I0320 13:57:02.274838 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="615a8158-d265-4082-b8c2-f342c4b9640e" containerName="oc" Mar 20 13:57:02 crc kubenswrapper[4973]: I0320 13:57:02.275189 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="615a8158-d265-4082-b8c2-f342c4b9640e" containerName="oc" Mar 20 13:57:02 crc kubenswrapper[4973]: I0320 13:57:02.277184 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4jz5" Mar 20 13:57:02 crc kubenswrapper[4973]: I0320 13:57:02.305016 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h4jz5"] Mar 20 13:57:02 crc kubenswrapper[4973]: I0320 13:57:02.353165 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrwws\" (UniqueName: \"kubernetes.io/projected/0e26365b-1585-4ea4-b2f0-7405f36c38d1-kube-api-access-nrwws\") pod \"redhat-operators-h4jz5\" (UID: \"0e26365b-1585-4ea4-b2f0-7405f36c38d1\") " pod="openshift-marketplace/redhat-operators-h4jz5" Mar 20 13:57:02 crc kubenswrapper[4973]: I0320 13:57:02.353239 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e26365b-1585-4ea4-b2f0-7405f36c38d1-utilities\") pod \"redhat-operators-h4jz5\" (UID: \"0e26365b-1585-4ea4-b2f0-7405f36c38d1\") " pod="openshift-marketplace/redhat-operators-h4jz5" Mar 20 13:57:02 crc kubenswrapper[4973]: I0320 13:57:02.353527 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e26365b-1585-4ea4-b2f0-7405f36c38d1-catalog-content\") pod \"redhat-operators-h4jz5\" (UID: \"0e26365b-1585-4ea4-b2f0-7405f36c38d1\") " pod="openshift-marketplace/redhat-operators-h4jz5" Mar 20 13:57:02 crc kubenswrapper[4973]: I0320 13:57:02.455856 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e26365b-1585-4ea4-b2f0-7405f36c38d1-catalog-content\") pod \"redhat-operators-h4jz5\" (UID: \"0e26365b-1585-4ea4-b2f0-7405f36c38d1\") " pod="openshift-marketplace/redhat-operators-h4jz5" Mar 20 13:57:02 crc kubenswrapper[4973]: I0320 13:57:02.456030 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrwws\" (UniqueName: \"kubernetes.io/projected/0e26365b-1585-4ea4-b2f0-7405f36c38d1-kube-api-access-nrwws\") pod \"redhat-operators-h4jz5\" (UID: \"0e26365b-1585-4ea4-b2f0-7405f36c38d1\") " pod="openshift-marketplace/redhat-operators-h4jz5" Mar 20 13:57:02 crc kubenswrapper[4973]: I0320 13:57:02.456054 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e26365b-1585-4ea4-b2f0-7405f36c38d1-utilities\") pod \"redhat-operators-h4jz5\" (UID: \"0e26365b-1585-4ea4-b2f0-7405f36c38d1\") " pod="openshift-marketplace/redhat-operators-h4jz5" Mar 20 13:57:02 crc kubenswrapper[4973]: I0320 13:57:02.456429 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e26365b-1585-4ea4-b2f0-7405f36c38d1-catalog-content\") pod \"redhat-operators-h4jz5\" (UID: \"0e26365b-1585-4ea4-b2f0-7405f36c38d1\") " pod="openshift-marketplace/redhat-operators-h4jz5" Mar 20 13:57:02 crc kubenswrapper[4973]: I0320 13:57:02.456548 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e26365b-1585-4ea4-b2f0-7405f36c38d1-utilities\") pod \"redhat-operators-h4jz5\" (UID: \"0e26365b-1585-4ea4-b2f0-7405f36c38d1\") " pod="openshift-marketplace/redhat-operators-h4jz5" Mar 20 13:57:02 crc kubenswrapper[4973]: I0320 13:57:02.480177 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrwws\" (UniqueName: \"kubernetes.io/projected/0e26365b-1585-4ea4-b2f0-7405f36c38d1-kube-api-access-nrwws\") pod \"redhat-operators-h4jz5\" (UID: \"0e26365b-1585-4ea4-b2f0-7405f36c38d1\") " pod="openshift-marketplace/redhat-operators-h4jz5" Mar 20 13:57:02 crc kubenswrapper[4973]: I0320 13:57:02.608127 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4jz5" Mar 20 13:57:03 crc kubenswrapper[4973]: I0320 13:57:03.080056 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-p2xzh"] Mar 20 13:57:03 crc kubenswrapper[4973]: I0320 13:57:03.096465 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-p2xzh"] Mar 20 13:57:03 crc kubenswrapper[4973]: I0320 13:57:03.110832 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-4451-account-create-update-pddz5"] Mar 20 13:57:03 crc kubenswrapper[4973]: I0320 13:57:03.120904 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-4451-account-create-update-pddz5"] Mar 20 13:57:03 crc kubenswrapper[4973]: I0320 13:57:03.300720 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-h4jz5"] Mar 20 13:57:03 crc kubenswrapper[4973]: I0320 13:57:03.732537 4973 generic.go:334] "Generic (PLEG): container finished" podID="0e26365b-1585-4ea4-b2f0-7405f36c38d1" containerID="8657e8432c53a3c99d36afc935649784c5a54ecb38eabeffb8d38354094d137f" exitCode=0 Mar 20 13:57:03 crc kubenswrapper[4973]: I0320 13:57:03.732745 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4jz5" event={"ID":"0e26365b-1585-4ea4-b2f0-7405f36c38d1","Type":"ContainerDied","Data":"8657e8432c53a3c99d36afc935649784c5a54ecb38eabeffb8d38354094d137f"} Mar 20 13:57:03 crc kubenswrapper[4973]: I0320 13:57:03.733183 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4jz5" event={"ID":"0e26365b-1585-4ea4-b2f0-7405f36c38d1","Type":"ContainerStarted","Data":"abd0dc6a78ef29f6649888df1f8abe73143a280f95eae187d19dde7327a2e82c"} Mar 20 13:57:03 crc kubenswrapper[4973]: I0320 13:57:03.967119 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73458433-b92e-4758-84bd-7aba2f23e1e0" path="/var/lib/kubelet/pods/73458433-b92e-4758-84bd-7aba2f23e1e0/volumes" Mar 20 13:57:03 crc kubenswrapper[4973]: I0320 13:57:03.968255 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bf6849d-dd2b-4a3e-be4f-1b00c1826000" path="/var/lib/kubelet/pods/7bf6849d-dd2b-4a3e-be4f-1b00c1826000/volumes" Mar 20 13:57:04 crc kubenswrapper[4973]: I0320 13:57:04.744436 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4jz5" event={"ID":"0e26365b-1585-4ea4-b2f0-7405f36c38d1","Type":"ContainerStarted","Data":"4d9105e289ccbde76460d09f1ee5653ee31ad849d04b5bc0dfef28f2fee5b83d"} Mar 20 13:57:05 crc kubenswrapper[4973]: I0320 13:57:05.034130 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-eea5-account-create-update-jjhm6"] Mar 20 13:57:05 crc kubenswrapper[4973]: I0320 13:57:05.051034 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-p4rbx"] Mar 20 13:57:05 crc kubenswrapper[4973]: I0320 13:57:05.064721 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-1a9f-account-create-update-4nplb"] Mar 20 13:57:05 crc kubenswrapper[4973]: I0320 13:57:05.078769 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-1a9f-account-create-update-4nplb"] Mar 20 13:57:05 crc kubenswrapper[4973]: I0320 13:57:05.090036 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-eea5-account-create-update-jjhm6"] Mar 20 13:57:05 crc kubenswrapper[4973]: I0320 13:57:05.101674 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-p4rbx"] Mar 20 13:57:05 crc kubenswrapper[4973]: I0320 13:57:05.965300 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3da0c070-14fe-41f9-9b97-f76831f43dbc" path="/var/lib/kubelet/pods/3da0c070-14fe-41f9-9b97-f76831f43dbc/volumes" Mar 20 13:57:05 crc kubenswrapper[4973]: I0320 13:57:05.966313 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bb957cd-c6f9-4602-b4a2-56834928aabb" path="/var/lib/kubelet/pods/6bb957cd-c6f9-4602-b4a2-56834928aabb/volumes" Mar 20 13:57:05 crc kubenswrapper[4973]: I0320 13:57:05.966992 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd964d5b-2edc-41ea-8dc9-3f6e71d9da32" path="/var/lib/kubelet/pods/cd964d5b-2edc-41ea-8dc9-3f6e71d9da32/volumes" Mar 20 13:57:06 crc kubenswrapper[4973]: I0320 13:57:06.040729 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-xnwtw"] Mar 20 13:57:06 crc kubenswrapper[4973]: I0320 13:57:06.061051 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-xnwtw"] Mar 20 13:57:07 crc kubenswrapper[4973]: I0320 13:57:07.966608 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="abeee3f9-2831-465f-8c3e-9853954f7087" path="/var/lib/kubelet/pods/abeee3f9-2831-465f-8c3e-9853954f7087/volumes" Mar 20 13:57:10 crc kubenswrapper[4973]: I0320 13:57:10.814796 4973 generic.go:334] "Generic (PLEG): container finished" podID="0e26365b-1585-4ea4-b2f0-7405f36c38d1" containerID="4d9105e289ccbde76460d09f1ee5653ee31ad849d04b5bc0dfef28f2fee5b83d" exitCode=0 Mar 20 13:57:10 crc kubenswrapper[4973]: I0320 13:57:10.814863 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4jz5" event={"ID":"0e26365b-1585-4ea4-b2f0-7405f36c38d1","Type":"ContainerDied","Data":"4d9105e289ccbde76460d09f1ee5653ee31ad849d04b5bc0dfef28f2fee5b83d"} Mar 20 13:57:11 crc kubenswrapper[4973]: I0320 13:57:11.857227 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4jz5" event={"ID":"0e26365b-1585-4ea4-b2f0-7405f36c38d1","Type":"ContainerStarted","Data":"4b35c2642b0fbfa8b82fb2979a569540e576282cae497e4886d8ec6acfab2183"} Mar 20 13:57:11 crc kubenswrapper[4973]: I0320 13:57:11.881879 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-h4jz5" podStartSLOduration=2.3567866410000002 podStartE2EDuration="9.881859157s" podCreationTimestamp="2026-03-20 13:57:02 +0000 UTC" firstStartedPulling="2026-03-20 13:57:03.736518698 +0000 UTC m=+2144.480188442" lastFinishedPulling="2026-03-20 13:57:11.261591214 +0000 UTC m=+2152.005260958" observedRunningTime="2026-03-20 13:57:11.87251533 +0000 UTC m=+2152.616185074" watchObservedRunningTime="2026-03-20 13:57:11.881859157 +0000 UTC m=+2152.625528901" Mar 20 13:57:12 crc kubenswrapper[4973]: I0320 13:57:12.608916 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h4jz5" Mar 20 13:57:12 crc kubenswrapper[4973]: I0320 13:57:12.608989 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h4jz5" Mar 20 13:57:13 crc kubenswrapper[4973]: I0320 13:57:13.667165 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h4jz5" podUID="0e26365b-1585-4ea4-b2f0-7405f36c38d1" containerName="registry-server" probeResult="failure" output=< Mar 20 13:57:13 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 13:57:13 crc kubenswrapper[4973]: > Mar 20 13:57:17 crc kubenswrapper[4973]: I0320 13:57:17.915280 4973 generic.go:334] "Generic (PLEG): container finished" podID="08df9ecf-310e-4fee-9ec6-e13e27f1537b" containerID="c06f9498f4cbf35dace0cccf0f767d41d3f6ee02d563090c073c2546e04cb56b" exitCode=0 Mar 20 13:57:17 crc kubenswrapper[4973]: I0320 13:57:17.915359 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5" event={"ID":"08df9ecf-310e-4fee-9ec6-e13e27f1537b","Type":"ContainerDied","Data":"c06f9498f4cbf35dace0cccf0f767d41d3f6ee02d563090c073c2546e04cb56b"} Mar 20 13:57:19 crc kubenswrapper[4973]: I0320 13:57:19.550817 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5" Mar 20 13:57:19 crc kubenswrapper[4973]: I0320 13:57:19.639932 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kstct\" (UniqueName: \"kubernetes.io/projected/08df9ecf-310e-4fee-9ec6-e13e27f1537b-kube-api-access-kstct\") pod \"08df9ecf-310e-4fee-9ec6-e13e27f1537b\" (UID: \"08df9ecf-310e-4fee-9ec6-e13e27f1537b\") " Mar 20 13:57:19 crc kubenswrapper[4973]: I0320 13:57:19.640008 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08df9ecf-310e-4fee-9ec6-e13e27f1537b-ssh-key-openstack-edpm-ipam\") pod \"08df9ecf-310e-4fee-9ec6-e13e27f1537b\" (UID: \"08df9ecf-310e-4fee-9ec6-e13e27f1537b\") " Mar 20 13:57:19 crc kubenswrapper[4973]: I0320 13:57:19.640303 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08df9ecf-310e-4fee-9ec6-e13e27f1537b-inventory\") pod \"08df9ecf-310e-4fee-9ec6-e13e27f1537b\" (UID: \"08df9ecf-310e-4fee-9ec6-e13e27f1537b\") " Mar 20 13:57:19 crc kubenswrapper[4973]: I0320 13:57:19.649937 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08df9ecf-310e-4fee-9ec6-e13e27f1537b-kube-api-access-kstct" (OuterVolumeSpecName: "kube-api-access-kstct") pod "08df9ecf-310e-4fee-9ec6-e13e27f1537b" (UID: "08df9ecf-310e-4fee-9ec6-e13e27f1537b"). InnerVolumeSpecName "kube-api-access-kstct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:57:19 crc kubenswrapper[4973]: I0320 13:57:19.676670 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08df9ecf-310e-4fee-9ec6-e13e27f1537b-inventory" (OuterVolumeSpecName: "inventory") pod "08df9ecf-310e-4fee-9ec6-e13e27f1537b" (UID: "08df9ecf-310e-4fee-9ec6-e13e27f1537b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:57:19 crc kubenswrapper[4973]: I0320 13:57:19.680078 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08df9ecf-310e-4fee-9ec6-e13e27f1537b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "08df9ecf-310e-4fee-9ec6-e13e27f1537b" (UID: "08df9ecf-310e-4fee-9ec6-e13e27f1537b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:57:19 crc kubenswrapper[4973]: I0320 13:57:19.744325 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kstct\" (UniqueName: \"kubernetes.io/projected/08df9ecf-310e-4fee-9ec6-e13e27f1537b-kube-api-access-kstct\") on node \"crc\" DevicePath \"\"" Mar 20 13:57:19 crc kubenswrapper[4973]: I0320 13:57:19.744373 4973 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08df9ecf-310e-4fee-9ec6-e13e27f1537b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 13:57:19 crc kubenswrapper[4973]: I0320 13:57:19.744386 4973 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/08df9ecf-310e-4fee-9ec6-e13e27f1537b-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 13:57:19 crc kubenswrapper[4973]: I0320 13:57:19.937814 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5" event={"ID":"08df9ecf-310e-4fee-9ec6-e13e27f1537b","Type":"ContainerDied","Data":"09e5d17bb26649da66d502780e57e39ac8b309b340bf3dd14539bd22aafc22e9"} Mar 20 13:57:19 crc kubenswrapper[4973]: I0320 13:57:19.938050 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09e5d17bb26649da66d502780e57e39ac8b309b340bf3dd14539bd22aafc22e9" Mar 20 13:57:19 crc kubenswrapper[4973]: I0320 13:57:19.937946 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5" Mar 20 13:57:20 crc kubenswrapper[4973]: I0320 13:57:20.046494 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7"] Mar 20 13:57:20 crc kubenswrapper[4973]: E0320 13:57:20.047370 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08df9ecf-310e-4fee-9ec6-e13e27f1537b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 20 13:57:20 crc kubenswrapper[4973]: I0320 13:57:20.047442 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="08df9ecf-310e-4fee-9ec6-e13e27f1537b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 20 13:57:20 crc kubenswrapper[4973]: I0320 13:57:20.047735 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="08df9ecf-310e-4fee-9ec6-e13e27f1537b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 20 13:57:20 crc kubenswrapper[4973]: I0320 13:57:20.048696 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7" Mar 20 13:57:20 crc kubenswrapper[4973]: I0320 13:57:20.054965 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 13:57:20 crc kubenswrapper[4973]: I0320 13:57:20.055202 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d4wjc" Mar 20 13:57:20 crc kubenswrapper[4973]: I0320 13:57:20.055413 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 13:57:20 crc kubenswrapper[4973]: I0320 13:57:20.055526 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 13:57:20 crc kubenswrapper[4973]: I0320 13:57:20.059556 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7"] Mar 20 13:57:20 crc kubenswrapper[4973]: I0320 13:57:20.158800 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d8302de-98a7-45e4-ab44-fddc83ce2f4b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7\" (UID: \"5d8302de-98a7-45e4-ab44-fddc83ce2f4b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7" Mar 20 13:57:20 crc kubenswrapper[4973]: I0320 13:57:20.158924 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d8302de-98a7-45e4-ab44-fddc83ce2f4b-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7\" (UID: \"5d8302de-98a7-45e4-ab44-fddc83ce2f4b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7" Mar 20 13:57:20 crc kubenswrapper[4973]: I0320 13:57:20.158987 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcxlf\" (UniqueName: \"kubernetes.io/projected/5d8302de-98a7-45e4-ab44-fddc83ce2f4b-kube-api-access-fcxlf\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7\" (UID: \"5d8302de-98a7-45e4-ab44-fddc83ce2f4b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7" Mar 20 13:57:20 crc kubenswrapper[4973]: I0320 13:57:20.262135 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d8302de-98a7-45e4-ab44-fddc83ce2f4b-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7\" (UID: \"5d8302de-98a7-45e4-ab44-fddc83ce2f4b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7" Mar 20 13:57:20 crc kubenswrapper[4973]: I0320 13:57:20.262250 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcxlf\" (UniqueName: \"kubernetes.io/projected/5d8302de-98a7-45e4-ab44-fddc83ce2f4b-kube-api-access-fcxlf\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7\" (UID: \"5d8302de-98a7-45e4-ab44-fddc83ce2f4b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7" Mar 20 13:57:20 crc kubenswrapper[4973]: I0320 13:57:20.262391 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d8302de-98a7-45e4-ab44-fddc83ce2f4b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7\" (UID: \"5d8302de-98a7-45e4-ab44-fddc83ce2f4b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7" Mar 20 13:57:20 crc kubenswrapper[4973]: I0320 13:57:20.267809 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d8302de-98a7-45e4-ab44-fddc83ce2f4b-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7\" (UID: \"5d8302de-98a7-45e4-ab44-fddc83ce2f4b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7" Mar 20 13:57:20 crc kubenswrapper[4973]: I0320 13:57:20.268376 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d8302de-98a7-45e4-ab44-fddc83ce2f4b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7\" (UID: \"5d8302de-98a7-45e4-ab44-fddc83ce2f4b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7" Mar 20 13:57:20 crc kubenswrapper[4973]: I0320 13:57:20.278649 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcxlf\" (UniqueName: \"kubernetes.io/projected/5d8302de-98a7-45e4-ab44-fddc83ce2f4b-kube-api-access-fcxlf\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7\" (UID: \"5d8302de-98a7-45e4-ab44-fddc83ce2f4b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7" Mar 20 13:57:20 crc kubenswrapper[4973]: I0320 13:57:20.380273 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7" Mar 20 13:57:20 crc kubenswrapper[4973]: I0320 13:57:20.933501 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7"] Mar 20 13:57:20 crc kubenswrapper[4973]: W0320 13:57:20.940030 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d8302de_98a7_45e4_ab44_fddc83ce2f4b.slice/crio-3b31d7e215704098959ccd6034506690949d514f8b1413ba3b18895118f32d5c WatchSource:0}: Error finding container 3b31d7e215704098959ccd6034506690949d514f8b1413ba3b18895118f32d5c: Status 404 returned error can't find the container with id 3b31d7e215704098959ccd6034506690949d514f8b1413ba3b18895118f32d5c Mar 20 13:57:21 crc kubenswrapper[4973]: I0320 13:57:21.963461 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7" event={"ID":"5d8302de-98a7-45e4-ab44-fddc83ce2f4b","Type":"ContainerStarted","Data":"9d640e5d866bbab313840486a79a923a61228333975b6857799450af821617c9"} Mar 20 13:57:21 crc kubenswrapper[4973]: I0320 13:57:21.964315 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7" event={"ID":"5d8302de-98a7-45e4-ab44-fddc83ce2f4b","Type":"ContainerStarted","Data":"3b31d7e215704098959ccd6034506690949d514f8b1413ba3b18895118f32d5c"} Mar 20 13:57:21 crc kubenswrapper[4973]: I0320 13:57:21.986267 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7" podStartSLOduration=1.5655801839999999 podStartE2EDuration="1.986245998s" podCreationTimestamp="2026-03-20 13:57:20 +0000 UTC" firstStartedPulling="2026-03-20 13:57:20.942579006 +0000 UTC m=+2161.686248750" lastFinishedPulling="2026-03-20 13:57:21.36324482 +0000 UTC m=+2162.106914564" observedRunningTime="2026-03-20 13:57:21.97794934 +0000 UTC m=+2162.721619094" watchObservedRunningTime="2026-03-20 13:57:21.986245998 +0000 UTC m=+2162.729915742" Mar 20 13:57:23 crc kubenswrapper[4973]: I0320 13:57:23.665199 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h4jz5" podUID="0e26365b-1585-4ea4-b2f0-7405f36c38d1" containerName="registry-server" probeResult="failure" output=< Mar 20 13:57:23 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 13:57:23 crc kubenswrapper[4973]: > Mar 20 13:57:32 crc kubenswrapper[4973]: I0320 13:57:32.656609 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h4jz5" Mar 20 13:57:32 crc kubenswrapper[4973]: I0320 13:57:32.713913 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h4jz5" Mar 20 13:57:32 crc kubenswrapper[4973]: I0320 13:57:32.901763 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h4jz5"] Mar 20 13:57:34 crc kubenswrapper[4973]: I0320 13:57:34.072777 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h4jz5" podUID="0e26365b-1585-4ea4-b2f0-7405f36c38d1" containerName="registry-server" containerID="cri-o://4b35c2642b0fbfa8b82fb2979a569540e576282cae497e4886d8ec6acfab2183" gracePeriod=2 Mar 20 13:57:34 crc kubenswrapper[4973]: I0320 13:57:34.593879 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4jz5" Mar 20 13:57:34 crc kubenswrapper[4973]: I0320 13:57:34.735528 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrwws\" (UniqueName: \"kubernetes.io/projected/0e26365b-1585-4ea4-b2f0-7405f36c38d1-kube-api-access-nrwws\") pod \"0e26365b-1585-4ea4-b2f0-7405f36c38d1\" (UID: \"0e26365b-1585-4ea4-b2f0-7405f36c38d1\") " Mar 20 13:57:34 crc kubenswrapper[4973]: I0320 13:57:34.735569 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e26365b-1585-4ea4-b2f0-7405f36c38d1-utilities\") pod \"0e26365b-1585-4ea4-b2f0-7405f36c38d1\" (UID: \"0e26365b-1585-4ea4-b2f0-7405f36c38d1\") " Mar 20 13:57:34 crc kubenswrapper[4973]: I0320 13:57:34.735650 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e26365b-1585-4ea4-b2f0-7405f36c38d1-catalog-content\") pod \"0e26365b-1585-4ea4-b2f0-7405f36c38d1\" (UID: \"0e26365b-1585-4ea4-b2f0-7405f36c38d1\") " Mar 20 13:57:34 crc kubenswrapper[4973]: I0320 13:57:34.736236 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e26365b-1585-4ea4-b2f0-7405f36c38d1-utilities" (OuterVolumeSpecName: "utilities") pod "0e26365b-1585-4ea4-b2f0-7405f36c38d1" (UID: "0e26365b-1585-4ea4-b2f0-7405f36c38d1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:57:34 crc kubenswrapper[4973]: I0320 13:57:34.736561 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e26365b-1585-4ea4-b2f0-7405f36c38d1-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:57:34 crc kubenswrapper[4973]: I0320 13:57:34.742994 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e26365b-1585-4ea4-b2f0-7405f36c38d1-kube-api-access-nrwws" (OuterVolumeSpecName: "kube-api-access-nrwws") pod "0e26365b-1585-4ea4-b2f0-7405f36c38d1" (UID: "0e26365b-1585-4ea4-b2f0-7405f36c38d1"). InnerVolumeSpecName "kube-api-access-nrwws". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:57:34 crc kubenswrapper[4973]: I0320 13:57:34.838572 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrwws\" (UniqueName: \"kubernetes.io/projected/0e26365b-1585-4ea4-b2f0-7405f36c38d1-kube-api-access-nrwws\") on node \"crc\" DevicePath \"\"" Mar 20 13:57:34 crc kubenswrapper[4973]: I0320 13:57:34.865510 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e26365b-1585-4ea4-b2f0-7405f36c38d1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e26365b-1585-4ea4-b2f0-7405f36c38d1" (UID: "0e26365b-1585-4ea4-b2f0-7405f36c38d1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:57:34 crc kubenswrapper[4973]: I0320 13:57:34.940418 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e26365b-1585-4ea4-b2f0-7405f36c38d1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:57:35 crc kubenswrapper[4973]: I0320 13:57:35.086359 4973 generic.go:334] "Generic (PLEG): container finished" podID="0e26365b-1585-4ea4-b2f0-7405f36c38d1" containerID="4b35c2642b0fbfa8b82fb2979a569540e576282cae497e4886d8ec6acfab2183" exitCode=0 Mar 20 13:57:35 crc kubenswrapper[4973]: I0320 13:57:35.086399 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4jz5" event={"ID":"0e26365b-1585-4ea4-b2f0-7405f36c38d1","Type":"ContainerDied","Data":"4b35c2642b0fbfa8b82fb2979a569540e576282cae497e4886d8ec6acfab2183"} Mar 20 13:57:35 crc kubenswrapper[4973]: I0320 13:57:35.086423 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h4jz5" event={"ID":"0e26365b-1585-4ea4-b2f0-7405f36c38d1","Type":"ContainerDied","Data":"abd0dc6a78ef29f6649888df1f8abe73143a280f95eae187d19dde7327a2e82c"} Mar 20 13:57:35 crc kubenswrapper[4973]: I0320 13:57:35.086444 4973 scope.go:117] "RemoveContainer" containerID="4b35c2642b0fbfa8b82fb2979a569540e576282cae497e4886d8ec6acfab2183" Mar 20 13:57:35 crc kubenswrapper[4973]: I0320 13:57:35.086559 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-h4jz5" Mar 20 13:57:35 crc kubenswrapper[4973]: I0320 13:57:35.112075 4973 scope.go:117] "RemoveContainer" containerID="4d9105e289ccbde76460d09f1ee5653ee31ad849d04b5bc0dfef28f2fee5b83d" Mar 20 13:57:35 crc kubenswrapper[4973]: I0320 13:57:35.129783 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-h4jz5"] Mar 20 13:57:35 crc kubenswrapper[4973]: I0320 13:57:35.139784 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-h4jz5"] Mar 20 13:57:35 crc kubenswrapper[4973]: I0320 13:57:35.139901 4973 scope.go:117] "RemoveContainer" containerID="8657e8432c53a3c99d36afc935649784c5a54ecb38eabeffb8d38354094d137f" Mar 20 13:57:35 crc kubenswrapper[4973]: I0320 13:57:35.190749 4973 scope.go:117] "RemoveContainer" containerID="4b35c2642b0fbfa8b82fb2979a569540e576282cae497e4886d8ec6acfab2183" Mar 20 13:57:35 crc kubenswrapper[4973]: E0320 13:57:35.191592 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b35c2642b0fbfa8b82fb2979a569540e576282cae497e4886d8ec6acfab2183\": container with ID starting with 4b35c2642b0fbfa8b82fb2979a569540e576282cae497e4886d8ec6acfab2183 not found: ID does not exist" containerID="4b35c2642b0fbfa8b82fb2979a569540e576282cae497e4886d8ec6acfab2183" Mar 20 13:57:35 crc kubenswrapper[4973]: I0320 13:57:35.191623 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b35c2642b0fbfa8b82fb2979a569540e576282cae497e4886d8ec6acfab2183"} err="failed to get container status \"4b35c2642b0fbfa8b82fb2979a569540e576282cae497e4886d8ec6acfab2183\": rpc error: code = NotFound desc = could not find container \"4b35c2642b0fbfa8b82fb2979a569540e576282cae497e4886d8ec6acfab2183\": container with ID starting with 4b35c2642b0fbfa8b82fb2979a569540e576282cae497e4886d8ec6acfab2183 not found: ID does not exist" Mar 20 13:57:35 crc kubenswrapper[4973]: I0320 13:57:35.191663 4973 scope.go:117] "RemoveContainer" containerID="4d9105e289ccbde76460d09f1ee5653ee31ad849d04b5bc0dfef28f2fee5b83d" Mar 20 13:57:35 crc kubenswrapper[4973]: E0320 13:57:35.191964 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d9105e289ccbde76460d09f1ee5653ee31ad849d04b5bc0dfef28f2fee5b83d\": container with ID starting with 4d9105e289ccbde76460d09f1ee5653ee31ad849d04b5bc0dfef28f2fee5b83d not found: ID does not exist" containerID="4d9105e289ccbde76460d09f1ee5653ee31ad849d04b5bc0dfef28f2fee5b83d" Mar 20 13:57:35 crc kubenswrapper[4973]: I0320 13:57:35.192015 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d9105e289ccbde76460d09f1ee5653ee31ad849d04b5bc0dfef28f2fee5b83d"} err="failed to get container status \"4d9105e289ccbde76460d09f1ee5653ee31ad849d04b5bc0dfef28f2fee5b83d\": rpc error: code = NotFound desc = could not find container \"4d9105e289ccbde76460d09f1ee5653ee31ad849d04b5bc0dfef28f2fee5b83d\": container with ID starting with 4d9105e289ccbde76460d09f1ee5653ee31ad849d04b5bc0dfef28f2fee5b83d not found: ID does not exist" Mar 20 13:57:35 crc kubenswrapper[4973]: I0320 13:57:35.192047 4973 scope.go:117] "RemoveContainer" containerID="8657e8432c53a3c99d36afc935649784c5a54ecb38eabeffb8d38354094d137f" Mar 20 13:57:35 crc kubenswrapper[4973]: E0320 13:57:35.192383 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8657e8432c53a3c99d36afc935649784c5a54ecb38eabeffb8d38354094d137f\": container with ID starting with 8657e8432c53a3c99d36afc935649784c5a54ecb38eabeffb8d38354094d137f not found: ID does not exist" containerID="8657e8432c53a3c99d36afc935649784c5a54ecb38eabeffb8d38354094d137f" Mar 20 13:57:35 crc kubenswrapper[4973]: I0320 13:57:35.192445 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8657e8432c53a3c99d36afc935649784c5a54ecb38eabeffb8d38354094d137f"} err="failed to get container status \"8657e8432c53a3c99d36afc935649784c5a54ecb38eabeffb8d38354094d137f\": rpc error: code = NotFound desc = could not find container \"8657e8432c53a3c99d36afc935649784c5a54ecb38eabeffb8d38354094d137f\": container with ID starting with 8657e8432c53a3c99d36afc935649784c5a54ecb38eabeffb8d38354094d137f not found: ID does not exist" Mar 20 13:57:35 crc kubenswrapper[4973]: I0320 13:57:35.963268 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e26365b-1585-4ea4-b2f0-7405f36c38d1" path="/var/lib/kubelet/pods/0e26365b-1585-4ea4-b2f0-7405f36c38d1/volumes" Mar 20 13:57:52 crc kubenswrapper[4973]: I0320 13:57:52.047767 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cl4pf"] Mar 20 13:57:52 crc kubenswrapper[4973]: I0320 13:57:52.060735 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-cl4pf"] Mar 20 13:57:53 crc kubenswrapper[4973]: I0320 13:57:53.967907 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="066916b9-4270-42bd-bcd1-3fd26bd65a9e" path="/var/lib/kubelet/pods/066916b9-4270-42bd-bcd1-3fd26bd65a9e/volumes" Mar 20 13:57:59 crc kubenswrapper[4973]: I0320 13:57:59.777001 4973 scope.go:117] "RemoveContainer" containerID="d7d22475246e6e4e91c8c765cc857280240fcdc8df652e20dc5511e8c4bf93a1" Mar 20 13:57:59 crc kubenswrapper[4973]: I0320 13:57:59.804950 4973 scope.go:117] "RemoveContainer" containerID="0c1369a0c6c79678a9256215b93c6834c97a2669821cd309f83bf0f0db642d27" Mar 20 13:57:59 crc kubenswrapper[4973]: I0320 13:57:59.883292 4973 scope.go:117] "RemoveContainer" containerID="210d5c181fb4f04e97bfc605675b4d3f7ca4c477d78dfbb351f46f194c573ca4" Mar 20 13:57:59 crc kubenswrapper[4973]: I0320 13:57:59.942299 4973 scope.go:117] "RemoveContainer" containerID="3b8c4124bd0aef0f3c36277c16d5fcfeb94a2c582b7e38567d2feae6b95ee4b2" Mar 20 13:57:59 crc kubenswrapper[4973]: I0320 13:57:59.990080 4973 scope.go:117] "RemoveContainer" containerID="ae3d8aa69626632f565d96c23915d440edfa3fe8a10884a7f681f436b737ab0c" Mar 20 13:58:00 crc kubenswrapper[4973]: I0320 13:58:00.057658 4973 scope.go:117] "RemoveContainer" containerID="5cf9ce1eea02e37bf7bdaa07a9c80d5983c431baf6d4859bcd48ce739ee58526" Mar 20 13:58:00 crc kubenswrapper[4973]: I0320 13:58:00.116381 4973 scope.go:117] "RemoveContainer" containerID="594b59f55856a3ae9b944997d88058dcedcd548ee1690cbd564fb3a702344490" Mar 20 13:58:00 crc kubenswrapper[4973]: I0320 13:58:00.154970 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566918-hj4mt"] Mar 20 13:58:00 crc kubenswrapper[4973]: E0320 13:58:00.155585 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e26365b-1585-4ea4-b2f0-7405f36c38d1" containerName="extract-utilities" Mar 20 13:58:00 crc kubenswrapper[4973]: I0320 13:58:00.155607 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e26365b-1585-4ea4-b2f0-7405f36c38d1" containerName="extract-utilities" Mar 20 13:58:00 crc kubenswrapper[4973]: E0320 13:58:00.155619 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e26365b-1585-4ea4-b2f0-7405f36c38d1" containerName="registry-server" Mar 20 13:58:00 crc kubenswrapper[4973]: I0320 13:58:00.155627 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e26365b-1585-4ea4-b2f0-7405f36c38d1" containerName="registry-server" Mar 20 13:58:00 crc kubenswrapper[4973]: E0320 13:58:00.155696 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e26365b-1585-4ea4-b2f0-7405f36c38d1" containerName="extract-content" Mar 20 13:58:00 crc kubenswrapper[4973]: I0320 13:58:00.155705 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e26365b-1585-4ea4-b2f0-7405f36c38d1" containerName="extract-content" Mar 20 13:58:00 crc kubenswrapper[4973]: I0320 13:58:00.155996 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e26365b-1585-4ea4-b2f0-7405f36c38d1" containerName="registry-server" Mar 20 13:58:00 crc kubenswrapper[4973]: I0320 13:58:00.157009 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566918-hj4mt" Mar 20 13:58:00 crc kubenswrapper[4973]: I0320 13:58:00.159949 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 13:58:00 crc kubenswrapper[4973]: I0320 13:58:00.159959 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 13:58:00 crc kubenswrapper[4973]: I0320 13:58:00.160021 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 13:58:00 crc kubenswrapper[4973]: I0320 13:58:00.177931 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566918-hj4mt"] Mar 20 13:58:00 crc kubenswrapper[4973]: I0320 13:58:00.348297 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mmbl\" (UniqueName: \"kubernetes.io/projected/f20225fb-dbbe-455f-8125-634ba343eae7-kube-api-access-2mmbl\") pod \"auto-csr-approver-29566918-hj4mt\" (UID: \"f20225fb-dbbe-455f-8125-634ba343eae7\") " pod="openshift-infra/auto-csr-approver-29566918-hj4mt" Mar 20 13:58:00 crc kubenswrapper[4973]: I0320 13:58:00.450981 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mmbl\" (UniqueName: \"kubernetes.io/projected/f20225fb-dbbe-455f-8125-634ba343eae7-kube-api-access-2mmbl\") pod \"auto-csr-approver-29566918-hj4mt\" (UID: \"f20225fb-dbbe-455f-8125-634ba343eae7\") " pod="openshift-infra/auto-csr-approver-29566918-hj4mt" Mar 20 13:58:00 crc kubenswrapper[4973]: I0320 13:58:00.470446 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mmbl\" (UniqueName: \"kubernetes.io/projected/f20225fb-dbbe-455f-8125-634ba343eae7-kube-api-access-2mmbl\") pod \"auto-csr-approver-29566918-hj4mt\" (UID: \"f20225fb-dbbe-455f-8125-634ba343eae7\") " pod="openshift-infra/auto-csr-approver-29566918-hj4mt" Mar 20 13:58:00 crc kubenswrapper[4973]: I0320 13:58:00.488834 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566918-hj4mt" Mar 20 13:58:01 crc kubenswrapper[4973]: I0320 13:58:01.001382 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566918-hj4mt"] Mar 20 13:58:01 crc kubenswrapper[4973]: W0320 13:58:01.001908 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf20225fb_dbbe_455f_8125_634ba343eae7.slice/crio-e4b5273b66ab5fff12dc919c08365d07bbfebb52a24986df51eba111b4544157 WatchSource:0}: Error finding container e4b5273b66ab5fff12dc919c08365d07bbfebb52a24986df51eba111b4544157: Status 404 returned error can't find the container with id e4b5273b66ab5fff12dc919c08365d07bbfebb52a24986df51eba111b4544157 Mar 20 13:58:01 crc kubenswrapper[4973]: I0320 13:58:01.367079 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566918-hj4mt" event={"ID":"f20225fb-dbbe-455f-8125-634ba343eae7","Type":"ContainerStarted","Data":"e4b5273b66ab5fff12dc919c08365d07bbfebb52a24986df51eba111b4544157"} Mar 20 13:58:03 crc kubenswrapper[4973]: I0320 13:58:03.407145 4973 generic.go:334] "Generic (PLEG): container finished" podID="f20225fb-dbbe-455f-8125-634ba343eae7" containerID="8864e77ac616f05259cc17aefd544daa37850026e094a5d998a441eebfcc8ed9" exitCode=0 Mar 20 13:58:03 crc kubenswrapper[4973]: I0320 13:58:03.408380 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566918-hj4mt" event={"ID":"f20225fb-dbbe-455f-8125-634ba343eae7","Type":"ContainerDied","Data":"8864e77ac616f05259cc17aefd544daa37850026e094a5d998a441eebfcc8ed9"} Mar 20 13:58:04 crc kubenswrapper[4973]: I0320 13:58:04.891125 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566918-hj4mt" Mar 20 13:58:04 crc kubenswrapper[4973]: I0320 13:58:04.961520 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mmbl\" (UniqueName: \"kubernetes.io/projected/f20225fb-dbbe-455f-8125-634ba343eae7-kube-api-access-2mmbl\") pod \"f20225fb-dbbe-455f-8125-634ba343eae7\" (UID: \"f20225fb-dbbe-455f-8125-634ba343eae7\") " Mar 20 13:58:04 crc kubenswrapper[4973]: I0320 13:58:04.970755 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f20225fb-dbbe-455f-8125-634ba343eae7-kube-api-access-2mmbl" (OuterVolumeSpecName: "kube-api-access-2mmbl") pod "f20225fb-dbbe-455f-8125-634ba343eae7" (UID: "f20225fb-dbbe-455f-8125-634ba343eae7"). InnerVolumeSpecName "kube-api-access-2mmbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:58:05 crc kubenswrapper[4973]: I0320 13:58:05.073464 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mmbl\" (UniqueName: \"kubernetes.io/projected/f20225fb-dbbe-455f-8125-634ba343eae7-kube-api-access-2mmbl\") on node \"crc\" DevicePath \"\"" Mar 20 13:58:05 crc kubenswrapper[4973]: I0320 13:58:05.430072 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566918-hj4mt" event={"ID":"f20225fb-dbbe-455f-8125-634ba343eae7","Type":"ContainerDied","Data":"e4b5273b66ab5fff12dc919c08365d07bbfebb52a24986df51eba111b4544157"} Mar 20 13:58:05 crc kubenswrapper[4973]: I0320 13:58:05.430418 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4b5273b66ab5fff12dc919c08365d07bbfebb52a24986df51eba111b4544157" Mar 20 13:58:05 crc kubenswrapper[4973]: I0320 13:58:05.430159 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566918-hj4mt" Mar 20 13:58:05 crc kubenswrapper[4973]: I0320 13:58:05.989195 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566912-gjjsx"] Mar 20 13:58:06 crc kubenswrapper[4973]: I0320 13:58:06.003379 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566912-gjjsx"] Mar 20 13:58:07 crc kubenswrapper[4973]: I0320 13:58:07.962773 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f64f4919-8d92-441b-be1d-84a900cbe013" path="/var/lib/kubelet/pods/f64f4919-8d92-441b-be1d-84a900cbe013/volumes" Mar 20 13:58:15 crc kubenswrapper[4973]: I0320 13:58:15.043808 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-eeca-account-create-update-q7rsh"] Mar 20 13:58:15 crc kubenswrapper[4973]: I0320 13:58:15.056537 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-f7kbp"] Mar 20 13:58:15 crc kubenswrapper[4973]: I0320 13:58:15.071017 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-eeca-account-create-update-q7rsh"] Mar 20 13:58:15 crc kubenswrapper[4973]: I0320 13:58:15.080683 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-f7kbp"] Mar 20 13:58:15 crc kubenswrapper[4973]: I0320 13:58:15.963973 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cc72d03-7a34-42ab-ad05-54dea5adfa04" path="/var/lib/kubelet/pods/4cc72d03-7a34-42ab-ad05-54dea5adfa04/volumes" Mar 20 13:58:15 crc kubenswrapper[4973]: I0320 13:58:15.964947 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0c659f5-9f0d-4133-a245-7def431f0b1a" path="/var/lib/kubelet/pods/e0c659f5-9f0d-4133-a245-7def431f0b1a/volumes" Mar 20 13:58:26 crc kubenswrapper[4973]: I0320 13:58:26.647580 4973 generic.go:334] "Generic (PLEG): container finished" podID="5d8302de-98a7-45e4-ab44-fddc83ce2f4b" containerID="9d640e5d866bbab313840486a79a923a61228333975b6857799450af821617c9" exitCode=0 Mar 20 13:58:26 crc kubenswrapper[4973]: I0320 13:58:26.647761 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7" event={"ID":"5d8302de-98a7-45e4-ab44-fddc83ce2f4b","Type":"ContainerDied","Data":"9d640e5d866bbab313840486a79a923a61228333975b6857799450af821617c9"} Mar 20 13:58:28 crc kubenswrapper[4973]: I0320 13:58:28.246198 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7" Mar 20 13:58:28 crc kubenswrapper[4973]: I0320 13:58:28.357505 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d8302de-98a7-45e4-ab44-fddc83ce2f4b-ssh-key-openstack-edpm-ipam\") pod \"5d8302de-98a7-45e4-ab44-fddc83ce2f4b\" (UID: \"5d8302de-98a7-45e4-ab44-fddc83ce2f4b\") " Mar 20 13:58:28 crc kubenswrapper[4973]: I0320 13:58:28.357644 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcxlf\" (UniqueName: \"kubernetes.io/projected/5d8302de-98a7-45e4-ab44-fddc83ce2f4b-kube-api-access-fcxlf\") pod \"5d8302de-98a7-45e4-ab44-fddc83ce2f4b\" (UID: \"5d8302de-98a7-45e4-ab44-fddc83ce2f4b\") " Mar 20 13:58:28 crc kubenswrapper[4973]: I0320 13:58:28.357776 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d8302de-98a7-45e4-ab44-fddc83ce2f4b-inventory\") pod \"5d8302de-98a7-45e4-ab44-fddc83ce2f4b\" (UID: \"5d8302de-98a7-45e4-ab44-fddc83ce2f4b\") " Mar 20 13:58:28 crc kubenswrapper[4973]: I0320 13:58:28.363893 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d8302de-98a7-45e4-ab44-fddc83ce2f4b-kube-api-access-fcxlf" (OuterVolumeSpecName: "kube-api-access-fcxlf") pod "5d8302de-98a7-45e4-ab44-fddc83ce2f4b" (UID: "5d8302de-98a7-45e4-ab44-fddc83ce2f4b"). InnerVolumeSpecName "kube-api-access-fcxlf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:58:28 crc kubenswrapper[4973]: I0320 13:58:28.397886 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d8302de-98a7-45e4-ab44-fddc83ce2f4b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5d8302de-98a7-45e4-ab44-fddc83ce2f4b" (UID: "5d8302de-98a7-45e4-ab44-fddc83ce2f4b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:58:28 crc kubenswrapper[4973]: I0320 13:58:28.401784 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d8302de-98a7-45e4-ab44-fddc83ce2f4b-inventory" (OuterVolumeSpecName: "inventory") pod "5d8302de-98a7-45e4-ab44-fddc83ce2f4b" (UID: "5d8302de-98a7-45e4-ab44-fddc83ce2f4b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:58:28 crc kubenswrapper[4973]: I0320 13:58:28.461189 4973 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5d8302de-98a7-45e4-ab44-fddc83ce2f4b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 13:58:28 crc kubenswrapper[4973]: I0320 13:58:28.461222 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcxlf\" (UniqueName: \"kubernetes.io/projected/5d8302de-98a7-45e4-ab44-fddc83ce2f4b-kube-api-access-fcxlf\") on node \"crc\" DevicePath \"\"" Mar 20 13:58:28 crc kubenswrapper[4973]: I0320 13:58:28.461235 4973 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5d8302de-98a7-45e4-ab44-fddc83ce2f4b-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 13:58:28 crc kubenswrapper[4973]: I0320 13:58:28.675163 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7" event={"ID":"5d8302de-98a7-45e4-ab44-fddc83ce2f4b","Type":"ContainerDied","Data":"3b31d7e215704098959ccd6034506690949d514f8b1413ba3b18895118f32d5c"} Mar 20 13:58:28 crc kubenswrapper[4973]: I0320 13:58:28.675230 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b31d7e215704098959ccd6034506690949d514f8b1413ba3b18895118f32d5c" Mar 20 13:58:28 crc kubenswrapper[4973]: I0320 13:58:28.675310 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7" Mar 20 13:58:28 crc kubenswrapper[4973]: I0320 13:58:28.781050 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj"] Mar 20 13:58:28 crc kubenswrapper[4973]: E0320 13:58:28.782095 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d8302de-98a7-45e4-ab44-fddc83ce2f4b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 20 13:58:28 crc kubenswrapper[4973]: I0320 13:58:28.782119 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d8302de-98a7-45e4-ab44-fddc83ce2f4b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 20 13:58:28 crc kubenswrapper[4973]: E0320 13:58:28.782162 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f20225fb-dbbe-455f-8125-634ba343eae7" containerName="oc" Mar 20 13:58:28 crc kubenswrapper[4973]: I0320 13:58:28.782172 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="f20225fb-dbbe-455f-8125-634ba343eae7" containerName="oc" Mar 20 13:58:28 crc kubenswrapper[4973]: I0320 13:58:28.787545 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="f20225fb-dbbe-455f-8125-634ba343eae7" containerName="oc" Mar 20 13:58:28 crc kubenswrapper[4973]: I0320 13:58:28.787658 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d8302de-98a7-45e4-ab44-fddc83ce2f4b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 20 13:58:28 crc kubenswrapper[4973]: I0320 13:58:28.791527 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj" Mar 20 13:58:28 crc kubenswrapper[4973]: I0320 13:58:28.802442 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 13:58:28 crc kubenswrapper[4973]: I0320 13:58:28.802760 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d4wjc" Mar 20 13:58:28 crc kubenswrapper[4973]: I0320 13:58:28.805998 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 13:58:28 crc kubenswrapper[4973]: I0320 13:58:28.806958 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 13:58:28 crc kubenswrapper[4973]: I0320 13:58:28.831503 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj"] Mar 20 13:58:28 crc kubenswrapper[4973]: I0320 13:58:28.977089 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/72622eaf-f7fc-44a6-9700-4de40de09009-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj\" (UID: \"72622eaf-f7fc-44a6-9700-4de40de09009\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj" Mar 20 13:58:28 crc kubenswrapper[4973]: I0320 13:58:28.977486 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72622eaf-f7fc-44a6-9700-4de40de09009-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj\" (UID: \"72622eaf-f7fc-44a6-9700-4de40de09009\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj" Mar 20 13:58:28 crc kubenswrapper[4973]: I0320 13:58:28.977557 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-829dz\" (UniqueName: \"kubernetes.io/projected/72622eaf-f7fc-44a6-9700-4de40de09009-kube-api-access-829dz\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj\" (UID: \"72622eaf-f7fc-44a6-9700-4de40de09009\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj" Mar 20 13:58:29 crc kubenswrapper[4973]: I0320 13:58:29.082208 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-829dz\" (UniqueName: \"kubernetes.io/projected/72622eaf-f7fc-44a6-9700-4de40de09009-kube-api-access-829dz\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj\" (UID: \"72622eaf-f7fc-44a6-9700-4de40de09009\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj" Mar 20 13:58:29 crc kubenswrapper[4973]: I0320 13:58:29.082310 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/72622eaf-f7fc-44a6-9700-4de40de09009-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj\" (UID: \"72622eaf-f7fc-44a6-9700-4de40de09009\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj" Mar 20 13:58:29 crc kubenswrapper[4973]: I0320 13:58:29.082649 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72622eaf-f7fc-44a6-9700-4de40de09009-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj\" (UID: \"72622eaf-f7fc-44a6-9700-4de40de09009\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj" Mar 20 13:58:29 crc kubenswrapper[4973]: I0320 13:58:29.088941 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/72622eaf-f7fc-44a6-9700-4de40de09009-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj\" (UID: \"72622eaf-f7fc-44a6-9700-4de40de09009\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj" Mar 20 13:58:29 crc kubenswrapper[4973]: I0320 13:58:29.092676 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72622eaf-f7fc-44a6-9700-4de40de09009-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj\" (UID: \"72622eaf-f7fc-44a6-9700-4de40de09009\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj" Mar 20 13:58:29 crc kubenswrapper[4973]: I0320 13:58:29.101437 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-829dz\" (UniqueName: \"kubernetes.io/projected/72622eaf-f7fc-44a6-9700-4de40de09009-kube-api-access-829dz\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj\" (UID: \"72622eaf-f7fc-44a6-9700-4de40de09009\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj" Mar 20 13:58:29 crc kubenswrapper[4973]: I0320 13:58:29.128498 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj" Mar 20 13:58:29 crc kubenswrapper[4973]: I0320 13:58:29.735584 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj"] Mar 20 13:58:30 crc kubenswrapper[4973]: I0320 13:58:30.694256 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj" event={"ID":"72622eaf-f7fc-44a6-9700-4de40de09009","Type":"ContainerStarted","Data":"8ea3ecbc32ea140833b1b751edb0c6f71209bbc968bc66094414f1da9d4b878a"} Mar 20 13:58:30 crc kubenswrapper[4973]: I0320 13:58:30.694624 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj" event={"ID":"72622eaf-f7fc-44a6-9700-4de40de09009","Type":"ContainerStarted","Data":"4a7b42dcb1708a69108a1c1c1183426ce92ded2108a54392431f095342de6719"} Mar 20 13:58:30 crc kubenswrapper[4973]: I0320 13:58:30.715101 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj" podStartSLOduration=2.138654436 podStartE2EDuration="2.715084094s" podCreationTimestamp="2026-03-20 13:58:28 +0000 UTC" firstStartedPulling="2026-03-20 13:58:29.746815532 +0000 UTC m=+2230.490485276" lastFinishedPulling="2026-03-20 13:58:30.3232452 +0000 UTC m=+2231.066914934" observedRunningTime="2026-03-20 13:58:30.711100895 +0000 UTC m=+2231.454770649" watchObservedRunningTime="2026-03-20 13:58:30.715084094 +0000 UTC m=+2231.458753828" Mar 20 13:58:35 crc kubenswrapper[4973]: I0320 13:58:35.756510 4973 generic.go:334] "Generic (PLEG): container finished" podID="72622eaf-f7fc-44a6-9700-4de40de09009" containerID="8ea3ecbc32ea140833b1b751edb0c6f71209bbc968bc66094414f1da9d4b878a" exitCode=0 Mar 20 13:58:35 crc kubenswrapper[4973]: I0320 13:58:35.756591 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj" event={"ID":"72622eaf-f7fc-44a6-9700-4de40de09009","Type":"ContainerDied","Data":"8ea3ecbc32ea140833b1b751edb0c6f71209bbc968bc66094414f1da9d4b878a"} Mar 20 13:58:37 crc kubenswrapper[4973]: I0320 13:58:37.363897 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj" Mar 20 13:58:37 crc kubenswrapper[4973]: I0320 13:58:37.434251 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/72622eaf-f7fc-44a6-9700-4de40de09009-ssh-key-openstack-edpm-ipam\") pod \"72622eaf-f7fc-44a6-9700-4de40de09009\" (UID: \"72622eaf-f7fc-44a6-9700-4de40de09009\") " Mar 20 13:58:37 crc kubenswrapper[4973]: I0320 13:58:37.434591 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72622eaf-f7fc-44a6-9700-4de40de09009-inventory\") pod \"72622eaf-f7fc-44a6-9700-4de40de09009\" (UID: \"72622eaf-f7fc-44a6-9700-4de40de09009\") " Mar 20 13:58:37 crc kubenswrapper[4973]: I0320 13:58:37.434650 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-829dz\" (UniqueName: \"kubernetes.io/projected/72622eaf-f7fc-44a6-9700-4de40de09009-kube-api-access-829dz\") pod \"72622eaf-f7fc-44a6-9700-4de40de09009\" (UID: \"72622eaf-f7fc-44a6-9700-4de40de09009\") " Mar 20 13:58:37 crc kubenswrapper[4973]: I0320 13:58:37.446890 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72622eaf-f7fc-44a6-9700-4de40de09009-kube-api-access-829dz" (OuterVolumeSpecName: "kube-api-access-829dz") pod "72622eaf-f7fc-44a6-9700-4de40de09009" (UID: "72622eaf-f7fc-44a6-9700-4de40de09009"). InnerVolumeSpecName "kube-api-access-829dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:58:37 crc kubenswrapper[4973]: I0320 13:58:37.467469 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72622eaf-f7fc-44a6-9700-4de40de09009-inventory" (OuterVolumeSpecName: "inventory") pod "72622eaf-f7fc-44a6-9700-4de40de09009" (UID: "72622eaf-f7fc-44a6-9700-4de40de09009"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:58:37 crc kubenswrapper[4973]: I0320 13:58:37.467895 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72622eaf-f7fc-44a6-9700-4de40de09009-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "72622eaf-f7fc-44a6-9700-4de40de09009" (UID: "72622eaf-f7fc-44a6-9700-4de40de09009"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:58:37 crc kubenswrapper[4973]: I0320 13:58:37.537407 4973 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/72622eaf-f7fc-44a6-9700-4de40de09009-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 13:58:37 crc kubenswrapper[4973]: I0320 13:58:37.537451 4973 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/72622eaf-f7fc-44a6-9700-4de40de09009-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 13:58:37 crc kubenswrapper[4973]: I0320 13:58:37.537464 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-829dz\" (UniqueName: \"kubernetes.io/projected/72622eaf-f7fc-44a6-9700-4de40de09009-kube-api-access-829dz\") on node \"crc\" DevicePath \"\"" Mar 20 13:58:37 crc kubenswrapper[4973]: I0320 13:58:37.821826 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj" event={"ID":"72622eaf-f7fc-44a6-9700-4de40de09009","Type":"ContainerDied","Data":"4a7b42dcb1708a69108a1c1c1183426ce92ded2108a54392431f095342de6719"} Mar 20 13:58:37 crc kubenswrapper[4973]: I0320 13:58:37.822135 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a7b42dcb1708a69108a1c1c1183426ce92ded2108a54392431f095342de6719" Mar 20 13:58:37 crc kubenswrapper[4973]: I0320 13:58:37.822192 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj" Mar 20 13:58:37 crc kubenswrapper[4973]: I0320 13:58:37.874549 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-76xjq"] Mar 20 13:58:37 crc kubenswrapper[4973]: E0320 13:58:37.875199 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72622eaf-f7fc-44a6-9700-4de40de09009" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 20 13:58:37 crc kubenswrapper[4973]: I0320 13:58:37.875225 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="72622eaf-f7fc-44a6-9700-4de40de09009" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 20 13:58:37 crc kubenswrapper[4973]: I0320 13:58:37.875671 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="72622eaf-f7fc-44a6-9700-4de40de09009" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 20 13:58:37 crc kubenswrapper[4973]: I0320 13:58:37.876715 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-76xjq" Mar 20 13:58:37 crc kubenswrapper[4973]: I0320 13:58:37.880615 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d4wjc" Mar 20 13:58:37 crc kubenswrapper[4973]: I0320 13:58:37.881010 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 13:58:37 crc kubenswrapper[4973]: I0320 13:58:37.881171 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 13:58:37 crc kubenswrapper[4973]: I0320 13:58:37.881758 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 13:58:37 crc kubenswrapper[4973]: I0320 13:58:37.895889 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-76xjq"] Mar 20 13:58:37 crc kubenswrapper[4973]: I0320 13:58:37.947959 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-528zb\" (UniqueName: \"kubernetes.io/projected/9456a1ee-7677-4176-9dd8-ec10621b434f-kube-api-access-528zb\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-76xjq\" (UID: \"9456a1ee-7677-4176-9dd8-ec10621b434f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-76xjq" Mar 20 13:58:37 crc kubenswrapper[4973]: I0320 13:58:37.948004 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9456a1ee-7677-4176-9dd8-ec10621b434f-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-76xjq\" (UID: \"9456a1ee-7677-4176-9dd8-ec10621b434f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-76xjq" Mar 20 13:58:37 crc kubenswrapper[4973]: I0320 13:58:37.948173 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9456a1ee-7677-4176-9dd8-ec10621b434f-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-76xjq\" (UID: \"9456a1ee-7677-4176-9dd8-ec10621b434f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-76xjq" Mar 20 13:58:38 crc kubenswrapper[4973]: I0320 13:58:38.050248 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9456a1ee-7677-4176-9dd8-ec10621b434f-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-76xjq\" (UID: \"9456a1ee-7677-4176-9dd8-ec10621b434f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-76xjq" Mar 20 13:58:38 crc kubenswrapper[4973]: I0320 13:58:38.050456 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-528zb\" (UniqueName: \"kubernetes.io/projected/9456a1ee-7677-4176-9dd8-ec10621b434f-kube-api-access-528zb\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-76xjq\" (UID: \"9456a1ee-7677-4176-9dd8-ec10621b434f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-76xjq" Mar 20 13:58:38 crc kubenswrapper[4973]: I0320 13:58:38.050518 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9456a1ee-7677-4176-9dd8-ec10621b434f-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-76xjq\" (UID: \"9456a1ee-7677-4176-9dd8-ec10621b434f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-76xjq" Mar 20 13:58:38 crc kubenswrapper[4973]: I0320 13:58:38.055802 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9456a1ee-7677-4176-9dd8-ec10621b434f-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-76xjq\" (UID: \"9456a1ee-7677-4176-9dd8-ec10621b434f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-76xjq" Mar 20 13:58:38 crc kubenswrapper[4973]: I0320 13:58:38.067083 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9456a1ee-7677-4176-9dd8-ec10621b434f-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-76xjq\" (UID: \"9456a1ee-7677-4176-9dd8-ec10621b434f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-76xjq" Mar 20 13:58:38 crc kubenswrapper[4973]: I0320 13:58:38.067602 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-528zb\" (UniqueName: \"kubernetes.io/projected/9456a1ee-7677-4176-9dd8-ec10621b434f-kube-api-access-528zb\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-76xjq\" (UID: \"9456a1ee-7677-4176-9dd8-ec10621b434f\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-76xjq" Mar 20 13:58:38 crc kubenswrapper[4973]: I0320 13:58:38.221956 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-76xjq" Mar 20 13:58:38 crc kubenswrapper[4973]: I0320 13:58:38.866019 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-76xjq"] Mar 20 13:58:39 crc kubenswrapper[4973]: I0320 13:58:39.843363 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-76xjq" event={"ID":"9456a1ee-7677-4176-9dd8-ec10621b434f","Type":"ContainerStarted","Data":"c8afe42d2157828d53deaeff30b9d36f4276758d66bf2a7de2ce207136f83802"} Mar 20 13:58:39 crc kubenswrapper[4973]: I0320 13:58:39.843674 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-76xjq" event={"ID":"9456a1ee-7677-4176-9dd8-ec10621b434f","Type":"ContainerStarted","Data":"cf502699779ee954dd14855f86515ff1246eb0a4d823d3eb78ac2614e23c9902"} Mar 20 13:58:39 crc kubenswrapper[4973]: I0320 13:58:39.869313 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-76xjq" podStartSLOduration=2.403125636 podStartE2EDuration="2.869291609s" podCreationTimestamp="2026-03-20 13:58:37 +0000 UTC" firstStartedPulling="2026-03-20 13:58:38.876690649 +0000 UTC m=+2239.620360393" lastFinishedPulling="2026-03-20 13:58:39.342856622 +0000 UTC m=+2240.086526366" observedRunningTime="2026-03-20 13:58:39.868176218 +0000 UTC m=+2240.611845962" watchObservedRunningTime="2026-03-20 13:58:39.869291609 +0000 UTC m=+2240.612961353" Mar 20 13:58:43 crc kubenswrapper[4973]: I0320 13:58:43.320221 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:58:43 crc kubenswrapper[4973]: I0320 13:58:43.320902 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:58:57 crc kubenswrapper[4973]: I0320 13:58:57.060491 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-xm9xj"] Mar 20 13:58:57 crc kubenswrapper[4973]: I0320 13:58:57.074230 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-xm9xj"] Mar 20 13:58:57 crc kubenswrapper[4973]: I0320 13:58:57.963959 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b9f729e-dda0-4ad0-a8fc-3f0365b27947" path="/var/lib/kubelet/pods/6b9f729e-dda0-4ad0-a8fc-3f0365b27947/volumes" Mar 20 13:58:58 crc kubenswrapper[4973]: I0320 13:58:58.039151 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wwf69"] Mar 20 13:58:58 crc kubenswrapper[4973]: I0320 13:58:58.050122 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-wwf69"] Mar 20 13:58:59 crc kubenswrapper[4973]: I0320 13:58:59.963644 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65" path="/var/lib/kubelet/pods/90d5b5b7-ec1d-42bd-bd12-3cbf50d09e65/volumes" Mar 20 13:59:00 crc kubenswrapper[4973]: I0320 13:59:00.321953 4973 scope.go:117] "RemoveContainer" containerID="4ac0de1008deec27af56feed7d6e25375cde44a11d80e888ce389f842527df4c" Mar 20 13:59:00 crc kubenswrapper[4973]: I0320 13:59:00.350581 4973 scope.go:117] "RemoveContainer" containerID="8d7d87ff2805313825bce4cad26271bdd381c5ca0531450eebf153c6a88d38d9" Mar 20 13:59:00 crc kubenswrapper[4973]: I0320 13:59:00.418416 4973 scope.go:117] "RemoveContainer" containerID="4eb493f0158eb6368e3711b22e4fdbccb247f467bce476b683654be02cff9320" Mar 20 13:59:00 crc kubenswrapper[4973]: I0320 13:59:00.494130 4973 scope.go:117] "RemoveContainer" containerID="9421d4a8f747a4ff43be42f3ae0f3390f9170da9651133851f6efb1a308c24e5" Mar 20 13:59:00 crc kubenswrapper[4973]: I0320 13:59:00.549632 4973 scope.go:117] "RemoveContainer" containerID="7280559e6e57805be6a95dcd864b8a1d929a3706a84959bbc75bf834c7986217" Mar 20 13:59:03 crc kubenswrapper[4973]: I0320 13:59:03.779591 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8ktr2"] Mar 20 13:59:03 crc kubenswrapper[4973]: I0320 13:59:03.782984 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8ktr2" Mar 20 13:59:03 crc kubenswrapper[4973]: I0320 13:59:03.797678 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ktr2"] Mar 20 13:59:03 crc kubenswrapper[4973]: I0320 13:59:03.885612 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28ff54ef-d0a1-4008-9c24-977c6c8c7680-catalog-content\") pod \"redhat-marketplace-8ktr2\" (UID: \"28ff54ef-d0a1-4008-9c24-977c6c8c7680\") " pod="openshift-marketplace/redhat-marketplace-8ktr2" Mar 20 13:59:03 crc kubenswrapper[4973]: I0320 13:59:03.885962 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28ff54ef-d0a1-4008-9c24-977c6c8c7680-utilities\") pod \"redhat-marketplace-8ktr2\" (UID: \"28ff54ef-d0a1-4008-9c24-977c6c8c7680\") " pod="openshift-marketplace/redhat-marketplace-8ktr2" Mar 20 13:59:03 crc kubenswrapper[4973]: I0320 13:59:03.886081 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvhzx\" (UniqueName: \"kubernetes.io/projected/28ff54ef-d0a1-4008-9c24-977c6c8c7680-kube-api-access-zvhzx\") pod \"redhat-marketplace-8ktr2\" (UID: \"28ff54ef-d0a1-4008-9c24-977c6c8c7680\") " pod="openshift-marketplace/redhat-marketplace-8ktr2" Mar 20 13:59:03 crc kubenswrapper[4973]: I0320 13:59:03.990200 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28ff54ef-d0a1-4008-9c24-977c6c8c7680-catalog-content\") pod \"redhat-marketplace-8ktr2\" (UID: \"28ff54ef-d0a1-4008-9c24-977c6c8c7680\") " pod="openshift-marketplace/redhat-marketplace-8ktr2" Mar 20 13:59:03 crc kubenswrapper[4973]: I0320 13:59:03.990407 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28ff54ef-d0a1-4008-9c24-977c6c8c7680-utilities\") pod \"redhat-marketplace-8ktr2\" (UID: \"28ff54ef-d0a1-4008-9c24-977c6c8c7680\") " pod="openshift-marketplace/redhat-marketplace-8ktr2" Mar 20 13:59:03 crc kubenswrapper[4973]: I0320 13:59:03.990484 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvhzx\" (UniqueName: \"kubernetes.io/projected/28ff54ef-d0a1-4008-9c24-977c6c8c7680-kube-api-access-zvhzx\") pod \"redhat-marketplace-8ktr2\" (UID: \"28ff54ef-d0a1-4008-9c24-977c6c8c7680\") " pod="openshift-marketplace/redhat-marketplace-8ktr2" Mar 20 13:59:03 crc kubenswrapper[4973]: I0320 13:59:03.990817 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28ff54ef-d0a1-4008-9c24-977c6c8c7680-catalog-content\") pod \"redhat-marketplace-8ktr2\" (UID: \"28ff54ef-d0a1-4008-9c24-977c6c8c7680\") " pod="openshift-marketplace/redhat-marketplace-8ktr2" Mar 20 13:59:03 crc kubenswrapper[4973]: I0320 13:59:03.990858 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28ff54ef-d0a1-4008-9c24-977c6c8c7680-utilities\") pod \"redhat-marketplace-8ktr2\" (UID: \"28ff54ef-d0a1-4008-9c24-977c6c8c7680\") " pod="openshift-marketplace/redhat-marketplace-8ktr2" Mar 20 13:59:04 crc kubenswrapper[4973]: I0320 13:59:04.017268 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvhzx\" (UniqueName: \"kubernetes.io/projected/28ff54ef-d0a1-4008-9c24-977c6c8c7680-kube-api-access-zvhzx\") pod \"redhat-marketplace-8ktr2\" (UID: \"28ff54ef-d0a1-4008-9c24-977c6c8c7680\") " pod="openshift-marketplace/redhat-marketplace-8ktr2" Mar 20 13:59:04 crc kubenswrapper[4973]: I0320 13:59:04.116077 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8ktr2" Mar 20 13:59:04 crc kubenswrapper[4973]: I0320 13:59:04.648483 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ktr2"] Mar 20 13:59:05 crc kubenswrapper[4973]: I0320 13:59:05.088738 4973 generic.go:334] "Generic (PLEG): container finished" podID="28ff54ef-d0a1-4008-9c24-977c6c8c7680" containerID="a20440c8066a1e971a8ad2df2402f2070e5af9a4c1ff9abba1b4ee7e5d505fab" exitCode=0 Mar 20 13:59:05 crc kubenswrapper[4973]: I0320 13:59:05.088835 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ktr2" event={"ID":"28ff54ef-d0a1-4008-9c24-977c6c8c7680","Type":"ContainerDied","Data":"a20440c8066a1e971a8ad2df2402f2070e5af9a4c1ff9abba1b4ee7e5d505fab"} Mar 20 13:59:05 crc kubenswrapper[4973]: I0320 13:59:05.089077 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ktr2" event={"ID":"28ff54ef-d0a1-4008-9c24-977c6c8c7680","Type":"ContainerStarted","Data":"6b9703ac4adceeb9c77681648f855a849923b841a08e28d68386f19b1303e0f9"} Mar 20 13:59:06 crc kubenswrapper[4973]: I0320 13:59:06.100643 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ktr2" event={"ID":"28ff54ef-d0a1-4008-9c24-977c6c8c7680","Type":"ContainerStarted","Data":"ec11a5c492c85996eee861fc6685326cc4e265f58f9e1452853b3f2197e4afb5"} Mar 20 13:59:07 crc kubenswrapper[4973]: I0320 13:59:07.110931 4973 generic.go:334] "Generic (PLEG): container finished" podID="28ff54ef-d0a1-4008-9c24-977c6c8c7680" containerID="ec11a5c492c85996eee861fc6685326cc4e265f58f9e1452853b3f2197e4afb5" exitCode=0 Mar 20 13:59:07 crc kubenswrapper[4973]: I0320 13:59:07.110982 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ktr2" event={"ID":"28ff54ef-d0a1-4008-9c24-977c6c8c7680","Type":"ContainerDied","Data":"ec11a5c492c85996eee861fc6685326cc4e265f58f9e1452853b3f2197e4afb5"} Mar 20 13:59:08 crc kubenswrapper[4973]: I0320 13:59:08.131679 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ktr2" event={"ID":"28ff54ef-d0a1-4008-9c24-977c6c8c7680","Type":"ContainerStarted","Data":"2f47e887535ec8f4317e6663c17568cc516fefb18728842fcbc21f99bc91e26e"} Mar 20 13:59:08 crc kubenswrapper[4973]: I0320 13:59:08.155510 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8ktr2" podStartSLOduration=2.733773549 podStartE2EDuration="5.15548954s" podCreationTimestamp="2026-03-20 13:59:03 +0000 UTC" firstStartedPulling="2026-03-20 13:59:05.091325608 +0000 UTC m=+2265.834995352" lastFinishedPulling="2026-03-20 13:59:07.513041609 +0000 UTC m=+2268.256711343" observedRunningTime="2026-03-20 13:59:08.153926676 +0000 UTC m=+2268.897596420" watchObservedRunningTime="2026-03-20 13:59:08.15548954 +0000 UTC m=+2268.899159284" Mar 20 13:59:13 crc kubenswrapper[4973]: I0320 13:59:13.320601 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:59:13 crc kubenswrapper[4973]: I0320 13:59:13.321176 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:59:14 crc kubenswrapper[4973]: I0320 13:59:14.117014 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8ktr2" Mar 20 13:59:14 crc kubenswrapper[4973]: I0320 13:59:14.117383 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8ktr2" Mar 20 13:59:14 crc kubenswrapper[4973]: I0320 13:59:14.165940 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8ktr2" Mar 20 13:59:14 crc kubenswrapper[4973]: I0320 13:59:14.249687 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8ktr2" Mar 20 13:59:14 crc kubenswrapper[4973]: I0320 13:59:14.409245 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ktr2"] Mar 20 13:59:16 crc kubenswrapper[4973]: I0320 13:59:16.212942 4973 generic.go:334] "Generic (PLEG): container finished" podID="9456a1ee-7677-4176-9dd8-ec10621b434f" containerID="c8afe42d2157828d53deaeff30b9d36f4276758d66bf2a7de2ce207136f83802" exitCode=0 Mar 20 13:59:16 crc kubenswrapper[4973]: I0320 13:59:16.213534 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8ktr2" podUID="28ff54ef-d0a1-4008-9c24-977c6c8c7680" containerName="registry-server" containerID="cri-o://2f47e887535ec8f4317e6663c17568cc516fefb18728842fcbc21f99bc91e26e" gracePeriod=2 Mar 20 13:59:16 crc kubenswrapper[4973]: I0320 13:59:16.213048 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-76xjq" event={"ID":"9456a1ee-7677-4176-9dd8-ec10621b434f","Type":"ContainerDied","Data":"c8afe42d2157828d53deaeff30b9d36f4276758d66bf2a7de2ce207136f83802"} Mar 20 13:59:17 crc kubenswrapper[4973]: I0320 13:59:17.224488 4973 generic.go:334] "Generic (PLEG): container finished" podID="28ff54ef-d0a1-4008-9c24-977c6c8c7680" containerID="2f47e887535ec8f4317e6663c17568cc516fefb18728842fcbc21f99bc91e26e" exitCode=0 Mar 20 13:59:17 crc kubenswrapper[4973]: I0320 13:59:17.224556 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ktr2" event={"ID":"28ff54ef-d0a1-4008-9c24-977c6c8c7680","Type":"ContainerDied","Data":"2f47e887535ec8f4317e6663c17568cc516fefb18728842fcbc21f99bc91e26e"} Mar 20 13:59:17 crc kubenswrapper[4973]: I0320 13:59:17.225415 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8ktr2" event={"ID":"28ff54ef-d0a1-4008-9c24-977c6c8c7680","Type":"ContainerDied","Data":"6b9703ac4adceeb9c77681648f855a849923b841a08e28d68386f19b1303e0f9"} Mar 20 13:59:17 crc kubenswrapper[4973]: I0320 13:59:17.225448 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b9703ac4adceeb9c77681648f855a849923b841a08e28d68386f19b1303e0f9" Mar 20 13:59:17 crc kubenswrapper[4973]: I0320 13:59:17.257193 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8ktr2" Mar 20 13:59:17 crc kubenswrapper[4973]: I0320 13:59:17.359154 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28ff54ef-d0a1-4008-9c24-977c6c8c7680-catalog-content\") pod \"28ff54ef-d0a1-4008-9c24-977c6c8c7680\" (UID: \"28ff54ef-d0a1-4008-9c24-977c6c8c7680\") " Mar 20 13:59:17 crc kubenswrapper[4973]: I0320 13:59:17.359359 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zvhzx\" (UniqueName: \"kubernetes.io/projected/28ff54ef-d0a1-4008-9c24-977c6c8c7680-kube-api-access-zvhzx\") pod \"28ff54ef-d0a1-4008-9c24-977c6c8c7680\" (UID: \"28ff54ef-d0a1-4008-9c24-977c6c8c7680\") " Mar 20 13:59:17 crc kubenswrapper[4973]: I0320 13:59:17.359395 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28ff54ef-d0a1-4008-9c24-977c6c8c7680-utilities\") pod \"28ff54ef-d0a1-4008-9c24-977c6c8c7680\" (UID: \"28ff54ef-d0a1-4008-9c24-977c6c8c7680\") " Mar 20 13:59:17 crc kubenswrapper[4973]: I0320 13:59:17.360744 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28ff54ef-d0a1-4008-9c24-977c6c8c7680-utilities" (OuterVolumeSpecName: "utilities") pod "28ff54ef-d0a1-4008-9c24-977c6c8c7680" (UID: "28ff54ef-d0a1-4008-9c24-977c6c8c7680"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:59:17 crc kubenswrapper[4973]: I0320 13:59:17.373564 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28ff54ef-d0a1-4008-9c24-977c6c8c7680-kube-api-access-zvhzx" (OuterVolumeSpecName: "kube-api-access-zvhzx") pod "28ff54ef-d0a1-4008-9c24-977c6c8c7680" (UID: "28ff54ef-d0a1-4008-9c24-977c6c8c7680"). InnerVolumeSpecName "kube-api-access-zvhzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:59:17 crc kubenswrapper[4973]: I0320 13:59:17.404332 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28ff54ef-d0a1-4008-9c24-977c6c8c7680-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "28ff54ef-d0a1-4008-9c24-977c6c8c7680" (UID: "28ff54ef-d0a1-4008-9c24-977c6c8c7680"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:59:17 crc kubenswrapper[4973]: I0320 13:59:17.462718 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/28ff54ef-d0a1-4008-9c24-977c6c8c7680-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:59:17 crc kubenswrapper[4973]: I0320 13:59:17.462750 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zvhzx\" (UniqueName: \"kubernetes.io/projected/28ff54ef-d0a1-4008-9c24-977c6c8c7680-kube-api-access-zvhzx\") on node \"crc\" DevicePath \"\"" Mar 20 13:59:17 crc kubenswrapper[4973]: I0320 13:59:17.462763 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/28ff54ef-d0a1-4008-9c24-977c6c8c7680-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:59:17 crc kubenswrapper[4973]: I0320 13:59:17.703217 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-76xjq" Mar 20 13:59:17 crc kubenswrapper[4973]: I0320 13:59:17.870615 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9456a1ee-7677-4176-9dd8-ec10621b434f-ssh-key-openstack-edpm-ipam\") pod \"9456a1ee-7677-4176-9dd8-ec10621b434f\" (UID: \"9456a1ee-7677-4176-9dd8-ec10621b434f\") " Mar 20 13:59:17 crc kubenswrapper[4973]: I0320 13:59:17.870665 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9456a1ee-7677-4176-9dd8-ec10621b434f-inventory\") pod \"9456a1ee-7677-4176-9dd8-ec10621b434f\" (UID: \"9456a1ee-7677-4176-9dd8-ec10621b434f\") " Mar 20 13:59:17 crc kubenswrapper[4973]: I0320 13:59:17.870731 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-528zb\" (UniqueName: \"kubernetes.io/projected/9456a1ee-7677-4176-9dd8-ec10621b434f-kube-api-access-528zb\") pod \"9456a1ee-7677-4176-9dd8-ec10621b434f\" (UID: \"9456a1ee-7677-4176-9dd8-ec10621b434f\") " Mar 20 13:59:17 crc kubenswrapper[4973]: I0320 13:59:17.876587 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9456a1ee-7677-4176-9dd8-ec10621b434f-kube-api-access-528zb" (OuterVolumeSpecName: "kube-api-access-528zb") pod "9456a1ee-7677-4176-9dd8-ec10621b434f" (UID: "9456a1ee-7677-4176-9dd8-ec10621b434f"). InnerVolumeSpecName "kube-api-access-528zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:59:17 crc kubenswrapper[4973]: I0320 13:59:17.905696 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9456a1ee-7677-4176-9dd8-ec10621b434f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9456a1ee-7677-4176-9dd8-ec10621b434f" (UID: "9456a1ee-7677-4176-9dd8-ec10621b434f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:59:17 crc kubenswrapper[4973]: I0320 13:59:17.922710 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9456a1ee-7677-4176-9dd8-ec10621b434f-inventory" (OuterVolumeSpecName: "inventory") pod "9456a1ee-7677-4176-9dd8-ec10621b434f" (UID: "9456a1ee-7677-4176-9dd8-ec10621b434f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 13:59:17 crc kubenswrapper[4973]: I0320 13:59:17.974485 4973 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9456a1ee-7677-4176-9dd8-ec10621b434f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 13:59:17 crc kubenswrapper[4973]: I0320 13:59:17.974529 4973 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9456a1ee-7677-4176-9dd8-ec10621b434f-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 13:59:17 crc kubenswrapper[4973]: I0320 13:59:17.974544 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-528zb\" (UniqueName: \"kubernetes.io/projected/9456a1ee-7677-4176-9dd8-ec10621b434f-kube-api-access-528zb\") on node \"crc\" DevicePath \"\"" Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.238934 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8ktr2" Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.238934 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-76xjq" Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.238726 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-76xjq" event={"ID":"9456a1ee-7677-4176-9dd8-ec10621b434f","Type":"ContainerDied","Data":"cf502699779ee954dd14855f86515ff1246eb0a4d823d3eb78ac2614e23c9902"} Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.239321 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cf502699779ee954dd14855f86515ff1246eb0a4d823d3eb78ac2614e23c9902" Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.275382 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ktr2"] Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.290755 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8ktr2"] Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.330021 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-trb69"] Mar 20 13:59:18 crc kubenswrapper[4973]: E0320 13:59:18.330985 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28ff54ef-d0a1-4008-9c24-977c6c8c7680" containerName="extract-content" Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.331005 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ff54ef-d0a1-4008-9c24-977c6c8c7680" containerName="extract-content" Mar 20 13:59:18 crc kubenswrapper[4973]: E0320 13:59:18.331029 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28ff54ef-d0a1-4008-9c24-977c6c8c7680" containerName="registry-server" Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.331035 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ff54ef-d0a1-4008-9c24-977c6c8c7680" containerName="registry-server" Mar 20 13:59:18 crc kubenswrapper[4973]: E0320 13:59:18.331045 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="28ff54ef-d0a1-4008-9c24-977c6c8c7680" containerName="extract-utilities" Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.331054 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="28ff54ef-d0a1-4008-9c24-977c6c8c7680" containerName="extract-utilities" Mar 20 13:59:18 crc kubenswrapper[4973]: E0320 13:59:18.331077 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9456a1ee-7677-4176-9dd8-ec10621b434f" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.331084 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="9456a1ee-7677-4176-9dd8-ec10621b434f" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.331296 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="28ff54ef-d0a1-4008-9c24-977c6c8c7680" containerName="registry-server" Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.331327 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="9456a1ee-7677-4176-9dd8-ec10621b434f" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.332130 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-trb69" Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.335261 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d4wjc" Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.336588 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.336955 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.337137 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.355532 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-trb69"] Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.487458 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkj65\" (UniqueName: \"kubernetes.io/projected/42ef6035-8917-4665-aaab-67b6c8e74ca7-kube-api-access-qkj65\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-trb69\" (UID: \"42ef6035-8917-4665-aaab-67b6c8e74ca7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-trb69" Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.487620 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42ef6035-8917-4665-aaab-67b6c8e74ca7-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-trb69\" (UID: \"42ef6035-8917-4665-aaab-67b6c8e74ca7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-trb69" Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.487653 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/42ef6035-8917-4665-aaab-67b6c8e74ca7-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-trb69\" (UID: \"42ef6035-8917-4665-aaab-67b6c8e74ca7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-trb69" Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.590109 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42ef6035-8917-4665-aaab-67b6c8e74ca7-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-trb69\" (UID: \"42ef6035-8917-4665-aaab-67b6c8e74ca7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-trb69" Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.590176 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/42ef6035-8917-4665-aaab-67b6c8e74ca7-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-trb69\" (UID: \"42ef6035-8917-4665-aaab-67b6c8e74ca7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-trb69" Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.590396 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkj65\" (UniqueName: \"kubernetes.io/projected/42ef6035-8917-4665-aaab-67b6c8e74ca7-kube-api-access-qkj65\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-trb69\" (UID: \"42ef6035-8917-4665-aaab-67b6c8e74ca7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-trb69" Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.603013 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/42ef6035-8917-4665-aaab-67b6c8e74ca7-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-trb69\" (UID: \"42ef6035-8917-4665-aaab-67b6c8e74ca7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-trb69" Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.606886 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42ef6035-8917-4665-aaab-67b6c8e74ca7-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-trb69\" (UID: \"42ef6035-8917-4665-aaab-67b6c8e74ca7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-trb69" Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.627253 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkj65\" (UniqueName: \"kubernetes.io/projected/42ef6035-8917-4665-aaab-67b6c8e74ca7-kube-api-access-qkj65\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-trb69\" (UID: \"42ef6035-8917-4665-aaab-67b6c8e74ca7\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-trb69" Mar 20 13:59:18 crc kubenswrapper[4973]: I0320 13:59:18.654754 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-trb69" Mar 20 13:59:19 crc kubenswrapper[4973]: I0320 13:59:19.380037 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-trb69"] Mar 20 13:59:19 crc kubenswrapper[4973]: I0320 13:59:19.973270 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28ff54ef-d0a1-4008-9c24-977c6c8c7680" path="/var/lib/kubelet/pods/28ff54ef-d0a1-4008-9c24-977c6c8c7680/volumes" Mar 20 13:59:20 crc kubenswrapper[4973]: I0320 13:59:20.095842 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 13:59:20 crc kubenswrapper[4973]: I0320 13:59:20.259305 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-trb69" event={"ID":"42ef6035-8917-4665-aaab-67b6c8e74ca7","Type":"ContainerStarted","Data":"9398a97c90c7dee83d22a26464251986e345dadd8feb1f241ca5b90a1bdd71be"} Mar 20 13:59:21 crc kubenswrapper[4973]: I0320 13:59:21.255902 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rmwkt"] Mar 20 13:59:21 crc kubenswrapper[4973]: I0320 13:59:21.259696 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rmwkt" Mar 20 13:59:21 crc kubenswrapper[4973]: I0320 13:59:21.266632 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rmwkt"] Mar 20 13:59:21 crc kubenswrapper[4973]: I0320 13:59:21.299027 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-trb69" event={"ID":"42ef6035-8917-4665-aaab-67b6c8e74ca7","Type":"ContainerStarted","Data":"ac40fda0038a8d732f18194ae5a62ff93e7597eb30640e86491f40742f404541"} Mar 20 13:59:21 crc kubenswrapper[4973]: I0320 13:59:21.333842 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-trb69" podStartSLOduration=2.625897824 podStartE2EDuration="3.333823203s" podCreationTimestamp="2026-03-20 13:59:18 +0000 UTC" firstStartedPulling="2026-03-20 13:59:19.383870328 +0000 UTC m=+2280.127540072" lastFinishedPulling="2026-03-20 13:59:20.091795707 +0000 UTC m=+2280.835465451" observedRunningTime="2026-03-20 13:59:21.320787815 +0000 UTC m=+2282.064457559" watchObservedRunningTime="2026-03-20 13:59:21.333823203 +0000 UTC m=+2282.077492947" Mar 20 13:59:21 crc kubenswrapper[4973]: I0320 13:59:21.362526 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbldv\" (UniqueName: \"kubernetes.io/projected/dbdea6e4-12dd-42d4-8ffe-37b38cbccadc-kube-api-access-bbldv\") pod \"certified-operators-rmwkt\" (UID: \"dbdea6e4-12dd-42d4-8ffe-37b38cbccadc\") " pod="openshift-marketplace/certified-operators-rmwkt" Mar 20 13:59:21 crc kubenswrapper[4973]: I0320 13:59:21.362661 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbdea6e4-12dd-42d4-8ffe-37b38cbccadc-utilities\") pod \"certified-operators-rmwkt\" (UID: \"dbdea6e4-12dd-42d4-8ffe-37b38cbccadc\") " pod="openshift-marketplace/certified-operators-rmwkt" Mar 20 13:59:21 crc kubenswrapper[4973]: I0320 13:59:21.362693 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbdea6e4-12dd-42d4-8ffe-37b38cbccadc-catalog-content\") pod \"certified-operators-rmwkt\" (UID: \"dbdea6e4-12dd-42d4-8ffe-37b38cbccadc\") " pod="openshift-marketplace/certified-operators-rmwkt" Mar 20 13:59:21 crc kubenswrapper[4973]: I0320 13:59:21.465102 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbdea6e4-12dd-42d4-8ffe-37b38cbccadc-utilities\") pod \"certified-operators-rmwkt\" (UID: \"dbdea6e4-12dd-42d4-8ffe-37b38cbccadc\") " pod="openshift-marketplace/certified-operators-rmwkt" Mar 20 13:59:21 crc kubenswrapper[4973]: I0320 13:59:21.465170 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbdea6e4-12dd-42d4-8ffe-37b38cbccadc-catalog-content\") pod \"certified-operators-rmwkt\" (UID: \"dbdea6e4-12dd-42d4-8ffe-37b38cbccadc\") " pod="openshift-marketplace/certified-operators-rmwkt" Mar 20 13:59:21 crc kubenswrapper[4973]: I0320 13:59:21.465497 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bbldv\" (UniqueName: \"kubernetes.io/projected/dbdea6e4-12dd-42d4-8ffe-37b38cbccadc-kube-api-access-bbldv\") pod \"certified-operators-rmwkt\" (UID: \"dbdea6e4-12dd-42d4-8ffe-37b38cbccadc\") " pod="openshift-marketplace/certified-operators-rmwkt" Mar 20 13:59:21 crc kubenswrapper[4973]: I0320 13:59:21.465803 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dbdea6e4-12dd-42d4-8ffe-37b38cbccadc-utilities\") pod \"certified-operators-rmwkt\" (UID: \"dbdea6e4-12dd-42d4-8ffe-37b38cbccadc\") " pod="openshift-marketplace/certified-operators-rmwkt" Mar 20 13:59:21 crc kubenswrapper[4973]: I0320 13:59:21.465923 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dbdea6e4-12dd-42d4-8ffe-37b38cbccadc-catalog-content\") pod \"certified-operators-rmwkt\" (UID: \"dbdea6e4-12dd-42d4-8ffe-37b38cbccadc\") " pod="openshift-marketplace/certified-operators-rmwkt" Mar 20 13:59:21 crc kubenswrapper[4973]: I0320 13:59:21.483629 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbldv\" (UniqueName: \"kubernetes.io/projected/dbdea6e4-12dd-42d4-8ffe-37b38cbccadc-kube-api-access-bbldv\") pod \"certified-operators-rmwkt\" (UID: \"dbdea6e4-12dd-42d4-8ffe-37b38cbccadc\") " pod="openshift-marketplace/certified-operators-rmwkt" Mar 20 13:59:21 crc kubenswrapper[4973]: I0320 13:59:21.601638 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rmwkt" Mar 20 13:59:22 crc kubenswrapper[4973]: I0320 13:59:22.124118 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rmwkt"] Mar 20 13:59:22 crc kubenswrapper[4973]: I0320 13:59:22.310110 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmwkt" event={"ID":"dbdea6e4-12dd-42d4-8ffe-37b38cbccadc","Type":"ContainerStarted","Data":"0b2a7767825dec0ce320c6e5cdbeda578a94e63153ab78de06819f077e3c28a3"} Mar 20 13:59:23 crc kubenswrapper[4973]: I0320 13:59:23.320414 4973 generic.go:334] "Generic (PLEG): container finished" podID="dbdea6e4-12dd-42d4-8ffe-37b38cbccadc" containerID="e236c52aba53d6eb991414c98278d146e72f064b1b1b5825a66fe967480ce9fa" exitCode=0 Mar 20 13:59:23 crc kubenswrapper[4973]: I0320 13:59:23.320518 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmwkt" event={"ID":"dbdea6e4-12dd-42d4-8ffe-37b38cbccadc","Type":"ContainerDied","Data":"e236c52aba53d6eb991414c98278d146e72f064b1b1b5825a66fe967480ce9fa"} Mar 20 13:59:30 crc kubenswrapper[4973]: I0320 13:59:30.393155 4973 generic.go:334] "Generic (PLEG): container finished" podID="dbdea6e4-12dd-42d4-8ffe-37b38cbccadc" containerID="7e55b5265a03c02de1f08e1c43dcab00a7f1b70993997cbdf6ceb12db1e62a7f" exitCode=0 Mar 20 13:59:30 crc kubenswrapper[4973]: I0320 13:59:30.393255 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmwkt" event={"ID":"dbdea6e4-12dd-42d4-8ffe-37b38cbccadc","Type":"ContainerDied","Data":"7e55b5265a03c02de1f08e1c43dcab00a7f1b70993997cbdf6ceb12db1e62a7f"} Mar 20 13:59:31 crc kubenswrapper[4973]: I0320 13:59:31.406622 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rmwkt" event={"ID":"dbdea6e4-12dd-42d4-8ffe-37b38cbccadc","Type":"ContainerStarted","Data":"a0258addbcaca4cc2274e75b8e3a931937b039cb4bf4e0856507219fee51fd28"} Mar 20 13:59:31 crc kubenswrapper[4973]: I0320 13:59:31.426874 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rmwkt" podStartSLOduration=2.909380536 podStartE2EDuration="10.426855434s" podCreationTimestamp="2026-03-20 13:59:21 +0000 UTC" firstStartedPulling="2026-03-20 13:59:23.322634893 +0000 UTC m=+2284.066304637" lastFinishedPulling="2026-03-20 13:59:30.840109791 +0000 UTC m=+2291.583779535" observedRunningTime="2026-03-20 13:59:31.424619113 +0000 UTC m=+2292.168288857" watchObservedRunningTime="2026-03-20 13:59:31.426855434 +0000 UTC m=+2292.170525178" Mar 20 13:59:31 crc kubenswrapper[4973]: I0320 13:59:31.602551 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rmwkt" Mar 20 13:59:31 crc kubenswrapper[4973]: I0320 13:59:31.602717 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rmwkt" Mar 20 13:59:32 crc kubenswrapper[4973]: I0320 13:59:32.664154 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-rmwkt" podUID="dbdea6e4-12dd-42d4-8ffe-37b38cbccadc" containerName="registry-server" probeResult="failure" output=< Mar 20 13:59:32 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 13:59:32 crc kubenswrapper[4973]: > Mar 20 13:59:41 crc kubenswrapper[4973]: I0320 13:59:41.706248 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rmwkt" Mar 20 13:59:41 crc kubenswrapper[4973]: I0320 13:59:41.800947 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rmwkt" Mar 20 13:59:41 crc kubenswrapper[4973]: I0320 13:59:41.897143 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rmwkt"] Mar 20 13:59:41 crc kubenswrapper[4973]: I0320 13:59:41.973247 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gmpxj"] Mar 20 13:59:41 crc kubenswrapper[4973]: I0320 13:59:41.973852 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gmpxj" podUID="23fa52a0-b5df-4056-bd9e-4f42e8e9893c" containerName="registry-server" containerID="cri-o://1d42cde7d7c77acf9457af58faffda33fadd57f8e7ffbc97bc752cffe7eec0cb" gracePeriod=2 Mar 20 13:59:42 crc kubenswrapper[4973]: I0320 13:59:42.060327 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-7tq9v"] Mar 20 13:59:42 crc kubenswrapper[4973]: I0320 13:59:42.074033 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-7tq9v"] Mar 20 13:59:42 crc kubenswrapper[4973]: I0320 13:59:42.490084 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gmpxj" Mar 20 13:59:42 crc kubenswrapper[4973]: I0320 13:59:42.528252 4973 generic.go:334] "Generic (PLEG): container finished" podID="23fa52a0-b5df-4056-bd9e-4f42e8e9893c" containerID="1d42cde7d7c77acf9457af58faffda33fadd57f8e7ffbc97bc752cffe7eec0cb" exitCode=0 Mar 20 13:59:42 crc kubenswrapper[4973]: I0320 13:59:42.529421 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gmpxj" event={"ID":"23fa52a0-b5df-4056-bd9e-4f42e8e9893c","Type":"ContainerDied","Data":"1d42cde7d7c77acf9457af58faffda33fadd57f8e7ffbc97bc752cffe7eec0cb"} Mar 20 13:59:42 crc kubenswrapper[4973]: I0320 13:59:42.529448 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gmpxj" Mar 20 13:59:42 crc kubenswrapper[4973]: I0320 13:59:42.529526 4973 scope.go:117] "RemoveContainer" containerID="1d42cde7d7c77acf9457af58faffda33fadd57f8e7ffbc97bc752cffe7eec0cb" Mar 20 13:59:42 crc kubenswrapper[4973]: I0320 13:59:42.529502 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gmpxj" event={"ID":"23fa52a0-b5df-4056-bd9e-4f42e8e9893c","Type":"ContainerDied","Data":"1268d233be1df167395850251a3e75516770c833b565925cd1f518def416dcae"} Mar 20 13:59:42 crc kubenswrapper[4973]: I0320 13:59:42.538869 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23fa52a0-b5df-4056-bd9e-4f42e8e9893c-catalog-content\") pod \"23fa52a0-b5df-4056-bd9e-4f42e8e9893c\" (UID: \"23fa52a0-b5df-4056-bd9e-4f42e8e9893c\") " Mar 20 13:59:42 crc kubenswrapper[4973]: I0320 13:59:42.539072 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2prcc\" (UniqueName: \"kubernetes.io/projected/23fa52a0-b5df-4056-bd9e-4f42e8e9893c-kube-api-access-2prcc\") pod \"23fa52a0-b5df-4056-bd9e-4f42e8e9893c\" (UID: \"23fa52a0-b5df-4056-bd9e-4f42e8e9893c\") " Mar 20 13:59:42 crc kubenswrapper[4973]: I0320 13:59:42.539346 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23fa52a0-b5df-4056-bd9e-4f42e8e9893c-utilities\") pod \"23fa52a0-b5df-4056-bd9e-4f42e8e9893c\" (UID: \"23fa52a0-b5df-4056-bd9e-4f42e8e9893c\") " Mar 20 13:59:42 crc kubenswrapper[4973]: I0320 13:59:42.552870 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23fa52a0-b5df-4056-bd9e-4f42e8e9893c-utilities" (OuterVolumeSpecName: "utilities") pod "23fa52a0-b5df-4056-bd9e-4f42e8e9893c" (UID: "23fa52a0-b5df-4056-bd9e-4f42e8e9893c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:59:42 crc kubenswrapper[4973]: I0320 13:59:42.564441 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23fa52a0-b5df-4056-bd9e-4f42e8e9893c-kube-api-access-2prcc" (OuterVolumeSpecName: "kube-api-access-2prcc") pod "23fa52a0-b5df-4056-bd9e-4f42e8e9893c" (UID: "23fa52a0-b5df-4056-bd9e-4f42e8e9893c"). InnerVolumeSpecName "kube-api-access-2prcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 13:59:42 crc kubenswrapper[4973]: I0320 13:59:42.620810 4973 scope.go:117] "RemoveContainer" containerID="9c159be98e56a648697890f346ca405a98b5515e7f47bdbee25c1eb0fdc5d592" Mar 20 13:59:42 crc kubenswrapper[4973]: I0320 13:59:42.641706 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23fa52a0-b5df-4056-bd9e-4f42e8e9893c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "23fa52a0-b5df-4056-bd9e-4f42e8e9893c" (UID: "23fa52a0-b5df-4056-bd9e-4f42e8e9893c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 13:59:42 crc kubenswrapper[4973]: I0320 13:59:42.642540 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2prcc\" (UniqueName: \"kubernetes.io/projected/23fa52a0-b5df-4056-bd9e-4f42e8e9893c-kube-api-access-2prcc\") on node \"crc\" DevicePath \"\"" Mar 20 13:59:42 crc kubenswrapper[4973]: I0320 13:59:42.642574 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23fa52a0-b5df-4056-bd9e-4f42e8e9893c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 13:59:42 crc kubenswrapper[4973]: I0320 13:59:42.642584 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23fa52a0-b5df-4056-bd9e-4f42e8e9893c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 13:59:42 crc kubenswrapper[4973]: I0320 13:59:42.663534 4973 scope.go:117] "RemoveContainer" containerID="6a9655491cb133886fe94e4ee47ee5de6400fcd5d3e9c098c3effbc46501c942" Mar 20 13:59:42 crc kubenswrapper[4973]: I0320 13:59:42.728020 4973 scope.go:117] "RemoveContainer" containerID="1d42cde7d7c77acf9457af58faffda33fadd57f8e7ffbc97bc752cffe7eec0cb" Mar 20 13:59:42 crc kubenswrapper[4973]: E0320 13:59:42.728596 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d42cde7d7c77acf9457af58faffda33fadd57f8e7ffbc97bc752cffe7eec0cb\": container with ID starting with 1d42cde7d7c77acf9457af58faffda33fadd57f8e7ffbc97bc752cffe7eec0cb not found: ID does not exist" containerID="1d42cde7d7c77acf9457af58faffda33fadd57f8e7ffbc97bc752cffe7eec0cb" Mar 20 13:59:42 crc kubenswrapper[4973]: I0320 13:59:42.728641 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d42cde7d7c77acf9457af58faffda33fadd57f8e7ffbc97bc752cffe7eec0cb"} err="failed to get container status \"1d42cde7d7c77acf9457af58faffda33fadd57f8e7ffbc97bc752cffe7eec0cb\": rpc error: code = NotFound desc = could not find container \"1d42cde7d7c77acf9457af58faffda33fadd57f8e7ffbc97bc752cffe7eec0cb\": container with ID starting with 1d42cde7d7c77acf9457af58faffda33fadd57f8e7ffbc97bc752cffe7eec0cb not found: ID does not exist" Mar 20 13:59:42 crc kubenswrapper[4973]: I0320 13:59:42.728668 4973 scope.go:117] "RemoveContainer" containerID="9c159be98e56a648697890f346ca405a98b5515e7f47bdbee25c1eb0fdc5d592" Mar 20 13:59:42 crc kubenswrapper[4973]: E0320 13:59:42.729030 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c159be98e56a648697890f346ca405a98b5515e7f47bdbee25c1eb0fdc5d592\": container with ID starting with 9c159be98e56a648697890f346ca405a98b5515e7f47bdbee25c1eb0fdc5d592 not found: ID does not exist" containerID="9c159be98e56a648697890f346ca405a98b5515e7f47bdbee25c1eb0fdc5d592" Mar 20 13:59:42 crc kubenswrapper[4973]: I0320 13:59:42.729083 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c159be98e56a648697890f346ca405a98b5515e7f47bdbee25c1eb0fdc5d592"} err="failed to get container status \"9c159be98e56a648697890f346ca405a98b5515e7f47bdbee25c1eb0fdc5d592\": rpc error: code = NotFound desc = could not find container \"9c159be98e56a648697890f346ca405a98b5515e7f47bdbee25c1eb0fdc5d592\": container with ID starting with 9c159be98e56a648697890f346ca405a98b5515e7f47bdbee25c1eb0fdc5d592 not found: ID does not exist" Mar 20 13:59:42 crc kubenswrapper[4973]: I0320 13:59:42.729120 4973 scope.go:117] "RemoveContainer" containerID="6a9655491cb133886fe94e4ee47ee5de6400fcd5d3e9c098c3effbc46501c942" Mar 20 13:59:42 crc kubenswrapper[4973]: E0320 13:59:42.729494 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a9655491cb133886fe94e4ee47ee5de6400fcd5d3e9c098c3effbc46501c942\": container with ID starting with 6a9655491cb133886fe94e4ee47ee5de6400fcd5d3e9c098c3effbc46501c942 not found: ID does not exist" containerID="6a9655491cb133886fe94e4ee47ee5de6400fcd5d3e9c098c3effbc46501c942" Mar 20 13:59:42 crc kubenswrapper[4973]: I0320 13:59:42.729535 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a9655491cb133886fe94e4ee47ee5de6400fcd5d3e9c098c3effbc46501c942"} err="failed to get container status \"6a9655491cb133886fe94e4ee47ee5de6400fcd5d3e9c098c3effbc46501c942\": rpc error: code = NotFound desc = could not find container \"6a9655491cb133886fe94e4ee47ee5de6400fcd5d3e9c098c3effbc46501c942\": container with ID starting with 6a9655491cb133886fe94e4ee47ee5de6400fcd5d3e9c098c3effbc46501c942 not found: ID does not exist" Mar 20 13:59:42 crc kubenswrapper[4973]: I0320 13:59:42.869032 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gmpxj"] Mar 20 13:59:42 crc kubenswrapper[4973]: I0320 13:59:42.881618 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gmpxj"] Mar 20 13:59:43 crc kubenswrapper[4973]: I0320 13:59:43.320864 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 13:59:43 crc kubenswrapper[4973]: I0320 13:59:43.320941 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 13:59:43 crc kubenswrapper[4973]: I0320 13:59:43.320997 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 13:59:43 crc kubenswrapper[4973]: I0320 13:59:43.322097 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee"} pod="openshift-machine-config-operator/machine-config-daemon-qlztx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 13:59:43 crc kubenswrapper[4973]: I0320 13:59:43.322167 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" containerID="cri-o://8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee" gracePeriod=600 Mar 20 13:59:43 crc kubenswrapper[4973]: E0320 13:59:43.456327 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:59:43 crc kubenswrapper[4973]: I0320 13:59:43.549948 4973 generic.go:334] "Generic (PLEG): container finished" podID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerID="8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee" exitCode=0 Mar 20 13:59:43 crc kubenswrapper[4973]: I0320 13:59:43.550001 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerDied","Data":"8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee"} Mar 20 13:59:43 crc kubenswrapper[4973]: I0320 13:59:43.550078 4973 scope.go:117] "RemoveContainer" containerID="73be12b46987d19974f1cae15f097e9d05c0f53b3e5735ef8e8bb8b42d9ed186" Mar 20 13:59:43 crc kubenswrapper[4973]: I0320 13:59:43.551805 4973 scope.go:117] "RemoveContainer" containerID="8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee" Mar 20 13:59:43 crc kubenswrapper[4973]: E0320 13:59:43.552981 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 13:59:43 crc kubenswrapper[4973]: I0320 13:59:43.964900 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23fa52a0-b5df-4056-bd9e-4f42e8e9893c" path="/var/lib/kubelet/pods/23fa52a0-b5df-4056-bd9e-4f42e8e9893c/volumes" Mar 20 13:59:43 crc kubenswrapper[4973]: I0320 13:59:43.965669 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87d2fb90-f8cc-4942-b86b-457b21b9790d" path="/var/lib/kubelet/pods/87d2fb90-f8cc-4942-b86b-457b21b9790d/volumes" Mar 20 13:59:55 crc kubenswrapper[4973]: I0320 13:59:55.951406 4973 scope.go:117] "RemoveContainer" containerID="8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee" Mar 20 13:59:55 crc kubenswrapper[4973]: E0320 13:59:55.952365 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.148851 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566920-dtwt4"] Mar 20 14:00:00 crc kubenswrapper[4973]: E0320 14:00:00.150061 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23fa52a0-b5df-4056-bd9e-4f42e8e9893c" containerName="registry-server" Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.150080 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="23fa52a0-b5df-4056-bd9e-4f42e8e9893c" containerName="registry-server" Mar 20 14:00:00 crc kubenswrapper[4973]: E0320 14:00:00.150091 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23fa52a0-b5df-4056-bd9e-4f42e8e9893c" containerName="extract-utilities" Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.150099 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="23fa52a0-b5df-4056-bd9e-4f42e8e9893c" containerName="extract-utilities" Mar 20 14:00:00 crc kubenswrapper[4973]: E0320 14:00:00.150130 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23fa52a0-b5df-4056-bd9e-4f42e8e9893c" containerName="extract-content" Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.150140 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="23fa52a0-b5df-4056-bd9e-4f42e8e9893c" containerName="extract-content" Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.150441 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="23fa52a0-b5df-4056-bd9e-4f42e8e9893c" containerName="registry-server" Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.151419 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566920-dtwt4" Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.156291 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.157687 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.158877 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.169444 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566920-hh7rt"] Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.175491 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-hh7rt" Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.188242 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.188654 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.207130 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566920-dtwt4"] Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.222967 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg645\" (UniqueName: \"kubernetes.io/projected/eb0d8010-577f-4749-a2b2-d0cf211ca0ec-kube-api-access-kg645\") pod \"collect-profiles-29566920-hh7rt\" (UID: \"eb0d8010-577f-4749-a2b2-d0cf211ca0ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-hh7rt" Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.224067 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb0d8010-577f-4749-a2b2-d0cf211ca0ec-config-volume\") pod \"collect-profiles-29566920-hh7rt\" (UID: \"eb0d8010-577f-4749-a2b2-d0cf211ca0ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-hh7rt" Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.224332 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb0d8010-577f-4749-a2b2-d0cf211ca0ec-secret-volume\") pod \"collect-profiles-29566920-hh7rt\" (UID: \"eb0d8010-577f-4749-a2b2-d0cf211ca0ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-hh7rt" Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.231987 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566920-hh7rt"] Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.330048 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njc5k\" (UniqueName: \"kubernetes.io/projected/6a9c0f80-a3b9-45c7-bd8b-de8af58b2a1b-kube-api-access-njc5k\") pod \"auto-csr-approver-29566920-dtwt4\" (UID: \"6a9c0f80-a3b9-45c7-bd8b-de8af58b2a1b\") " pod="openshift-infra/auto-csr-approver-29566920-dtwt4" Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.330108 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg645\" (UniqueName: \"kubernetes.io/projected/eb0d8010-577f-4749-a2b2-d0cf211ca0ec-kube-api-access-kg645\") pod \"collect-profiles-29566920-hh7rt\" (UID: \"eb0d8010-577f-4749-a2b2-d0cf211ca0ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-hh7rt" Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.330652 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb0d8010-577f-4749-a2b2-d0cf211ca0ec-config-volume\") pod \"collect-profiles-29566920-hh7rt\" (UID: \"eb0d8010-577f-4749-a2b2-d0cf211ca0ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-hh7rt" Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.330712 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb0d8010-577f-4749-a2b2-d0cf211ca0ec-secret-volume\") pod \"collect-profiles-29566920-hh7rt\" (UID: \"eb0d8010-577f-4749-a2b2-d0cf211ca0ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-hh7rt" Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.332244 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb0d8010-577f-4749-a2b2-d0cf211ca0ec-config-volume\") pod \"collect-profiles-29566920-hh7rt\" (UID: \"eb0d8010-577f-4749-a2b2-d0cf211ca0ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-hh7rt" Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.336620 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb0d8010-577f-4749-a2b2-d0cf211ca0ec-secret-volume\") pod \"collect-profiles-29566920-hh7rt\" (UID: \"eb0d8010-577f-4749-a2b2-d0cf211ca0ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-hh7rt" Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.358168 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg645\" (UniqueName: \"kubernetes.io/projected/eb0d8010-577f-4749-a2b2-d0cf211ca0ec-kube-api-access-kg645\") pod \"collect-profiles-29566920-hh7rt\" (UID: \"eb0d8010-577f-4749-a2b2-d0cf211ca0ec\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-hh7rt" Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.433263 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njc5k\" (UniqueName: \"kubernetes.io/projected/6a9c0f80-a3b9-45c7-bd8b-de8af58b2a1b-kube-api-access-njc5k\") pod \"auto-csr-approver-29566920-dtwt4\" (UID: \"6a9c0f80-a3b9-45c7-bd8b-de8af58b2a1b\") " pod="openshift-infra/auto-csr-approver-29566920-dtwt4" Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.450464 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njc5k\" (UniqueName: \"kubernetes.io/projected/6a9c0f80-a3b9-45c7-bd8b-de8af58b2a1b-kube-api-access-njc5k\") pod \"auto-csr-approver-29566920-dtwt4\" (UID: \"6a9c0f80-a3b9-45c7-bd8b-de8af58b2a1b\") " pod="openshift-infra/auto-csr-approver-29566920-dtwt4" Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.494257 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566920-dtwt4" Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.521880 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-hh7rt" Mar 20 14:00:00 crc kubenswrapper[4973]: I0320 14:00:00.720431 4973 scope.go:117] "RemoveContainer" containerID="ad9e3dedb9addd4bbdb9a1ca6d267f476bbd2183f98c5b6606156a79252a673a" Mar 20 14:00:01 crc kubenswrapper[4973]: I0320 14:00:01.042233 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566920-dtwt4"] Mar 20 14:00:01 crc kubenswrapper[4973]: W0320 14:00:01.045815 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a9c0f80_a3b9_45c7_bd8b_de8af58b2a1b.slice/crio-a5df845f959347c55e6b6db889ca405746ed409ce0758789bc6021e035217f46 WatchSource:0}: Error finding container a5df845f959347c55e6b6db889ca405746ed409ce0758789bc6021e035217f46: Status 404 returned error can't find the container with id a5df845f959347c55e6b6db889ca405746ed409ce0758789bc6021e035217f46 Mar 20 14:00:01 crc kubenswrapper[4973]: W0320 14:00:01.161770 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb0d8010_577f_4749_a2b2_d0cf211ca0ec.slice/crio-46b5c23594c6bd715ea8498b5f59759e96b23517e229ca07d1526e05fcc19c31 WatchSource:0}: Error finding container 46b5c23594c6bd715ea8498b5f59759e96b23517e229ca07d1526e05fcc19c31: Status 404 returned error can't find the container with id 46b5c23594c6bd715ea8498b5f59759e96b23517e229ca07d1526e05fcc19c31 Mar 20 14:00:01 crc kubenswrapper[4973]: I0320 14:00:01.163151 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566920-hh7rt"] Mar 20 14:00:01 crc kubenswrapper[4973]: I0320 14:00:01.775177 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566920-dtwt4" event={"ID":"6a9c0f80-a3b9-45c7-bd8b-de8af58b2a1b","Type":"ContainerStarted","Data":"a5df845f959347c55e6b6db889ca405746ed409ce0758789bc6021e035217f46"} Mar 20 14:00:01 crc kubenswrapper[4973]: I0320 14:00:01.781706 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-hh7rt" event={"ID":"eb0d8010-577f-4749-a2b2-d0cf211ca0ec","Type":"ContainerStarted","Data":"9de663864ca341f6e12eacddc91f098517dc595a608e7452438ef1cb10ff127a"} Mar 20 14:00:01 crc kubenswrapper[4973]: I0320 14:00:01.781767 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-hh7rt" event={"ID":"eb0d8010-577f-4749-a2b2-d0cf211ca0ec","Type":"ContainerStarted","Data":"46b5c23594c6bd715ea8498b5f59759e96b23517e229ca07d1526e05fcc19c31"} Mar 20 14:00:01 crc kubenswrapper[4973]: I0320 14:00:01.823392 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-hh7rt" podStartSLOduration=1.8233754690000001 podStartE2EDuration="1.823375469s" podCreationTimestamp="2026-03-20 14:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:00:01.806586288 +0000 UTC m=+2322.550256022" watchObservedRunningTime="2026-03-20 14:00:01.823375469 +0000 UTC m=+2322.567045213" Mar 20 14:00:02 crc kubenswrapper[4973]: I0320 14:00:02.793371 4973 generic.go:334] "Generic (PLEG): container finished" podID="eb0d8010-577f-4749-a2b2-d0cf211ca0ec" containerID="9de663864ca341f6e12eacddc91f098517dc595a608e7452438ef1cb10ff127a" exitCode=0 Mar 20 14:00:02 crc kubenswrapper[4973]: I0320 14:00:02.793514 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-hh7rt" event={"ID":"eb0d8010-577f-4749-a2b2-d0cf211ca0ec","Type":"ContainerDied","Data":"9de663864ca341f6e12eacddc91f098517dc595a608e7452438ef1cb10ff127a"} Mar 20 14:00:04 crc kubenswrapper[4973]: I0320 14:00:04.275107 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-hh7rt" Mar 20 14:00:04 crc kubenswrapper[4973]: I0320 14:00:04.444230 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb0d8010-577f-4749-a2b2-d0cf211ca0ec-config-volume\") pod \"eb0d8010-577f-4749-a2b2-d0cf211ca0ec\" (UID: \"eb0d8010-577f-4749-a2b2-d0cf211ca0ec\") " Mar 20 14:00:04 crc kubenswrapper[4973]: I0320 14:00:04.444445 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kg645\" (UniqueName: \"kubernetes.io/projected/eb0d8010-577f-4749-a2b2-d0cf211ca0ec-kube-api-access-kg645\") pod \"eb0d8010-577f-4749-a2b2-d0cf211ca0ec\" (UID: \"eb0d8010-577f-4749-a2b2-d0cf211ca0ec\") " Mar 20 14:00:04 crc kubenswrapper[4973]: I0320 14:00:04.444565 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb0d8010-577f-4749-a2b2-d0cf211ca0ec-secret-volume\") pod \"eb0d8010-577f-4749-a2b2-d0cf211ca0ec\" (UID: \"eb0d8010-577f-4749-a2b2-d0cf211ca0ec\") " Mar 20 14:00:04 crc kubenswrapper[4973]: I0320 14:00:04.445529 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb0d8010-577f-4749-a2b2-d0cf211ca0ec-config-volume" (OuterVolumeSpecName: "config-volume") pod "eb0d8010-577f-4749-a2b2-d0cf211ca0ec" (UID: "eb0d8010-577f-4749-a2b2-d0cf211ca0ec"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:00:04 crc kubenswrapper[4973]: I0320 14:00:04.446174 4973 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/eb0d8010-577f-4749-a2b2-d0cf211ca0ec-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:04 crc kubenswrapper[4973]: I0320 14:00:04.453726 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb0d8010-577f-4749-a2b2-d0cf211ca0ec-kube-api-access-kg645" (OuterVolumeSpecName: "kube-api-access-kg645") pod "eb0d8010-577f-4749-a2b2-d0cf211ca0ec" (UID: "eb0d8010-577f-4749-a2b2-d0cf211ca0ec"). InnerVolumeSpecName "kube-api-access-kg645". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:00:04 crc kubenswrapper[4973]: I0320 14:00:04.463631 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb0d8010-577f-4749-a2b2-d0cf211ca0ec-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "eb0d8010-577f-4749-a2b2-d0cf211ca0ec" (UID: "eb0d8010-577f-4749-a2b2-d0cf211ca0ec"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:00:04 crc kubenswrapper[4973]: I0320 14:00:04.547988 4973 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/eb0d8010-577f-4749-a2b2-d0cf211ca0ec-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:04 crc kubenswrapper[4973]: I0320 14:00:04.548025 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kg645\" (UniqueName: \"kubernetes.io/projected/eb0d8010-577f-4749-a2b2-d0cf211ca0ec-kube-api-access-kg645\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:04 crc kubenswrapper[4973]: I0320 14:00:04.820195 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-hh7rt" event={"ID":"eb0d8010-577f-4749-a2b2-d0cf211ca0ec","Type":"ContainerDied","Data":"46b5c23594c6bd715ea8498b5f59759e96b23517e229ca07d1526e05fcc19c31"} Mar 20 14:00:04 crc kubenswrapper[4973]: I0320 14:00:04.820234 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46b5c23594c6bd715ea8498b5f59759e96b23517e229ca07d1526e05fcc19c31" Mar 20 14:00:04 crc kubenswrapper[4973]: I0320 14:00:04.820286 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566920-hh7rt" Mar 20 14:00:05 crc kubenswrapper[4973]: I0320 14:00:05.368207 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566875-2dssd"] Mar 20 14:00:05 crc kubenswrapper[4973]: I0320 14:00:05.381934 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566875-2dssd"] Mar 20 14:00:05 crc kubenswrapper[4973]: I0320 14:00:05.967738 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9d6f338-6a7a-4ba2-b1c9-12485bb30937" path="/var/lib/kubelet/pods/b9d6f338-6a7a-4ba2-b1c9-12485bb30937/volumes" Mar 20 14:00:09 crc kubenswrapper[4973]: I0320 14:00:09.960920 4973 scope.go:117] "RemoveContainer" containerID="8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee" Mar 20 14:00:09 crc kubenswrapper[4973]: E0320 14:00:09.961618 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:00:10 crc kubenswrapper[4973]: E0320 14:00:10.430370 4973 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a9c0f80_a3b9_45c7_bd8b_de8af58b2a1b.slice/crio-a713851d3065a685db42e5e476ea526be4fccefa9e1391fffa39c5435b796fd5.scope\": RecentStats: unable to find data in memory cache]" Mar 20 14:00:10 crc kubenswrapper[4973]: I0320 14:00:10.888313 4973 generic.go:334] "Generic (PLEG): container finished" podID="42ef6035-8917-4665-aaab-67b6c8e74ca7" containerID="ac40fda0038a8d732f18194ae5a62ff93e7597eb30640e86491f40742f404541" exitCode=0 Mar 20 14:00:10 crc kubenswrapper[4973]: I0320 14:00:10.888418 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-trb69" event={"ID":"42ef6035-8917-4665-aaab-67b6c8e74ca7","Type":"ContainerDied","Data":"ac40fda0038a8d732f18194ae5a62ff93e7597eb30640e86491f40742f404541"} Mar 20 14:00:10 crc kubenswrapper[4973]: I0320 14:00:10.890250 4973 generic.go:334] "Generic (PLEG): container finished" podID="6a9c0f80-a3b9-45c7-bd8b-de8af58b2a1b" containerID="a713851d3065a685db42e5e476ea526be4fccefa9e1391fffa39c5435b796fd5" exitCode=0 Mar 20 14:00:10 crc kubenswrapper[4973]: I0320 14:00:10.890286 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566920-dtwt4" event={"ID":"6a9c0f80-a3b9-45c7-bd8b-de8af58b2a1b","Type":"ContainerDied","Data":"a713851d3065a685db42e5e476ea526be4fccefa9e1391fffa39c5435b796fd5"} Mar 20 14:00:12 crc kubenswrapper[4973]: I0320 14:00:12.408485 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566920-dtwt4" Mar 20 14:00:12 crc kubenswrapper[4973]: I0320 14:00:12.415192 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-trb69" Mar 20 14:00:12 crc kubenswrapper[4973]: I0320 14:00:12.569739 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/42ef6035-8917-4665-aaab-67b6c8e74ca7-ssh-key-openstack-edpm-ipam\") pod \"42ef6035-8917-4665-aaab-67b6c8e74ca7\" (UID: \"42ef6035-8917-4665-aaab-67b6c8e74ca7\") " Mar 20 14:00:12 crc kubenswrapper[4973]: I0320 14:00:12.569796 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkj65\" (UniqueName: \"kubernetes.io/projected/42ef6035-8917-4665-aaab-67b6c8e74ca7-kube-api-access-qkj65\") pod \"42ef6035-8917-4665-aaab-67b6c8e74ca7\" (UID: \"42ef6035-8917-4665-aaab-67b6c8e74ca7\") " Mar 20 14:00:12 crc kubenswrapper[4973]: I0320 14:00:12.569924 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njc5k\" (UniqueName: \"kubernetes.io/projected/6a9c0f80-a3b9-45c7-bd8b-de8af58b2a1b-kube-api-access-njc5k\") pod \"6a9c0f80-a3b9-45c7-bd8b-de8af58b2a1b\" (UID: \"6a9c0f80-a3b9-45c7-bd8b-de8af58b2a1b\") " Mar 20 14:00:12 crc kubenswrapper[4973]: I0320 14:00:12.570148 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42ef6035-8917-4665-aaab-67b6c8e74ca7-inventory\") pod \"42ef6035-8917-4665-aaab-67b6c8e74ca7\" (UID: \"42ef6035-8917-4665-aaab-67b6c8e74ca7\") " Mar 20 14:00:12 crc kubenswrapper[4973]: I0320 14:00:12.575285 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42ef6035-8917-4665-aaab-67b6c8e74ca7-kube-api-access-qkj65" (OuterVolumeSpecName: "kube-api-access-qkj65") pod "42ef6035-8917-4665-aaab-67b6c8e74ca7" (UID: "42ef6035-8917-4665-aaab-67b6c8e74ca7"). InnerVolumeSpecName "kube-api-access-qkj65". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:00:12 crc kubenswrapper[4973]: I0320 14:00:12.578673 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a9c0f80-a3b9-45c7-bd8b-de8af58b2a1b-kube-api-access-njc5k" (OuterVolumeSpecName: "kube-api-access-njc5k") pod "6a9c0f80-a3b9-45c7-bd8b-de8af58b2a1b" (UID: "6a9c0f80-a3b9-45c7-bd8b-de8af58b2a1b"). InnerVolumeSpecName "kube-api-access-njc5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:00:12 crc kubenswrapper[4973]: I0320 14:00:12.605602 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42ef6035-8917-4665-aaab-67b6c8e74ca7-inventory" (OuterVolumeSpecName: "inventory") pod "42ef6035-8917-4665-aaab-67b6c8e74ca7" (UID: "42ef6035-8917-4665-aaab-67b6c8e74ca7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:00:12 crc kubenswrapper[4973]: I0320 14:00:12.607672 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/42ef6035-8917-4665-aaab-67b6c8e74ca7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "42ef6035-8917-4665-aaab-67b6c8e74ca7" (UID: "42ef6035-8917-4665-aaab-67b6c8e74ca7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:00:12 crc kubenswrapper[4973]: I0320 14:00:12.673964 4973 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/42ef6035-8917-4665-aaab-67b6c8e74ca7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:12 crc kubenswrapper[4973]: I0320 14:00:12.674006 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkj65\" (UniqueName: \"kubernetes.io/projected/42ef6035-8917-4665-aaab-67b6c8e74ca7-kube-api-access-qkj65\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:12 crc kubenswrapper[4973]: I0320 14:00:12.674023 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njc5k\" (UniqueName: \"kubernetes.io/projected/6a9c0f80-a3b9-45c7-bd8b-de8af58b2a1b-kube-api-access-njc5k\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:12 crc kubenswrapper[4973]: I0320 14:00:12.674036 4973 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/42ef6035-8917-4665-aaab-67b6c8e74ca7-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:12 crc kubenswrapper[4973]: I0320 14:00:12.918303 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566920-dtwt4" event={"ID":"6a9c0f80-a3b9-45c7-bd8b-de8af58b2a1b","Type":"ContainerDied","Data":"a5df845f959347c55e6b6db889ca405746ed409ce0758789bc6021e035217f46"} Mar 20 14:00:12 crc kubenswrapper[4973]: I0320 14:00:12.918364 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5df845f959347c55e6b6db889ca405746ed409ce0758789bc6021e035217f46" Mar 20 14:00:12 crc kubenswrapper[4973]: I0320 14:00:12.918407 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566920-dtwt4" Mar 20 14:00:12 crc kubenswrapper[4973]: I0320 14:00:12.920160 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-trb69" event={"ID":"42ef6035-8917-4665-aaab-67b6c8e74ca7","Type":"ContainerDied","Data":"9398a97c90c7dee83d22a26464251986e345dadd8feb1f241ca5b90a1bdd71be"} Mar 20 14:00:12 crc kubenswrapper[4973]: I0320 14:00:12.920213 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9398a97c90c7dee83d22a26464251986e345dadd8feb1f241ca5b90a1bdd71be" Mar 20 14:00:12 crc kubenswrapper[4973]: I0320 14:00:12.920291 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-trb69" Mar 20 14:00:13 crc kubenswrapper[4973]: I0320 14:00:13.003136 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cvmvr"] Mar 20 14:00:13 crc kubenswrapper[4973]: E0320 14:00:13.003686 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb0d8010-577f-4749-a2b2-d0cf211ca0ec" containerName="collect-profiles" Mar 20 14:00:13 crc kubenswrapper[4973]: I0320 14:00:13.003702 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb0d8010-577f-4749-a2b2-d0cf211ca0ec" containerName="collect-profiles" Mar 20 14:00:13 crc kubenswrapper[4973]: E0320 14:00:13.003724 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9c0f80-a3b9-45c7-bd8b-de8af58b2a1b" containerName="oc" Mar 20 14:00:13 crc kubenswrapper[4973]: I0320 14:00:13.003730 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9c0f80-a3b9-45c7-bd8b-de8af58b2a1b" containerName="oc" Mar 20 14:00:13 crc kubenswrapper[4973]: E0320 14:00:13.003755 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42ef6035-8917-4665-aaab-67b6c8e74ca7" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 20 14:00:13 crc kubenswrapper[4973]: I0320 14:00:13.003763 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="42ef6035-8917-4665-aaab-67b6c8e74ca7" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 20 14:00:13 crc kubenswrapper[4973]: I0320 14:00:13.003999 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="42ef6035-8917-4665-aaab-67b6c8e74ca7" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 20 14:00:13 crc kubenswrapper[4973]: I0320 14:00:13.004023 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb0d8010-577f-4749-a2b2-d0cf211ca0ec" containerName="collect-profiles" Mar 20 14:00:13 crc kubenswrapper[4973]: I0320 14:00:13.004046 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a9c0f80-a3b9-45c7-bd8b-de8af58b2a1b" containerName="oc" Mar 20 14:00:13 crc kubenswrapper[4973]: I0320 14:00:13.004898 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cvmvr" Mar 20 14:00:13 crc kubenswrapper[4973]: I0320 14:00:13.007359 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 14:00:13 crc kubenswrapper[4973]: I0320 14:00:13.007494 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 14:00:13 crc kubenswrapper[4973]: I0320 14:00:13.007597 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d4wjc" Mar 20 14:00:13 crc kubenswrapper[4973]: I0320 14:00:13.007736 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 14:00:13 crc kubenswrapper[4973]: I0320 14:00:13.027375 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cvmvr"] Mar 20 14:00:13 crc kubenswrapper[4973]: I0320 14:00:13.188324 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/02613b7a-06ef-4a92-8d7e-55dd12481786-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cvmvr\" (UID: \"02613b7a-06ef-4a92-8d7e-55dd12481786\") " pod="openstack/ssh-known-hosts-edpm-deployment-cvmvr" Mar 20 14:00:13 crc kubenswrapper[4973]: I0320 14:00:13.188501 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2llfn\" (UniqueName: \"kubernetes.io/projected/02613b7a-06ef-4a92-8d7e-55dd12481786-kube-api-access-2llfn\") pod \"ssh-known-hosts-edpm-deployment-cvmvr\" (UID: \"02613b7a-06ef-4a92-8d7e-55dd12481786\") " pod="openstack/ssh-known-hosts-edpm-deployment-cvmvr" Mar 20 14:00:13 crc kubenswrapper[4973]: I0320 14:00:13.188790 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02613b7a-06ef-4a92-8d7e-55dd12481786-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cvmvr\" (UID: \"02613b7a-06ef-4a92-8d7e-55dd12481786\") " pod="openstack/ssh-known-hosts-edpm-deployment-cvmvr" Mar 20 14:00:13 crc kubenswrapper[4973]: I0320 14:00:13.290654 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02613b7a-06ef-4a92-8d7e-55dd12481786-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cvmvr\" (UID: \"02613b7a-06ef-4a92-8d7e-55dd12481786\") " pod="openstack/ssh-known-hosts-edpm-deployment-cvmvr" Mar 20 14:00:13 crc kubenswrapper[4973]: I0320 14:00:13.290762 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/02613b7a-06ef-4a92-8d7e-55dd12481786-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cvmvr\" (UID: \"02613b7a-06ef-4a92-8d7e-55dd12481786\") " pod="openstack/ssh-known-hosts-edpm-deployment-cvmvr" Mar 20 14:00:13 crc kubenswrapper[4973]: I0320 14:00:13.290866 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2llfn\" (UniqueName: \"kubernetes.io/projected/02613b7a-06ef-4a92-8d7e-55dd12481786-kube-api-access-2llfn\") pod \"ssh-known-hosts-edpm-deployment-cvmvr\" (UID: \"02613b7a-06ef-4a92-8d7e-55dd12481786\") " pod="openstack/ssh-known-hosts-edpm-deployment-cvmvr" Mar 20 14:00:13 crc kubenswrapper[4973]: I0320 14:00:13.295454 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02613b7a-06ef-4a92-8d7e-55dd12481786-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-cvmvr\" (UID: \"02613b7a-06ef-4a92-8d7e-55dd12481786\") " pod="openstack/ssh-known-hosts-edpm-deployment-cvmvr" Mar 20 14:00:13 crc kubenswrapper[4973]: I0320 14:00:13.309714 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/02613b7a-06ef-4a92-8d7e-55dd12481786-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-cvmvr\" (UID: \"02613b7a-06ef-4a92-8d7e-55dd12481786\") " pod="openstack/ssh-known-hosts-edpm-deployment-cvmvr" Mar 20 14:00:13 crc kubenswrapper[4973]: I0320 14:00:13.317389 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2llfn\" (UniqueName: \"kubernetes.io/projected/02613b7a-06ef-4a92-8d7e-55dd12481786-kube-api-access-2llfn\") pod \"ssh-known-hosts-edpm-deployment-cvmvr\" (UID: \"02613b7a-06ef-4a92-8d7e-55dd12481786\") " pod="openstack/ssh-known-hosts-edpm-deployment-cvmvr" Mar 20 14:00:13 crc kubenswrapper[4973]: I0320 14:00:13.335760 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cvmvr" Mar 20 14:00:13 crc kubenswrapper[4973]: I0320 14:00:13.483725 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566914-ljvvd"] Mar 20 14:00:13 crc kubenswrapper[4973]: I0320 14:00:13.499945 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566914-ljvvd"] Mar 20 14:00:13 crc kubenswrapper[4973]: I0320 14:00:13.964951 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="517222fd-ce22-4859-abde-4d45185cd4b0" path="/var/lib/kubelet/pods/517222fd-ce22-4859-abde-4d45185cd4b0/volumes" Mar 20 14:00:14 crc kubenswrapper[4973]: I0320 14:00:14.040358 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-cvmvr"] Mar 20 14:00:14 crc kubenswrapper[4973]: W0320 14:00:14.042892 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02613b7a_06ef_4a92_8d7e_55dd12481786.slice/crio-4d68fd37e6fc1d0b91bb60c2ff5c0f08207d8eac9a567f5fc692de9f65a3b50f WatchSource:0}: Error finding container 4d68fd37e6fc1d0b91bb60c2ff5c0f08207d8eac9a567f5fc692de9f65a3b50f: Status 404 returned error can't find the container with id 4d68fd37e6fc1d0b91bb60c2ff5c0f08207d8eac9a567f5fc692de9f65a3b50f Mar 20 14:00:14 crc kubenswrapper[4973]: I0320 14:00:14.943888 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cvmvr" event={"ID":"02613b7a-06ef-4a92-8d7e-55dd12481786","Type":"ContainerStarted","Data":"58d6de5d26f932408626d0039766801a647df081614471951385631fbf46d8f2"} Mar 20 14:00:14 crc kubenswrapper[4973]: I0320 14:00:14.944272 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cvmvr" event={"ID":"02613b7a-06ef-4a92-8d7e-55dd12481786","Type":"ContainerStarted","Data":"4d68fd37e6fc1d0b91bb60c2ff5c0f08207d8eac9a567f5fc692de9f65a3b50f"} Mar 20 14:00:14 crc kubenswrapper[4973]: I0320 14:00:14.971749 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-cvmvr" podStartSLOduration=2.533469642 podStartE2EDuration="2.971731619s" podCreationTimestamp="2026-03-20 14:00:12 +0000 UTC" firstStartedPulling="2026-03-20 14:00:14.045792408 +0000 UTC m=+2334.789462152" lastFinishedPulling="2026-03-20 14:00:14.484054385 +0000 UTC m=+2335.227724129" observedRunningTime="2026-03-20 14:00:14.963194524 +0000 UTC m=+2335.706864288" watchObservedRunningTime="2026-03-20 14:00:14.971731619 +0000 UTC m=+2335.715401363" Mar 20 14:00:21 crc kubenswrapper[4973]: I0320 14:00:21.951446 4973 scope.go:117] "RemoveContainer" containerID="8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee" Mar 20 14:00:21 crc kubenswrapper[4973]: E0320 14:00:21.952155 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:00:22 crc kubenswrapper[4973]: I0320 14:00:22.022006 4973 generic.go:334] "Generic (PLEG): container finished" podID="02613b7a-06ef-4a92-8d7e-55dd12481786" containerID="58d6de5d26f932408626d0039766801a647df081614471951385631fbf46d8f2" exitCode=0 Mar 20 14:00:22 crc kubenswrapper[4973]: I0320 14:00:22.022058 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cvmvr" event={"ID":"02613b7a-06ef-4a92-8d7e-55dd12481786","Type":"ContainerDied","Data":"58d6de5d26f932408626d0039766801a647df081614471951385631fbf46d8f2"} Mar 20 14:00:23 crc kubenswrapper[4973]: I0320 14:00:23.536301 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cvmvr" Mar 20 14:00:23 crc kubenswrapper[4973]: I0320 14:00:23.669226 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02613b7a-06ef-4a92-8d7e-55dd12481786-ssh-key-openstack-edpm-ipam\") pod \"02613b7a-06ef-4a92-8d7e-55dd12481786\" (UID: \"02613b7a-06ef-4a92-8d7e-55dd12481786\") " Mar 20 14:00:23 crc kubenswrapper[4973]: I0320 14:00:23.669302 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/02613b7a-06ef-4a92-8d7e-55dd12481786-inventory-0\") pod \"02613b7a-06ef-4a92-8d7e-55dd12481786\" (UID: \"02613b7a-06ef-4a92-8d7e-55dd12481786\") " Mar 20 14:00:23 crc kubenswrapper[4973]: I0320 14:00:23.669356 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2llfn\" (UniqueName: \"kubernetes.io/projected/02613b7a-06ef-4a92-8d7e-55dd12481786-kube-api-access-2llfn\") pod \"02613b7a-06ef-4a92-8d7e-55dd12481786\" (UID: \"02613b7a-06ef-4a92-8d7e-55dd12481786\") " Mar 20 14:00:23 crc kubenswrapper[4973]: I0320 14:00:23.679461 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02613b7a-06ef-4a92-8d7e-55dd12481786-kube-api-access-2llfn" (OuterVolumeSpecName: "kube-api-access-2llfn") pod "02613b7a-06ef-4a92-8d7e-55dd12481786" (UID: "02613b7a-06ef-4a92-8d7e-55dd12481786"). InnerVolumeSpecName "kube-api-access-2llfn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:00:23 crc kubenswrapper[4973]: I0320 14:00:23.704860 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02613b7a-06ef-4a92-8d7e-55dd12481786-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "02613b7a-06ef-4a92-8d7e-55dd12481786" (UID: "02613b7a-06ef-4a92-8d7e-55dd12481786"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:00:23 crc kubenswrapper[4973]: I0320 14:00:23.712125 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02613b7a-06ef-4a92-8d7e-55dd12481786-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "02613b7a-06ef-4a92-8d7e-55dd12481786" (UID: "02613b7a-06ef-4a92-8d7e-55dd12481786"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:00:23 crc kubenswrapper[4973]: I0320 14:00:23.773681 4973 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02613b7a-06ef-4a92-8d7e-55dd12481786-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:23 crc kubenswrapper[4973]: I0320 14:00:23.773724 4973 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/02613b7a-06ef-4a92-8d7e-55dd12481786-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:23 crc kubenswrapper[4973]: I0320 14:00:23.773738 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2llfn\" (UniqueName: \"kubernetes.io/projected/02613b7a-06ef-4a92-8d7e-55dd12481786-kube-api-access-2llfn\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:24 crc kubenswrapper[4973]: I0320 14:00:24.044482 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-cvmvr" event={"ID":"02613b7a-06ef-4a92-8d7e-55dd12481786","Type":"ContainerDied","Data":"4d68fd37e6fc1d0b91bb60c2ff5c0f08207d8eac9a567f5fc692de9f65a3b50f"} Mar 20 14:00:24 crc kubenswrapper[4973]: I0320 14:00:24.044532 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4d68fd37e6fc1d0b91bb60c2ff5c0f08207d8eac9a567f5fc692de9f65a3b50f" Mar 20 14:00:24 crc kubenswrapper[4973]: I0320 14:00:24.044543 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-cvmvr" Mar 20 14:00:24 crc kubenswrapper[4973]: I0320 14:00:24.120994 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkhr6"] Mar 20 14:00:24 crc kubenswrapper[4973]: E0320 14:00:24.121915 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02613b7a-06ef-4a92-8d7e-55dd12481786" containerName="ssh-known-hosts-edpm-deployment" Mar 20 14:00:24 crc kubenswrapper[4973]: I0320 14:00:24.121936 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="02613b7a-06ef-4a92-8d7e-55dd12481786" containerName="ssh-known-hosts-edpm-deployment" Mar 20 14:00:24 crc kubenswrapper[4973]: I0320 14:00:24.122155 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="02613b7a-06ef-4a92-8d7e-55dd12481786" containerName="ssh-known-hosts-edpm-deployment" Mar 20 14:00:24 crc kubenswrapper[4973]: I0320 14:00:24.122976 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkhr6" Mar 20 14:00:24 crc kubenswrapper[4973]: I0320 14:00:24.128906 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 14:00:24 crc kubenswrapper[4973]: I0320 14:00:24.129031 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 14:00:24 crc kubenswrapper[4973]: I0320 14:00:24.129289 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 14:00:24 crc kubenswrapper[4973]: I0320 14:00:24.135755 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkhr6"] Mar 20 14:00:24 crc kubenswrapper[4973]: I0320 14:00:24.137518 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d4wjc" Mar 20 14:00:24 crc kubenswrapper[4973]: I0320 14:00:24.285681 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da2fdea4-15c3-4312-9e6f-bbff469b8feb-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gkhr6\" (UID: \"da2fdea4-15c3-4312-9e6f-bbff469b8feb\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkhr6" Mar 20 14:00:24 crc kubenswrapper[4973]: I0320 14:00:24.285723 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkmpz\" (UniqueName: \"kubernetes.io/projected/da2fdea4-15c3-4312-9e6f-bbff469b8feb-kube-api-access-hkmpz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gkhr6\" (UID: \"da2fdea4-15c3-4312-9e6f-bbff469b8feb\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkhr6" Mar 20 14:00:24 crc kubenswrapper[4973]: I0320 14:00:24.285847 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da2fdea4-15c3-4312-9e6f-bbff469b8feb-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gkhr6\" (UID: \"da2fdea4-15c3-4312-9e6f-bbff469b8feb\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkhr6" Mar 20 14:00:24 crc kubenswrapper[4973]: I0320 14:00:24.387898 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da2fdea4-15c3-4312-9e6f-bbff469b8feb-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gkhr6\" (UID: \"da2fdea4-15c3-4312-9e6f-bbff469b8feb\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkhr6" Mar 20 14:00:24 crc kubenswrapper[4973]: I0320 14:00:24.388160 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hkmpz\" (UniqueName: \"kubernetes.io/projected/da2fdea4-15c3-4312-9e6f-bbff469b8feb-kube-api-access-hkmpz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gkhr6\" (UID: \"da2fdea4-15c3-4312-9e6f-bbff469b8feb\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkhr6" Mar 20 14:00:24 crc kubenswrapper[4973]: I0320 14:00:24.388361 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da2fdea4-15c3-4312-9e6f-bbff469b8feb-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gkhr6\" (UID: \"da2fdea4-15c3-4312-9e6f-bbff469b8feb\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkhr6" Mar 20 14:00:24 crc kubenswrapper[4973]: I0320 14:00:24.391958 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da2fdea4-15c3-4312-9e6f-bbff469b8feb-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gkhr6\" (UID: \"da2fdea4-15c3-4312-9e6f-bbff469b8feb\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkhr6" Mar 20 14:00:24 crc kubenswrapper[4973]: I0320 14:00:24.392793 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da2fdea4-15c3-4312-9e6f-bbff469b8feb-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gkhr6\" (UID: \"da2fdea4-15c3-4312-9e6f-bbff469b8feb\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkhr6" Mar 20 14:00:24 crc kubenswrapper[4973]: I0320 14:00:24.409417 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkmpz\" (UniqueName: \"kubernetes.io/projected/da2fdea4-15c3-4312-9e6f-bbff469b8feb-kube-api-access-hkmpz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-gkhr6\" (UID: \"da2fdea4-15c3-4312-9e6f-bbff469b8feb\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkhr6" Mar 20 14:00:24 crc kubenswrapper[4973]: I0320 14:00:24.438699 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkhr6" Mar 20 14:00:24 crc kubenswrapper[4973]: I0320 14:00:24.977745 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkhr6"] Mar 20 14:00:25 crc kubenswrapper[4973]: I0320 14:00:25.054417 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkhr6" event={"ID":"da2fdea4-15c3-4312-9e6f-bbff469b8feb","Type":"ContainerStarted","Data":"d6e443b16c9909b52bf76cda323bd620c3b6f4e23934eece7b6aa611286dc7c4"} Mar 20 14:00:26 crc kubenswrapper[4973]: I0320 14:00:26.065990 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkhr6" event={"ID":"da2fdea4-15c3-4312-9e6f-bbff469b8feb","Type":"ContainerStarted","Data":"ae391977ecb4fc439b45d569e5367e972fda575be1d8254d1a4640109e89fb92"} Mar 20 14:00:26 crc kubenswrapper[4973]: I0320 14:00:26.084458 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkhr6" podStartSLOduration=1.436713167 podStartE2EDuration="2.084440503s" podCreationTimestamp="2026-03-20 14:00:24 +0000 UTC" firstStartedPulling="2026-03-20 14:00:24.980527548 +0000 UTC m=+2345.724197292" lastFinishedPulling="2026-03-20 14:00:25.628254884 +0000 UTC m=+2346.371924628" observedRunningTime="2026-03-20 14:00:26.079905228 +0000 UTC m=+2346.823574972" watchObservedRunningTime="2026-03-20 14:00:26.084440503 +0000 UTC m=+2346.828110247" Mar 20 14:00:34 crc kubenswrapper[4973]: I0320 14:00:34.145021 4973 generic.go:334] "Generic (PLEG): container finished" podID="da2fdea4-15c3-4312-9e6f-bbff469b8feb" containerID="ae391977ecb4fc439b45d569e5367e972fda575be1d8254d1a4640109e89fb92" exitCode=0 Mar 20 14:00:34 crc kubenswrapper[4973]: I0320 14:00:34.145133 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkhr6" event={"ID":"da2fdea4-15c3-4312-9e6f-bbff469b8feb","Type":"ContainerDied","Data":"ae391977ecb4fc439b45d569e5367e972fda575be1d8254d1a4640109e89fb92"} Mar 20 14:00:35 crc kubenswrapper[4973]: I0320 14:00:35.617296 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkhr6" Mar 20 14:00:35 crc kubenswrapper[4973]: I0320 14:00:35.762727 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da2fdea4-15c3-4312-9e6f-bbff469b8feb-inventory\") pod \"da2fdea4-15c3-4312-9e6f-bbff469b8feb\" (UID: \"da2fdea4-15c3-4312-9e6f-bbff469b8feb\") " Mar 20 14:00:35 crc kubenswrapper[4973]: I0320 14:00:35.762792 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hkmpz\" (UniqueName: \"kubernetes.io/projected/da2fdea4-15c3-4312-9e6f-bbff469b8feb-kube-api-access-hkmpz\") pod \"da2fdea4-15c3-4312-9e6f-bbff469b8feb\" (UID: \"da2fdea4-15c3-4312-9e6f-bbff469b8feb\") " Mar 20 14:00:35 crc kubenswrapper[4973]: I0320 14:00:35.762997 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da2fdea4-15c3-4312-9e6f-bbff469b8feb-ssh-key-openstack-edpm-ipam\") pod \"da2fdea4-15c3-4312-9e6f-bbff469b8feb\" (UID: \"da2fdea4-15c3-4312-9e6f-bbff469b8feb\") " Mar 20 14:00:35 crc kubenswrapper[4973]: I0320 14:00:35.769425 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da2fdea4-15c3-4312-9e6f-bbff469b8feb-kube-api-access-hkmpz" (OuterVolumeSpecName: "kube-api-access-hkmpz") pod "da2fdea4-15c3-4312-9e6f-bbff469b8feb" (UID: "da2fdea4-15c3-4312-9e6f-bbff469b8feb"). InnerVolumeSpecName "kube-api-access-hkmpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:00:35 crc kubenswrapper[4973]: I0320 14:00:35.804660 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da2fdea4-15c3-4312-9e6f-bbff469b8feb-inventory" (OuterVolumeSpecName: "inventory") pod "da2fdea4-15c3-4312-9e6f-bbff469b8feb" (UID: "da2fdea4-15c3-4312-9e6f-bbff469b8feb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:00:35 crc kubenswrapper[4973]: I0320 14:00:35.809323 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da2fdea4-15c3-4312-9e6f-bbff469b8feb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "da2fdea4-15c3-4312-9e6f-bbff469b8feb" (UID: "da2fdea4-15c3-4312-9e6f-bbff469b8feb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:00:35 crc kubenswrapper[4973]: I0320 14:00:35.867463 4973 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/da2fdea4-15c3-4312-9e6f-bbff469b8feb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:35 crc kubenswrapper[4973]: I0320 14:00:35.867511 4973 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/da2fdea4-15c3-4312-9e6f-bbff469b8feb-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:35 crc kubenswrapper[4973]: I0320 14:00:35.867524 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hkmpz\" (UniqueName: \"kubernetes.io/projected/da2fdea4-15c3-4312-9e6f-bbff469b8feb-kube-api-access-hkmpz\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:36 crc kubenswrapper[4973]: I0320 14:00:36.208778 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkhr6" event={"ID":"da2fdea4-15c3-4312-9e6f-bbff469b8feb","Type":"ContainerDied","Data":"d6e443b16c9909b52bf76cda323bd620c3b6f4e23934eece7b6aa611286dc7c4"} Mar 20 14:00:36 crc kubenswrapper[4973]: I0320 14:00:36.208841 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d6e443b16c9909b52bf76cda323bd620c3b6f4e23934eece7b6aa611286dc7c4" Mar 20 14:00:36 crc kubenswrapper[4973]: I0320 14:00:36.208844 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-gkhr6" Mar 20 14:00:36 crc kubenswrapper[4973]: I0320 14:00:36.353733 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7"] Mar 20 14:00:36 crc kubenswrapper[4973]: E0320 14:00:36.355331 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da2fdea4-15c3-4312-9e6f-bbff469b8feb" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 20 14:00:36 crc kubenswrapper[4973]: I0320 14:00:36.355381 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="da2fdea4-15c3-4312-9e6f-bbff469b8feb" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 20 14:00:36 crc kubenswrapper[4973]: I0320 14:00:36.355721 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="da2fdea4-15c3-4312-9e6f-bbff469b8feb" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 20 14:00:36 crc kubenswrapper[4973]: I0320 14:00:36.357250 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7" Mar 20 14:00:36 crc kubenswrapper[4973]: I0320 14:00:36.361233 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 14:00:36 crc kubenswrapper[4973]: I0320 14:00:36.361591 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d4wjc" Mar 20 14:00:36 crc kubenswrapper[4973]: I0320 14:00:36.361625 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 14:00:36 crc kubenswrapper[4973]: I0320 14:00:36.361789 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 14:00:36 crc kubenswrapper[4973]: I0320 14:00:36.371051 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7"] Mar 20 14:00:36 crc kubenswrapper[4973]: I0320 14:00:36.408974 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94pwv\" (UniqueName: \"kubernetes.io/projected/d26727be-6f6f-4044-b0ac-771e58cb8641-kube-api-access-94pwv\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7\" (UID: \"d26727be-6f6f-4044-b0ac-771e58cb8641\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7" Mar 20 14:00:36 crc kubenswrapper[4973]: I0320 14:00:36.409354 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d26727be-6f6f-4044-b0ac-771e58cb8641-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7\" (UID: \"d26727be-6f6f-4044-b0ac-771e58cb8641\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7" Mar 20 14:00:36 crc kubenswrapper[4973]: I0320 14:00:36.409712 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d26727be-6f6f-4044-b0ac-771e58cb8641-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7\" (UID: \"d26727be-6f6f-4044-b0ac-771e58cb8641\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7" Mar 20 14:00:36 crc kubenswrapper[4973]: I0320 14:00:36.513760 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d26727be-6f6f-4044-b0ac-771e58cb8641-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7\" (UID: \"d26727be-6f6f-4044-b0ac-771e58cb8641\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7" Mar 20 14:00:36 crc kubenswrapper[4973]: I0320 14:00:36.514464 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94pwv\" (UniqueName: \"kubernetes.io/projected/d26727be-6f6f-4044-b0ac-771e58cb8641-kube-api-access-94pwv\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7\" (UID: \"d26727be-6f6f-4044-b0ac-771e58cb8641\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7" Mar 20 14:00:36 crc kubenswrapper[4973]: I0320 14:00:36.514679 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d26727be-6f6f-4044-b0ac-771e58cb8641-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7\" (UID: \"d26727be-6f6f-4044-b0ac-771e58cb8641\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7" Mar 20 14:00:36 crc kubenswrapper[4973]: I0320 14:00:36.522714 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d26727be-6f6f-4044-b0ac-771e58cb8641-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7\" (UID: \"d26727be-6f6f-4044-b0ac-771e58cb8641\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7" Mar 20 14:00:36 crc kubenswrapper[4973]: I0320 14:00:36.527258 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d26727be-6f6f-4044-b0ac-771e58cb8641-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7\" (UID: \"d26727be-6f6f-4044-b0ac-771e58cb8641\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7" Mar 20 14:00:36 crc kubenswrapper[4973]: I0320 14:00:36.535337 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94pwv\" (UniqueName: \"kubernetes.io/projected/d26727be-6f6f-4044-b0ac-771e58cb8641-kube-api-access-94pwv\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7\" (UID: \"d26727be-6f6f-4044-b0ac-771e58cb8641\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7" Mar 20 14:00:36 crc kubenswrapper[4973]: I0320 14:00:36.714892 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7" Mar 20 14:00:36 crc kubenswrapper[4973]: I0320 14:00:36.951278 4973 scope.go:117] "RemoveContainer" containerID="8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee" Mar 20 14:00:36 crc kubenswrapper[4973]: E0320 14:00:36.951926 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:00:37 crc kubenswrapper[4973]: I0320 14:00:37.317300 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7"] Mar 20 14:00:38 crc kubenswrapper[4973]: I0320 14:00:38.236082 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7" event={"ID":"d26727be-6f6f-4044-b0ac-771e58cb8641","Type":"ContainerStarted","Data":"5c931a14212ab6f4ad39d4e3785c36ee0890b185d007fd1f2991b6338b5e4f01"} Mar 20 14:00:38 crc kubenswrapper[4973]: I0320 14:00:38.236762 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7" event={"ID":"d26727be-6f6f-4044-b0ac-771e58cb8641","Type":"ContainerStarted","Data":"2afbfcd8b512f89ec1d3e570ff6803eac746ea8951fd80ae02f16a0acbc4cad2"} Mar 20 14:00:47 crc kubenswrapper[4973]: I0320 14:00:47.331566 4973 generic.go:334] "Generic (PLEG): container finished" podID="d26727be-6f6f-4044-b0ac-771e58cb8641" containerID="5c931a14212ab6f4ad39d4e3785c36ee0890b185d007fd1f2991b6338b5e4f01" exitCode=0 Mar 20 14:00:47 crc kubenswrapper[4973]: I0320 14:00:47.336597 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7" event={"ID":"d26727be-6f6f-4044-b0ac-771e58cb8641","Type":"ContainerDied","Data":"5c931a14212ab6f4ad39d4e3785c36ee0890b185d007fd1f2991b6338b5e4f01"} Mar 20 14:00:48 crc kubenswrapper[4973]: I0320 14:00:48.835009 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7" Mar 20 14:00:48 crc kubenswrapper[4973]: I0320 14:00:48.872585 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d26727be-6f6f-4044-b0ac-771e58cb8641-ssh-key-openstack-edpm-ipam\") pod \"d26727be-6f6f-4044-b0ac-771e58cb8641\" (UID: \"d26727be-6f6f-4044-b0ac-771e58cb8641\") " Mar 20 14:00:48 crc kubenswrapper[4973]: I0320 14:00:48.873827 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d26727be-6f6f-4044-b0ac-771e58cb8641-inventory\") pod \"d26727be-6f6f-4044-b0ac-771e58cb8641\" (UID: \"d26727be-6f6f-4044-b0ac-771e58cb8641\") " Mar 20 14:00:48 crc kubenswrapper[4973]: I0320 14:00:48.874006 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-94pwv\" (UniqueName: \"kubernetes.io/projected/d26727be-6f6f-4044-b0ac-771e58cb8641-kube-api-access-94pwv\") pod \"d26727be-6f6f-4044-b0ac-771e58cb8641\" (UID: \"d26727be-6f6f-4044-b0ac-771e58cb8641\") " Mar 20 14:00:48 crc kubenswrapper[4973]: I0320 14:00:48.881524 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d26727be-6f6f-4044-b0ac-771e58cb8641-kube-api-access-94pwv" (OuterVolumeSpecName: "kube-api-access-94pwv") pod "d26727be-6f6f-4044-b0ac-771e58cb8641" (UID: "d26727be-6f6f-4044-b0ac-771e58cb8641"). InnerVolumeSpecName "kube-api-access-94pwv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:00:48 crc kubenswrapper[4973]: E0320 14:00:48.911304 4973 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d26727be-6f6f-4044-b0ac-771e58cb8641-inventory podName:d26727be-6f6f-4044-b0ac-771e58cb8641 nodeName:}" failed. No retries permitted until 2026-03-20 14:00:49.411265747 +0000 UTC m=+2370.154935481 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "inventory" (UniqueName: "kubernetes.io/secret/d26727be-6f6f-4044-b0ac-771e58cb8641-inventory") pod "d26727be-6f6f-4044-b0ac-771e58cb8641" (UID: "d26727be-6f6f-4044-b0ac-771e58cb8641") : error deleting /var/lib/kubelet/pods/d26727be-6f6f-4044-b0ac-771e58cb8641/volume-subpaths: remove /var/lib/kubelet/pods/d26727be-6f6f-4044-b0ac-771e58cb8641/volume-subpaths: no such file or directory Mar 20 14:00:48 crc kubenswrapper[4973]: I0320 14:00:48.914624 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d26727be-6f6f-4044-b0ac-771e58cb8641-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "d26727be-6f6f-4044-b0ac-771e58cb8641" (UID: "d26727be-6f6f-4044-b0ac-771e58cb8641"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:00:48 crc kubenswrapper[4973]: I0320 14:00:48.951810 4973 scope.go:117] "RemoveContainer" containerID="8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee" Mar 20 14:00:48 crc kubenswrapper[4973]: E0320 14:00:48.952057 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:00:48 crc kubenswrapper[4973]: I0320 14:00:48.979479 4973 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/d26727be-6f6f-4044-b0ac-771e58cb8641-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:48 crc kubenswrapper[4973]: I0320 14:00:48.979549 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-94pwv\" (UniqueName: \"kubernetes.io/projected/d26727be-6f6f-4044-b0ac-771e58cb8641-kube-api-access-94pwv\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.354432 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7" event={"ID":"d26727be-6f6f-4044-b0ac-771e58cb8641","Type":"ContainerDied","Data":"2afbfcd8b512f89ec1d3e570ff6803eac746ea8951fd80ae02f16a0acbc4cad2"} Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.354767 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2afbfcd8b512f89ec1d3e570ff6803eac746ea8951fd80ae02f16a0acbc4cad2" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.354479 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.440791 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs"] Mar 20 14:00:49 crc kubenswrapper[4973]: E0320 14:00:49.441315 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d26727be-6f6f-4044-b0ac-771e58cb8641" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.441351 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="d26727be-6f6f-4044-b0ac-771e58cb8641" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.441617 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="d26727be-6f6f-4044-b0ac-771e58cb8641" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.442535 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.444956 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.444986 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.445006 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.445236 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.445313 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.465855 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs"] Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.491921 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d26727be-6f6f-4044-b0ac-771e58cb8641-inventory\") pod \"d26727be-6f6f-4044-b0ac-771e58cb8641\" (UID: \"d26727be-6f6f-4044-b0ac-771e58cb8641\") " Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.492309 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.492383 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.492419 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.492444 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.492495 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.492519 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk8hq\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-kube-api-access-hk8hq\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.492541 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.492573 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.492684 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.492720 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.492747 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.493004 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.493169 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.493238 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.493461 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.493518 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.496718 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d26727be-6f6f-4044-b0ac-771e58cb8641-inventory" (OuterVolumeSpecName: "inventory") pod "d26727be-6f6f-4044-b0ac-771e58cb8641" (UID: "d26727be-6f6f-4044-b0ac-771e58cb8641"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.595936 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.596212 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.596299 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.596419 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.596539 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.596642 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk8hq\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-kube-api-access-hk8hq\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.596721 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.596810 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.596967 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.597082 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.597177 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.597281 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.597412 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.597517 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.597668 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.597755 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.597901 4973 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/d26727be-6f6f-4044-b0ac-771e58cb8641-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.603061 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.603355 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.603685 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.603987 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.604084 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.604302 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.604361 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.604921 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.605759 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-telemetry-power-monitoring-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.605821 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.606399 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.606478 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.607029 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.607371 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.611598 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.614293 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk8hq\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-kube-api-access-hk8hq\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:49 crc kubenswrapper[4973]: I0320 14:00:49.765172 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:00:50 crc kubenswrapper[4973]: I0320 14:00:50.358475 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs"] Mar 20 14:00:51 crc kubenswrapper[4973]: I0320 14:00:51.391031 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" event={"ID":"fbb12599-731a-426e-9032-80f723072a75","Type":"ContainerStarted","Data":"bdd4f6c1d12102f5464420bb926818a0bb8c53024b8a0983dc92fc872113be7f"} Mar 20 14:00:51 crc kubenswrapper[4973]: I0320 14:00:51.391459 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" event={"ID":"fbb12599-731a-426e-9032-80f723072a75","Type":"ContainerStarted","Data":"8aadfd66eab902474472cc15561926d49b8dd558c735ca9f285d543ef97ae396"} Mar 20 14:00:51 crc kubenswrapper[4973]: I0320 14:00:51.421598 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" podStartSLOduration=1.775186838 podStartE2EDuration="2.421569958s" podCreationTimestamp="2026-03-20 14:00:49 +0000 UTC" firstStartedPulling="2026-03-20 14:00:50.36273135 +0000 UTC m=+2371.106401094" lastFinishedPulling="2026-03-20 14:00:51.00911447 +0000 UTC m=+2371.752784214" observedRunningTime="2026-03-20 14:00:51.411831451 +0000 UTC m=+2372.155501205" watchObservedRunningTime="2026-03-20 14:00:51.421569958 +0000 UTC m=+2372.165239702" Mar 20 14:00:58 crc kubenswrapper[4973]: I0320 14:00:58.465372 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m77wh"] Mar 20 14:00:58 crc kubenswrapper[4973]: I0320 14:00:58.469101 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m77wh" Mar 20 14:00:58 crc kubenswrapper[4973]: I0320 14:00:58.482409 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m77wh"] Mar 20 14:00:58 crc kubenswrapper[4973]: I0320 14:00:58.645830 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ea80a7b-776d-452e-a25c-6c3c381a6229-utilities\") pod \"community-operators-m77wh\" (UID: \"0ea80a7b-776d-452e-a25c-6c3c381a6229\") " pod="openshift-marketplace/community-operators-m77wh" Mar 20 14:00:58 crc kubenswrapper[4973]: I0320 14:00:58.645977 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjwbn\" (UniqueName: \"kubernetes.io/projected/0ea80a7b-776d-452e-a25c-6c3c381a6229-kube-api-access-fjwbn\") pod \"community-operators-m77wh\" (UID: \"0ea80a7b-776d-452e-a25c-6c3c381a6229\") " pod="openshift-marketplace/community-operators-m77wh" Mar 20 14:00:58 crc kubenswrapper[4973]: I0320 14:00:58.647037 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ea80a7b-776d-452e-a25c-6c3c381a6229-catalog-content\") pod \"community-operators-m77wh\" (UID: \"0ea80a7b-776d-452e-a25c-6c3c381a6229\") " pod="openshift-marketplace/community-operators-m77wh" Mar 20 14:00:58 crc kubenswrapper[4973]: I0320 14:00:58.751633 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ea80a7b-776d-452e-a25c-6c3c381a6229-utilities\") pod \"community-operators-m77wh\" (UID: \"0ea80a7b-776d-452e-a25c-6c3c381a6229\") " pod="openshift-marketplace/community-operators-m77wh" Mar 20 14:00:58 crc kubenswrapper[4973]: I0320 14:00:58.752178 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjwbn\" (UniqueName: \"kubernetes.io/projected/0ea80a7b-776d-452e-a25c-6c3c381a6229-kube-api-access-fjwbn\") pod \"community-operators-m77wh\" (UID: \"0ea80a7b-776d-452e-a25c-6c3c381a6229\") " pod="openshift-marketplace/community-operators-m77wh" Mar 20 14:00:58 crc kubenswrapper[4973]: I0320 14:00:58.752256 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ea80a7b-776d-452e-a25c-6c3c381a6229-catalog-content\") pod \"community-operators-m77wh\" (UID: \"0ea80a7b-776d-452e-a25c-6c3c381a6229\") " pod="openshift-marketplace/community-operators-m77wh" Mar 20 14:00:58 crc kubenswrapper[4973]: I0320 14:00:58.752505 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ea80a7b-776d-452e-a25c-6c3c381a6229-utilities\") pod \"community-operators-m77wh\" (UID: \"0ea80a7b-776d-452e-a25c-6c3c381a6229\") " pod="openshift-marketplace/community-operators-m77wh" Mar 20 14:00:58 crc kubenswrapper[4973]: I0320 14:00:58.752872 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ea80a7b-776d-452e-a25c-6c3c381a6229-catalog-content\") pod \"community-operators-m77wh\" (UID: \"0ea80a7b-776d-452e-a25c-6c3c381a6229\") " pod="openshift-marketplace/community-operators-m77wh" Mar 20 14:00:58 crc kubenswrapper[4973]: I0320 14:00:58.784512 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjwbn\" (UniqueName: \"kubernetes.io/projected/0ea80a7b-776d-452e-a25c-6c3c381a6229-kube-api-access-fjwbn\") pod \"community-operators-m77wh\" (UID: \"0ea80a7b-776d-452e-a25c-6c3c381a6229\") " pod="openshift-marketplace/community-operators-m77wh" Mar 20 14:00:58 crc kubenswrapper[4973]: I0320 14:00:58.858506 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m77wh" Mar 20 14:00:59 crc kubenswrapper[4973]: I0320 14:00:59.475063 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m77wh"] Mar 20 14:00:59 crc kubenswrapper[4973]: I0320 14:00:59.961710 4973 scope.go:117] "RemoveContainer" containerID="8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee" Mar 20 14:00:59 crc kubenswrapper[4973]: E0320 14:00:59.962021 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:01:00 crc kubenswrapper[4973]: I0320 14:01:00.133095 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29566921-tn9xk"] Mar 20 14:01:00 crc kubenswrapper[4973]: I0320 14:01:00.135511 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566921-tn9xk" Mar 20 14:01:00 crc kubenswrapper[4973]: I0320 14:01:00.147174 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29566921-tn9xk"] Mar 20 14:01:00 crc kubenswrapper[4973]: I0320 14:01:00.290304 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3defeff9-b2c0-4236-a0f2-c91f57208005-fernet-keys\") pod \"keystone-cron-29566921-tn9xk\" (UID: \"3defeff9-b2c0-4236-a0f2-c91f57208005\") " pod="openstack/keystone-cron-29566921-tn9xk" Mar 20 14:01:00 crc kubenswrapper[4973]: I0320 14:01:00.290414 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3defeff9-b2c0-4236-a0f2-c91f57208005-combined-ca-bundle\") pod \"keystone-cron-29566921-tn9xk\" (UID: \"3defeff9-b2c0-4236-a0f2-c91f57208005\") " pod="openstack/keystone-cron-29566921-tn9xk" Mar 20 14:01:00 crc kubenswrapper[4973]: I0320 14:01:00.290491 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3defeff9-b2c0-4236-a0f2-c91f57208005-config-data\") pod \"keystone-cron-29566921-tn9xk\" (UID: \"3defeff9-b2c0-4236-a0f2-c91f57208005\") " pod="openstack/keystone-cron-29566921-tn9xk" Mar 20 14:01:00 crc kubenswrapper[4973]: I0320 14:01:00.290683 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-979hx\" (UniqueName: \"kubernetes.io/projected/3defeff9-b2c0-4236-a0f2-c91f57208005-kube-api-access-979hx\") pod \"keystone-cron-29566921-tn9xk\" (UID: \"3defeff9-b2c0-4236-a0f2-c91f57208005\") " pod="openstack/keystone-cron-29566921-tn9xk" Mar 20 14:01:00 crc kubenswrapper[4973]: I0320 14:01:00.393184 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3defeff9-b2c0-4236-a0f2-c91f57208005-fernet-keys\") pod \"keystone-cron-29566921-tn9xk\" (UID: \"3defeff9-b2c0-4236-a0f2-c91f57208005\") " pod="openstack/keystone-cron-29566921-tn9xk" Mar 20 14:01:00 crc kubenswrapper[4973]: I0320 14:01:00.393268 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3defeff9-b2c0-4236-a0f2-c91f57208005-combined-ca-bundle\") pod \"keystone-cron-29566921-tn9xk\" (UID: \"3defeff9-b2c0-4236-a0f2-c91f57208005\") " pod="openstack/keystone-cron-29566921-tn9xk" Mar 20 14:01:00 crc kubenswrapper[4973]: I0320 14:01:00.393476 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3defeff9-b2c0-4236-a0f2-c91f57208005-config-data\") pod \"keystone-cron-29566921-tn9xk\" (UID: \"3defeff9-b2c0-4236-a0f2-c91f57208005\") " pod="openstack/keystone-cron-29566921-tn9xk" Mar 20 14:01:00 crc kubenswrapper[4973]: I0320 14:01:00.393632 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-979hx\" (UniqueName: \"kubernetes.io/projected/3defeff9-b2c0-4236-a0f2-c91f57208005-kube-api-access-979hx\") pod \"keystone-cron-29566921-tn9xk\" (UID: \"3defeff9-b2c0-4236-a0f2-c91f57208005\") " pod="openstack/keystone-cron-29566921-tn9xk" Mar 20 14:01:00 crc kubenswrapper[4973]: I0320 14:01:00.401281 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3defeff9-b2c0-4236-a0f2-c91f57208005-combined-ca-bundle\") pod \"keystone-cron-29566921-tn9xk\" (UID: \"3defeff9-b2c0-4236-a0f2-c91f57208005\") " pod="openstack/keystone-cron-29566921-tn9xk" Mar 20 14:01:00 crc kubenswrapper[4973]: I0320 14:01:00.401334 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3defeff9-b2c0-4236-a0f2-c91f57208005-fernet-keys\") pod \"keystone-cron-29566921-tn9xk\" (UID: \"3defeff9-b2c0-4236-a0f2-c91f57208005\") " pod="openstack/keystone-cron-29566921-tn9xk" Mar 20 14:01:00 crc kubenswrapper[4973]: I0320 14:01:00.401923 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3defeff9-b2c0-4236-a0f2-c91f57208005-config-data\") pod \"keystone-cron-29566921-tn9xk\" (UID: \"3defeff9-b2c0-4236-a0f2-c91f57208005\") " pod="openstack/keystone-cron-29566921-tn9xk" Mar 20 14:01:00 crc kubenswrapper[4973]: I0320 14:01:00.412771 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-979hx\" (UniqueName: \"kubernetes.io/projected/3defeff9-b2c0-4236-a0f2-c91f57208005-kube-api-access-979hx\") pod \"keystone-cron-29566921-tn9xk\" (UID: \"3defeff9-b2c0-4236-a0f2-c91f57208005\") " pod="openstack/keystone-cron-29566921-tn9xk" Mar 20 14:01:00 crc kubenswrapper[4973]: I0320 14:01:00.468552 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566921-tn9xk" Mar 20 14:01:00 crc kubenswrapper[4973]: I0320 14:01:00.499908 4973 generic.go:334] "Generic (PLEG): container finished" podID="0ea80a7b-776d-452e-a25c-6c3c381a6229" containerID="d9aae1d5a05536520efc51d0cff949ff786e6e6884f384f1d158ff7e1eb62377" exitCode=0 Mar 20 14:01:00 crc kubenswrapper[4973]: I0320 14:01:00.499970 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m77wh" event={"ID":"0ea80a7b-776d-452e-a25c-6c3c381a6229","Type":"ContainerDied","Data":"d9aae1d5a05536520efc51d0cff949ff786e6e6884f384f1d158ff7e1eb62377"} Mar 20 14:01:00 crc kubenswrapper[4973]: I0320 14:01:00.500048 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m77wh" event={"ID":"0ea80a7b-776d-452e-a25c-6c3c381a6229","Type":"ContainerStarted","Data":"8d9be3722b8bfbf87346bb8b294d8c2e74619ef20a1bcc690fa4e2e145e00613"} Mar 20 14:01:00 crc kubenswrapper[4973]: I0320 14:01:00.838945 4973 scope.go:117] "RemoveContainer" containerID="0d23c3baf1062e9d49890ec6be219164ac05264162d0919a1284f9aaa63d4fc8" Mar 20 14:01:00 crc kubenswrapper[4973]: I0320 14:01:00.889101 4973 scope.go:117] "RemoveContainer" containerID="a8cf42a9d59625b7d17e2f95f60f398813c0fe69038ad0c286fec8890d8a13d7" Mar 20 14:01:00 crc kubenswrapper[4973]: I0320 14:01:00.993492 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29566921-tn9xk"] Mar 20 14:01:01 crc kubenswrapper[4973]: W0320 14:01:01.003443 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3defeff9_b2c0_4236_a0f2_c91f57208005.slice/crio-e08450790a4f927eb86f7940a3118d0c643193af179ab2b3a54c4688d1e7c982 WatchSource:0}: Error finding container e08450790a4f927eb86f7940a3118d0c643193af179ab2b3a54c4688d1e7c982: Status 404 returned error can't find the container with id e08450790a4f927eb86f7940a3118d0c643193af179ab2b3a54c4688d1e7c982 Mar 20 14:01:01 crc kubenswrapper[4973]: I0320 14:01:01.511954 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566921-tn9xk" event={"ID":"3defeff9-b2c0-4236-a0f2-c91f57208005","Type":"ContainerStarted","Data":"1b40c43d7e4e3844691075f0e1632f5fd7ae11a4bfb9c4752fd5251db88da3fc"} Mar 20 14:01:01 crc kubenswrapper[4973]: I0320 14:01:01.512254 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566921-tn9xk" event={"ID":"3defeff9-b2c0-4236-a0f2-c91f57208005","Type":"ContainerStarted","Data":"e08450790a4f927eb86f7940a3118d0c643193af179ab2b3a54c4688d1e7c982"} Mar 20 14:01:01 crc kubenswrapper[4973]: I0320 14:01:01.514785 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m77wh" event={"ID":"0ea80a7b-776d-452e-a25c-6c3c381a6229","Type":"ContainerStarted","Data":"be42e19e85a943a18c58d151f2d20a4f114de01157ac22dacea725c3acc5cc82"} Mar 20 14:01:01 crc kubenswrapper[4973]: I0320 14:01:01.534001 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29566921-tn9xk" podStartSLOduration=1.5339776010000001 podStartE2EDuration="1.533977601s" podCreationTimestamp="2026-03-20 14:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:01:01.526826025 +0000 UTC m=+2382.270495789" watchObservedRunningTime="2026-03-20 14:01:01.533977601 +0000 UTC m=+2382.277647345" Mar 20 14:01:03 crc kubenswrapper[4973]: I0320 14:01:03.539667 4973 generic.go:334] "Generic (PLEG): container finished" podID="0ea80a7b-776d-452e-a25c-6c3c381a6229" containerID="be42e19e85a943a18c58d151f2d20a4f114de01157ac22dacea725c3acc5cc82" exitCode=0 Mar 20 14:01:03 crc kubenswrapper[4973]: I0320 14:01:03.539794 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m77wh" event={"ID":"0ea80a7b-776d-452e-a25c-6c3c381a6229","Type":"ContainerDied","Data":"be42e19e85a943a18c58d151f2d20a4f114de01157ac22dacea725c3acc5cc82"} Mar 20 14:01:03 crc kubenswrapper[4973]: I0320 14:01:03.543540 4973 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:01:05 crc kubenswrapper[4973]: I0320 14:01:05.572942 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m77wh" event={"ID":"0ea80a7b-776d-452e-a25c-6c3c381a6229","Type":"ContainerStarted","Data":"c38167ee7a6239b96671a438929f28222eaa9cff32f08f6a2814c5ad8f10c4db"} Mar 20 14:01:05 crc kubenswrapper[4973]: I0320 14:01:05.574783 4973 generic.go:334] "Generic (PLEG): container finished" podID="3defeff9-b2c0-4236-a0f2-c91f57208005" containerID="1b40c43d7e4e3844691075f0e1632f5fd7ae11a4bfb9c4752fd5251db88da3fc" exitCode=0 Mar 20 14:01:05 crc kubenswrapper[4973]: I0320 14:01:05.574817 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566921-tn9xk" event={"ID":"3defeff9-b2c0-4236-a0f2-c91f57208005","Type":"ContainerDied","Data":"1b40c43d7e4e3844691075f0e1632f5fd7ae11a4bfb9c4752fd5251db88da3fc"} Mar 20 14:01:05 crc kubenswrapper[4973]: I0320 14:01:05.602778 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m77wh" podStartSLOduration=3.7766448009999998 podStartE2EDuration="7.602754974s" podCreationTimestamp="2026-03-20 14:00:58 +0000 UTC" firstStartedPulling="2026-03-20 14:01:00.502722789 +0000 UTC m=+2381.246392533" lastFinishedPulling="2026-03-20 14:01:04.328832962 +0000 UTC m=+2385.072502706" observedRunningTime="2026-03-20 14:01:05.592480091 +0000 UTC m=+2386.336149845" watchObservedRunningTime="2026-03-20 14:01:05.602754974 +0000 UTC m=+2386.346424718" Mar 20 14:01:07 crc kubenswrapper[4973]: I0320 14:01:07.025755 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566921-tn9xk" Mar 20 14:01:07 crc kubenswrapper[4973]: I0320 14:01:07.227961 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3defeff9-b2c0-4236-a0f2-c91f57208005-combined-ca-bundle\") pod \"3defeff9-b2c0-4236-a0f2-c91f57208005\" (UID: \"3defeff9-b2c0-4236-a0f2-c91f57208005\") " Mar 20 14:01:07 crc kubenswrapper[4973]: I0320 14:01:07.228385 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-979hx\" (UniqueName: \"kubernetes.io/projected/3defeff9-b2c0-4236-a0f2-c91f57208005-kube-api-access-979hx\") pod \"3defeff9-b2c0-4236-a0f2-c91f57208005\" (UID: \"3defeff9-b2c0-4236-a0f2-c91f57208005\") " Mar 20 14:01:07 crc kubenswrapper[4973]: I0320 14:01:07.228544 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3defeff9-b2c0-4236-a0f2-c91f57208005-fernet-keys\") pod \"3defeff9-b2c0-4236-a0f2-c91f57208005\" (UID: \"3defeff9-b2c0-4236-a0f2-c91f57208005\") " Mar 20 14:01:07 crc kubenswrapper[4973]: I0320 14:01:07.228581 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3defeff9-b2c0-4236-a0f2-c91f57208005-config-data\") pod \"3defeff9-b2c0-4236-a0f2-c91f57208005\" (UID: \"3defeff9-b2c0-4236-a0f2-c91f57208005\") " Mar 20 14:01:07 crc kubenswrapper[4973]: I0320 14:01:07.233917 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3defeff9-b2c0-4236-a0f2-c91f57208005-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "3defeff9-b2c0-4236-a0f2-c91f57208005" (UID: "3defeff9-b2c0-4236-a0f2-c91f57208005"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:01:07 crc kubenswrapper[4973]: I0320 14:01:07.234476 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3defeff9-b2c0-4236-a0f2-c91f57208005-kube-api-access-979hx" (OuterVolumeSpecName: "kube-api-access-979hx") pod "3defeff9-b2c0-4236-a0f2-c91f57208005" (UID: "3defeff9-b2c0-4236-a0f2-c91f57208005"). InnerVolumeSpecName "kube-api-access-979hx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:01:07 crc kubenswrapper[4973]: I0320 14:01:07.261307 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3defeff9-b2c0-4236-a0f2-c91f57208005-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3defeff9-b2c0-4236-a0f2-c91f57208005" (UID: "3defeff9-b2c0-4236-a0f2-c91f57208005"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:01:07 crc kubenswrapper[4973]: I0320 14:01:07.293931 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3defeff9-b2c0-4236-a0f2-c91f57208005-config-data" (OuterVolumeSpecName: "config-data") pod "3defeff9-b2c0-4236-a0f2-c91f57208005" (UID: "3defeff9-b2c0-4236-a0f2-c91f57208005"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:01:07 crc kubenswrapper[4973]: I0320 14:01:07.331394 4973 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/3defeff9-b2c0-4236-a0f2-c91f57208005-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:07 crc kubenswrapper[4973]: I0320 14:01:07.331425 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3defeff9-b2c0-4236-a0f2-c91f57208005-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:07 crc kubenswrapper[4973]: I0320 14:01:07.331436 4973 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3defeff9-b2c0-4236-a0f2-c91f57208005-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:07 crc kubenswrapper[4973]: I0320 14:01:07.331447 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-979hx\" (UniqueName: \"kubernetes.io/projected/3defeff9-b2c0-4236-a0f2-c91f57208005-kube-api-access-979hx\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:07 crc kubenswrapper[4973]: I0320 14:01:07.595203 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29566921-tn9xk" event={"ID":"3defeff9-b2c0-4236-a0f2-c91f57208005","Type":"ContainerDied","Data":"e08450790a4f927eb86f7940a3118d0c643193af179ab2b3a54c4688d1e7c982"} Mar 20 14:01:07 crc kubenswrapper[4973]: I0320 14:01:07.595251 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29566921-tn9xk" Mar 20 14:01:07 crc kubenswrapper[4973]: I0320 14:01:07.595252 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e08450790a4f927eb86f7940a3118d0c643193af179ab2b3a54c4688d1e7c982" Mar 20 14:01:08 crc kubenswrapper[4973]: I0320 14:01:08.858853 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m77wh" Mar 20 14:01:08 crc kubenswrapper[4973]: I0320 14:01:08.859235 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m77wh" Mar 20 14:01:08 crc kubenswrapper[4973]: I0320 14:01:08.916160 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m77wh" Mar 20 14:01:09 crc kubenswrapper[4973]: I0320 14:01:09.670219 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m77wh" Mar 20 14:01:09 crc kubenswrapper[4973]: I0320 14:01:09.723003 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m77wh"] Mar 20 14:01:11 crc kubenswrapper[4973]: I0320 14:01:11.633275 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m77wh" podUID="0ea80a7b-776d-452e-a25c-6c3c381a6229" containerName="registry-server" containerID="cri-o://c38167ee7a6239b96671a438929f28222eaa9cff32f08f6a2814c5ad8f10c4db" gracePeriod=2 Mar 20 14:01:11 crc kubenswrapper[4973]: I0320 14:01:11.951603 4973 scope.go:117] "RemoveContainer" containerID="8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee" Mar 20 14:01:11 crc kubenswrapper[4973]: E0320 14:01:11.952192 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:01:12 crc kubenswrapper[4973]: I0320 14:01:12.189880 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m77wh" Mar 20 14:01:12 crc kubenswrapper[4973]: I0320 14:01:12.261544 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ea80a7b-776d-452e-a25c-6c3c381a6229-utilities\") pod \"0ea80a7b-776d-452e-a25c-6c3c381a6229\" (UID: \"0ea80a7b-776d-452e-a25c-6c3c381a6229\") " Mar 20 14:01:12 crc kubenswrapper[4973]: I0320 14:01:12.261863 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ea80a7b-776d-452e-a25c-6c3c381a6229-catalog-content\") pod \"0ea80a7b-776d-452e-a25c-6c3c381a6229\" (UID: \"0ea80a7b-776d-452e-a25c-6c3c381a6229\") " Mar 20 14:01:12 crc kubenswrapper[4973]: I0320 14:01:12.261923 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjwbn\" (UniqueName: \"kubernetes.io/projected/0ea80a7b-776d-452e-a25c-6c3c381a6229-kube-api-access-fjwbn\") pod \"0ea80a7b-776d-452e-a25c-6c3c381a6229\" (UID: \"0ea80a7b-776d-452e-a25c-6c3c381a6229\") " Mar 20 14:01:12 crc kubenswrapper[4973]: I0320 14:01:12.265586 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ea80a7b-776d-452e-a25c-6c3c381a6229-utilities" (OuterVolumeSpecName: "utilities") pod "0ea80a7b-776d-452e-a25c-6c3c381a6229" (UID: "0ea80a7b-776d-452e-a25c-6c3c381a6229"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:01:12 crc kubenswrapper[4973]: I0320 14:01:12.270607 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0ea80a7b-776d-452e-a25c-6c3c381a6229-kube-api-access-fjwbn" (OuterVolumeSpecName: "kube-api-access-fjwbn") pod "0ea80a7b-776d-452e-a25c-6c3c381a6229" (UID: "0ea80a7b-776d-452e-a25c-6c3c381a6229"). InnerVolumeSpecName "kube-api-access-fjwbn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:01:12 crc kubenswrapper[4973]: I0320 14:01:12.325730 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0ea80a7b-776d-452e-a25c-6c3c381a6229-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0ea80a7b-776d-452e-a25c-6c3c381a6229" (UID: "0ea80a7b-776d-452e-a25c-6c3c381a6229"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:01:12 crc kubenswrapper[4973]: I0320 14:01:12.366396 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0ea80a7b-776d-452e-a25c-6c3c381a6229-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:12 crc kubenswrapper[4973]: I0320 14:01:12.366442 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0ea80a7b-776d-452e-a25c-6c3c381a6229-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:12 crc kubenswrapper[4973]: I0320 14:01:12.366458 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjwbn\" (UniqueName: \"kubernetes.io/projected/0ea80a7b-776d-452e-a25c-6c3c381a6229-kube-api-access-fjwbn\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:12 crc kubenswrapper[4973]: I0320 14:01:12.646528 4973 generic.go:334] "Generic (PLEG): container finished" podID="0ea80a7b-776d-452e-a25c-6c3c381a6229" containerID="c38167ee7a6239b96671a438929f28222eaa9cff32f08f6a2814c5ad8f10c4db" exitCode=0 Mar 20 14:01:12 crc kubenswrapper[4973]: I0320 14:01:12.646579 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m77wh" event={"ID":"0ea80a7b-776d-452e-a25c-6c3c381a6229","Type":"ContainerDied","Data":"c38167ee7a6239b96671a438929f28222eaa9cff32f08f6a2814c5ad8f10c4db"} Mar 20 14:01:12 crc kubenswrapper[4973]: I0320 14:01:12.646620 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m77wh" event={"ID":"0ea80a7b-776d-452e-a25c-6c3c381a6229","Type":"ContainerDied","Data":"8d9be3722b8bfbf87346bb8b294d8c2e74619ef20a1bcc690fa4e2e145e00613"} Mar 20 14:01:12 crc kubenswrapper[4973]: I0320 14:01:12.646638 4973 scope.go:117] "RemoveContainer" containerID="c38167ee7a6239b96671a438929f28222eaa9cff32f08f6a2814c5ad8f10c4db" Mar 20 14:01:12 crc kubenswrapper[4973]: I0320 14:01:12.646653 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m77wh" Mar 20 14:01:12 crc kubenswrapper[4973]: I0320 14:01:12.681941 4973 scope.go:117] "RemoveContainer" containerID="be42e19e85a943a18c58d151f2d20a4f114de01157ac22dacea725c3acc5cc82" Mar 20 14:01:12 crc kubenswrapper[4973]: I0320 14:01:12.687483 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m77wh"] Mar 20 14:01:12 crc kubenswrapper[4973]: I0320 14:01:12.698575 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-m77wh"] Mar 20 14:01:12 crc kubenswrapper[4973]: I0320 14:01:12.714846 4973 scope.go:117] "RemoveContainer" containerID="d9aae1d5a05536520efc51d0cff949ff786e6e6884f384f1d158ff7e1eb62377" Mar 20 14:01:12 crc kubenswrapper[4973]: I0320 14:01:12.777626 4973 scope.go:117] "RemoveContainer" containerID="c38167ee7a6239b96671a438929f28222eaa9cff32f08f6a2814c5ad8f10c4db" Mar 20 14:01:12 crc kubenswrapper[4973]: E0320 14:01:12.778210 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c38167ee7a6239b96671a438929f28222eaa9cff32f08f6a2814c5ad8f10c4db\": container with ID starting with c38167ee7a6239b96671a438929f28222eaa9cff32f08f6a2814c5ad8f10c4db not found: ID does not exist" containerID="c38167ee7a6239b96671a438929f28222eaa9cff32f08f6a2814c5ad8f10c4db" Mar 20 14:01:12 crc kubenswrapper[4973]: I0320 14:01:12.778246 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c38167ee7a6239b96671a438929f28222eaa9cff32f08f6a2814c5ad8f10c4db"} err="failed to get container status \"c38167ee7a6239b96671a438929f28222eaa9cff32f08f6a2814c5ad8f10c4db\": rpc error: code = NotFound desc = could not find container \"c38167ee7a6239b96671a438929f28222eaa9cff32f08f6a2814c5ad8f10c4db\": container with ID starting with c38167ee7a6239b96671a438929f28222eaa9cff32f08f6a2814c5ad8f10c4db not found: ID does not exist" Mar 20 14:01:12 crc kubenswrapper[4973]: I0320 14:01:12.778271 4973 scope.go:117] "RemoveContainer" containerID="be42e19e85a943a18c58d151f2d20a4f114de01157ac22dacea725c3acc5cc82" Mar 20 14:01:12 crc kubenswrapper[4973]: E0320 14:01:12.778609 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be42e19e85a943a18c58d151f2d20a4f114de01157ac22dacea725c3acc5cc82\": container with ID starting with be42e19e85a943a18c58d151f2d20a4f114de01157ac22dacea725c3acc5cc82 not found: ID does not exist" containerID="be42e19e85a943a18c58d151f2d20a4f114de01157ac22dacea725c3acc5cc82" Mar 20 14:01:12 crc kubenswrapper[4973]: I0320 14:01:12.778643 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be42e19e85a943a18c58d151f2d20a4f114de01157ac22dacea725c3acc5cc82"} err="failed to get container status \"be42e19e85a943a18c58d151f2d20a4f114de01157ac22dacea725c3acc5cc82\": rpc error: code = NotFound desc = could not find container \"be42e19e85a943a18c58d151f2d20a4f114de01157ac22dacea725c3acc5cc82\": container with ID starting with be42e19e85a943a18c58d151f2d20a4f114de01157ac22dacea725c3acc5cc82 not found: ID does not exist" Mar 20 14:01:12 crc kubenswrapper[4973]: I0320 14:01:12.778663 4973 scope.go:117] "RemoveContainer" containerID="d9aae1d5a05536520efc51d0cff949ff786e6e6884f384f1d158ff7e1eb62377" Mar 20 14:01:12 crc kubenswrapper[4973]: E0320 14:01:12.778877 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9aae1d5a05536520efc51d0cff949ff786e6e6884f384f1d158ff7e1eb62377\": container with ID starting with d9aae1d5a05536520efc51d0cff949ff786e6e6884f384f1d158ff7e1eb62377 not found: ID does not exist" containerID="d9aae1d5a05536520efc51d0cff949ff786e6e6884f384f1d158ff7e1eb62377" Mar 20 14:01:12 crc kubenswrapper[4973]: I0320 14:01:12.778916 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9aae1d5a05536520efc51d0cff949ff786e6e6884f384f1d158ff7e1eb62377"} err="failed to get container status \"d9aae1d5a05536520efc51d0cff949ff786e6e6884f384f1d158ff7e1eb62377\": rpc error: code = NotFound desc = could not find container \"d9aae1d5a05536520efc51d0cff949ff786e6e6884f384f1d158ff7e1eb62377\": container with ID starting with d9aae1d5a05536520efc51d0cff949ff786e6e6884f384f1d158ff7e1eb62377 not found: ID does not exist" Mar 20 14:01:13 crc kubenswrapper[4973]: I0320 14:01:13.963871 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0ea80a7b-776d-452e-a25c-6c3c381a6229" path="/var/lib/kubelet/pods/0ea80a7b-776d-452e-a25c-6c3c381a6229/volumes" Mar 20 14:01:24 crc kubenswrapper[4973]: I0320 14:01:24.950675 4973 scope.go:117] "RemoveContainer" containerID="8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee" Mar 20 14:01:24 crc kubenswrapper[4973]: E0320 14:01:24.951542 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:01:34 crc kubenswrapper[4973]: I0320 14:01:34.874673 4973 generic.go:334] "Generic (PLEG): container finished" podID="fbb12599-731a-426e-9032-80f723072a75" containerID="bdd4f6c1d12102f5464420bb926818a0bb8c53024b8a0983dc92fc872113be7f" exitCode=0 Mar 20 14:01:34 crc kubenswrapper[4973]: I0320 14:01:34.874762 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" event={"ID":"fbb12599-731a-426e-9032-80f723072a75","Type":"ContainerDied","Data":"bdd4f6c1d12102f5464420bb926818a0bb8c53024b8a0983dc92fc872113be7f"} Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.401808 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.494188 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-inventory\") pod \"fbb12599-731a-426e-9032-80f723072a75\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.494242 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-telemetry-combined-ca-bundle\") pod \"fbb12599-731a-426e-9032-80f723072a75\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.494343 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") pod \"fbb12599-731a-426e-9032-80f723072a75\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.494496 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-telemetry-power-monitoring-combined-ca-bundle\") pod \"fbb12599-731a-426e-9032-80f723072a75\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.494530 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-nova-combined-ca-bundle\") pod \"fbb12599-731a-426e-9032-80f723072a75\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.494588 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-repo-setup-combined-ca-bundle\") pod \"fbb12599-731a-426e-9032-80f723072a75\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.494627 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-ovn-default-certs-0\") pod \"fbb12599-731a-426e-9032-80f723072a75\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.494657 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-ssh-key-openstack-edpm-ipam\") pod \"fbb12599-731a-426e-9032-80f723072a75\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.494727 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"fbb12599-731a-426e-9032-80f723072a75\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.494753 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-ovn-combined-ca-bundle\") pod \"fbb12599-731a-426e-9032-80f723072a75\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.494819 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk8hq\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-kube-api-access-hk8hq\") pod \"fbb12599-731a-426e-9032-80f723072a75\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.494911 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-neutron-metadata-combined-ca-bundle\") pod \"fbb12599-731a-426e-9032-80f723072a75\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.494954 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"fbb12599-731a-426e-9032-80f723072a75\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.495003 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-bootstrap-combined-ca-bundle\") pod \"fbb12599-731a-426e-9032-80f723072a75\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.495058 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-libvirt-combined-ca-bundle\") pod \"fbb12599-731a-426e-9032-80f723072a75\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.495122 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"fbb12599-731a-426e-9032-80f723072a75\" (UID: \"fbb12599-731a-426e-9032-80f723072a75\") " Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.501005 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "fbb12599-731a-426e-9032-80f723072a75" (UID: "fbb12599-731a-426e-9032-80f723072a75"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.501252 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "fbb12599-731a-426e-9032-80f723072a75" (UID: "fbb12599-731a-426e-9032-80f723072a75"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.501293 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "fbb12599-731a-426e-9032-80f723072a75" (UID: "fbb12599-731a-426e-9032-80f723072a75"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.501750 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "fbb12599-731a-426e-9032-80f723072a75" (UID: "fbb12599-731a-426e-9032-80f723072a75"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.501894 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0") pod "fbb12599-731a-426e-9032-80f723072a75" (UID: "fbb12599-731a-426e-9032-80f723072a75"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.502810 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "fbb12599-731a-426e-9032-80f723072a75" (UID: "fbb12599-731a-426e-9032-80f723072a75"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.504169 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-kube-api-access-hk8hq" (OuterVolumeSpecName: "kube-api-access-hk8hq") pod "fbb12599-731a-426e-9032-80f723072a75" (UID: "fbb12599-731a-426e-9032-80f723072a75"). InnerVolumeSpecName "kube-api-access-hk8hq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.504529 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "fbb12599-731a-426e-9032-80f723072a75" (UID: "fbb12599-731a-426e-9032-80f723072a75"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.504574 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "fbb12599-731a-426e-9032-80f723072a75" (UID: "fbb12599-731a-426e-9032-80f723072a75"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.505560 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "fbb12599-731a-426e-9032-80f723072a75" (UID: "fbb12599-731a-426e-9032-80f723072a75"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.506542 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "fbb12599-731a-426e-9032-80f723072a75" (UID: "fbb12599-731a-426e-9032-80f723072a75"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.506981 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "fbb12599-731a-426e-9032-80f723072a75" (UID: "fbb12599-731a-426e-9032-80f723072a75"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.513730 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "fbb12599-731a-426e-9032-80f723072a75" (UID: "fbb12599-731a-426e-9032-80f723072a75"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.520116 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "fbb12599-731a-426e-9032-80f723072a75" (UID: "fbb12599-731a-426e-9032-80f723072a75"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.548282 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-inventory" (OuterVolumeSpecName: "inventory") pod "fbb12599-731a-426e-9032-80f723072a75" (UID: "fbb12599-731a-426e-9032-80f723072a75"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.551921 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fbb12599-731a-426e-9032-80f723072a75" (UID: "fbb12599-731a-426e-9032-80f723072a75"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.598571 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hk8hq\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-kube-api-access-hk8hq\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.598609 4973 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.598635 4973 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.598647 4973 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.598657 4973 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.598667 4973 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.598677 4973 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.598687 4973 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.598696 4973 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-telemetry-power-monitoring-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.598707 4973 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.598716 4973 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.598724 4973 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.598733 4973 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.598741 4973 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.598750 4973 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/fbb12599-731a-426e-9032-80f723072a75-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.598759 4973 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fbb12599-731a-426e-9032-80f723072a75-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.901043 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" event={"ID":"fbb12599-731a-426e-9032-80f723072a75","Type":"ContainerDied","Data":"8aadfd66eab902474472cc15561926d49b8dd558c735ca9f285d543ef97ae396"} Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.901093 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8aadfd66eab902474472cc15561926d49b8dd558c735ca9f285d543ef97ae396" Mar 20 14:01:36 crc kubenswrapper[4973]: I0320 14:01:36.901160 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.001042 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-w4k6b"] Mar 20 14:01:37 crc kubenswrapper[4973]: E0320 14:01:37.002535 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea80a7b-776d-452e-a25c-6c3c381a6229" containerName="registry-server" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.002561 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea80a7b-776d-452e-a25c-6c3c381a6229" containerName="registry-server" Mar 20 14:01:37 crc kubenswrapper[4973]: E0320 14:01:37.002623 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3defeff9-b2c0-4236-a0f2-c91f57208005" containerName="keystone-cron" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.002632 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="3defeff9-b2c0-4236-a0f2-c91f57208005" containerName="keystone-cron" Mar 20 14:01:37 crc kubenswrapper[4973]: E0320 14:01:37.002642 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea80a7b-776d-452e-a25c-6c3c381a6229" containerName="extract-content" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.002649 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea80a7b-776d-452e-a25c-6c3c381a6229" containerName="extract-content" Mar 20 14:01:37 crc kubenswrapper[4973]: E0320 14:01:37.002667 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0ea80a7b-776d-452e-a25c-6c3c381a6229" containerName="extract-utilities" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.002676 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="0ea80a7b-776d-452e-a25c-6c3c381a6229" containerName="extract-utilities" Mar 20 14:01:37 crc kubenswrapper[4973]: E0320 14:01:37.002741 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fbb12599-731a-426e-9032-80f723072a75" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.002751 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="fbb12599-731a-426e-9032-80f723072a75" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.003029 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="3defeff9-b2c0-4236-a0f2-c91f57208005" containerName="keystone-cron" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.003054 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="fbb12599-731a-426e-9032-80f723072a75" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.003077 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="0ea80a7b-776d-452e-a25c-6c3c381a6229" containerName="registry-server" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.004160 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w4k6b" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.006505 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.010649 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.010882 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d4wjc" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.010905 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.012850 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.016267 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-w4k6b"] Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.062719 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-bxbkc"] Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.076056 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-bxbkc"] Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.110789 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/eda7094e-499a-4027-9f8d-91360c6d9780-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w4k6b\" (UID: \"eda7094e-499a-4027-9f8d-91360c6d9780\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w4k6b" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.111025 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda7094e-499a-4027-9f8d-91360c6d9780-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w4k6b\" (UID: \"eda7094e-499a-4027-9f8d-91360c6d9780\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w4k6b" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.111161 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75kd2\" (UniqueName: \"kubernetes.io/projected/eda7094e-499a-4027-9f8d-91360c6d9780-kube-api-access-75kd2\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w4k6b\" (UID: \"eda7094e-499a-4027-9f8d-91360c6d9780\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w4k6b" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.111220 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eda7094e-499a-4027-9f8d-91360c6d9780-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w4k6b\" (UID: \"eda7094e-499a-4027-9f8d-91360c6d9780\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w4k6b" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.111340 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eda7094e-499a-4027-9f8d-91360c6d9780-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w4k6b\" (UID: \"eda7094e-499a-4027-9f8d-91360c6d9780\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w4k6b" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.212963 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eda7094e-499a-4027-9f8d-91360c6d9780-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w4k6b\" (UID: \"eda7094e-499a-4027-9f8d-91360c6d9780\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w4k6b" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.213075 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eda7094e-499a-4027-9f8d-91360c6d9780-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w4k6b\" (UID: \"eda7094e-499a-4027-9f8d-91360c6d9780\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w4k6b" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.213138 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/eda7094e-499a-4027-9f8d-91360c6d9780-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w4k6b\" (UID: \"eda7094e-499a-4027-9f8d-91360c6d9780\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w4k6b" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.213253 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda7094e-499a-4027-9f8d-91360c6d9780-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w4k6b\" (UID: \"eda7094e-499a-4027-9f8d-91360c6d9780\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w4k6b" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.213379 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75kd2\" (UniqueName: \"kubernetes.io/projected/eda7094e-499a-4027-9f8d-91360c6d9780-kube-api-access-75kd2\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w4k6b\" (UID: \"eda7094e-499a-4027-9f8d-91360c6d9780\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w4k6b" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.214519 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/eda7094e-499a-4027-9f8d-91360c6d9780-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w4k6b\" (UID: \"eda7094e-499a-4027-9f8d-91360c6d9780\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w4k6b" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.217561 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eda7094e-499a-4027-9f8d-91360c6d9780-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w4k6b\" (UID: \"eda7094e-499a-4027-9f8d-91360c6d9780\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w4k6b" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.217728 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda7094e-499a-4027-9f8d-91360c6d9780-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w4k6b\" (UID: \"eda7094e-499a-4027-9f8d-91360c6d9780\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w4k6b" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.217875 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eda7094e-499a-4027-9f8d-91360c6d9780-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w4k6b\" (UID: \"eda7094e-499a-4027-9f8d-91360c6d9780\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w4k6b" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.234761 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75kd2\" (UniqueName: \"kubernetes.io/projected/eda7094e-499a-4027-9f8d-91360c6d9780-kube-api-access-75kd2\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-w4k6b\" (UID: \"eda7094e-499a-4027-9f8d-91360c6d9780\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w4k6b" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.326633 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w4k6b" Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.898637 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-w4k6b"] Mar 20 14:01:37 crc kubenswrapper[4973]: W0320 14:01:37.908913 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeda7094e_499a_4027_9f8d_91360c6d9780.slice/crio-3bdb39d8f126c1da7fb5b4af8178dda6ba6abea0c10b13b15d6da6d5032df4bc WatchSource:0}: Error finding container 3bdb39d8f126c1da7fb5b4af8178dda6ba6abea0c10b13b15d6da6d5032df4bc: Status 404 returned error can't find the container with id 3bdb39d8f126c1da7fb5b4af8178dda6ba6abea0c10b13b15d6da6d5032df4bc Mar 20 14:01:37 crc kubenswrapper[4973]: I0320 14:01:37.963433 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ca1a0fe5-a203-41e8-814e-fec933be3407" path="/var/lib/kubelet/pods/ca1a0fe5-a203-41e8-814e-fec933be3407/volumes" Mar 20 14:01:38 crc kubenswrapper[4973]: I0320 14:01:38.922952 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w4k6b" event={"ID":"eda7094e-499a-4027-9f8d-91360c6d9780","Type":"ContainerStarted","Data":"07f32fcfac01964e1b686251fb9041b6ea2b98230bea0688ffebad0016050852"} Mar 20 14:01:38 crc kubenswrapper[4973]: I0320 14:01:38.923722 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w4k6b" event={"ID":"eda7094e-499a-4027-9f8d-91360c6d9780","Type":"ContainerStarted","Data":"3bdb39d8f126c1da7fb5b4af8178dda6ba6abea0c10b13b15d6da6d5032df4bc"} Mar 20 14:01:38 crc kubenswrapper[4973]: I0320 14:01:38.951027 4973 scope.go:117] "RemoveContainer" containerID="8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee" Mar 20 14:01:38 crc kubenswrapper[4973]: E0320 14:01:38.951315 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:01:52 crc kubenswrapper[4973]: I0320 14:01:52.950383 4973 scope.go:117] "RemoveContainer" containerID="8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee" Mar 20 14:01:52 crc kubenswrapper[4973]: E0320 14:01:52.951272 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:02:00 crc kubenswrapper[4973]: I0320 14:02:00.134569 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w4k6b" podStartSLOduration=23.608458321 podStartE2EDuration="24.134550349s" podCreationTimestamp="2026-03-20 14:01:36 +0000 UTC" firstStartedPulling="2026-03-20 14:01:37.91409756 +0000 UTC m=+2418.657767304" lastFinishedPulling="2026-03-20 14:01:38.440189588 +0000 UTC m=+2419.183859332" observedRunningTime="2026-03-20 14:01:38.950095561 +0000 UTC m=+2419.693765305" watchObservedRunningTime="2026-03-20 14:02:00.134550349 +0000 UTC m=+2440.878220093" Mar 20 14:02:00 crc kubenswrapper[4973]: I0320 14:02:00.140875 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566922-qjbt8"] Mar 20 14:02:00 crc kubenswrapper[4973]: I0320 14:02:00.143289 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566922-qjbt8" Mar 20 14:02:00 crc kubenswrapper[4973]: I0320 14:02:00.146994 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:02:00 crc kubenswrapper[4973]: I0320 14:02:00.147250 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:02:00 crc kubenswrapper[4973]: I0320 14:02:00.154863 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:02:00 crc kubenswrapper[4973]: I0320 14:02:00.167029 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566922-qjbt8"] Mar 20 14:02:00 crc kubenswrapper[4973]: I0320 14:02:00.167954 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnbjm\" (UniqueName: \"kubernetes.io/projected/f036cf8b-0120-4173-baea-19e177ab6734-kube-api-access-mnbjm\") pod \"auto-csr-approver-29566922-qjbt8\" (UID: \"f036cf8b-0120-4173-baea-19e177ab6734\") " pod="openshift-infra/auto-csr-approver-29566922-qjbt8" Mar 20 14:02:00 crc kubenswrapper[4973]: I0320 14:02:00.270994 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnbjm\" (UniqueName: \"kubernetes.io/projected/f036cf8b-0120-4173-baea-19e177ab6734-kube-api-access-mnbjm\") pod \"auto-csr-approver-29566922-qjbt8\" (UID: \"f036cf8b-0120-4173-baea-19e177ab6734\") " pod="openshift-infra/auto-csr-approver-29566922-qjbt8" Mar 20 14:02:00 crc kubenswrapper[4973]: I0320 14:02:00.295139 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnbjm\" (UniqueName: \"kubernetes.io/projected/f036cf8b-0120-4173-baea-19e177ab6734-kube-api-access-mnbjm\") pod \"auto-csr-approver-29566922-qjbt8\" (UID: \"f036cf8b-0120-4173-baea-19e177ab6734\") " pod="openshift-infra/auto-csr-approver-29566922-qjbt8" Mar 20 14:02:00 crc kubenswrapper[4973]: I0320 14:02:00.477613 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566922-qjbt8" Mar 20 14:02:00 crc kubenswrapper[4973]: I0320 14:02:00.951837 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566922-qjbt8"] Mar 20 14:02:01 crc kubenswrapper[4973]: I0320 14:02:01.033434 4973 scope.go:117] "RemoveContainer" containerID="35d105e6f6d44075fb976737535843d99d453b8916eb02a9fdc45b90c0818a3a" Mar 20 14:02:01 crc kubenswrapper[4973]: I0320 14:02:01.142693 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566922-qjbt8" event={"ID":"f036cf8b-0120-4173-baea-19e177ab6734","Type":"ContainerStarted","Data":"ef65811242df18154458b66a0e5af905ae65e0e1aadef64bc15166d348c0e532"} Mar 20 14:02:04 crc kubenswrapper[4973]: I0320 14:02:04.195522 4973 generic.go:334] "Generic (PLEG): container finished" podID="f036cf8b-0120-4173-baea-19e177ab6734" containerID="7278b8d366e9bcde8f393f319abec9ef6e5ca3584b3bf0375b95042c2a3290b1" exitCode=0 Mar 20 14:02:04 crc kubenswrapper[4973]: I0320 14:02:04.195640 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566922-qjbt8" event={"ID":"f036cf8b-0120-4173-baea-19e177ab6734","Type":"ContainerDied","Data":"7278b8d366e9bcde8f393f319abec9ef6e5ca3584b3bf0375b95042c2a3290b1"} Mar 20 14:02:05 crc kubenswrapper[4973]: I0320 14:02:05.622500 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566922-qjbt8" Mar 20 14:02:05 crc kubenswrapper[4973]: I0320 14:02:05.724314 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnbjm\" (UniqueName: \"kubernetes.io/projected/f036cf8b-0120-4173-baea-19e177ab6734-kube-api-access-mnbjm\") pod \"f036cf8b-0120-4173-baea-19e177ab6734\" (UID: \"f036cf8b-0120-4173-baea-19e177ab6734\") " Mar 20 14:02:05 crc kubenswrapper[4973]: I0320 14:02:05.742593 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f036cf8b-0120-4173-baea-19e177ab6734-kube-api-access-mnbjm" (OuterVolumeSpecName: "kube-api-access-mnbjm") pod "f036cf8b-0120-4173-baea-19e177ab6734" (UID: "f036cf8b-0120-4173-baea-19e177ab6734"). InnerVolumeSpecName "kube-api-access-mnbjm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:02:05 crc kubenswrapper[4973]: I0320 14:02:05.829717 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnbjm\" (UniqueName: \"kubernetes.io/projected/f036cf8b-0120-4173-baea-19e177ab6734-kube-api-access-mnbjm\") on node \"crc\" DevicePath \"\"" Mar 20 14:02:05 crc kubenswrapper[4973]: I0320 14:02:05.950666 4973 scope.go:117] "RemoveContainer" containerID="8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee" Mar 20 14:02:05 crc kubenswrapper[4973]: E0320 14:02:05.950939 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:02:06 crc kubenswrapper[4973]: I0320 14:02:06.220599 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566922-qjbt8" event={"ID":"f036cf8b-0120-4173-baea-19e177ab6734","Type":"ContainerDied","Data":"ef65811242df18154458b66a0e5af905ae65e0e1aadef64bc15166d348c0e532"} Mar 20 14:02:06 crc kubenswrapper[4973]: I0320 14:02:06.220634 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566922-qjbt8" Mar 20 14:02:06 crc kubenswrapper[4973]: I0320 14:02:06.220640 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef65811242df18154458b66a0e5af905ae65e0e1aadef64bc15166d348c0e532" Mar 20 14:02:06 crc kubenswrapper[4973]: I0320 14:02:06.696288 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566916-xjq6l"] Mar 20 14:02:06 crc kubenswrapper[4973]: I0320 14:02:06.709145 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566916-xjq6l"] Mar 20 14:02:07 crc kubenswrapper[4973]: I0320 14:02:07.962848 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="615a8158-d265-4082-b8c2-f342c4b9640e" path="/var/lib/kubelet/pods/615a8158-d265-4082-b8c2-f342c4b9640e/volumes" Mar 20 14:02:18 crc kubenswrapper[4973]: I0320 14:02:18.951617 4973 scope.go:117] "RemoveContainer" containerID="8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee" Mar 20 14:02:18 crc kubenswrapper[4973]: E0320 14:02:18.952447 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:02:21 crc kubenswrapper[4973]: I0320 14:02:21.037543 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-jzhs8"] Mar 20 14:02:21 crc kubenswrapper[4973]: I0320 14:02:21.052722 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-jzhs8"] Mar 20 14:02:21 crc kubenswrapper[4973]: I0320 14:02:21.965312 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0b8e77b-1b7f-47ab-817f-54a700c8d2d9" path="/var/lib/kubelet/pods/f0b8e77b-1b7f-47ab-817f-54a700c8d2d9/volumes" Mar 20 14:02:30 crc kubenswrapper[4973]: I0320 14:02:30.960110 4973 scope.go:117] "RemoveContainer" containerID="8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee" Mar 20 14:02:30 crc kubenswrapper[4973]: E0320 14:02:30.961174 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:02:38 crc kubenswrapper[4973]: I0320 14:02:38.558326 4973 generic.go:334] "Generic (PLEG): container finished" podID="eda7094e-499a-4027-9f8d-91360c6d9780" containerID="07f32fcfac01964e1b686251fb9041b6ea2b98230bea0688ffebad0016050852" exitCode=0 Mar 20 14:02:38 crc kubenswrapper[4973]: I0320 14:02:38.558406 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w4k6b" event={"ID":"eda7094e-499a-4027-9f8d-91360c6d9780","Type":"ContainerDied","Data":"07f32fcfac01964e1b686251fb9041b6ea2b98230bea0688ffebad0016050852"} Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.137756 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w4k6b" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.327646 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda7094e-499a-4027-9f8d-91360c6d9780-ovn-combined-ca-bundle\") pod \"eda7094e-499a-4027-9f8d-91360c6d9780\" (UID: \"eda7094e-499a-4027-9f8d-91360c6d9780\") " Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.327781 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eda7094e-499a-4027-9f8d-91360c6d9780-inventory\") pod \"eda7094e-499a-4027-9f8d-91360c6d9780\" (UID: \"eda7094e-499a-4027-9f8d-91360c6d9780\") " Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.328018 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/eda7094e-499a-4027-9f8d-91360c6d9780-ovncontroller-config-0\") pod \"eda7094e-499a-4027-9f8d-91360c6d9780\" (UID: \"eda7094e-499a-4027-9f8d-91360c6d9780\") " Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.328096 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eda7094e-499a-4027-9f8d-91360c6d9780-ssh-key-openstack-edpm-ipam\") pod \"eda7094e-499a-4027-9f8d-91360c6d9780\" (UID: \"eda7094e-499a-4027-9f8d-91360c6d9780\") " Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.328201 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75kd2\" (UniqueName: \"kubernetes.io/projected/eda7094e-499a-4027-9f8d-91360c6d9780-kube-api-access-75kd2\") pod \"eda7094e-499a-4027-9f8d-91360c6d9780\" (UID: \"eda7094e-499a-4027-9f8d-91360c6d9780\") " Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.334442 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eda7094e-499a-4027-9f8d-91360c6d9780-kube-api-access-75kd2" (OuterVolumeSpecName: "kube-api-access-75kd2") pod "eda7094e-499a-4027-9f8d-91360c6d9780" (UID: "eda7094e-499a-4027-9f8d-91360c6d9780"). InnerVolumeSpecName "kube-api-access-75kd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.336556 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eda7094e-499a-4027-9f8d-91360c6d9780-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "eda7094e-499a-4027-9f8d-91360c6d9780" (UID: "eda7094e-499a-4027-9f8d-91360c6d9780"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.362500 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eda7094e-499a-4027-9f8d-91360c6d9780-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "eda7094e-499a-4027-9f8d-91360c6d9780" (UID: "eda7094e-499a-4027-9f8d-91360c6d9780"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.366255 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eda7094e-499a-4027-9f8d-91360c6d9780-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "eda7094e-499a-4027-9f8d-91360c6d9780" (UID: "eda7094e-499a-4027-9f8d-91360c6d9780"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.372071 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eda7094e-499a-4027-9f8d-91360c6d9780-inventory" (OuterVolumeSpecName: "inventory") pod "eda7094e-499a-4027-9f8d-91360c6d9780" (UID: "eda7094e-499a-4027-9f8d-91360c6d9780"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.431995 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75kd2\" (UniqueName: \"kubernetes.io/projected/eda7094e-499a-4027-9f8d-91360c6d9780-kube-api-access-75kd2\") on node \"crc\" DevicePath \"\"" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.432248 4973 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eda7094e-499a-4027-9f8d-91360c6d9780-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.432355 4973 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eda7094e-499a-4027-9f8d-91360c6d9780-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.432471 4973 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/eda7094e-499a-4027-9f8d-91360c6d9780-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.432570 4973 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/eda7094e-499a-4027-9f8d-91360c6d9780-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.578230 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w4k6b" event={"ID":"eda7094e-499a-4027-9f8d-91360c6d9780","Type":"ContainerDied","Data":"3bdb39d8f126c1da7fb5b4af8178dda6ba6abea0c10b13b15d6da6d5032df4bc"} Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.578522 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bdb39d8f126c1da7fb5b4af8178dda6ba6abea0c10b13b15d6da6d5032df4bc" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.578282 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-w4k6b" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.737225 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8"] Mar 20 14:02:40 crc kubenswrapper[4973]: E0320 14:02:40.737943 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eda7094e-499a-4027-9f8d-91360c6d9780" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.737974 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="eda7094e-499a-4027-9f8d-91360c6d9780" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 20 14:02:40 crc kubenswrapper[4973]: E0320 14:02:40.738016 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f036cf8b-0120-4173-baea-19e177ab6734" containerName="oc" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.738025 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="f036cf8b-0120-4173-baea-19e177ab6734" containerName="oc" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.738316 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="eda7094e-499a-4027-9f8d-91360c6d9780" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.738381 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="f036cf8b-0120-4173-baea-19e177ab6734" containerName="oc" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.742437 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.745238 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.745261 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d4wjc" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.745272 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.745242 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.745567 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.746794 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.755530 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8"] Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.841218 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8\" (UID: \"ad27159b-a222-439e-985c-d0164bb4eb21\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.841289 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8\" (UID: \"ad27159b-a222-439e-985c-d0164bb4eb21\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.841399 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcgbm\" (UniqueName: \"kubernetes.io/projected/ad27159b-a222-439e-985c-d0164bb4eb21-kube-api-access-bcgbm\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8\" (UID: \"ad27159b-a222-439e-985c-d0164bb4eb21\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.841476 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8\" (UID: \"ad27159b-a222-439e-985c-d0164bb4eb21\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.841614 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8\" (UID: \"ad27159b-a222-439e-985c-d0164bb4eb21\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.841650 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8\" (UID: \"ad27159b-a222-439e-985c-d0164bb4eb21\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.945326 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8\" (UID: \"ad27159b-a222-439e-985c-d0164bb4eb21\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.945563 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8\" (UID: \"ad27159b-a222-439e-985c-d0164bb4eb21\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.946319 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8\" (UID: \"ad27159b-a222-439e-985c-d0164bb4eb21\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.946586 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8\" (UID: \"ad27159b-a222-439e-985c-d0164bb4eb21\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.946714 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8\" (UID: \"ad27159b-a222-439e-985c-d0164bb4eb21\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.946818 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcgbm\" (UniqueName: \"kubernetes.io/projected/ad27159b-a222-439e-985c-d0164bb4eb21-kube-api-access-bcgbm\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8\" (UID: \"ad27159b-a222-439e-985c-d0164bb4eb21\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.950456 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8\" (UID: \"ad27159b-a222-439e-985c-d0164bb4eb21\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.950549 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8\" (UID: \"ad27159b-a222-439e-985c-d0164bb4eb21\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.952389 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8\" (UID: \"ad27159b-a222-439e-985c-d0164bb4eb21\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.963387 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8\" (UID: \"ad27159b-a222-439e-985c-d0164bb4eb21\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.964005 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8\" (UID: \"ad27159b-a222-439e-985c-d0164bb4eb21\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8" Mar 20 14:02:40 crc kubenswrapper[4973]: I0320 14:02:40.968405 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcgbm\" (UniqueName: \"kubernetes.io/projected/ad27159b-a222-439e-985c-d0164bb4eb21-kube-api-access-bcgbm\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8\" (UID: \"ad27159b-a222-439e-985c-d0164bb4eb21\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8" Mar 20 14:02:41 crc kubenswrapper[4973]: I0320 14:02:41.059225 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8" Mar 20 14:02:41 crc kubenswrapper[4973]: I0320 14:02:41.585732 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8"] Mar 20 14:02:42 crc kubenswrapper[4973]: I0320 14:02:42.607218 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8" event={"ID":"ad27159b-a222-439e-985c-d0164bb4eb21","Type":"ContainerStarted","Data":"9214c7f05a10ce010cd089eff9aed20c06ea7d6a57f95c72540f951e861ab50e"} Mar 20 14:02:43 crc kubenswrapper[4973]: I0320 14:02:43.628134 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8" event={"ID":"ad27159b-a222-439e-985c-d0164bb4eb21","Type":"ContainerStarted","Data":"9d2ef815beb04154b1047b985505dcee72771f7c711983ff4209544196f20826"} Mar 20 14:02:43 crc kubenswrapper[4973]: I0320 14:02:43.673711 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8" podStartSLOduration=2.853797391 podStartE2EDuration="3.673682883s" podCreationTimestamp="2026-03-20 14:02:40 +0000 UTC" firstStartedPulling="2026-03-20 14:02:41.593866098 +0000 UTC m=+2482.337535842" lastFinishedPulling="2026-03-20 14:02:42.41375159 +0000 UTC m=+2483.157421334" observedRunningTime="2026-03-20 14:02:43.661640553 +0000 UTC m=+2484.405310297" watchObservedRunningTime="2026-03-20 14:02:43.673682883 +0000 UTC m=+2484.417352627" Mar 20 14:02:44 crc kubenswrapper[4973]: I0320 14:02:44.950355 4973 scope.go:117] "RemoveContainer" containerID="8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee" Mar 20 14:02:44 crc kubenswrapper[4973]: E0320 14:02:44.951069 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:02:57 crc kubenswrapper[4973]: I0320 14:02:57.950843 4973 scope.go:117] "RemoveContainer" containerID="8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee" Mar 20 14:02:57 crc kubenswrapper[4973]: E0320 14:02:57.951707 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:03:01 crc kubenswrapper[4973]: I0320 14:03:01.139086 4973 scope.go:117] "RemoveContainer" containerID="af88b9782db03543db3db15178949f15507b71f0fc9b6eb4b591811a54e9f89b" Mar 20 14:03:01 crc kubenswrapper[4973]: I0320 14:03:01.168093 4973 scope.go:117] "RemoveContainer" containerID="55139a2b61c755778065606c6b40bb6810f06b83d0eb1937845018b6d353eced" Mar 20 14:03:12 crc kubenswrapper[4973]: I0320 14:03:12.951293 4973 scope.go:117] "RemoveContainer" containerID="8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee" Mar 20 14:03:12 crc kubenswrapper[4973]: E0320 14:03:12.952382 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:03:27 crc kubenswrapper[4973]: I0320 14:03:27.951853 4973 scope.go:117] "RemoveContainer" containerID="8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee" Mar 20 14:03:27 crc kubenswrapper[4973]: E0320 14:03:27.952804 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:03:29 crc kubenswrapper[4973]: I0320 14:03:29.178513 4973 generic.go:334] "Generic (PLEG): container finished" podID="ad27159b-a222-439e-985c-d0164bb4eb21" containerID="9d2ef815beb04154b1047b985505dcee72771f7c711983ff4209544196f20826" exitCode=0 Mar 20 14:03:29 crc kubenswrapper[4973]: I0320 14:03:29.178599 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8" event={"ID":"ad27159b-a222-439e-985c-d0164bb4eb21","Type":"ContainerDied","Data":"9d2ef815beb04154b1047b985505dcee72771f7c711983ff4209544196f20826"} Mar 20 14:03:30 crc kubenswrapper[4973]: I0320 14:03:30.745303 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8" Mar 20 14:03:30 crc kubenswrapper[4973]: I0320 14:03:30.874577 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcgbm\" (UniqueName: \"kubernetes.io/projected/ad27159b-a222-439e-985c-d0164bb4eb21-kube-api-access-bcgbm\") pod \"ad27159b-a222-439e-985c-d0164bb4eb21\" (UID: \"ad27159b-a222-439e-985c-d0164bb4eb21\") " Mar 20 14:03:30 crc kubenswrapper[4973]: I0320 14:03:30.874653 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-inventory\") pod \"ad27159b-a222-439e-985c-d0164bb4eb21\" (UID: \"ad27159b-a222-439e-985c-d0164bb4eb21\") " Mar 20 14:03:30 crc kubenswrapper[4973]: I0320 14:03:30.874913 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-neutron-ovn-metadata-agent-neutron-config-0\") pod \"ad27159b-a222-439e-985c-d0164bb4eb21\" (UID: \"ad27159b-a222-439e-985c-d0164bb4eb21\") " Mar 20 14:03:30 crc kubenswrapper[4973]: I0320 14:03:30.874953 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-neutron-metadata-combined-ca-bundle\") pod \"ad27159b-a222-439e-985c-d0164bb4eb21\" (UID: \"ad27159b-a222-439e-985c-d0164bb4eb21\") " Mar 20 14:03:30 crc kubenswrapper[4973]: I0320 14:03:30.875001 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-nova-metadata-neutron-config-0\") pod \"ad27159b-a222-439e-985c-d0164bb4eb21\" (UID: \"ad27159b-a222-439e-985c-d0164bb4eb21\") " Mar 20 14:03:30 crc kubenswrapper[4973]: I0320 14:03:30.875028 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-ssh-key-openstack-edpm-ipam\") pod \"ad27159b-a222-439e-985c-d0164bb4eb21\" (UID: \"ad27159b-a222-439e-985c-d0164bb4eb21\") " Mar 20 14:03:30 crc kubenswrapper[4973]: I0320 14:03:30.881145 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad27159b-a222-439e-985c-d0164bb4eb21-kube-api-access-bcgbm" (OuterVolumeSpecName: "kube-api-access-bcgbm") pod "ad27159b-a222-439e-985c-d0164bb4eb21" (UID: "ad27159b-a222-439e-985c-d0164bb4eb21"). InnerVolumeSpecName "kube-api-access-bcgbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:03:30 crc kubenswrapper[4973]: I0320 14:03:30.882451 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "ad27159b-a222-439e-985c-d0164bb4eb21" (UID: "ad27159b-a222-439e-985c-d0164bb4eb21"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:03:30 crc kubenswrapper[4973]: I0320 14:03:30.910404 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "ad27159b-a222-439e-985c-d0164bb4eb21" (UID: "ad27159b-a222-439e-985c-d0164bb4eb21"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:03:30 crc kubenswrapper[4973]: I0320 14:03:30.910538 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-inventory" (OuterVolumeSpecName: "inventory") pod "ad27159b-a222-439e-985c-d0164bb4eb21" (UID: "ad27159b-a222-439e-985c-d0164bb4eb21"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:03:30 crc kubenswrapper[4973]: I0320 14:03:30.913735 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ad27159b-a222-439e-985c-d0164bb4eb21" (UID: "ad27159b-a222-439e-985c-d0164bb4eb21"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:03:30 crc kubenswrapper[4973]: I0320 14:03:30.915544 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "ad27159b-a222-439e-985c-d0164bb4eb21" (UID: "ad27159b-a222-439e-985c-d0164bb4eb21"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:03:30 crc kubenswrapper[4973]: I0320 14:03:30.988419 4973 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 14:03:30 crc kubenswrapper[4973]: I0320 14:03:30.990140 4973 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:03:30 crc kubenswrapper[4973]: I0320 14:03:30.990250 4973 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 14:03:30 crc kubenswrapper[4973]: I0320 14:03:30.990322 4973 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 14:03:30 crc kubenswrapper[4973]: I0320 14:03:30.990421 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcgbm\" (UniqueName: \"kubernetes.io/projected/ad27159b-a222-439e-985c-d0164bb4eb21-kube-api-access-bcgbm\") on node \"crc\" DevicePath \"\"" Mar 20 14:03:30 crc kubenswrapper[4973]: I0320 14:03:30.990493 4973 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ad27159b-a222-439e-985c-d0164bb4eb21-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.205932 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8" Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.205842 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8" event={"ID":"ad27159b-a222-439e-985c-d0164bb4eb21","Type":"ContainerDied","Data":"9214c7f05a10ce010cd089eff9aed20c06ea7d6a57f95c72540f951e861ab50e"} Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.219520 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9214c7f05a10ce010cd089eff9aed20c06ea7d6a57f95c72540f951e861ab50e" Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.293461 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc"] Mar 20 14:03:31 crc kubenswrapper[4973]: E0320 14:03:31.293997 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad27159b-a222-439e-985c-d0164bb4eb21" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.294016 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad27159b-a222-439e-985c-d0164bb4eb21" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.294263 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad27159b-a222-439e-985c-d0164bb4eb21" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.295105 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc" Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.297141 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.298077 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.298155 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.298215 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.298082 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d4wjc" Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.307312 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc"] Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.399206 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc\" (UID: \"3fa3f71f-ea87-4f03-9499-d0c4bea49c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc" Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.399329 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc\" (UID: \"3fa3f71f-ea87-4f03-9499-d0c4bea49c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc" Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.399533 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc\" (UID: \"3fa3f71f-ea87-4f03-9499-d0c4bea49c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc" Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.399611 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc\" (UID: \"3fa3f71f-ea87-4f03-9499-d0c4bea49c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc" Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.399718 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pz46m\" (UniqueName: \"kubernetes.io/projected/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-kube-api-access-pz46m\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc\" (UID: \"3fa3f71f-ea87-4f03-9499-d0c4bea49c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc" Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.502230 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc\" (UID: \"3fa3f71f-ea87-4f03-9499-d0c4bea49c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc" Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.502302 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc\" (UID: \"3fa3f71f-ea87-4f03-9499-d0c4bea49c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc" Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.502369 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc\" (UID: \"3fa3f71f-ea87-4f03-9499-d0c4bea49c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc" Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.502391 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc\" (UID: \"3fa3f71f-ea87-4f03-9499-d0c4bea49c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc" Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.502427 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pz46m\" (UniqueName: \"kubernetes.io/projected/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-kube-api-access-pz46m\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc\" (UID: \"3fa3f71f-ea87-4f03-9499-d0c4bea49c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc" Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.506175 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc\" (UID: \"3fa3f71f-ea87-4f03-9499-d0c4bea49c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc" Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.506205 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc\" (UID: \"3fa3f71f-ea87-4f03-9499-d0c4bea49c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc" Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.506420 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc\" (UID: \"3fa3f71f-ea87-4f03-9499-d0c4bea49c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc" Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.507382 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc\" (UID: \"3fa3f71f-ea87-4f03-9499-d0c4bea49c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc" Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.520886 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pz46m\" (UniqueName: \"kubernetes.io/projected/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-kube-api-access-pz46m\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc\" (UID: \"3fa3f71f-ea87-4f03-9499-d0c4bea49c04\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc" Mar 20 14:03:31 crc kubenswrapper[4973]: I0320 14:03:31.625605 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc" Mar 20 14:03:32 crc kubenswrapper[4973]: I0320 14:03:32.176476 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc"] Mar 20 14:03:32 crc kubenswrapper[4973]: I0320 14:03:32.223009 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc" event={"ID":"3fa3f71f-ea87-4f03-9499-d0c4bea49c04","Type":"ContainerStarted","Data":"610db4148222cb6b7af1fc445b334aa32adaf0ca50e99aff5d4d5512cdf71e15"} Mar 20 14:03:33 crc kubenswrapper[4973]: I0320 14:03:33.234071 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc" event={"ID":"3fa3f71f-ea87-4f03-9499-d0c4bea49c04","Type":"ContainerStarted","Data":"753966ef6a8d9767f3f9a3bda8818be8a8b517fadadc14b4a753f94739261abb"} Mar 20 14:03:33 crc kubenswrapper[4973]: I0320 14:03:33.262714 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc" podStartSLOduration=1.787946012 podStartE2EDuration="2.262691167s" podCreationTimestamp="2026-03-20 14:03:31 +0000 UTC" firstStartedPulling="2026-03-20 14:03:32.179688834 +0000 UTC m=+2532.923358578" lastFinishedPulling="2026-03-20 14:03:32.654433979 +0000 UTC m=+2533.398103733" observedRunningTime="2026-03-20 14:03:33.248010544 +0000 UTC m=+2533.991680308" watchObservedRunningTime="2026-03-20 14:03:33.262691167 +0000 UTC m=+2534.006360911" Mar 20 14:03:38 crc kubenswrapper[4973]: I0320 14:03:38.951621 4973 scope.go:117] "RemoveContainer" containerID="8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee" Mar 20 14:03:38 crc kubenswrapper[4973]: E0320 14:03:38.952836 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:03:52 crc kubenswrapper[4973]: I0320 14:03:52.950611 4973 scope.go:117] "RemoveContainer" containerID="8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee" Mar 20 14:03:52 crc kubenswrapper[4973]: E0320 14:03:52.951398 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:04:00 crc kubenswrapper[4973]: I0320 14:04:00.187887 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566924-c94kj"] Mar 20 14:04:00 crc kubenswrapper[4973]: I0320 14:04:00.190115 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566924-c94kj" Mar 20 14:04:00 crc kubenswrapper[4973]: I0320 14:04:00.193076 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:04:00 crc kubenswrapper[4973]: I0320 14:04:00.193963 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:04:00 crc kubenswrapper[4973]: I0320 14:04:00.198474 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:04:00 crc kubenswrapper[4973]: I0320 14:04:00.200387 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566924-c94kj"] Mar 20 14:04:00 crc kubenswrapper[4973]: I0320 14:04:00.327172 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cfzk2\" (UniqueName: \"kubernetes.io/projected/3be1f4bf-b422-46df-a681-53453caf47bb-kube-api-access-cfzk2\") pod \"auto-csr-approver-29566924-c94kj\" (UID: \"3be1f4bf-b422-46df-a681-53453caf47bb\") " pod="openshift-infra/auto-csr-approver-29566924-c94kj" Mar 20 14:04:00 crc kubenswrapper[4973]: I0320 14:04:00.429396 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cfzk2\" (UniqueName: \"kubernetes.io/projected/3be1f4bf-b422-46df-a681-53453caf47bb-kube-api-access-cfzk2\") pod \"auto-csr-approver-29566924-c94kj\" (UID: \"3be1f4bf-b422-46df-a681-53453caf47bb\") " pod="openshift-infra/auto-csr-approver-29566924-c94kj" Mar 20 14:04:00 crc kubenswrapper[4973]: I0320 14:04:00.448013 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cfzk2\" (UniqueName: \"kubernetes.io/projected/3be1f4bf-b422-46df-a681-53453caf47bb-kube-api-access-cfzk2\") pod \"auto-csr-approver-29566924-c94kj\" (UID: \"3be1f4bf-b422-46df-a681-53453caf47bb\") " pod="openshift-infra/auto-csr-approver-29566924-c94kj" Mar 20 14:04:00 crc kubenswrapper[4973]: I0320 14:04:00.513642 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566924-c94kj" Mar 20 14:04:00 crc kubenswrapper[4973]: I0320 14:04:00.996639 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566924-c94kj"] Mar 20 14:04:01 crc kubenswrapper[4973]: I0320 14:04:01.529257 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566924-c94kj" event={"ID":"3be1f4bf-b422-46df-a681-53453caf47bb","Type":"ContainerStarted","Data":"33e2b56264d26b695ebafdf34b2eda3753d57c1832d405ead57acfa7bcd4e490"} Mar 20 14:04:02 crc kubenswrapper[4973]: I0320 14:04:02.539867 4973 generic.go:334] "Generic (PLEG): container finished" podID="3be1f4bf-b422-46df-a681-53453caf47bb" containerID="81c24dc12fc06cc4d163d004e4cac1e0738a8d150ad590fbe9ad5ee130c08875" exitCode=0 Mar 20 14:04:02 crc kubenswrapper[4973]: I0320 14:04:02.539912 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566924-c94kj" event={"ID":"3be1f4bf-b422-46df-a681-53453caf47bb","Type":"ContainerDied","Data":"81c24dc12fc06cc4d163d004e4cac1e0738a8d150ad590fbe9ad5ee130c08875"} Mar 20 14:04:03 crc kubenswrapper[4973]: I0320 14:04:03.953812 4973 scope.go:117] "RemoveContainer" containerID="8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee" Mar 20 14:04:03 crc kubenswrapper[4973]: E0320 14:04:03.954354 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:04:03 crc kubenswrapper[4973]: I0320 14:04:03.983605 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566924-c94kj" Mar 20 14:04:04 crc kubenswrapper[4973]: I0320 14:04:04.115793 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfzk2\" (UniqueName: \"kubernetes.io/projected/3be1f4bf-b422-46df-a681-53453caf47bb-kube-api-access-cfzk2\") pod \"3be1f4bf-b422-46df-a681-53453caf47bb\" (UID: \"3be1f4bf-b422-46df-a681-53453caf47bb\") " Mar 20 14:04:04 crc kubenswrapper[4973]: I0320 14:04:04.131841 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3be1f4bf-b422-46df-a681-53453caf47bb-kube-api-access-cfzk2" (OuterVolumeSpecName: "kube-api-access-cfzk2") pod "3be1f4bf-b422-46df-a681-53453caf47bb" (UID: "3be1f4bf-b422-46df-a681-53453caf47bb"). InnerVolumeSpecName "kube-api-access-cfzk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:04:04 crc kubenswrapper[4973]: I0320 14:04:04.220000 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfzk2\" (UniqueName: \"kubernetes.io/projected/3be1f4bf-b422-46df-a681-53453caf47bb-kube-api-access-cfzk2\") on node \"crc\" DevicePath \"\"" Mar 20 14:04:04 crc kubenswrapper[4973]: I0320 14:04:04.565247 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566924-c94kj" event={"ID":"3be1f4bf-b422-46df-a681-53453caf47bb","Type":"ContainerDied","Data":"33e2b56264d26b695ebafdf34b2eda3753d57c1832d405ead57acfa7bcd4e490"} Mar 20 14:04:04 crc kubenswrapper[4973]: I0320 14:04:04.565290 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33e2b56264d26b695ebafdf34b2eda3753d57c1832d405ead57acfa7bcd4e490" Mar 20 14:04:04 crc kubenswrapper[4973]: I0320 14:04:04.565300 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566924-c94kj" Mar 20 14:04:05 crc kubenswrapper[4973]: I0320 14:04:05.066242 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566918-hj4mt"] Mar 20 14:04:05 crc kubenswrapper[4973]: I0320 14:04:05.078560 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566918-hj4mt"] Mar 20 14:04:05 crc kubenswrapper[4973]: I0320 14:04:05.964562 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f20225fb-dbbe-455f-8125-634ba343eae7" path="/var/lib/kubelet/pods/f20225fb-dbbe-455f-8125-634ba343eae7/volumes" Mar 20 14:04:15 crc kubenswrapper[4973]: I0320 14:04:15.263124 4973 scope.go:117] "RemoveContainer" containerID="8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee" Mar 20 14:04:15 crc kubenswrapper[4973]: E0320 14:04:15.264285 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:04:28 crc kubenswrapper[4973]: I0320 14:04:28.951550 4973 scope.go:117] "RemoveContainer" containerID="8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee" Mar 20 14:04:28 crc kubenswrapper[4973]: E0320 14:04:28.952333 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:04:43 crc kubenswrapper[4973]: I0320 14:04:43.951087 4973 scope.go:117] "RemoveContainer" containerID="8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee" Mar 20 14:04:44 crc kubenswrapper[4973]: I0320 14:04:44.709256 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerStarted","Data":"b01dc0fd273ee1427bf99791c14487bb944cb9cd6409ef35b5ca274d472b4fca"} Mar 20 14:05:01 crc kubenswrapper[4973]: I0320 14:05:01.315452 4973 scope.go:117] "RemoveContainer" containerID="8864e77ac616f05259cc17aefd544daa37850026e094a5d998a441eebfcc8ed9" Mar 20 14:06:00 crc kubenswrapper[4973]: I0320 14:06:00.147886 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566926-g6rlw"] Mar 20 14:06:00 crc kubenswrapper[4973]: E0320 14:06:00.149023 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3be1f4bf-b422-46df-a681-53453caf47bb" containerName="oc" Mar 20 14:06:00 crc kubenswrapper[4973]: I0320 14:06:00.149042 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="3be1f4bf-b422-46df-a681-53453caf47bb" containerName="oc" Mar 20 14:06:00 crc kubenswrapper[4973]: I0320 14:06:00.151622 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="3be1f4bf-b422-46df-a681-53453caf47bb" containerName="oc" Mar 20 14:06:00 crc kubenswrapper[4973]: I0320 14:06:00.152806 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566926-g6rlw" Mar 20 14:06:00 crc kubenswrapper[4973]: I0320 14:06:00.154984 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:06:00 crc kubenswrapper[4973]: I0320 14:06:00.155270 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:06:00 crc kubenswrapper[4973]: I0320 14:06:00.155818 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:06:00 crc kubenswrapper[4973]: I0320 14:06:00.168556 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566926-g6rlw"] Mar 20 14:06:00 crc kubenswrapper[4973]: I0320 14:06:00.328843 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlvz8\" (UniqueName: \"kubernetes.io/projected/da4eb1a8-7a68-4ad6-bc1d-0a316f90845e-kube-api-access-wlvz8\") pod \"auto-csr-approver-29566926-g6rlw\" (UID: \"da4eb1a8-7a68-4ad6-bc1d-0a316f90845e\") " pod="openshift-infra/auto-csr-approver-29566926-g6rlw" Mar 20 14:06:00 crc kubenswrapper[4973]: I0320 14:06:00.431330 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlvz8\" (UniqueName: \"kubernetes.io/projected/da4eb1a8-7a68-4ad6-bc1d-0a316f90845e-kube-api-access-wlvz8\") pod \"auto-csr-approver-29566926-g6rlw\" (UID: \"da4eb1a8-7a68-4ad6-bc1d-0a316f90845e\") " pod="openshift-infra/auto-csr-approver-29566926-g6rlw" Mar 20 14:06:00 crc kubenswrapper[4973]: I0320 14:06:00.453014 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlvz8\" (UniqueName: \"kubernetes.io/projected/da4eb1a8-7a68-4ad6-bc1d-0a316f90845e-kube-api-access-wlvz8\") pod \"auto-csr-approver-29566926-g6rlw\" (UID: \"da4eb1a8-7a68-4ad6-bc1d-0a316f90845e\") " pod="openshift-infra/auto-csr-approver-29566926-g6rlw" Mar 20 14:06:00 crc kubenswrapper[4973]: I0320 14:06:00.473066 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566926-g6rlw" Mar 20 14:06:00 crc kubenswrapper[4973]: I0320 14:06:00.960411 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566926-g6rlw"] Mar 20 14:06:01 crc kubenswrapper[4973]: I0320 14:06:01.400155 4973 scope.go:117] "RemoveContainer" containerID="a20440c8066a1e971a8ad2df2402f2070e5af9a4c1ff9abba1b4ee7e5d505fab" Mar 20 14:06:01 crc kubenswrapper[4973]: I0320 14:06:01.426975 4973 scope.go:117] "RemoveContainer" containerID="2f47e887535ec8f4317e6663c17568cc516fefb18728842fcbc21f99bc91e26e" Mar 20 14:06:01 crc kubenswrapper[4973]: I0320 14:06:01.493474 4973 scope.go:117] "RemoveContainer" containerID="ec11a5c492c85996eee861fc6685326cc4e265f58f9e1452853b3f2197e4afb5" Mar 20 14:06:01 crc kubenswrapper[4973]: I0320 14:06:01.535225 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566926-g6rlw" event={"ID":"da4eb1a8-7a68-4ad6-bc1d-0a316f90845e","Type":"ContainerStarted","Data":"32e74b3e3728bfc8603da0e4028a81b27a20a9bbc96f3bd8953c4994ece8e37b"} Mar 20 14:06:02 crc kubenswrapper[4973]: I0320 14:06:02.562768 4973 generic.go:334] "Generic (PLEG): container finished" podID="da4eb1a8-7a68-4ad6-bc1d-0a316f90845e" containerID="9ccefe0a323465cd3ccc7c71536d48863bf442ce88722931b58b6f715192ce18" exitCode=0 Mar 20 14:06:02 crc kubenswrapper[4973]: I0320 14:06:02.563132 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566926-g6rlw" event={"ID":"da4eb1a8-7a68-4ad6-bc1d-0a316f90845e","Type":"ContainerDied","Data":"9ccefe0a323465cd3ccc7c71536d48863bf442ce88722931b58b6f715192ce18"} Mar 20 14:06:03 crc kubenswrapper[4973]: I0320 14:06:03.963326 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566926-g6rlw" Mar 20 14:06:04 crc kubenswrapper[4973]: I0320 14:06:04.120748 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wlvz8\" (UniqueName: \"kubernetes.io/projected/da4eb1a8-7a68-4ad6-bc1d-0a316f90845e-kube-api-access-wlvz8\") pod \"da4eb1a8-7a68-4ad6-bc1d-0a316f90845e\" (UID: \"da4eb1a8-7a68-4ad6-bc1d-0a316f90845e\") " Mar 20 14:06:04 crc kubenswrapper[4973]: I0320 14:06:04.125844 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da4eb1a8-7a68-4ad6-bc1d-0a316f90845e-kube-api-access-wlvz8" (OuterVolumeSpecName: "kube-api-access-wlvz8") pod "da4eb1a8-7a68-4ad6-bc1d-0a316f90845e" (UID: "da4eb1a8-7a68-4ad6-bc1d-0a316f90845e"). InnerVolumeSpecName "kube-api-access-wlvz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:06:04 crc kubenswrapper[4973]: I0320 14:06:04.224566 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wlvz8\" (UniqueName: \"kubernetes.io/projected/da4eb1a8-7a68-4ad6-bc1d-0a316f90845e-kube-api-access-wlvz8\") on node \"crc\" DevicePath \"\"" Mar 20 14:06:04 crc kubenswrapper[4973]: I0320 14:06:04.583737 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566926-g6rlw" event={"ID":"da4eb1a8-7a68-4ad6-bc1d-0a316f90845e","Type":"ContainerDied","Data":"32e74b3e3728bfc8603da0e4028a81b27a20a9bbc96f3bd8953c4994ece8e37b"} Mar 20 14:06:04 crc kubenswrapper[4973]: I0320 14:06:04.583779 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32e74b3e3728bfc8603da0e4028a81b27a20a9bbc96f3bd8953c4994ece8e37b" Mar 20 14:06:04 crc kubenswrapper[4973]: I0320 14:06:04.583803 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566926-g6rlw" Mar 20 14:06:05 crc kubenswrapper[4973]: I0320 14:06:05.038260 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566920-dtwt4"] Mar 20 14:06:05 crc kubenswrapper[4973]: I0320 14:06:05.047680 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566920-dtwt4"] Mar 20 14:06:05 crc kubenswrapper[4973]: I0320 14:06:05.967253 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a9c0f80-a3b9-45c7-bd8b-de8af58b2a1b" path="/var/lib/kubelet/pods/6a9c0f80-a3b9-45c7-bd8b-de8af58b2a1b/volumes" Mar 20 14:07:01 crc kubenswrapper[4973]: I0320 14:07:01.552915 4973 scope.go:117] "RemoveContainer" containerID="a713851d3065a685db42e5e476ea526be4fccefa9e1391fffa39c5435b796fd5" Mar 20 14:07:13 crc kubenswrapper[4973]: I0320 14:07:13.321124 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:07:13 crc kubenswrapper[4973]: I0320 14:07:13.323058 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:07:31 crc kubenswrapper[4973]: I0320 14:07:31.664980 4973 generic.go:334] "Generic (PLEG): container finished" podID="3fa3f71f-ea87-4f03-9499-d0c4bea49c04" containerID="753966ef6a8d9767f3f9a3bda8818be8a8b517fadadc14b4a753f94739261abb" exitCode=0 Mar 20 14:07:31 crc kubenswrapper[4973]: I0320 14:07:31.665070 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc" event={"ID":"3fa3f71f-ea87-4f03-9499-d0c4bea49c04","Type":"ContainerDied","Data":"753966ef6a8d9767f3f9a3bda8818be8a8b517fadadc14b4a753f94739261abb"} Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.207256 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.286236 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-libvirt-secret-0\") pod \"3fa3f71f-ea87-4f03-9499-d0c4bea49c04\" (UID: \"3fa3f71f-ea87-4f03-9499-d0c4bea49c04\") " Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.286389 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-ssh-key-openstack-edpm-ipam\") pod \"3fa3f71f-ea87-4f03-9499-d0c4bea49c04\" (UID: \"3fa3f71f-ea87-4f03-9499-d0c4bea49c04\") " Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.286666 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-inventory\") pod \"3fa3f71f-ea87-4f03-9499-d0c4bea49c04\" (UID: \"3fa3f71f-ea87-4f03-9499-d0c4bea49c04\") " Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.286699 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pz46m\" (UniqueName: \"kubernetes.io/projected/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-kube-api-access-pz46m\") pod \"3fa3f71f-ea87-4f03-9499-d0c4bea49c04\" (UID: \"3fa3f71f-ea87-4f03-9499-d0c4bea49c04\") " Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.286877 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-libvirt-combined-ca-bundle\") pod \"3fa3f71f-ea87-4f03-9499-d0c4bea49c04\" (UID: \"3fa3f71f-ea87-4f03-9499-d0c4bea49c04\") " Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.294880 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-kube-api-access-pz46m" (OuterVolumeSpecName: "kube-api-access-pz46m") pod "3fa3f71f-ea87-4f03-9499-d0c4bea49c04" (UID: "3fa3f71f-ea87-4f03-9499-d0c4bea49c04"). InnerVolumeSpecName "kube-api-access-pz46m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.296829 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "3fa3f71f-ea87-4f03-9499-d0c4bea49c04" (UID: "3fa3f71f-ea87-4f03-9499-d0c4bea49c04"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.322679 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-inventory" (OuterVolumeSpecName: "inventory") pod "3fa3f71f-ea87-4f03-9499-d0c4bea49c04" (UID: "3fa3f71f-ea87-4f03-9499-d0c4bea49c04"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.334078 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3fa3f71f-ea87-4f03-9499-d0c4bea49c04" (UID: "3fa3f71f-ea87-4f03-9499-d0c4bea49c04"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.345956 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "3fa3f71f-ea87-4f03-9499-d0c4bea49c04" (UID: "3fa3f71f-ea87-4f03-9499-d0c4bea49c04"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.391737 4973 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.392638 4973 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.392660 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pz46m\" (UniqueName: \"kubernetes.io/projected/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-kube-api-access-pz46m\") on node \"crc\" DevicePath \"\"" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.392670 4973 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.392679 4973 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3fa3f71f-ea87-4f03-9499-d0c4bea49c04-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.696006 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc" event={"ID":"3fa3f71f-ea87-4f03-9499-d0c4bea49c04","Type":"ContainerDied","Data":"610db4148222cb6b7af1fc445b334aa32adaf0ca50e99aff5d4d5512cdf71e15"} Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.696049 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="610db4148222cb6b7af1fc445b334aa32adaf0ca50e99aff5d4d5512cdf71e15" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.696146 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.789725 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv"] Mar 20 14:07:33 crc kubenswrapper[4973]: E0320 14:07:33.792803 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fa3f71f-ea87-4f03-9499-d0c4bea49c04" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.792837 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fa3f71f-ea87-4f03-9499-d0c4bea49c04" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 20 14:07:33 crc kubenswrapper[4973]: E0320 14:07:33.792872 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="da4eb1a8-7a68-4ad6-bc1d-0a316f90845e" containerName="oc" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.792881 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="da4eb1a8-7a68-4ad6-bc1d-0a316f90845e" containerName="oc" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.793162 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fa3f71f-ea87-4f03-9499-d0c4bea49c04" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.793213 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="da4eb1a8-7a68-4ad6-bc1d-0a316f90845e" containerName="oc" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.794234 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.796852 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.797083 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.797305 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.797387 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.797574 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d4wjc" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.797744 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.799827 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.817042 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv"] Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.904361 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.904446 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.904470 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.904527 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.904584 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.904604 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.904689 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.905081 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.905502 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.905660 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:33 crc kubenswrapper[4973]: I0320 14:07:33.905690 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhggl\" (UniqueName: \"kubernetes.io/projected/f2cb943f-f811-4eea-b860-1c19a6137dbb-kube-api-access-mhggl\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:34 crc kubenswrapper[4973]: I0320 14:07:34.007538 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:34 crc kubenswrapper[4973]: I0320 14:07:34.007584 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:34 crc kubenswrapper[4973]: I0320 14:07:34.007604 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:34 crc kubenswrapper[4973]: I0320 14:07:34.007717 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:34 crc kubenswrapper[4973]: I0320 14:07:34.007768 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:34 crc kubenswrapper[4973]: I0320 14:07:34.007804 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:34 crc kubenswrapper[4973]: I0320 14:07:34.007824 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhggl\" (UniqueName: \"kubernetes.io/projected/f2cb943f-f811-4eea-b860-1c19a6137dbb-kube-api-access-mhggl\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:34 crc kubenswrapper[4973]: I0320 14:07:34.007872 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:34 crc kubenswrapper[4973]: I0320 14:07:34.007927 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:34 crc kubenswrapper[4973]: I0320 14:07:34.007956 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:34 crc kubenswrapper[4973]: I0320 14:07:34.008105 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:34 crc kubenswrapper[4973]: I0320 14:07:34.009042 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:34 crc kubenswrapper[4973]: I0320 14:07:34.013275 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:34 crc kubenswrapper[4973]: I0320 14:07:34.013863 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:34 crc kubenswrapper[4973]: I0320 14:07:34.015494 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:34 crc kubenswrapper[4973]: I0320 14:07:34.015837 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:34 crc kubenswrapper[4973]: I0320 14:07:34.015913 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:34 crc kubenswrapper[4973]: I0320 14:07:34.016061 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:34 crc kubenswrapper[4973]: I0320 14:07:34.016914 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:34 crc kubenswrapper[4973]: I0320 14:07:34.017218 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:34 crc kubenswrapper[4973]: I0320 14:07:34.019018 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:34 crc kubenswrapper[4973]: I0320 14:07:34.025084 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhggl\" (UniqueName: \"kubernetes.io/projected/f2cb943f-f811-4eea-b860-1c19a6137dbb-kube-api-access-mhggl\") pod \"nova-edpm-deployment-openstack-edpm-ipam-zrpbv\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:34 crc kubenswrapper[4973]: I0320 14:07:34.124871 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:07:34 crc kubenswrapper[4973]: I0320 14:07:34.737460 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv"] Mar 20 14:07:34 crc kubenswrapper[4973]: I0320 14:07:34.748172 4973 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:07:35 crc kubenswrapper[4973]: I0320 14:07:35.719684 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" event={"ID":"f2cb943f-f811-4eea-b860-1c19a6137dbb","Type":"ContainerStarted","Data":"9a963f0d17603e82c9076879e437a45a930265c82058514a36fd7edd5886fba7"} Mar 20 14:07:36 crc kubenswrapper[4973]: I0320 14:07:36.739560 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" event={"ID":"f2cb943f-f811-4eea-b860-1c19a6137dbb","Type":"ContainerStarted","Data":"43f23358c3636e303074d8f658b2311591da53a4d914e31c5dcb43c401c13edf"} Mar 20 14:07:36 crc kubenswrapper[4973]: I0320 14:07:36.770196 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" podStartSLOduration=2.950335306 podStartE2EDuration="3.770177198s" podCreationTimestamp="2026-03-20 14:07:33 +0000 UTC" firstStartedPulling="2026-03-20 14:07:34.747872655 +0000 UTC m=+2775.491542399" lastFinishedPulling="2026-03-20 14:07:35.567714547 +0000 UTC m=+2776.311384291" observedRunningTime="2026-03-20 14:07:36.767064563 +0000 UTC m=+2777.510734307" watchObservedRunningTime="2026-03-20 14:07:36.770177198 +0000 UTC m=+2777.513846942" Mar 20 14:07:43 crc kubenswrapper[4973]: I0320 14:07:43.320794 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:07:43 crc kubenswrapper[4973]: I0320 14:07:43.321443 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:08:00 crc kubenswrapper[4973]: I0320 14:08:00.133779 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566928-98vqm"] Mar 20 14:08:00 crc kubenswrapper[4973]: I0320 14:08:00.136146 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566928-98vqm" Mar 20 14:08:00 crc kubenswrapper[4973]: I0320 14:08:00.140254 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:08:00 crc kubenswrapper[4973]: I0320 14:08:00.140670 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:08:00 crc kubenswrapper[4973]: I0320 14:08:00.140867 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:08:00 crc kubenswrapper[4973]: I0320 14:08:00.164894 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566928-98vqm"] Mar 20 14:08:00 crc kubenswrapper[4973]: I0320 14:08:00.304178 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6fch\" (UniqueName: \"kubernetes.io/projected/53b986db-231a-4811-b651-7f7c242f0b44-kube-api-access-v6fch\") pod \"auto-csr-approver-29566928-98vqm\" (UID: \"53b986db-231a-4811-b651-7f7c242f0b44\") " pod="openshift-infra/auto-csr-approver-29566928-98vqm" Mar 20 14:08:00 crc kubenswrapper[4973]: I0320 14:08:00.405857 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6fch\" (UniqueName: \"kubernetes.io/projected/53b986db-231a-4811-b651-7f7c242f0b44-kube-api-access-v6fch\") pod \"auto-csr-approver-29566928-98vqm\" (UID: \"53b986db-231a-4811-b651-7f7c242f0b44\") " pod="openshift-infra/auto-csr-approver-29566928-98vqm" Mar 20 14:08:00 crc kubenswrapper[4973]: I0320 14:08:00.424192 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6fch\" (UniqueName: \"kubernetes.io/projected/53b986db-231a-4811-b651-7f7c242f0b44-kube-api-access-v6fch\") pod \"auto-csr-approver-29566928-98vqm\" (UID: \"53b986db-231a-4811-b651-7f7c242f0b44\") " pod="openshift-infra/auto-csr-approver-29566928-98vqm" Mar 20 14:08:00 crc kubenswrapper[4973]: I0320 14:08:00.460253 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566928-98vqm" Mar 20 14:08:00 crc kubenswrapper[4973]: I0320 14:08:00.925776 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566928-98vqm"] Mar 20 14:08:00 crc kubenswrapper[4973]: I0320 14:08:00.977750 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566928-98vqm" event={"ID":"53b986db-231a-4811-b651-7f7c242f0b44","Type":"ContainerStarted","Data":"1e0f6a2523e0add3ba10694ba7ff239b6653c989ba7a4ba50218260f89fa917e"} Mar 20 14:08:03 crc kubenswrapper[4973]: I0320 14:08:03.005643 4973 generic.go:334] "Generic (PLEG): container finished" podID="53b986db-231a-4811-b651-7f7c242f0b44" containerID="5e5c94cebc85c407bb033bcb272b3d87e43ce6f2747036ba81692afbb5fc42cb" exitCode=0 Mar 20 14:08:03 crc kubenswrapper[4973]: I0320 14:08:03.005712 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566928-98vqm" event={"ID":"53b986db-231a-4811-b651-7f7c242f0b44","Type":"ContainerDied","Data":"5e5c94cebc85c407bb033bcb272b3d87e43ce6f2747036ba81692afbb5fc42cb"} Mar 20 14:08:04 crc kubenswrapper[4973]: I0320 14:08:04.433140 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566928-98vqm" Mar 20 14:08:04 crc kubenswrapper[4973]: I0320 14:08:04.612806 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6fch\" (UniqueName: \"kubernetes.io/projected/53b986db-231a-4811-b651-7f7c242f0b44-kube-api-access-v6fch\") pod \"53b986db-231a-4811-b651-7f7c242f0b44\" (UID: \"53b986db-231a-4811-b651-7f7c242f0b44\") " Mar 20 14:08:04 crc kubenswrapper[4973]: I0320 14:08:04.622266 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53b986db-231a-4811-b651-7f7c242f0b44-kube-api-access-v6fch" (OuterVolumeSpecName: "kube-api-access-v6fch") pod "53b986db-231a-4811-b651-7f7c242f0b44" (UID: "53b986db-231a-4811-b651-7f7c242f0b44"). InnerVolumeSpecName "kube-api-access-v6fch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:08:04 crc kubenswrapper[4973]: I0320 14:08:04.715860 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6fch\" (UniqueName: \"kubernetes.io/projected/53b986db-231a-4811-b651-7f7c242f0b44-kube-api-access-v6fch\") on node \"crc\" DevicePath \"\"" Mar 20 14:08:05 crc kubenswrapper[4973]: I0320 14:08:05.033738 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566928-98vqm" event={"ID":"53b986db-231a-4811-b651-7f7c242f0b44","Type":"ContainerDied","Data":"1e0f6a2523e0add3ba10694ba7ff239b6653c989ba7a4ba50218260f89fa917e"} Mar 20 14:08:05 crc kubenswrapper[4973]: I0320 14:08:05.033794 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1e0f6a2523e0add3ba10694ba7ff239b6653c989ba7a4ba50218260f89fa917e" Mar 20 14:08:05 crc kubenswrapper[4973]: I0320 14:08:05.033810 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566928-98vqm" Mar 20 14:08:05 crc kubenswrapper[4973]: I0320 14:08:05.501943 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566922-qjbt8"] Mar 20 14:08:05 crc kubenswrapper[4973]: I0320 14:08:05.513818 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566922-qjbt8"] Mar 20 14:08:05 crc kubenswrapper[4973]: I0320 14:08:05.963846 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f036cf8b-0120-4173-baea-19e177ab6734" path="/var/lib/kubelet/pods/f036cf8b-0120-4173-baea-19e177ab6734/volumes" Mar 20 14:08:11 crc kubenswrapper[4973]: I0320 14:08:11.765059 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ttxz7"] Mar 20 14:08:11 crc kubenswrapper[4973]: E0320 14:08:11.766303 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53b986db-231a-4811-b651-7f7c242f0b44" containerName="oc" Mar 20 14:08:11 crc kubenswrapper[4973]: I0320 14:08:11.766346 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="53b986db-231a-4811-b651-7f7c242f0b44" containerName="oc" Mar 20 14:08:11 crc kubenswrapper[4973]: I0320 14:08:11.766592 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="53b986db-231a-4811-b651-7f7c242f0b44" containerName="oc" Mar 20 14:08:11 crc kubenswrapper[4973]: I0320 14:08:11.768436 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttxz7" Mar 20 14:08:11 crc kubenswrapper[4973]: I0320 14:08:11.813804 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ttxz7"] Mar 20 14:08:11 crc kubenswrapper[4973]: I0320 14:08:11.896155 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7a9caed-83eb-46b2-a11d-536eb18eb232-catalog-content\") pod \"redhat-operators-ttxz7\" (UID: \"b7a9caed-83eb-46b2-a11d-536eb18eb232\") " pod="openshift-marketplace/redhat-operators-ttxz7" Mar 20 14:08:11 crc kubenswrapper[4973]: I0320 14:08:11.896253 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7a9caed-83eb-46b2-a11d-536eb18eb232-utilities\") pod \"redhat-operators-ttxz7\" (UID: \"b7a9caed-83eb-46b2-a11d-536eb18eb232\") " pod="openshift-marketplace/redhat-operators-ttxz7" Mar 20 14:08:11 crc kubenswrapper[4973]: I0320 14:08:11.896787 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smjf4\" (UniqueName: \"kubernetes.io/projected/b7a9caed-83eb-46b2-a11d-536eb18eb232-kube-api-access-smjf4\") pod \"redhat-operators-ttxz7\" (UID: \"b7a9caed-83eb-46b2-a11d-536eb18eb232\") " pod="openshift-marketplace/redhat-operators-ttxz7" Mar 20 14:08:11 crc kubenswrapper[4973]: I0320 14:08:11.999747 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7a9caed-83eb-46b2-a11d-536eb18eb232-catalog-content\") pod \"redhat-operators-ttxz7\" (UID: \"b7a9caed-83eb-46b2-a11d-536eb18eb232\") " pod="openshift-marketplace/redhat-operators-ttxz7" Mar 20 14:08:12 crc kubenswrapper[4973]: I0320 14:08:11.999866 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7a9caed-83eb-46b2-a11d-536eb18eb232-utilities\") pod \"redhat-operators-ttxz7\" (UID: \"b7a9caed-83eb-46b2-a11d-536eb18eb232\") " pod="openshift-marketplace/redhat-operators-ttxz7" Mar 20 14:08:12 crc kubenswrapper[4973]: I0320 14:08:11.999979 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smjf4\" (UniqueName: \"kubernetes.io/projected/b7a9caed-83eb-46b2-a11d-536eb18eb232-kube-api-access-smjf4\") pod \"redhat-operators-ttxz7\" (UID: \"b7a9caed-83eb-46b2-a11d-536eb18eb232\") " pod="openshift-marketplace/redhat-operators-ttxz7" Mar 20 14:08:12 crc kubenswrapper[4973]: I0320 14:08:12.000322 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7a9caed-83eb-46b2-a11d-536eb18eb232-catalog-content\") pod \"redhat-operators-ttxz7\" (UID: \"b7a9caed-83eb-46b2-a11d-536eb18eb232\") " pod="openshift-marketplace/redhat-operators-ttxz7" Mar 20 14:08:12 crc kubenswrapper[4973]: I0320 14:08:12.000413 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7a9caed-83eb-46b2-a11d-536eb18eb232-utilities\") pod \"redhat-operators-ttxz7\" (UID: \"b7a9caed-83eb-46b2-a11d-536eb18eb232\") " pod="openshift-marketplace/redhat-operators-ttxz7" Mar 20 14:08:12 crc kubenswrapper[4973]: I0320 14:08:12.034297 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smjf4\" (UniqueName: \"kubernetes.io/projected/b7a9caed-83eb-46b2-a11d-536eb18eb232-kube-api-access-smjf4\") pod \"redhat-operators-ttxz7\" (UID: \"b7a9caed-83eb-46b2-a11d-536eb18eb232\") " pod="openshift-marketplace/redhat-operators-ttxz7" Mar 20 14:08:12 crc kubenswrapper[4973]: I0320 14:08:12.095841 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttxz7" Mar 20 14:08:12 crc kubenswrapper[4973]: I0320 14:08:12.574957 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ttxz7"] Mar 20 14:08:13 crc kubenswrapper[4973]: I0320 14:08:13.121663 4973 generic.go:334] "Generic (PLEG): container finished" podID="b7a9caed-83eb-46b2-a11d-536eb18eb232" containerID="440a173e00c2580f694b75e71245cbdc7b876b1e7cb7989c9d15cc571d07cc5c" exitCode=0 Mar 20 14:08:13 crc kubenswrapper[4973]: I0320 14:08:13.121744 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttxz7" event={"ID":"b7a9caed-83eb-46b2-a11d-536eb18eb232","Type":"ContainerDied","Data":"440a173e00c2580f694b75e71245cbdc7b876b1e7cb7989c9d15cc571d07cc5c"} Mar 20 14:08:13 crc kubenswrapper[4973]: I0320 14:08:13.121964 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttxz7" event={"ID":"b7a9caed-83eb-46b2-a11d-536eb18eb232","Type":"ContainerStarted","Data":"d2c6ad79757c7ade9c9a039b88dd53793b60f0e3139650899d1bc7a4221ba69e"} Mar 20 14:08:13 crc kubenswrapper[4973]: I0320 14:08:13.320939 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:08:13 crc kubenswrapper[4973]: I0320 14:08:13.321315 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:08:13 crc kubenswrapper[4973]: I0320 14:08:13.321385 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 14:08:13 crc kubenswrapper[4973]: I0320 14:08:13.322447 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b01dc0fd273ee1427bf99791c14487bb944cb9cd6409ef35b5ca274d472b4fca"} pod="openshift-machine-config-operator/machine-config-daemon-qlztx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:08:13 crc kubenswrapper[4973]: I0320 14:08:13.322513 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" containerID="cri-o://b01dc0fd273ee1427bf99791c14487bb944cb9cd6409ef35b5ca274d472b4fca" gracePeriod=600 Mar 20 14:08:14 crc kubenswrapper[4973]: I0320 14:08:14.136154 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttxz7" event={"ID":"b7a9caed-83eb-46b2-a11d-536eb18eb232","Type":"ContainerStarted","Data":"2faa98c29afae29c0c7772a1318a177a61ab5bf8df42f1185bd104d6d8b3f9eb"} Mar 20 14:08:14 crc kubenswrapper[4973]: I0320 14:08:14.141563 4973 generic.go:334] "Generic (PLEG): container finished" podID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerID="b01dc0fd273ee1427bf99791c14487bb944cb9cd6409ef35b5ca274d472b4fca" exitCode=0 Mar 20 14:08:14 crc kubenswrapper[4973]: I0320 14:08:14.141616 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerDied","Data":"b01dc0fd273ee1427bf99791c14487bb944cb9cd6409ef35b5ca274d472b4fca"} Mar 20 14:08:14 crc kubenswrapper[4973]: I0320 14:08:14.141646 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerStarted","Data":"eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c"} Mar 20 14:08:14 crc kubenswrapper[4973]: I0320 14:08:14.141662 4973 scope.go:117] "RemoveContainer" containerID="8621baf726e08c2ed687221b60bfad1f4fa992e7126eeb0e0f24894206d0a1ee" Mar 20 14:08:20 crc kubenswrapper[4973]: I0320 14:08:20.213536 4973 generic.go:334] "Generic (PLEG): container finished" podID="b7a9caed-83eb-46b2-a11d-536eb18eb232" containerID="2faa98c29afae29c0c7772a1318a177a61ab5bf8df42f1185bd104d6d8b3f9eb" exitCode=0 Mar 20 14:08:20 crc kubenswrapper[4973]: I0320 14:08:20.213638 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttxz7" event={"ID":"b7a9caed-83eb-46b2-a11d-536eb18eb232","Type":"ContainerDied","Data":"2faa98c29afae29c0c7772a1318a177a61ab5bf8df42f1185bd104d6d8b3f9eb"} Mar 20 14:08:21 crc kubenswrapper[4973]: I0320 14:08:21.229540 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttxz7" event={"ID":"b7a9caed-83eb-46b2-a11d-536eb18eb232","Type":"ContainerStarted","Data":"4e3d3b27b06eb501de684907f02171e3b067ab250a67056a0c9c9916ec68abe3"} Mar 20 14:08:21 crc kubenswrapper[4973]: I0320 14:08:21.253170 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ttxz7" podStartSLOduration=2.464689762 podStartE2EDuration="10.253152418s" podCreationTimestamp="2026-03-20 14:08:11 +0000 UTC" firstStartedPulling="2026-03-20 14:08:13.124220272 +0000 UTC m=+2813.867890016" lastFinishedPulling="2026-03-20 14:08:20.912682928 +0000 UTC m=+2821.656352672" observedRunningTime="2026-03-20 14:08:21.245506118 +0000 UTC m=+2821.989175862" watchObservedRunningTime="2026-03-20 14:08:21.253152418 +0000 UTC m=+2821.996822162" Mar 20 14:08:22 crc kubenswrapper[4973]: I0320 14:08:22.096273 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ttxz7" Mar 20 14:08:22 crc kubenswrapper[4973]: I0320 14:08:22.096787 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ttxz7" Mar 20 14:08:23 crc kubenswrapper[4973]: I0320 14:08:23.149115 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ttxz7" podUID="b7a9caed-83eb-46b2-a11d-536eb18eb232" containerName="registry-server" probeResult="failure" output=< Mar 20 14:08:23 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:08:23 crc kubenswrapper[4973]: > Mar 20 14:08:33 crc kubenswrapper[4973]: I0320 14:08:33.143231 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ttxz7" podUID="b7a9caed-83eb-46b2-a11d-536eb18eb232" containerName="registry-server" probeResult="failure" output=< Mar 20 14:08:33 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:08:33 crc kubenswrapper[4973]: > Mar 20 14:08:43 crc kubenswrapper[4973]: I0320 14:08:43.162849 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ttxz7" podUID="b7a9caed-83eb-46b2-a11d-536eb18eb232" containerName="registry-server" probeResult="failure" output=< Mar 20 14:08:43 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:08:43 crc kubenswrapper[4973]: > Mar 20 14:08:52 crc kubenswrapper[4973]: I0320 14:08:52.147232 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ttxz7" Mar 20 14:08:52 crc kubenswrapper[4973]: I0320 14:08:52.220015 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ttxz7" Mar 20 14:08:52 crc kubenswrapper[4973]: I0320 14:08:52.400899 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ttxz7"] Mar 20 14:08:53 crc kubenswrapper[4973]: I0320 14:08:53.579948 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ttxz7" podUID="b7a9caed-83eb-46b2-a11d-536eb18eb232" containerName="registry-server" containerID="cri-o://4e3d3b27b06eb501de684907f02171e3b067ab250a67056a0c9c9916ec68abe3" gracePeriod=2 Mar 20 14:08:54 crc kubenswrapper[4973]: I0320 14:08:54.150320 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttxz7" Mar 20 14:08:54 crc kubenswrapper[4973]: I0320 14:08:54.268271 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7a9caed-83eb-46b2-a11d-536eb18eb232-catalog-content\") pod \"b7a9caed-83eb-46b2-a11d-536eb18eb232\" (UID: \"b7a9caed-83eb-46b2-a11d-536eb18eb232\") " Mar 20 14:08:54 crc kubenswrapper[4973]: I0320 14:08:54.268443 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7a9caed-83eb-46b2-a11d-536eb18eb232-utilities\") pod \"b7a9caed-83eb-46b2-a11d-536eb18eb232\" (UID: \"b7a9caed-83eb-46b2-a11d-536eb18eb232\") " Mar 20 14:08:54 crc kubenswrapper[4973]: I0320 14:08:54.268615 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smjf4\" (UniqueName: \"kubernetes.io/projected/b7a9caed-83eb-46b2-a11d-536eb18eb232-kube-api-access-smjf4\") pod \"b7a9caed-83eb-46b2-a11d-536eb18eb232\" (UID: \"b7a9caed-83eb-46b2-a11d-536eb18eb232\") " Mar 20 14:08:54 crc kubenswrapper[4973]: I0320 14:08:54.269566 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7a9caed-83eb-46b2-a11d-536eb18eb232-utilities" (OuterVolumeSpecName: "utilities") pod "b7a9caed-83eb-46b2-a11d-536eb18eb232" (UID: "b7a9caed-83eb-46b2-a11d-536eb18eb232"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:08:54 crc kubenswrapper[4973]: I0320 14:08:54.269973 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b7a9caed-83eb-46b2-a11d-536eb18eb232-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:08:54 crc kubenswrapper[4973]: I0320 14:08:54.277636 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7a9caed-83eb-46b2-a11d-536eb18eb232-kube-api-access-smjf4" (OuterVolumeSpecName: "kube-api-access-smjf4") pod "b7a9caed-83eb-46b2-a11d-536eb18eb232" (UID: "b7a9caed-83eb-46b2-a11d-536eb18eb232"). InnerVolumeSpecName "kube-api-access-smjf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:08:54 crc kubenswrapper[4973]: I0320 14:08:54.376246 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smjf4\" (UniqueName: \"kubernetes.io/projected/b7a9caed-83eb-46b2-a11d-536eb18eb232-kube-api-access-smjf4\") on node \"crc\" DevicePath \"\"" Mar 20 14:08:54 crc kubenswrapper[4973]: I0320 14:08:54.421473 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7a9caed-83eb-46b2-a11d-536eb18eb232-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b7a9caed-83eb-46b2-a11d-536eb18eb232" (UID: "b7a9caed-83eb-46b2-a11d-536eb18eb232"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:08:54 crc kubenswrapper[4973]: I0320 14:08:54.477794 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b7a9caed-83eb-46b2-a11d-536eb18eb232-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:08:54 crc kubenswrapper[4973]: I0320 14:08:54.592749 4973 generic.go:334] "Generic (PLEG): container finished" podID="b7a9caed-83eb-46b2-a11d-536eb18eb232" containerID="4e3d3b27b06eb501de684907f02171e3b067ab250a67056a0c9c9916ec68abe3" exitCode=0 Mar 20 14:08:54 crc kubenswrapper[4973]: I0320 14:08:54.592799 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttxz7" event={"ID":"b7a9caed-83eb-46b2-a11d-536eb18eb232","Type":"ContainerDied","Data":"4e3d3b27b06eb501de684907f02171e3b067ab250a67056a0c9c9916ec68abe3"} Mar 20 14:08:54 crc kubenswrapper[4973]: I0320 14:08:54.592862 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ttxz7" event={"ID":"b7a9caed-83eb-46b2-a11d-536eb18eb232","Type":"ContainerDied","Data":"d2c6ad79757c7ade9c9a039b88dd53793b60f0e3139650899d1bc7a4221ba69e"} Mar 20 14:08:54 crc kubenswrapper[4973]: I0320 14:08:54.592886 4973 scope.go:117] "RemoveContainer" containerID="4e3d3b27b06eb501de684907f02171e3b067ab250a67056a0c9c9916ec68abe3" Mar 20 14:08:54 crc kubenswrapper[4973]: I0320 14:08:54.592897 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ttxz7" Mar 20 14:08:54 crc kubenswrapper[4973]: I0320 14:08:54.634095 4973 scope.go:117] "RemoveContainer" containerID="2faa98c29afae29c0c7772a1318a177a61ab5bf8df42f1185bd104d6d8b3f9eb" Mar 20 14:08:54 crc kubenswrapper[4973]: I0320 14:08:54.638010 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ttxz7"] Mar 20 14:08:54 crc kubenswrapper[4973]: I0320 14:08:54.649250 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ttxz7"] Mar 20 14:08:54 crc kubenswrapper[4973]: I0320 14:08:54.665613 4973 scope.go:117] "RemoveContainer" containerID="440a173e00c2580f694b75e71245cbdc7b876b1e7cb7989c9d15cc571d07cc5c" Mar 20 14:08:54 crc kubenswrapper[4973]: I0320 14:08:54.714012 4973 scope.go:117] "RemoveContainer" containerID="4e3d3b27b06eb501de684907f02171e3b067ab250a67056a0c9c9916ec68abe3" Mar 20 14:08:54 crc kubenswrapper[4973]: E0320 14:08:54.714517 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e3d3b27b06eb501de684907f02171e3b067ab250a67056a0c9c9916ec68abe3\": container with ID starting with 4e3d3b27b06eb501de684907f02171e3b067ab250a67056a0c9c9916ec68abe3 not found: ID does not exist" containerID="4e3d3b27b06eb501de684907f02171e3b067ab250a67056a0c9c9916ec68abe3" Mar 20 14:08:54 crc kubenswrapper[4973]: I0320 14:08:54.714551 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e3d3b27b06eb501de684907f02171e3b067ab250a67056a0c9c9916ec68abe3"} err="failed to get container status \"4e3d3b27b06eb501de684907f02171e3b067ab250a67056a0c9c9916ec68abe3\": rpc error: code = NotFound desc = could not find container \"4e3d3b27b06eb501de684907f02171e3b067ab250a67056a0c9c9916ec68abe3\": container with ID starting with 4e3d3b27b06eb501de684907f02171e3b067ab250a67056a0c9c9916ec68abe3 not found: ID does not exist" Mar 20 14:08:54 crc kubenswrapper[4973]: I0320 14:08:54.714571 4973 scope.go:117] "RemoveContainer" containerID="2faa98c29afae29c0c7772a1318a177a61ab5bf8df42f1185bd104d6d8b3f9eb" Mar 20 14:08:54 crc kubenswrapper[4973]: E0320 14:08:54.714886 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2faa98c29afae29c0c7772a1318a177a61ab5bf8df42f1185bd104d6d8b3f9eb\": container with ID starting with 2faa98c29afae29c0c7772a1318a177a61ab5bf8df42f1185bd104d6d8b3f9eb not found: ID does not exist" containerID="2faa98c29afae29c0c7772a1318a177a61ab5bf8df42f1185bd104d6d8b3f9eb" Mar 20 14:08:54 crc kubenswrapper[4973]: I0320 14:08:54.714903 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2faa98c29afae29c0c7772a1318a177a61ab5bf8df42f1185bd104d6d8b3f9eb"} err="failed to get container status \"2faa98c29afae29c0c7772a1318a177a61ab5bf8df42f1185bd104d6d8b3f9eb\": rpc error: code = NotFound desc = could not find container \"2faa98c29afae29c0c7772a1318a177a61ab5bf8df42f1185bd104d6d8b3f9eb\": container with ID starting with 2faa98c29afae29c0c7772a1318a177a61ab5bf8df42f1185bd104d6d8b3f9eb not found: ID does not exist" Mar 20 14:08:54 crc kubenswrapper[4973]: I0320 14:08:54.714915 4973 scope.go:117] "RemoveContainer" containerID="440a173e00c2580f694b75e71245cbdc7b876b1e7cb7989c9d15cc571d07cc5c" Mar 20 14:08:54 crc kubenswrapper[4973]: E0320 14:08:54.715154 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"440a173e00c2580f694b75e71245cbdc7b876b1e7cb7989c9d15cc571d07cc5c\": container with ID starting with 440a173e00c2580f694b75e71245cbdc7b876b1e7cb7989c9d15cc571d07cc5c not found: ID does not exist" containerID="440a173e00c2580f694b75e71245cbdc7b876b1e7cb7989c9d15cc571d07cc5c" Mar 20 14:08:54 crc kubenswrapper[4973]: I0320 14:08:54.715199 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"440a173e00c2580f694b75e71245cbdc7b876b1e7cb7989c9d15cc571d07cc5c"} err="failed to get container status \"440a173e00c2580f694b75e71245cbdc7b876b1e7cb7989c9d15cc571d07cc5c\": rpc error: code = NotFound desc = could not find container \"440a173e00c2580f694b75e71245cbdc7b876b1e7cb7989c9d15cc571d07cc5c\": container with ID starting with 440a173e00c2580f694b75e71245cbdc7b876b1e7cb7989c9d15cc571d07cc5c not found: ID does not exist" Mar 20 14:08:55 crc kubenswrapper[4973]: I0320 14:08:55.964316 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7a9caed-83eb-46b2-a11d-536eb18eb232" path="/var/lib/kubelet/pods/b7a9caed-83eb-46b2-a11d-536eb18eb232/volumes" Mar 20 14:09:01 crc kubenswrapper[4973]: I0320 14:09:01.678811 4973 scope.go:117] "RemoveContainer" containerID="7278b8d366e9bcde8f393f319abec9ef6e5ca3584b3bf0375b95042c2a3290b1" Mar 20 14:09:39 crc kubenswrapper[4973]: I0320 14:09:39.853377 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vm7fr"] Mar 20 14:09:39 crc kubenswrapper[4973]: E0320 14:09:39.854686 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a9caed-83eb-46b2-a11d-536eb18eb232" containerName="extract-content" Mar 20 14:09:39 crc kubenswrapper[4973]: I0320 14:09:39.854704 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a9caed-83eb-46b2-a11d-536eb18eb232" containerName="extract-content" Mar 20 14:09:39 crc kubenswrapper[4973]: E0320 14:09:39.854740 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a9caed-83eb-46b2-a11d-536eb18eb232" containerName="registry-server" Mar 20 14:09:39 crc kubenswrapper[4973]: I0320 14:09:39.854749 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a9caed-83eb-46b2-a11d-536eb18eb232" containerName="registry-server" Mar 20 14:09:39 crc kubenswrapper[4973]: E0320 14:09:39.854760 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a9caed-83eb-46b2-a11d-536eb18eb232" containerName="extract-utilities" Mar 20 14:09:39 crc kubenswrapper[4973]: I0320 14:09:39.854768 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a9caed-83eb-46b2-a11d-536eb18eb232" containerName="extract-utilities" Mar 20 14:09:39 crc kubenswrapper[4973]: I0320 14:09:39.855078 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7a9caed-83eb-46b2-a11d-536eb18eb232" containerName="registry-server" Mar 20 14:09:39 crc kubenswrapper[4973]: I0320 14:09:39.856870 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vm7fr" Mar 20 14:09:39 crc kubenswrapper[4973]: I0320 14:09:39.879004 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vm7fr"] Mar 20 14:09:39 crc kubenswrapper[4973]: I0320 14:09:39.888708 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s86gl\" (UniqueName: \"kubernetes.io/projected/1bf4e3dc-1619-4352-a116-cee4edacbc88-kube-api-access-s86gl\") pod \"redhat-marketplace-vm7fr\" (UID: \"1bf4e3dc-1619-4352-a116-cee4edacbc88\") " pod="openshift-marketplace/redhat-marketplace-vm7fr" Mar 20 14:09:39 crc kubenswrapper[4973]: I0320 14:09:39.888784 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bf4e3dc-1619-4352-a116-cee4edacbc88-catalog-content\") pod \"redhat-marketplace-vm7fr\" (UID: \"1bf4e3dc-1619-4352-a116-cee4edacbc88\") " pod="openshift-marketplace/redhat-marketplace-vm7fr" Mar 20 14:09:39 crc kubenswrapper[4973]: I0320 14:09:39.889018 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bf4e3dc-1619-4352-a116-cee4edacbc88-utilities\") pod \"redhat-marketplace-vm7fr\" (UID: \"1bf4e3dc-1619-4352-a116-cee4edacbc88\") " pod="openshift-marketplace/redhat-marketplace-vm7fr" Mar 20 14:09:39 crc kubenswrapper[4973]: I0320 14:09:39.998928 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bf4e3dc-1619-4352-a116-cee4edacbc88-utilities\") pod \"redhat-marketplace-vm7fr\" (UID: \"1bf4e3dc-1619-4352-a116-cee4edacbc88\") " pod="openshift-marketplace/redhat-marketplace-vm7fr" Mar 20 14:09:39 crc kubenswrapper[4973]: I0320 14:09:39.999163 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s86gl\" (UniqueName: \"kubernetes.io/projected/1bf4e3dc-1619-4352-a116-cee4edacbc88-kube-api-access-s86gl\") pod \"redhat-marketplace-vm7fr\" (UID: \"1bf4e3dc-1619-4352-a116-cee4edacbc88\") " pod="openshift-marketplace/redhat-marketplace-vm7fr" Mar 20 14:09:39 crc kubenswrapper[4973]: I0320 14:09:39.999208 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bf4e3dc-1619-4352-a116-cee4edacbc88-catalog-content\") pod \"redhat-marketplace-vm7fr\" (UID: \"1bf4e3dc-1619-4352-a116-cee4edacbc88\") " pod="openshift-marketplace/redhat-marketplace-vm7fr" Mar 20 14:09:40 crc kubenswrapper[4973]: I0320 14:09:40.000238 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bf4e3dc-1619-4352-a116-cee4edacbc88-catalog-content\") pod \"redhat-marketplace-vm7fr\" (UID: \"1bf4e3dc-1619-4352-a116-cee4edacbc88\") " pod="openshift-marketplace/redhat-marketplace-vm7fr" Mar 20 14:09:40 crc kubenswrapper[4973]: I0320 14:09:40.000830 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bf4e3dc-1619-4352-a116-cee4edacbc88-utilities\") pod \"redhat-marketplace-vm7fr\" (UID: \"1bf4e3dc-1619-4352-a116-cee4edacbc88\") " pod="openshift-marketplace/redhat-marketplace-vm7fr" Mar 20 14:09:40 crc kubenswrapper[4973]: I0320 14:09:40.022933 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s86gl\" (UniqueName: \"kubernetes.io/projected/1bf4e3dc-1619-4352-a116-cee4edacbc88-kube-api-access-s86gl\") pod \"redhat-marketplace-vm7fr\" (UID: \"1bf4e3dc-1619-4352-a116-cee4edacbc88\") " pod="openshift-marketplace/redhat-marketplace-vm7fr" Mar 20 14:09:40 crc kubenswrapper[4973]: I0320 14:09:40.194823 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vm7fr" Mar 20 14:09:40 crc kubenswrapper[4973]: I0320 14:09:40.570288 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vm7fr"] Mar 20 14:09:41 crc kubenswrapper[4973]: I0320 14:09:41.313212 4973 generic.go:334] "Generic (PLEG): container finished" podID="1bf4e3dc-1619-4352-a116-cee4edacbc88" containerID="fafb92d5eea3e30c92f99fdc5b5f9ee1015d625c4fd413538e5037041c7a6b86" exitCode=0 Mar 20 14:09:41 crc kubenswrapper[4973]: I0320 14:09:41.313411 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vm7fr" event={"ID":"1bf4e3dc-1619-4352-a116-cee4edacbc88","Type":"ContainerDied","Data":"fafb92d5eea3e30c92f99fdc5b5f9ee1015d625c4fd413538e5037041c7a6b86"} Mar 20 14:09:41 crc kubenswrapper[4973]: I0320 14:09:41.313564 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vm7fr" event={"ID":"1bf4e3dc-1619-4352-a116-cee4edacbc88","Type":"ContainerStarted","Data":"75a560014f0b09964cd1d2a1de024e41b7951fd37f686c79b3d71ad96e0e5c21"} Mar 20 14:09:42 crc kubenswrapper[4973]: I0320 14:09:42.326143 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vm7fr" event={"ID":"1bf4e3dc-1619-4352-a116-cee4edacbc88","Type":"ContainerStarted","Data":"e3481bcc66bfa95b406444005b72f475af949b15b8b5655bf911392ddb73a55a"} Mar 20 14:09:43 crc kubenswrapper[4973]: I0320 14:09:43.340072 4973 generic.go:334] "Generic (PLEG): container finished" podID="1bf4e3dc-1619-4352-a116-cee4edacbc88" containerID="e3481bcc66bfa95b406444005b72f475af949b15b8b5655bf911392ddb73a55a" exitCode=0 Mar 20 14:09:43 crc kubenswrapper[4973]: I0320 14:09:43.340147 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vm7fr" event={"ID":"1bf4e3dc-1619-4352-a116-cee4edacbc88","Type":"ContainerDied","Data":"e3481bcc66bfa95b406444005b72f475af949b15b8b5655bf911392ddb73a55a"} Mar 20 14:09:44 crc kubenswrapper[4973]: I0320 14:09:44.352521 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vm7fr" event={"ID":"1bf4e3dc-1619-4352-a116-cee4edacbc88","Type":"ContainerStarted","Data":"ed47c28c170f4e88aae6bae804fe50334e5bf1f61f04d1ccc9048eb61632c1fb"} Mar 20 14:09:44 crc kubenswrapper[4973]: I0320 14:09:44.378768 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vm7fr" podStartSLOduration=2.655170838 podStartE2EDuration="5.378731859s" podCreationTimestamp="2026-03-20 14:09:39 +0000 UTC" firstStartedPulling="2026-03-20 14:09:41.317193946 +0000 UTC m=+2902.060863690" lastFinishedPulling="2026-03-20 14:09:44.040754967 +0000 UTC m=+2904.784424711" observedRunningTime="2026-03-20 14:09:44.373796232 +0000 UTC m=+2905.117465976" watchObservedRunningTime="2026-03-20 14:09:44.378731859 +0000 UTC m=+2905.122401603" Mar 20 14:09:46 crc kubenswrapper[4973]: I0320 14:09:46.227841 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8k2cf"] Mar 20 14:09:46 crc kubenswrapper[4973]: I0320 14:09:46.232719 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8k2cf" Mar 20 14:09:46 crc kubenswrapper[4973]: I0320 14:09:46.241413 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8k2cf"] Mar 20 14:09:46 crc kubenswrapper[4973]: I0320 14:09:46.357753 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6087bd2-bcf1-475f-9aab-7774295e730e-utilities\") pod \"certified-operators-8k2cf\" (UID: \"c6087bd2-bcf1-475f-9aab-7774295e730e\") " pod="openshift-marketplace/certified-operators-8k2cf" Mar 20 14:09:46 crc kubenswrapper[4973]: I0320 14:09:46.357845 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6087bd2-bcf1-475f-9aab-7774295e730e-catalog-content\") pod \"certified-operators-8k2cf\" (UID: \"c6087bd2-bcf1-475f-9aab-7774295e730e\") " pod="openshift-marketplace/certified-operators-8k2cf" Mar 20 14:09:46 crc kubenswrapper[4973]: I0320 14:09:46.357927 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvhtf\" (UniqueName: \"kubernetes.io/projected/c6087bd2-bcf1-475f-9aab-7774295e730e-kube-api-access-rvhtf\") pod \"certified-operators-8k2cf\" (UID: \"c6087bd2-bcf1-475f-9aab-7774295e730e\") " pod="openshift-marketplace/certified-operators-8k2cf" Mar 20 14:09:46 crc kubenswrapper[4973]: I0320 14:09:46.459705 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6087bd2-bcf1-475f-9aab-7774295e730e-utilities\") pod \"certified-operators-8k2cf\" (UID: \"c6087bd2-bcf1-475f-9aab-7774295e730e\") " pod="openshift-marketplace/certified-operators-8k2cf" Mar 20 14:09:46 crc kubenswrapper[4973]: I0320 14:09:46.459812 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6087bd2-bcf1-475f-9aab-7774295e730e-catalog-content\") pod \"certified-operators-8k2cf\" (UID: \"c6087bd2-bcf1-475f-9aab-7774295e730e\") " pod="openshift-marketplace/certified-operators-8k2cf" Mar 20 14:09:46 crc kubenswrapper[4973]: I0320 14:09:46.459904 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvhtf\" (UniqueName: \"kubernetes.io/projected/c6087bd2-bcf1-475f-9aab-7774295e730e-kube-api-access-rvhtf\") pod \"certified-operators-8k2cf\" (UID: \"c6087bd2-bcf1-475f-9aab-7774295e730e\") " pod="openshift-marketplace/certified-operators-8k2cf" Mar 20 14:09:46 crc kubenswrapper[4973]: I0320 14:09:46.460222 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6087bd2-bcf1-475f-9aab-7774295e730e-utilities\") pod \"certified-operators-8k2cf\" (UID: \"c6087bd2-bcf1-475f-9aab-7774295e730e\") " pod="openshift-marketplace/certified-operators-8k2cf" Mar 20 14:09:46 crc kubenswrapper[4973]: I0320 14:09:46.460279 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6087bd2-bcf1-475f-9aab-7774295e730e-catalog-content\") pod \"certified-operators-8k2cf\" (UID: \"c6087bd2-bcf1-475f-9aab-7774295e730e\") " pod="openshift-marketplace/certified-operators-8k2cf" Mar 20 14:09:46 crc kubenswrapper[4973]: I0320 14:09:46.489398 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvhtf\" (UniqueName: \"kubernetes.io/projected/c6087bd2-bcf1-475f-9aab-7774295e730e-kube-api-access-rvhtf\") pod \"certified-operators-8k2cf\" (UID: \"c6087bd2-bcf1-475f-9aab-7774295e730e\") " pod="openshift-marketplace/certified-operators-8k2cf" Mar 20 14:09:46 crc kubenswrapper[4973]: I0320 14:09:46.554546 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8k2cf" Mar 20 14:09:47 crc kubenswrapper[4973]: W0320 14:09:47.107395 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6087bd2_bcf1_475f_9aab_7774295e730e.slice/crio-c79b73398394b808a38747a27ee60dd95fb22925cdba22e6f68ddc19b16b1d7d WatchSource:0}: Error finding container c79b73398394b808a38747a27ee60dd95fb22925cdba22e6f68ddc19b16b1d7d: Status 404 returned error can't find the container with id c79b73398394b808a38747a27ee60dd95fb22925cdba22e6f68ddc19b16b1d7d Mar 20 14:09:47 crc kubenswrapper[4973]: I0320 14:09:47.109558 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8k2cf"] Mar 20 14:09:47 crc kubenswrapper[4973]: I0320 14:09:47.385865 4973 generic.go:334] "Generic (PLEG): container finished" podID="c6087bd2-bcf1-475f-9aab-7774295e730e" containerID="11383f19239a5ca94e419f61328ed4f4c4c7cd7f04fb2853d4a9ea60fdc375e4" exitCode=0 Mar 20 14:09:47 crc kubenswrapper[4973]: I0320 14:09:47.385936 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8k2cf" event={"ID":"c6087bd2-bcf1-475f-9aab-7774295e730e","Type":"ContainerDied","Data":"11383f19239a5ca94e419f61328ed4f4c4c7cd7f04fb2853d4a9ea60fdc375e4"} Mar 20 14:09:47 crc kubenswrapper[4973]: I0320 14:09:47.386198 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8k2cf" event={"ID":"c6087bd2-bcf1-475f-9aab-7774295e730e","Type":"ContainerStarted","Data":"c79b73398394b808a38747a27ee60dd95fb22925cdba22e6f68ddc19b16b1d7d"} Mar 20 14:09:49 crc kubenswrapper[4973]: I0320 14:09:49.408391 4973 generic.go:334] "Generic (PLEG): container finished" podID="c6087bd2-bcf1-475f-9aab-7774295e730e" containerID="c1216c769fa34b0823a5dc713f80042b0acabda932bfe871bc1797eb613d9a99" exitCode=0 Mar 20 14:09:49 crc kubenswrapper[4973]: I0320 14:09:49.408597 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8k2cf" event={"ID":"c6087bd2-bcf1-475f-9aab-7774295e730e","Type":"ContainerDied","Data":"c1216c769fa34b0823a5dc713f80042b0acabda932bfe871bc1797eb613d9a99"} Mar 20 14:09:50 crc kubenswrapper[4973]: I0320 14:09:50.195318 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vm7fr" Mar 20 14:09:50 crc kubenswrapper[4973]: I0320 14:09:50.196059 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vm7fr" Mar 20 14:09:50 crc kubenswrapper[4973]: I0320 14:09:50.259033 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vm7fr" Mar 20 14:09:50 crc kubenswrapper[4973]: I0320 14:09:50.420989 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8k2cf" event={"ID":"c6087bd2-bcf1-475f-9aab-7774295e730e","Type":"ContainerStarted","Data":"37d21ebbf527e63bfa57333791324ea23e08cde3596f6adac06d7dac971677a5"} Mar 20 14:09:50 crc kubenswrapper[4973]: I0320 14:09:50.454520 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8k2cf" podStartSLOduration=1.99330604 podStartE2EDuration="4.454500558s" podCreationTimestamp="2026-03-20 14:09:46 +0000 UTC" firstStartedPulling="2026-03-20 14:09:47.387199914 +0000 UTC m=+2908.130869658" lastFinishedPulling="2026-03-20 14:09:49.848394422 +0000 UTC m=+2910.592064176" observedRunningTime="2026-03-20 14:09:50.447190399 +0000 UTC m=+2911.190860153" watchObservedRunningTime="2026-03-20 14:09:50.454500558 +0000 UTC m=+2911.198170302" Mar 20 14:09:50 crc kubenswrapper[4973]: I0320 14:09:50.485854 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vm7fr" Mar 20 14:09:52 crc kubenswrapper[4973]: I0320 14:09:52.216719 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vm7fr"] Mar 20 14:09:52 crc kubenswrapper[4973]: I0320 14:09:52.437926 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vm7fr" podUID="1bf4e3dc-1619-4352-a116-cee4edacbc88" containerName="registry-server" containerID="cri-o://ed47c28c170f4e88aae6bae804fe50334e5bf1f61f04d1ccc9048eb61632c1fb" gracePeriod=2 Mar 20 14:09:52 crc kubenswrapper[4973]: I0320 14:09:52.946206 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vm7fr" Mar 20 14:09:53 crc kubenswrapper[4973]: I0320 14:09:53.047377 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bf4e3dc-1619-4352-a116-cee4edacbc88-utilities\") pod \"1bf4e3dc-1619-4352-a116-cee4edacbc88\" (UID: \"1bf4e3dc-1619-4352-a116-cee4edacbc88\") " Mar 20 14:09:53 crc kubenswrapper[4973]: I0320 14:09:53.048380 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bf4e3dc-1619-4352-a116-cee4edacbc88-catalog-content\") pod \"1bf4e3dc-1619-4352-a116-cee4edacbc88\" (UID: \"1bf4e3dc-1619-4352-a116-cee4edacbc88\") " Mar 20 14:09:53 crc kubenswrapper[4973]: I0320 14:09:53.048600 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s86gl\" (UniqueName: \"kubernetes.io/projected/1bf4e3dc-1619-4352-a116-cee4edacbc88-kube-api-access-s86gl\") pod \"1bf4e3dc-1619-4352-a116-cee4edacbc88\" (UID: \"1bf4e3dc-1619-4352-a116-cee4edacbc88\") " Mar 20 14:09:53 crc kubenswrapper[4973]: I0320 14:09:53.048671 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bf4e3dc-1619-4352-a116-cee4edacbc88-utilities" (OuterVolumeSpecName: "utilities") pod "1bf4e3dc-1619-4352-a116-cee4edacbc88" (UID: "1bf4e3dc-1619-4352-a116-cee4edacbc88"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:09:53 crc kubenswrapper[4973]: I0320 14:09:53.054225 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1bf4e3dc-1619-4352-a116-cee4edacbc88-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:09:53 crc kubenswrapper[4973]: I0320 14:09:53.072174 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf4e3dc-1619-4352-a116-cee4edacbc88-kube-api-access-s86gl" (OuterVolumeSpecName: "kube-api-access-s86gl") pod "1bf4e3dc-1619-4352-a116-cee4edacbc88" (UID: "1bf4e3dc-1619-4352-a116-cee4edacbc88"). InnerVolumeSpecName "kube-api-access-s86gl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:09:53 crc kubenswrapper[4973]: I0320 14:09:53.078201 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1bf4e3dc-1619-4352-a116-cee4edacbc88-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1bf4e3dc-1619-4352-a116-cee4edacbc88" (UID: "1bf4e3dc-1619-4352-a116-cee4edacbc88"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:09:53 crc kubenswrapper[4973]: I0320 14:09:53.155614 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1bf4e3dc-1619-4352-a116-cee4edacbc88-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:09:53 crc kubenswrapper[4973]: I0320 14:09:53.155651 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s86gl\" (UniqueName: \"kubernetes.io/projected/1bf4e3dc-1619-4352-a116-cee4edacbc88-kube-api-access-s86gl\") on node \"crc\" DevicePath \"\"" Mar 20 14:09:53 crc kubenswrapper[4973]: I0320 14:09:53.450443 4973 generic.go:334] "Generic (PLEG): container finished" podID="1bf4e3dc-1619-4352-a116-cee4edacbc88" containerID="ed47c28c170f4e88aae6bae804fe50334e5bf1f61f04d1ccc9048eb61632c1fb" exitCode=0 Mar 20 14:09:53 crc kubenswrapper[4973]: I0320 14:09:53.450486 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vm7fr" event={"ID":"1bf4e3dc-1619-4352-a116-cee4edacbc88","Type":"ContainerDied","Data":"ed47c28c170f4e88aae6bae804fe50334e5bf1f61f04d1ccc9048eb61632c1fb"} Mar 20 14:09:53 crc kubenswrapper[4973]: I0320 14:09:53.450521 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vm7fr" event={"ID":"1bf4e3dc-1619-4352-a116-cee4edacbc88","Type":"ContainerDied","Data":"75a560014f0b09964cd1d2a1de024e41b7951fd37f686c79b3d71ad96e0e5c21"} Mar 20 14:09:53 crc kubenswrapper[4973]: I0320 14:09:53.450539 4973 scope.go:117] "RemoveContainer" containerID="ed47c28c170f4e88aae6bae804fe50334e5bf1f61f04d1ccc9048eb61632c1fb" Mar 20 14:09:53 crc kubenswrapper[4973]: I0320 14:09:53.450565 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vm7fr" Mar 20 14:09:53 crc kubenswrapper[4973]: I0320 14:09:53.473403 4973 scope.go:117] "RemoveContainer" containerID="e3481bcc66bfa95b406444005b72f475af949b15b8b5655bf911392ddb73a55a" Mar 20 14:09:53 crc kubenswrapper[4973]: I0320 14:09:53.491962 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vm7fr"] Mar 20 14:09:53 crc kubenswrapper[4973]: I0320 14:09:53.501417 4973 scope.go:117] "RemoveContainer" containerID="fafb92d5eea3e30c92f99fdc5b5f9ee1015d625c4fd413538e5037041c7a6b86" Mar 20 14:09:53 crc kubenswrapper[4973]: I0320 14:09:53.506260 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vm7fr"] Mar 20 14:09:53 crc kubenswrapper[4973]: I0320 14:09:53.556128 4973 scope.go:117] "RemoveContainer" containerID="ed47c28c170f4e88aae6bae804fe50334e5bf1f61f04d1ccc9048eb61632c1fb" Mar 20 14:09:53 crc kubenswrapper[4973]: E0320 14:09:53.558894 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed47c28c170f4e88aae6bae804fe50334e5bf1f61f04d1ccc9048eb61632c1fb\": container with ID starting with ed47c28c170f4e88aae6bae804fe50334e5bf1f61f04d1ccc9048eb61632c1fb not found: ID does not exist" containerID="ed47c28c170f4e88aae6bae804fe50334e5bf1f61f04d1ccc9048eb61632c1fb" Mar 20 14:09:53 crc kubenswrapper[4973]: I0320 14:09:53.558926 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed47c28c170f4e88aae6bae804fe50334e5bf1f61f04d1ccc9048eb61632c1fb"} err="failed to get container status \"ed47c28c170f4e88aae6bae804fe50334e5bf1f61f04d1ccc9048eb61632c1fb\": rpc error: code = NotFound desc = could not find container \"ed47c28c170f4e88aae6bae804fe50334e5bf1f61f04d1ccc9048eb61632c1fb\": container with ID starting with ed47c28c170f4e88aae6bae804fe50334e5bf1f61f04d1ccc9048eb61632c1fb not found: ID does not exist" Mar 20 14:09:53 crc kubenswrapper[4973]: I0320 14:09:53.558959 4973 scope.go:117] "RemoveContainer" containerID="e3481bcc66bfa95b406444005b72f475af949b15b8b5655bf911392ddb73a55a" Mar 20 14:09:53 crc kubenswrapper[4973]: E0320 14:09:53.559245 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3481bcc66bfa95b406444005b72f475af949b15b8b5655bf911392ddb73a55a\": container with ID starting with e3481bcc66bfa95b406444005b72f475af949b15b8b5655bf911392ddb73a55a not found: ID does not exist" containerID="e3481bcc66bfa95b406444005b72f475af949b15b8b5655bf911392ddb73a55a" Mar 20 14:09:53 crc kubenswrapper[4973]: I0320 14:09:53.559271 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3481bcc66bfa95b406444005b72f475af949b15b8b5655bf911392ddb73a55a"} err="failed to get container status \"e3481bcc66bfa95b406444005b72f475af949b15b8b5655bf911392ddb73a55a\": rpc error: code = NotFound desc = could not find container \"e3481bcc66bfa95b406444005b72f475af949b15b8b5655bf911392ddb73a55a\": container with ID starting with e3481bcc66bfa95b406444005b72f475af949b15b8b5655bf911392ddb73a55a not found: ID does not exist" Mar 20 14:09:53 crc kubenswrapper[4973]: I0320 14:09:53.559287 4973 scope.go:117] "RemoveContainer" containerID="fafb92d5eea3e30c92f99fdc5b5f9ee1015d625c4fd413538e5037041c7a6b86" Mar 20 14:09:53 crc kubenswrapper[4973]: E0320 14:09:53.559626 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fafb92d5eea3e30c92f99fdc5b5f9ee1015d625c4fd413538e5037041c7a6b86\": container with ID starting with fafb92d5eea3e30c92f99fdc5b5f9ee1015d625c4fd413538e5037041c7a6b86 not found: ID does not exist" containerID="fafb92d5eea3e30c92f99fdc5b5f9ee1015d625c4fd413538e5037041c7a6b86" Mar 20 14:09:53 crc kubenswrapper[4973]: I0320 14:09:53.559656 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fafb92d5eea3e30c92f99fdc5b5f9ee1015d625c4fd413538e5037041c7a6b86"} err="failed to get container status \"fafb92d5eea3e30c92f99fdc5b5f9ee1015d625c4fd413538e5037041c7a6b86\": rpc error: code = NotFound desc = could not find container \"fafb92d5eea3e30c92f99fdc5b5f9ee1015d625c4fd413538e5037041c7a6b86\": container with ID starting with fafb92d5eea3e30c92f99fdc5b5f9ee1015d625c4fd413538e5037041c7a6b86 not found: ID does not exist" Mar 20 14:09:53 crc kubenswrapper[4973]: I0320 14:09:53.963192 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf4e3dc-1619-4352-a116-cee4edacbc88" path="/var/lib/kubelet/pods/1bf4e3dc-1619-4352-a116-cee4edacbc88/volumes" Mar 20 14:09:56 crc kubenswrapper[4973]: I0320 14:09:56.555008 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8k2cf" Mar 20 14:09:56 crc kubenswrapper[4973]: I0320 14:09:56.556363 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8k2cf" Mar 20 14:09:56 crc kubenswrapper[4973]: I0320 14:09:56.622363 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8k2cf" Mar 20 14:09:57 crc kubenswrapper[4973]: I0320 14:09:57.580411 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8k2cf" Mar 20 14:09:57 crc kubenswrapper[4973]: I0320 14:09:57.633583 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8k2cf"] Mar 20 14:09:59 crc kubenswrapper[4973]: I0320 14:09:59.554241 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8k2cf" podUID="c6087bd2-bcf1-475f-9aab-7774295e730e" containerName="registry-server" containerID="cri-o://37d21ebbf527e63bfa57333791324ea23e08cde3596f6adac06d7dac971677a5" gracePeriod=2 Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.103022 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8k2cf" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.154685 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566930-fsj2r"] Mar 20 14:10:00 crc kubenswrapper[4973]: E0320 14:10:00.155233 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6087bd2-bcf1-475f-9aab-7774295e730e" containerName="extract-content" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.155254 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6087bd2-bcf1-475f-9aab-7774295e730e" containerName="extract-content" Mar 20 14:10:00 crc kubenswrapper[4973]: E0320 14:10:00.155270 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bf4e3dc-1619-4352-a116-cee4edacbc88" containerName="extract-content" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.155278 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bf4e3dc-1619-4352-a116-cee4edacbc88" containerName="extract-content" Mar 20 14:10:00 crc kubenswrapper[4973]: E0320 14:10:00.155296 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6087bd2-bcf1-475f-9aab-7774295e730e" containerName="extract-utilities" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.155311 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6087bd2-bcf1-475f-9aab-7774295e730e" containerName="extract-utilities" Mar 20 14:10:00 crc kubenswrapper[4973]: E0320 14:10:00.155335 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6087bd2-bcf1-475f-9aab-7774295e730e" containerName="registry-server" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.155355 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6087bd2-bcf1-475f-9aab-7774295e730e" containerName="registry-server" Mar 20 14:10:00 crc kubenswrapper[4973]: E0320 14:10:00.157863 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bf4e3dc-1619-4352-a116-cee4edacbc88" containerName="extract-utilities" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.157891 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bf4e3dc-1619-4352-a116-cee4edacbc88" containerName="extract-utilities" Mar 20 14:10:00 crc kubenswrapper[4973]: E0320 14:10:00.157942 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bf4e3dc-1619-4352-a116-cee4edacbc88" containerName="registry-server" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.157951 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bf4e3dc-1619-4352-a116-cee4edacbc88" containerName="registry-server" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.158312 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bf4e3dc-1619-4352-a116-cee4edacbc88" containerName="registry-server" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.158361 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6087bd2-bcf1-475f-9aab-7774295e730e" containerName="registry-server" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.159254 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566930-fsj2r" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.173947 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.174192 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.174331 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.184759 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566930-fsj2r"] Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.238225 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6087bd2-bcf1-475f-9aab-7774295e730e-catalog-content\") pod \"c6087bd2-bcf1-475f-9aab-7774295e730e\" (UID: \"c6087bd2-bcf1-475f-9aab-7774295e730e\") " Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.238589 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6087bd2-bcf1-475f-9aab-7774295e730e-utilities\") pod \"c6087bd2-bcf1-475f-9aab-7774295e730e\" (UID: \"c6087bd2-bcf1-475f-9aab-7774295e730e\") " Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.238958 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvhtf\" (UniqueName: \"kubernetes.io/projected/c6087bd2-bcf1-475f-9aab-7774295e730e-kube-api-access-rvhtf\") pod \"c6087bd2-bcf1-475f-9aab-7774295e730e\" (UID: \"c6087bd2-bcf1-475f-9aab-7774295e730e\") " Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.239715 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jrkh\" (UniqueName: \"kubernetes.io/projected/3b962ddb-93ff-417b-812f-1ae7022e90db-kube-api-access-2jrkh\") pod \"auto-csr-approver-29566930-fsj2r\" (UID: \"3b962ddb-93ff-417b-812f-1ae7022e90db\") " pod="openshift-infra/auto-csr-approver-29566930-fsj2r" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.240190 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6087bd2-bcf1-475f-9aab-7774295e730e-utilities" (OuterVolumeSpecName: "utilities") pod "c6087bd2-bcf1-475f-9aab-7774295e730e" (UID: "c6087bd2-bcf1-475f-9aab-7774295e730e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.246675 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6087bd2-bcf1-475f-9aab-7774295e730e-kube-api-access-rvhtf" (OuterVolumeSpecName: "kube-api-access-rvhtf") pod "c6087bd2-bcf1-475f-9aab-7774295e730e" (UID: "c6087bd2-bcf1-475f-9aab-7774295e730e"). InnerVolumeSpecName "kube-api-access-rvhtf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.294255 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6087bd2-bcf1-475f-9aab-7774295e730e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c6087bd2-bcf1-475f-9aab-7774295e730e" (UID: "c6087bd2-bcf1-475f-9aab-7774295e730e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.342389 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jrkh\" (UniqueName: \"kubernetes.io/projected/3b962ddb-93ff-417b-812f-1ae7022e90db-kube-api-access-2jrkh\") pod \"auto-csr-approver-29566930-fsj2r\" (UID: \"3b962ddb-93ff-417b-812f-1ae7022e90db\") " pod="openshift-infra/auto-csr-approver-29566930-fsj2r" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.342601 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvhtf\" (UniqueName: \"kubernetes.io/projected/c6087bd2-bcf1-475f-9aab-7774295e730e-kube-api-access-rvhtf\") on node \"crc\" DevicePath \"\"" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.342615 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c6087bd2-bcf1-475f-9aab-7774295e730e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.342624 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c6087bd2-bcf1-475f-9aab-7774295e730e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.371201 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jrkh\" (UniqueName: \"kubernetes.io/projected/3b962ddb-93ff-417b-812f-1ae7022e90db-kube-api-access-2jrkh\") pod \"auto-csr-approver-29566930-fsj2r\" (UID: \"3b962ddb-93ff-417b-812f-1ae7022e90db\") " pod="openshift-infra/auto-csr-approver-29566930-fsj2r" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.500361 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566930-fsj2r" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.587332 4973 generic.go:334] "Generic (PLEG): container finished" podID="c6087bd2-bcf1-475f-9aab-7774295e730e" containerID="37d21ebbf527e63bfa57333791324ea23e08cde3596f6adac06d7dac971677a5" exitCode=0 Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.587481 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8k2cf" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.587524 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8k2cf" event={"ID":"c6087bd2-bcf1-475f-9aab-7774295e730e","Type":"ContainerDied","Data":"37d21ebbf527e63bfa57333791324ea23e08cde3596f6adac06d7dac971677a5"} Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.587896 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8k2cf" event={"ID":"c6087bd2-bcf1-475f-9aab-7774295e730e","Type":"ContainerDied","Data":"c79b73398394b808a38747a27ee60dd95fb22925cdba22e6f68ddc19b16b1d7d"} Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.587923 4973 scope.go:117] "RemoveContainer" containerID="37d21ebbf527e63bfa57333791324ea23e08cde3596f6adac06d7dac971677a5" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.622577 4973 scope.go:117] "RemoveContainer" containerID="c1216c769fa34b0823a5dc713f80042b0acabda932bfe871bc1797eb613d9a99" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.638805 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8k2cf"] Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.657562 4973 scope.go:117] "RemoveContainer" containerID="11383f19239a5ca94e419f61328ed4f4c4c7cd7f04fb2853d4a9ea60fdc375e4" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.662456 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8k2cf"] Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.745379 4973 scope.go:117] "RemoveContainer" containerID="37d21ebbf527e63bfa57333791324ea23e08cde3596f6adac06d7dac971677a5" Mar 20 14:10:00 crc kubenswrapper[4973]: E0320 14:10:00.747204 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37d21ebbf527e63bfa57333791324ea23e08cde3596f6adac06d7dac971677a5\": container with ID starting with 37d21ebbf527e63bfa57333791324ea23e08cde3596f6adac06d7dac971677a5 not found: ID does not exist" containerID="37d21ebbf527e63bfa57333791324ea23e08cde3596f6adac06d7dac971677a5" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.747239 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37d21ebbf527e63bfa57333791324ea23e08cde3596f6adac06d7dac971677a5"} err="failed to get container status \"37d21ebbf527e63bfa57333791324ea23e08cde3596f6adac06d7dac971677a5\": rpc error: code = NotFound desc = could not find container \"37d21ebbf527e63bfa57333791324ea23e08cde3596f6adac06d7dac971677a5\": container with ID starting with 37d21ebbf527e63bfa57333791324ea23e08cde3596f6adac06d7dac971677a5 not found: ID does not exist" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.747275 4973 scope.go:117] "RemoveContainer" containerID="c1216c769fa34b0823a5dc713f80042b0acabda932bfe871bc1797eb613d9a99" Mar 20 14:10:00 crc kubenswrapper[4973]: E0320 14:10:00.747972 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1216c769fa34b0823a5dc713f80042b0acabda932bfe871bc1797eb613d9a99\": container with ID starting with c1216c769fa34b0823a5dc713f80042b0acabda932bfe871bc1797eb613d9a99 not found: ID does not exist" containerID="c1216c769fa34b0823a5dc713f80042b0acabda932bfe871bc1797eb613d9a99" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.748001 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1216c769fa34b0823a5dc713f80042b0acabda932bfe871bc1797eb613d9a99"} err="failed to get container status \"c1216c769fa34b0823a5dc713f80042b0acabda932bfe871bc1797eb613d9a99\": rpc error: code = NotFound desc = could not find container \"c1216c769fa34b0823a5dc713f80042b0acabda932bfe871bc1797eb613d9a99\": container with ID starting with c1216c769fa34b0823a5dc713f80042b0acabda932bfe871bc1797eb613d9a99 not found: ID does not exist" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.748017 4973 scope.go:117] "RemoveContainer" containerID="11383f19239a5ca94e419f61328ed4f4c4c7cd7f04fb2853d4a9ea60fdc375e4" Mar 20 14:10:00 crc kubenswrapper[4973]: E0320 14:10:00.749158 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11383f19239a5ca94e419f61328ed4f4c4c7cd7f04fb2853d4a9ea60fdc375e4\": container with ID starting with 11383f19239a5ca94e419f61328ed4f4c4c7cd7f04fb2853d4a9ea60fdc375e4 not found: ID does not exist" containerID="11383f19239a5ca94e419f61328ed4f4c4c7cd7f04fb2853d4a9ea60fdc375e4" Mar 20 14:10:00 crc kubenswrapper[4973]: I0320 14:10:00.749183 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11383f19239a5ca94e419f61328ed4f4c4c7cd7f04fb2853d4a9ea60fdc375e4"} err="failed to get container status \"11383f19239a5ca94e419f61328ed4f4c4c7cd7f04fb2853d4a9ea60fdc375e4\": rpc error: code = NotFound desc = could not find container \"11383f19239a5ca94e419f61328ed4f4c4c7cd7f04fb2853d4a9ea60fdc375e4\": container with ID starting with 11383f19239a5ca94e419f61328ed4f4c4c7cd7f04fb2853d4a9ea60fdc375e4 not found: ID does not exist" Mar 20 14:10:01 crc kubenswrapper[4973]: I0320 14:10:01.086414 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566930-fsj2r"] Mar 20 14:10:01 crc kubenswrapper[4973]: I0320 14:10:01.603066 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566930-fsj2r" event={"ID":"3b962ddb-93ff-417b-812f-1ae7022e90db","Type":"ContainerStarted","Data":"72f626bb50273e423e0ba8c79490bbdb83a500cfa81536e8e72c12bae948ce3c"} Mar 20 14:10:01 crc kubenswrapper[4973]: I0320 14:10:01.964319 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6087bd2-bcf1-475f-9aab-7774295e730e" path="/var/lib/kubelet/pods/c6087bd2-bcf1-475f-9aab-7774295e730e/volumes" Mar 20 14:10:03 crc kubenswrapper[4973]: I0320 14:10:03.635333 4973 generic.go:334] "Generic (PLEG): container finished" podID="3b962ddb-93ff-417b-812f-1ae7022e90db" containerID="58d057e5a7b6cafbc68a0ecb8cd8418eab3ec9f68ac8942c094282cded3f5563" exitCode=0 Mar 20 14:10:03 crc kubenswrapper[4973]: I0320 14:10:03.636050 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566930-fsj2r" event={"ID":"3b962ddb-93ff-417b-812f-1ae7022e90db","Type":"ContainerDied","Data":"58d057e5a7b6cafbc68a0ecb8cd8418eab3ec9f68ac8942c094282cded3f5563"} Mar 20 14:10:05 crc kubenswrapper[4973]: I0320 14:10:05.108664 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566930-fsj2r" Mar 20 14:10:05 crc kubenswrapper[4973]: I0320 14:10:05.277522 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jrkh\" (UniqueName: \"kubernetes.io/projected/3b962ddb-93ff-417b-812f-1ae7022e90db-kube-api-access-2jrkh\") pod \"3b962ddb-93ff-417b-812f-1ae7022e90db\" (UID: \"3b962ddb-93ff-417b-812f-1ae7022e90db\") " Mar 20 14:10:05 crc kubenswrapper[4973]: I0320 14:10:05.283148 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b962ddb-93ff-417b-812f-1ae7022e90db-kube-api-access-2jrkh" (OuterVolumeSpecName: "kube-api-access-2jrkh") pod "3b962ddb-93ff-417b-812f-1ae7022e90db" (UID: "3b962ddb-93ff-417b-812f-1ae7022e90db"). InnerVolumeSpecName "kube-api-access-2jrkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:10:05 crc kubenswrapper[4973]: I0320 14:10:05.380929 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jrkh\" (UniqueName: \"kubernetes.io/projected/3b962ddb-93ff-417b-812f-1ae7022e90db-kube-api-access-2jrkh\") on node \"crc\" DevicePath \"\"" Mar 20 14:10:05 crc kubenswrapper[4973]: I0320 14:10:05.662168 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566930-fsj2r" event={"ID":"3b962ddb-93ff-417b-812f-1ae7022e90db","Type":"ContainerDied","Data":"72f626bb50273e423e0ba8c79490bbdb83a500cfa81536e8e72c12bae948ce3c"} Mar 20 14:10:05 crc kubenswrapper[4973]: I0320 14:10:05.662566 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72f626bb50273e423e0ba8c79490bbdb83a500cfa81536e8e72c12bae948ce3c" Mar 20 14:10:05 crc kubenswrapper[4973]: I0320 14:10:05.662209 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566930-fsj2r" Mar 20 14:10:06 crc kubenswrapper[4973]: I0320 14:10:06.190387 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566924-c94kj"] Mar 20 14:10:06 crc kubenswrapper[4973]: I0320 14:10:06.201384 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566924-c94kj"] Mar 20 14:10:07 crc kubenswrapper[4973]: I0320 14:10:07.965110 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3be1f4bf-b422-46df-a681-53453caf47bb" path="/var/lib/kubelet/pods/3be1f4bf-b422-46df-a681-53453caf47bb/volumes" Mar 20 14:10:13 crc kubenswrapper[4973]: I0320 14:10:13.320671 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:10:13 crc kubenswrapper[4973]: I0320 14:10:13.321247 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:10:14 crc kubenswrapper[4973]: I0320 14:10:14.766787 4973 generic.go:334] "Generic (PLEG): container finished" podID="f2cb943f-f811-4eea-b860-1c19a6137dbb" containerID="43f23358c3636e303074d8f658b2311591da53a4d914e31c5dcb43c401c13edf" exitCode=0 Mar 20 14:10:14 crc kubenswrapper[4973]: I0320 14:10:14.766829 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" event={"ID":"f2cb943f-f811-4eea-b860-1c19a6137dbb","Type":"ContainerDied","Data":"43f23358c3636e303074d8f658b2311591da53a4d914e31c5dcb43c401c13edf"} Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.341219 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.469274 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-cell1-compute-config-0\") pod \"f2cb943f-f811-4eea-b860-1c19a6137dbb\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.469712 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-migration-ssh-key-0\") pod \"f2cb943f-f811-4eea-b860-1c19a6137dbb\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.469823 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-ssh-key-openstack-edpm-ipam\") pod \"f2cb943f-f811-4eea-b860-1c19a6137dbb\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.469929 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-cell1-compute-config-1\") pod \"f2cb943f-f811-4eea-b860-1c19a6137dbb\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.469989 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-cell1-compute-config-2\") pod \"f2cb943f-f811-4eea-b860-1c19a6137dbb\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.470010 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-migration-ssh-key-1\") pod \"f2cb943f-f811-4eea-b860-1c19a6137dbb\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.470154 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhggl\" (UniqueName: \"kubernetes.io/projected/f2cb943f-f811-4eea-b860-1c19a6137dbb-kube-api-access-mhggl\") pod \"f2cb943f-f811-4eea-b860-1c19a6137dbb\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.470264 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-combined-ca-bundle\") pod \"f2cb943f-f811-4eea-b860-1c19a6137dbb\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.470292 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-inventory\") pod \"f2cb943f-f811-4eea-b860-1c19a6137dbb\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.470330 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-cell1-compute-config-3\") pod \"f2cb943f-f811-4eea-b860-1c19a6137dbb\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.470398 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-extra-config-0\") pod \"f2cb943f-f811-4eea-b860-1c19a6137dbb\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.532492 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2cb943f-f811-4eea-b860-1c19a6137dbb-kube-api-access-mhggl" (OuterVolumeSpecName: "kube-api-access-mhggl") pod "f2cb943f-f811-4eea-b860-1c19a6137dbb" (UID: "f2cb943f-f811-4eea-b860-1c19a6137dbb"). InnerVolumeSpecName "kube-api-access-mhggl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.534217 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "f2cb943f-f811-4eea-b860-1c19a6137dbb" (UID: "f2cb943f-f811-4eea-b860-1c19a6137dbb"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.546573 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "f2cb943f-f811-4eea-b860-1c19a6137dbb" (UID: "f2cb943f-f811-4eea-b860-1c19a6137dbb"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.560767 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "f2cb943f-f811-4eea-b860-1c19a6137dbb" (UID: "f2cb943f-f811-4eea-b860-1c19a6137dbb"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.563111 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "f2cb943f-f811-4eea-b860-1c19a6137dbb" (UID: "f2cb943f-f811-4eea-b860-1c19a6137dbb"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.567096 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "f2cb943f-f811-4eea-b860-1c19a6137dbb" (UID: "f2cb943f-f811-4eea-b860-1c19a6137dbb"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.573433 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "f2cb943f-f811-4eea-b860-1c19a6137dbb" (UID: "f2cb943f-f811-4eea-b860-1c19a6137dbb"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.573575 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-cell1-compute-config-1\") pod \"f2cb943f-f811-4eea-b860-1c19a6137dbb\" (UID: \"f2cb943f-f811-4eea-b860-1c19a6137dbb\") " Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.574505 4973 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.574526 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhggl\" (UniqueName: \"kubernetes.io/projected/f2cb943f-f811-4eea-b860-1c19a6137dbb-kube-api-access-mhggl\") on node \"crc\" DevicePath \"\"" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.574536 4973 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.574546 4973 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.574558 4973 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.574572 4973 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 20 14:10:16 crc kubenswrapper[4973]: W0320 14:10:16.574663 4973 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/f2cb943f-f811-4eea-b860-1c19a6137dbb/volumes/kubernetes.io~secret/nova-cell1-compute-config-1 Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.574675 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "f2cb943f-f811-4eea-b860-1c19a6137dbb" (UID: "f2cb943f-f811-4eea-b860-1c19a6137dbb"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.584724 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f2cb943f-f811-4eea-b860-1c19a6137dbb" (UID: "f2cb943f-f811-4eea-b860-1c19a6137dbb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.601901 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "f2cb943f-f811-4eea-b860-1c19a6137dbb" (UID: "f2cb943f-f811-4eea-b860-1c19a6137dbb"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.603328 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-inventory" (OuterVolumeSpecName: "inventory") pod "f2cb943f-f811-4eea-b860-1c19a6137dbb" (UID: "f2cb943f-f811-4eea-b860-1c19a6137dbb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.605386 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "f2cb943f-f811-4eea-b860-1c19a6137dbb" (UID: "f2cb943f-f811-4eea-b860-1c19a6137dbb"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.677108 4973 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.677138 4973 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.677148 4973 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.677157 4973 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.677165 4973 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f2cb943f-f811-4eea-b860-1c19a6137dbb-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.789348 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" event={"ID":"f2cb943f-f811-4eea-b860-1c19a6137dbb","Type":"ContainerDied","Data":"9a963f0d17603e82c9076879e437a45a930265c82058514a36fd7edd5886fba7"} Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.789390 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a963f0d17603e82c9076879e437a45a930265c82058514a36fd7edd5886fba7" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.789426 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-zrpbv" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.889639 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q"] Mar 20 14:10:16 crc kubenswrapper[4973]: E0320 14:10:16.890292 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b962ddb-93ff-417b-812f-1ae7022e90db" containerName="oc" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.890319 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b962ddb-93ff-417b-812f-1ae7022e90db" containerName="oc" Mar 20 14:10:16 crc kubenswrapper[4973]: E0320 14:10:16.890387 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2cb943f-f811-4eea-b860-1c19a6137dbb" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.890398 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2cb943f-f811-4eea-b860-1c19a6137dbb" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.890679 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2cb943f-f811-4eea-b860-1c19a6137dbb" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.890718 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b962ddb-93ff-417b-812f-1ae7022e90db" containerName="oc" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.892258 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.894314 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.895047 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.895273 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d4wjc" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.897816 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.899598 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.912670 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q"] Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.986572 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.986736 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86lqs\" (UniqueName: \"kubernetes.io/projected/7038f01c-aeff-4322-bfd1-715445d5d1cb-kube-api-access-86lqs\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.986777 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.986813 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.987103 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.987401 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" Mar 20 14:10:16 crc kubenswrapper[4973]: I0320 14:10:16.987434 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" Mar 20 14:10:17 crc kubenswrapper[4973]: I0320 14:10:17.090188 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" Mar 20 14:10:17 crc kubenswrapper[4973]: I0320 14:10:17.090638 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" Mar 20 14:10:17 crc kubenswrapper[4973]: I0320 14:10:17.090705 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" Mar 20 14:10:17 crc kubenswrapper[4973]: I0320 14:10:17.090800 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86lqs\" (UniqueName: \"kubernetes.io/projected/7038f01c-aeff-4322-bfd1-715445d5d1cb-kube-api-access-86lqs\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" Mar 20 14:10:17 crc kubenswrapper[4973]: I0320 14:10:17.090832 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" Mar 20 14:10:17 crc kubenswrapper[4973]: I0320 14:10:17.090862 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" Mar 20 14:10:17 crc kubenswrapper[4973]: I0320 14:10:17.091018 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" Mar 20 14:10:17 crc kubenswrapper[4973]: I0320 14:10:17.096180 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" Mar 20 14:10:17 crc kubenswrapper[4973]: I0320 14:10:17.096377 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" Mar 20 14:10:17 crc kubenswrapper[4973]: I0320 14:10:17.096661 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" Mar 20 14:10:17 crc kubenswrapper[4973]: I0320 14:10:17.096744 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" Mar 20 14:10:17 crc kubenswrapper[4973]: I0320 14:10:17.097071 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" Mar 20 14:10:17 crc kubenswrapper[4973]: I0320 14:10:17.097217 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" Mar 20 14:10:17 crc kubenswrapper[4973]: I0320 14:10:17.111507 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86lqs\" (UniqueName: \"kubernetes.io/projected/7038f01c-aeff-4322-bfd1-715445d5d1cb-kube-api-access-86lqs\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" Mar 20 14:10:17 crc kubenswrapper[4973]: I0320 14:10:17.261280 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" Mar 20 14:10:17 crc kubenswrapper[4973]: I0320 14:10:17.837851 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q"] Mar 20 14:10:18 crc kubenswrapper[4973]: I0320 14:10:18.813058 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" event={"ID":"7038f01c-aeff-4322-bfd1-715445d5d1cb","Type":"ContainerStarted","Data":"79832105df4a0f6a5f8cdc3d4fdc3cf7e22c5fdd85ef64a73af61e8c0d0b3dd7"} Mar 20 14:10:19 crc kubenswrapper[4973]: I0320 14:10:19.824857 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" event={"ID":"7038f01c-aeff-4322-bfd1-715445d5d1cb","Type":"ContainerStarted","Data":"ed08d18b1140c1c4489c3a75a91ec79fdd0aa6a6aa436f59d50e811e79ed22bf"} Mar 20 14:10:19 crc kubenswrapper[4973]: I0320 14:10:19.841148 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" podStartSLOduration=2.824833403 podStartE2EDuration="3.841125877s" podCreationTimestamp="2026-03-20 14:10:16 +0000 UTC" firstStartedPulling="2026-03-20 14:10:17.840528993 +0000 UTC m=+2938.584198737" lastFinishedPulling="2026-03-20 14:10:18.856821467 +0000 UTC m=+2939.600491211" observedRunningTime="2026-03-20 14:10:19.839531974 +0000 UTC m=+2940.583201718" watchObservedRunningTime="2026-03-20 14:10:19.841125877 +0000 UTC m=+2940.584795621" Mar 20 14:10:43 crc kubenswrapper[4973]: I0320 14:10:43.321410 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:10:43 crc kubenswrapper[4973]: I0320 14:10:43.321845 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:11:01 crc kubenswrapper[4973]: I0320 14:11:01.844555 4973 scope.go:117] "RemoveContainer" containerID="81c24dc12fc06cc4d163d004e4cac1e0738a8d150ad590fbe9ad5ee130c08875" Mar 20 14:11:13 crc kubenswrapper[4973]: I0320 14:11:13.322156 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:11:13 crc kubenswrapper[4973]: I0320 14:11:13.322742 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:11:13 crc kubenswrapper[4973]: I0320 14:11:13.322789 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 14:11:13 crc kubenswrapper[4973]: I0320 14:11:13.323738 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c"} pod="openshift-machine-config-operator/machine-config-daemon-qlztx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:11:13 crc kubenswrapper[4973]: I0320 14:11:13.323788 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" containerID="cri-o://eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c" gracePeriod=600 Mar 20 14:11:13 crc kubenswrapper[4973]: E0320 14:11:13.455837 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:11:14 crc kubenswrapper[4973]: I0320 14:11:14.405649 4973 generic.go:334] "Generic (PLEG): container finished" podID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerID="eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c" exitCode=0 Mar 20 14:11:14 crc kubenswrapper[4973]: I0320 14:11:14.405752 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerDied","Data":"eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c"} Mar 20 14:11:14 crc kubenswrapper[4973]: I0320 14:11:14.405999 4973 scope.go:117] "RemoveContainer" containerID="b01dc0fd273ee1427bf99791c14487bb944cb9cd6409ef35b5ca274d472b4fca" Mar 20 14:11:14 crc kubenswrapper[4973]: I0320 14:11:14.406489 4973 scope.go:117] "RemoveContainer" containerID="eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c" Mar 20 14:11:14 crc kubenswrapper[4973]: E0320 14:11:14.406795 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:11:24 crc kubenswrapper[4973]: I0320 14:11:24.951989 4973 scope.go:117] "RemoveContainer" containerID="eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c" Mar 20 14:11:24 crc kubenswrapper[4973]: E0320 14:11:24.953205 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:11:26 crc kubenswrapper[4973]: I0320 14:11:26.980878 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-5q9c4"] Mar 20 14:11:26 crc kubenswrapper[4973]: I0320 14:11:26.983956 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5q9c4" Mar 20 14:11:26 crc kubenswrapper[4973]: I0320 14:11:26.997241 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5q9c4"] Mar 20 14:11:27 crc kubenswrapper[4973]: I0320 14:11:27.150937 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e057a3ae-d02f-42b8-ab09-8366b7c85c61-catalog-content\") pod \"community-operators-5q9c4\" (UID: \"e057a3ae-d02f-42b8-ab09-8366b7c85c61\") " pod="openshift-marketplace/community-operators-5q9c4" Mar 20 14:11:27 crc kubenswrapper[4973]: I0320 14:11:27.151110 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8km5\" (UniqueName: \"kubernetes.io/projected/e057a3ae-d02f-42b8-ab09-8366b7c85c61-kube-api-access-s8km5\") pod \"community-operators-5q9c4\" (UID: \"e057a3ae-d02f-42b8-ab09-8366b7c85c61\") " pod="openshift-marketplace/community-operators-5q9c4" Mar 20 14:11:27 crc kubenswrapper[4973]: I0320 14:11:27.151161 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e057a3ae-d02f-42b8-ab09-8366b7c85c61-utilities\") pod \"community-operators-5q9c4\" (UID: \"e057a3ae-d02f-42b8-ab09-8366b7c85c61\") " pod="openshift-marketplace/community-operators-5q9c4" Mar 20 14:11:27 crc kubenswrapper[4973]: I0320 14:11:27.253156 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8km5\" (UniqueName: \"kubernetes.io/projected/e057a3ae-d02f-42b8-ab09-8366b7c85c61-kube-api-access-s8km5\") pod \"community-operators-5q9c4\" (UID: \"e057a3ae-d02f-42b8-ab09-8366b7c85c61\") " pod="openshift-marketplace/community-operators-5q9c4" Mar 20 14:11:27 crc kubenswrapper[4973]: I0320 14:11:27.253236 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e057a3ae-d02f-42b8-ab09-8366b7c85c61-utilities\") pod \"community-operators-5q9c4\" (UID: \"e057a3ae-d02f-42b8-ab09-8366b7c85c61\") " pod="openshift-marketplace/community-operators-5q9c4" Mar 20 14:11:27 crc kubenswrapper[4973]: I0320 14:11:27.253413 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e057a3ae-d02f-42b8-ab09-8366b7c85c61-catalog-content\") pod \"community-operators-5q9c4\" (UID: \"e057a3ae-d02f-42b8-ab09-8366b7c85c61\") " pod="openshift-marketplace/community-operators-5q9c4" Mar 20 14:11:27 crc kubenswrapper[4973]: I0320 14:11:27.254013 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e057a3ae-d02f-42b8-ab09-8366b7c85c61-catalog-content\") pod \"community-operators-5q9c4\" (UID: \"e057a3ae-d02f-42b8-ab09-8366b7c85c61\") " pod="openshift-marketplace/community-operators-5q9c4" Mar 20 14:11:27 crc kubenswrapper[4973]: I0320 14:11:27.254032 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e057a3ae-d02f-42b8-ab09-8366b7c85c61-utilities\") pod \"community-operators-5q9c4\" (UID: \"e057a3ae-d02f-42b8-ab09-8366b7c85c61\") " pod="openshift-marketplace/community-operators-5q9c4" Mar 20 14:11:27 crc kubenswrapper[4973]: I0320 14:11:27.274048 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8km5\" (UniqueName: \"kubernetes.io/projected/e057a3ae-d02f-42b8-ab09-8366b7c85c61-kube-api-access-s8km5\") pod \"community-operators-5q9c4\" (UID: \"e057a3ae-d02f-42b8-ab09-8366b7c85c61\") " pod="openshift-marketplace/community-operators-5q9c4" Mar 20 14:11:27 crc kubenswrapper[4973]: I0320 14:11:27.310601 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5q9c4" Mar 20 14:11:27 crc kubenswrapper[4973]: I0320 14:11:27.879493 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-5q9c4"] Mar 20 14:11:28 crc kubenswrapper[4973]: I0320 14:11:28.585390 4973 generic.go:334] "Generic (PLEG): container finished" podID="e057a3ae-d02f-42b8-ab09-8366b7c85c61" containerID="b5a81d246d54e7b3415f52e8987d6fe59dff161bebd7eeaf24a7545b66bf0ee3" exitCode=0 Mar 20 14:11:28 crc kubenswrapper[4973]: I0320 14:11:28.585487 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5q9c4" event={"ID":"e057a3ae-d02f-42b8-ab09-8366b7c85c61","Type":"ContainerDied","Data":"b5a81d246d54e7b3415f52e8987d6fe59dff161bebd7eeaf24a7545b66bf0ee3"} Mar 20 14:11:28 crc kubenswrapper[4973]: I0320 14:11:28.585763 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5q9c4" event={"ID":"e057a3ae-d02f-42b8-ab09-8366b7c85c61","Type":"ContainerStarted","Data":"53465866a5e628788d8a74fefae25ed50f87cd5f1aca32552f09ba231ebcd8ab"} Mar 20 14:11:29 crc kubenswrapper[4973]: I0320 14:11:29.598989 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5q9c4" event={"ID":"e057a3ae-d02f-42b8-ab09-8366b7c85c61","Type":"ContainerStarted","Data":"41697004ce71f913beb7bd6f03e7c544a0b850484807e037237d34d811e1f81e"} Mar 20 14:11:31 crc kubenswrapper[4973]: I0320 14:11:31.622812 4973 generic.go:334] "Generic (PLEG): container finished" podID="e057a3ae-d02f-42b8-ab09-8366b7c85c61" containerID="41697004ce71f913beb7bd6f03e7c544a0b850484807e037237d34d811e1f81e" exitCode=0 Mar 20 14:11:31 crc kubenswrapper[4973]: I0320 14:11:31.622960 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5q9c4" event={"ID":"e057a3ae-d02f-42b8-ab09-8366b7c85c61","Type":"ContainerDied","Data":"41697004ce71f913beb7bd6f03e7c544a0b850484807e037237d34d811e1f81e"} Mar 20 14:11:32 crc kubenswrapper[4973]: I0320 14:11:32.635600 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5q9c4" event={"ID":"e057a3ae-d02f-42b8-ab09-8366b7c85c61","Type":"ContainerStarted","Data":"bc3065844d18256943c9e28466881f614fc70c6f827b651070b6453328b021fd"} Mar 20 14:11:32 crc kubenswrapper[4973]: I0320 14:11:32.661666 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-5q9c4" podStartSLOduration=2.893427291 podStartE2EDuration="6.66164312s" podCreationTimestamp="2026-03-20 14:11:26 +0000 UTC" firstStartedPulling="2026-03-20 14:11:28.586964495 +0000 UTC m=+3009.330634239" lastFinishedPulling="2026-03-20 14:11:32.355180324 +0000 UTC m=+3013.098850068" observedRunningTime="2026-03-20 14:11:32.653552649 +0000 UTC m=+3013.397222393" watchObservedRunningTime="2026-03-20 14:11:32.66164312 +0000 UTC m=+3013.405312864" Mar 20 14:11:37 crc kubenswrapper[4973]: I0320 14:11:37.311498 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-5q9c4" Mar 20 14:11:37 crc kubenswrapper[4973]: I0320 14:11:37.312571 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-5q9c4" Mar 20 14:11:37 crc kubenswrapper[4973]: I0320 14:11:37.364100 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-5q9c4" Mar 20 14:11:37 crc kubenswrapper[4973]: I0320 14:11:37.731333 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-5q9c4" Mar 20 14:11:37 crc kubenswrapper[4973]: I0320 14:11:37.951397 4973 scope.go:117] "RemoveContainer" containerID="eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c" Mar 20 14:11:37 crc kubenswrapper[4973]: E0320 14:11:37.951756 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:11:40 crc kubenswrapper[4973]: I0320 14:11:40.972294 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5q9c4"] Mar 20 14:11:40 crc kubenswrapper[4973]: I0320 14:11:40.972951 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-5q9c4" podUID="e057a3ae-d02f-42b8-ab09-8366b7c85c61" containerName="registry-server" containerID="cri-o://bc3065844d18256943c9e28466881f614fc70c6f827b651070b6453328b021fd" gracePeriod=2 Mar 20 14:11:41 crc kubenswrapper[4973]: I0320 14:11:41.535398 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5q9c4" Mar 20 14:11:41 crc kubenswrapper[4973]: I0320 14:11:41.624430 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8km5\" (UniqueName: \"kubernetes.io/projected/e057a3ae-d02f-42b8-ab09-8366b7c85c61-kube-api-access-s8km5\") pod \"e057a3ae-d02f-42b8-ab09-8366b7c85c61\" (UID: \"e057a3ae-d02f-42b8-ab09-8366b7c85c61\") " Mar 20 14:11:41 crc kubenswrapper[4973]: I0320 14:11:41.624531 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e057a3ae-d02f-42b8-ab09-8366b7c85c61-catalog-content\") pod \"e057a3ae-d02f-42b8-ab09-8366b7c85c61\" (UID: \"e057a3ae-d02f-42b8-ab09-8366b7c85c61\") " Mar 20 14:11:41 crc kubenswrapper[4973]: I0320 14:11:41.624565 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e057a3ae-d02f-42b8-ab09-8366b7c85c61-utilities\") pod \"e057a3ae-d02f-42b8-ab09-8366b7c85c61\" (UID: \"e057a3ae-d02f-42b8-ab09-8366b7c85c61\") " Mar 20 14:11:41 crc kubenswrapper[4973]: I0320 14:11:41.625639 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e057a3ae-d02f-42b8-ab09-8366b7c85c61-utilities" (OuterVolumeSpecName: "utilities") pod "e057a3ae-d02f-42b8-ab09-8366b7c85c61" (UID: "e057a3ae-d02f-42b8-ab09-8366b7c85c61"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:11:41 crc kubenswrapper[4973]: I0320 14:11:41.639625 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e057a3ae-d02f-42b8-ab09-8366b7c85c61-kube-api-access-s8km5" (OuterVolumeSpecName: "kube-api-access-s8km5") pod "e057a3ae-d02f-42b8-ab09-8366b7c85c61" (UID: "e057a3ae-d02f-42b8-ab09-8366b7c85c61"). InnerVolumeSpecName "kube-api-access-s8km5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:11:41 crc kubenswrapper[4973]: I0320 14:11:41.718256 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e057a3ae-d02f-42b8-ab09-8366b7c85c61-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e057a3ae-d02f-42b8-ab09-8366b7c85c61" (UID: "e057a3ae-d02f-42b8-ab09-8366b7c85c61"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:11:41 crc kubenswrapper[4973]: I0320 14:11:41.727486 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8km5\" (UniqueName: \"kubernetes.io/projected/e057a3ae-d02f-42b8-ab09-8366b7c85c61-kube-api-access-s8km5\") on node \"crc\" DevicePath \"\"" Mar 20 14:11:41 crc kubenswrapper[4973]: I0320 14:11:41.727524 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e057a3ae-d02f-42b8-ab09-8366b7c85c61-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:11:41 crc kubenswrapper[4973]: I0320 14:11:41.727537 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e057a3ae-d02f-42b8-ab09-8366b7c85c61-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:11:41 crc kubenswrapper[4973]: I0320 14:11:41.731828 4973 generic.go:334] "Generic (PLEG): container finished" podID="e057a3ae-d02f-42b8-ab09-8366b7c85c61" containerID="bc3065844d18256943c9e28466881f614fc70c6f827b651070b6453328b021fd" exitCode=0 Mar 20 14:11:41 crc kubenswrapper[4973]: I0320 14:11:41.731870 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5q9c4" event={"ID":"e057a3ae-d02f-42b8-ab09-8366b7c85c61","Type":"ContainerDied","Data":"bc3065844d18256943c9e28466881f614fc70c6f827b651070b6453328b021fd"} Mar 20 14:11:41 crc kubenswrapper[4973]: I0320 14:11:41.731902 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-5q9c4" event={"ID":"e057a3ae-d02f-42b8-ab09-8366b7c85c61","Type":"ContainerDied","Data":"53465866a5e628788d8a74fefae25ed50f87cd5f1aca32552f09ba231ebcd8ab"} Mar 20 14:11:41 crc kubenswrapper[4973]: I0320 14:11:41.731922 4973 scope.go:117] "RemoveContainer" containerID="bc3065844d18256943c9e28466881f614fc70c6f827b651070b6453328b021fd" Mar 20 14:11:41 crc kubenswrapper[4973]: I0320 14:11:41.731945 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-5q9c4" Mar 20 14:11:41 crc kubenswrapper[4973]: I0320 14:11:41.775650 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-5q9c4"] Mar 20 14:11:41 crc kubenswrapper[4973]: I0320 14:11:41.785204 4973 scope.go:117] "RemoveContainer" containerID="41697004ce71f913beb7bd6f03e7c544a0b850484807e037237d34d811e1f81e" Mar 20 14:11:41 crc kubenswrapper[4973]: I0320 14:11:41.791015 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-5q9c4"] Mar 20 14:11:41 crc kubenswrapper[4973]: I0320 14:11:41.812920 4973 scope.go:117] "RemoveContainer" containerID="b5a81d246d54e7b3415f52e8987d6fe59dff161bebd7eeaf24a7545b66bf0ee3" Mar 20 14:11:41 crc kubenswrapper[4973]: I0320 14:11:41.874324 4973 scope.go:117] "RemoveContainer" containerID="bc3065844d18256943c9e28466881f614fc70c6f827b651070b6453328b021fd" Mar 20 14:11:41 crc kubenswrapper[4973]: E0320 14:11:41.874783 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc3065844d18256943c9e28466881f614fc70c6f827b651070b6453328b021fd\": container with ID starting with bc3065844d18256943c9e28466881f614fc70c6f827b651070b6453328b021fd not found: ID does not exist" containerID="bc3065844d18256943c9e28466881f614fc70c6f827b651070b6453328b021fd" Mar 20 14:11:41 crc kubenswrapper[4973]: I0320 14:11:41.874826 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc3065844d18256943c9e28466881f614fc70c6f827b651070b6453328b021fd"} err="failed to get container status \"bc3065844d18256943c9e28466881f614fc70c6f827b651070b6453328b021fd\": rpc error: code = NotFound desc = could not find container \"bc3065844d18256943c9e28466881f614fc70c6f827b651070b6453328b021fd\": container with ID starting with bc3065844d18256943c9e28466881f614fc70c6f827b651070b6453328b021fd not found: ID does not exist" Mar 20 14:11:41 crc kubenswrapper[4973]: I0320 14:11:41.874879 4973 scope.go:117] "RemoveContainer" containerID="41697004ce71f913beb7bd6f03e7c544a0b850484807e037237d34d811e1f81e" Mar 20 14:11:41 crc kubenswrapper[4973]: E0320 14:11:41.875209 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41697004ce71f913beb7bd6f03e7c544a0b850484807e037237d34d811e1f81e\": container with ID starting with 41697004ce71f913beb7bd6f03e7c544a0b850484807e037237d34d811e1f81e not found: ID does not exist" containerID="41697004ce71f913beb7bd6f03e7c544a0b850484807e037237d34d811e1f81e" Mar 20 14:11:41 crc kubenswrapper[4973]: I0320 14:11:41.875281 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41697004ce71f913beb7bd6f03e7c544a0b850484807e037237d34d811e1f81e"} err="failed to get container status \"41697004ce71f913beb7bd6f03e7c544a0b850484807e037237d34d811e1f81e\": rpc error: code = NotFound desc = could not find container \"41697004ce71f913beb7bd6f03e7c544a0b850484807e037237d34d811e1f81e\": container with ID starting with 41697004ce71f913beb7bd6f03e7c544a0b850484807e037237d34d811e1f81e not found: ID does not exist" Mar 20 14:11:41 crc kubenswrapper[4973]: I0320 14:11:41.875301 4973 scope.go:117] "RemoveContainer" containerID="b5a81d246d54e7b3415f52e8987d6fe59dff161bebd7eeaf24a7545b66bf0ee3" Mar 20 14:11:41 crc kubenswrapper[4973]: E0320 14:11:41.875598 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5a81d246d54e7b3415f52e8987d6fe59dff161bebd7eeaf24a7545b66bf0ee3\": container with ID starting with b5a81d246d54e7b3415f52e8987d6fe59dff161bebd7eeaf24a7545b66bf0ee3 not found: ID does not exist" containerID="b5a81d246d54e7b3415f52e8987d6fe59dff161bebd7eeaf24a7545b66bf0ee3" Mar 20 14:11:41 crc kubenswrapper[4973]: I0320 14:11:41.875635 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5a81d246d54e7b3415f52e8987d6fe59dff161bebd7eeaf24a7545b66bf0ee3"} err="failed to get container status \"b5a81d246d54e7b3415f52e8987d6fe59dff161bebd7eeaf24a7545b66bf0ee3\": rpc error: code = NotFound desc = could not find container \"b5a81d246d54e7b3415f52e8987d6fe59dff161bebd7eeaf24a7545b66bf0ee3\": container with ID starting with b5a81d246d54e7b3415f52e8987d6fe59dff161bebd7eeaf24a7545b66bf0ee3 not found: ID does not exist" Mar 20 14:11:41 crc kubenswrapper[4973]: I0320 14:11:41.966333 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e057a3ae-d02f-42b8-ab09-8366b7c85c61" path="/var/lib/kubelet/pods/e057a3ae-d02f-42b8-ab09-8366b7c85c61/volumes" Mar 20 14:11:50 crc kubenswrapper[4973]: I0320 14:11:50.951683 4973 scope.go:117] "RemoveContainer" containerID="eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c" Mar 20 14:11:50 crc kubenswrapper[4973]: E0320 14:11:50.953067 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:12:00 crc kubenswrapper[4973]: I0320 14:12:00.147813 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566932-xpxbx"] Mar 20 14:12:00 crc kubenswrapper[4973]: E0320 14:12:00.149981 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e057a3ae-d02f-42b8-ab09-8366b7c85c61" containerName="extract-utilities" Mar 20 14:12:00 crc kubenswrapper[4973]: I0320 14:12:00.150092 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="e057a3ae-d02f-42b8-ab09-8366b7c85c61" containerName="extract-utilities" Mar 20 14:12:00 crc kubenswrapper[4973]: E0320 14:12:00.150195 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e057a3ae-d02f-42b8-ab09-8366b7c85c61" containerName="registry-server" Mar 20 14:12:00 crc kubenswrapper[4973]: I0320 14:12:00.150277 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="e057a3ae-d02f-42b8-ab09-8366b7c85c61" containerName="registry-server" Mar 20 14:12:00 crc kubenswrapper[4973]: E0320 14:12:00.150411 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e057a3ae-d02f-42b8-ab09-8366b7c85c61" containerName="extract-content" Mar 20 14:12:00 crc kubenswrapper[4973]: I0320 14:12:00.150530 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="e057a3ae-d02f-42b8-ab09-8366b7c85c61" containerName="extract-content" Mar 20 14:12:00 crc kubenswrapper[4973]: I0320 14:12:00.150898 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="e057a3ae-d02f-42b8-ab09-8366b7c85c61" containerName="registry-server" Mar 20 14:12:00 crc kubenswrapper[4973]: I0320 14:12:00.152133 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566932-xpxbx" Mar 20 14:12:00 crc kubenswrapper[4973]: I0320 14:12:00.154771 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:12:00 crc kubenswrapper[4973]: I0320 14:12:00.154901 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:12:00 crc kubenswrapper[4973]: I0320 14:12:00.155110 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:12:00 crc kubenswrapper[4973]: I0320 14:12:00.159540 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566932-xpxbx"] Mar 20 14:12:00 crc kubenswrapper[4973]: I0320 14:12:00.180430 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqb2h\" (UniqueName: \"kubernetes.io/projected/3bd7505e-95cc-46e7-bff1-53138b47efca-kube-api-access-mqb2h\") pod \"auto-csr-approver-29566932-xpxbx\" (UID: \"3bd7505e-95cc-46e7-bff1-53138b47efca\") " pod="openshift-infra/auto-csr-approver-29566932-xpxbx" Mar 20 14:12:00 crc kubenswrapper[4973]: I0320 14:12:00.283184 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqb2h\" (UniqueName: \"kubernetes.io/projected/3bd7505e-95cc-46e7-bff1-53138b47efca-kube-api-access-mqb2h\") pod \"auto-csr-approver-29566932-xpxbx\" (UID: \"3bd7505e-95cc-46e7-bff1-53138b47efca\") " pod="openshift-infra/auto-csr-approver-29566932-xpxbx" Mar 20 14:12:00 crc kubenswrapper[4973]: I0320 14:12:00.303321 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqb2h\" (UniqueName: \"kubernetes.io/projected/3bd7505e-95cc-46e7-bff1-53138b47efca-kube-api-access-mqb2h\") pod \"auto-csr-approver-29566932-xpxbx\" (UID: \"3bd7505e-95cc-46e7-bff1-53138b47efca\") " pod="openshift-infra/auto-csr-approver-29566932-xpxbx" Mar 20 14:12:00 crc kubenswrapper[4973]: I0320 14:12:00.484701 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566932-xpxbx" Mar 20 14:12:00 crc kubenswrapper[4973]: I0320 14:12:00.948058 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566932-xpxbx"] Mar 20 14:12:01 crc kubenswrapper[4973]: I0320 14:12:01.966053 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566932-xpxbx" event={"ID":"3bd7505e-95cc-46e7-bff1-53138b47efca","Type":"ContainerStarted","Data":"09f53dc981997ba29af78f03bc935346c3374f62cd349d04fa5a5146b31a6669"} Mar 20 14:12:05 crc kubenswrapper[4973]: I0320 14:12:05.006274 4973 generic.go:334] "Generic (PLEG): container finished" podID="3bd7505e-95cc-46e7-bff1-53138b47efca" containerID="ee41fd4dbeb100a91a3a97ff3a73a074fafe660ffa7721c71ae928a4376f7a31" exitCode=0 Mar 20 14:12:05 crc kubenswrapper[4973]: I0320 14:12:05.006413 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566932-xpxbx" event={"ID":"3bd7505e-95cc-46e7-bff1-53138b47efca","Type":"ContainerDied","Data":"ee41fd4dbeb100a91a3a97ff3a73a074fafe660ffa7721c71ae928a4376f7a31"} Mar 20 14:12:05 crc kubenswrapper[4973]: I0320 14:12:05.950878 4973 scope.go:117] "RemoveContainer" containerID="eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c" Mar 20 14:12:05 crc kubenswrapper[4973]: E0320 14:12:05.953119 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:12:06 crc kubenswrapper[4973]: I0320 14:12:06.503900 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566932-xpxbx" Mar 20 14:12:06 crc kubenswrapper[4973]: I0320 14:12:06.563734 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqb2h\" (UniqueName: \"kubernetes.io/projected/3bd7505e-95cc-46e7-bff1-53138b47efca-kube-api-access-mqb2h\") pod \"3bd7505e-95cc-46e7-bff1-53138b47efca\" (UID: \"3bd7505e-95cc-46e7-bff1-53138b47efca\") " Mar 20 14:12:06 crc kubenswrapper[4973]: I0320 14:12:06.574581 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bd7505e-95cc-46e7-bff1-53138b47efca-kube-api-access-mqb2h" (OuterVolumeSpecName: "kube-api-access-mqb2h") pod "3bd7505e-95cc-46e7-bff1-53138b47efca" (UID: "3bd7505e-95cc-46e7-bff1-53138b47efca"). InnerVolumeSpecName "kube-api-access-mqb2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:12:06 crc kubenswrapper[4973]: I0320 14:12:06.667625 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqb2h\" (UniqueName: \"kubernetes.io/projected/3bd7505e-95cc-46e7-bff1-53138b47efca-kube-api-access-mqb2h\") on node \"crc\" DevicePath \"\"" Mar 20 14:12:07 crc kubenswrapper[4973]: I0320 14:12:07.026801 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566932-xpxbx" event={"ID":"3bd7505e-95cc-46e7-bff1-53138b47efca","Type":"ContainerDied","Data":"09f53dc981997ba29af78f03bc935346c3374f62cd349d04fa5a5146b31a6669"} Mar 20 14:12:07 crc kubenswrapper[4973]: I0320 14:12:07.026852 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566932-xpxbx" Mar 20 14:12:07 crc kubenswrapper[4973]: I0320 14:12:07.026859 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="09f53dc981997ba29af78f03bc935346c3374f62cd349d04fa5a5146b31a6669" Mar 20 14:12:07 crc kubenswrapper[4973]: I0320 14:12:07.575427 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566926-g6rlw"] Mar 20 14:12:07 crc kubenswrapper[4973]: I0320 14:12:07.585303 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566926-g6rlw"] Mar 20 14:12:07 crc kubenswrapper[4973]: I0320 14:12:07.962861 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da4eb1a8-7a68-4ad6-bc1d-0a316f90845e" path="/var/lib/kubelet/pods/da4eb1a8-7a68-4ad6-bc1d-0a316f90845e/volumes" Mar 20 14:12:20 crc kubenswrapper[4973]: I0320 14:12:20.951212 4973 scope.go:117] "RemoveContainer" containerID="eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c" Mar 20 14:12:20 crc kubenswrapper[4973]: E0320 14:12:20.952159 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:12:35 crc kubenswrapper[4973]: I0320 14:12:35.950587 4973 scope.go:117] "RemoveContainer" containerID="eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c" Mar 20 14:12:35 crc kubenswrapper[4973]: E0320 14:12:35.951813 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:12:44 crc kubenswrapper[4973]: I0320 14:12:44.424774 4973 generic.go:334] "Generic (PLEG): container finished" podID="7038f01c-aeff-4322-bfd1-715445d5d1cb" containerID="ed08d18b1140c1c4489c3a75a91ec79fdd0aa6a6aa436f59d50e811e79ed22bf" exitCode=0 Mar 20 14:12:44 crc kubenswrapper[4973]: I0320 14:12:44.424842 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" event={"ID":"7038f01c-aeff-4322-bfd1-715445d5d1cb","Type":"ContainerDied","Data":"ed08d18b1140c1c4489c3a75a91ec79fdd0aa6a6aa436f59d50e811e79ed22bf"} Mar 20 14:12:45 crc kubenswrapper[4973]: I0320 14:12:45.961032 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.122676 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-telemetry-combined-ca-bundle\") pod \"7038f01c-aeff-4322-bfd1-715445d5d1cb\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.122967 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-86lqs\" (UniqueName: \"kubernetes.io/projected/7038f01c-aeff-4322-bfd1-715445d5d1cb-kube-api-access-86lqs\") pod \"7038f01c-aeff-4322-bfd1-715445d5d1cb\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.123210 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-inventory\") pod \"7038f01c-aeff-4322-bfd1-715445d5d1cb\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.123347 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-ceilometer-compute-config-data-0\") pod \"7038f01c-aeff-4322-bfd1-715445d5d1cb\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.123442 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-ceilometer-compute-config-data-1\") pod \"7038f01c-aeff-4322-bfd1-715445d5d1cb\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.123608 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-ssh-key-openstack-edpm-ipam\") pod \"7038f01c-aeff-4322-bfd1-715445d5d1cb\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.123686 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-ceilometer-compute-config-data-2\") pod \"7038f01c-aeff-4322-bfd1-715445d5d1cb\" (UID: \"7038f01c-aeff-4322-bfd1-715445d5d1cb\") " Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.139226 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "7038f01c-aeff-4322-bfd1-715445d5d1cb" (UID: "7038f01c-aeff-4322-bfd1-715445d5d1cb"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.139464 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7038f01c-aeff-4322-bfd1-715445d5d1cb-kube-api-access-86lqs" (OuterVolumeSpecName: "kube-api-access-86lqs") pod "7038f01c-aeff-4322-bfd1-715445d5d1cb" (UID: "7038f01c-aeff-4322-bfd1-715445d5d1cb"). InnerVolumeSpecName "kube-api-access-86lqs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.160533 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-inventory" (OuterVolumeSpecName: "inventory") pod "7038f01c-aeff-4322-bfd1-715445d5d1cb" (UID: "7038f01c-aeff-4322-bfd1-715445d5d1cb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.160988 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "7038f01c-aeff-4322-bfd1-715445d5d1cb" (UID: "7038f01c-aeff-4322-bfd1-715445d5d1cb"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.162175 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "7038f01c-aeff-4322-bfd1-715445d5d1cb" (UID: "7038f01c-aeff-4322-bfd1-715445d5d1cb"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.163409 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "7038f01c-aeff-4322-bfd1-715445d5d1cb" (UID: "7038f01c-aeff-4322-bfd1-715445d5d1cb"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.171315 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "7038f01c-aeff-4322-bfd1-715445d5d1cb" (UID: "7038f01c-aeff-4322-bfd1-715445d5d1cb"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.227163 4973 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.227197 4973 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.227207 4973 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.227216 4973 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.227227 4973 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.227235 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-86lqs\" (UniqueName: \"kubernetes.io/projected/7038f01c-aeff-4322-bfd1-715445d5d1cb-kube-api-access-86lqs\") on node \"crc\" DevicePath \"\"" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.227244 4973 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7038f01c-aeff-4322-bfd1-715445d5d1cb-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.447390 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" event={"ID":"7038f01c-aeff-4322-bfd1-715445d5d1cb","Type":"ContainerDied","Data":"79832105df4a0f6a5f8cdc3d4fdc3cf7e22c5fdd85ef64a73af61e8c0d0b3dd7"} Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.447438 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="79832105df4a0f6a5f8cdc3d4fdc3cf7e22c5fdd85ef64a73af61e8c0d0b3dd7" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.447461 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.544805 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g"] Mar 20 14:12:46 crc kubenswrapper[4973]: E0320 14:12:46.545642 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd7505e-95cc-46e7-bff1-53138b47efca" containerName="oc" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.545669 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd7505e-95cc-46e7-bff1-53138b47efca" containerName="oc" Mar 20 14:12:46 crc kubenswrapper[4973]: E0320 14:12:46.545714 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7038f01c-aeff-4322-bfd1-715445d5d1cb" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.545723 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="7038f01c-aeff-4322-bfd1-715445d5d1cb" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.545957 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="7038f01c-aeff-4322-bfd1-715445d5d1cb" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.545987 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bd7505e-95cc-46e7-bff1-53138b47efca" containerName="oc" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.546934 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.549683 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-ipmi-config-data" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.549893 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.550156 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d4wjc" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.550240 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.550156 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.579562 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g"] Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.636828 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq8q4\" (UniqueName: \"kubernetes.io/projected/5972691e-cb05-4ded-b36a-b045d3b4726f-kube-api-access-zq8q4\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.637028 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.637445 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.637529 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.637746 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.637796 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.637902 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.739803 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.739919 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.739957 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.740039 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.740092 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.740242 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.740293 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq8q4\" (UniqueName: \"kubernetes.io/projected/5972691e-cb05-4ded-b36a-b045d3b4726f-kube-api-access-zq8q4\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.745933 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-ceilometer-ipmi-config-data-1\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.746103 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-telemetry-power-monitoring-combined-ca-bundle\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.746754 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-ssh-key-openstack-edpm-ipam\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.747412 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-ceilometer-ipmi-config-data-0\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.747594 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-inventory\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.747777 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-ceilometer-ipmi-config-data-2\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.761487 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq8q4\" (UniqueName: \"kubernetes.io/projected/5972691e-cb05-4ded-b36a-b045d3b4726f-kube-api-access-zq8q4\") pod \"telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" Mar 20 14:12:46 crc kubenswrapper[4973]: I0320 14:12:46.872458 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" Mar 20 14:12:47 crc kubenswrapper[4973]: I0320 14:12:47.432396 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g"] Mar 20 14:12:47 crc kubenswrapper[4973]: I0320 14:12:47.439067 4973 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:12:47 crc kubenswrapper[4973]: I0320 14:12:47.459627 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" event={"ID":"5972691e-cb05-4ded-b36a-b045d3b4726f","Type":"ContainerStarted","Data":"c3eb05b5c65cc510164e20861b3930f22d9e8197191ef1a55c3a7f19eba68dbc"} Mar 20 14:12:47 crc kubenswrapper[4973]: I0320 14:12:47.952361 4973 scope.go:117] "RemoveContainer" containerID="eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c" Mar 20 14:12:47 crc kubenswrapper[4973]: E0320 14:12:47.953107 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:12:48 crc kubenswrapper[4973]: I0320 14:12:48.470905 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" event={"ID":"5972691e-cb05-4ded-b36a-b045d3b4726f","Type":"ContainerStarted","Data":"3447971b1fd91036781284fb009692d02da57ce1cdc0098beb619c2f0c0fffaf"} Mar 20 14:12:48 crc kubenswrapper[4973]: I0320 14:12:48.489228 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" podStartSLOduration=1.9791441349999999 podStartE2EDuration="2.4892075s" podCreationTimestamp="2026-03-20 14:12:46 +0000 UTC" firstStartedPulling="2026-03-20 14:12:47.438823446 +0000 UTC m=+3088.182493190" lastFinishedPulling="2026-03-20 14:12:47.948886801 +0000 UTC m=+3088.692556555" observedRunningTime="2026-03-20 14:12:48.486540548 +0000 UTC m=+3089.230210282" watchObservedRunningTime="2026-03-20 14:12:48.4892075 +0000 UTC m=+3089.232877254" Mar 20 14:13:00 crc kubenswrapper[4973]: I0320 14:13:00.953047 4973 scope.go:117] "RemoveContainer" containerID="eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c" Mar 20 14:13:00 crc kubenswrapper[4973]: E0320 14:13:00.953975 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:13:02 crc kubenswrapper[4973]: I0320 14:13:02.073710 4973 scope.go:117] "RemoveContainer" containerID="9ccefe0a323465cd3ccc7c71536d48863bf442ce88722931b58b6f715192ce18" Mar 20 14:13:14 crc kubenswrapper[4973]: I0320 14:13:14.951383 4973 scope.go:117] "RemoveContainer" containerID="eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c" Mar 20 14:13:14 crc kubenswrapper[4973]: E0320 14:13:14.952270 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:13:28 crc kubenswrapper[4973]: I0320 14:13:28.951190 4973 scope.go:117] "RemoveContainer" containerID="eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c" Mar 20 14:13:28 crc kubenswrapper[4973]: E0320 14:13:28.952244 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:13:40 crc kubenswrapper[4973]: I0320 14:13:40.951552 4973 scope.go:117] "RemoveContainer" containerID="eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c" Mar 20 14:13:40 crc kubenswrapper[4973]: E0320 14:13:40.952464 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:13:52 crc kubenswrapper[4973]: I0320 14:13:52.950686 4973 scope.go:117] "RemoveContainer" containerID="eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c" Mar 20 14:13:52 crc kubenswrapper[4973]: E0320 14:13:52.951531 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:14:00 crc kubenswrapper[4973]: I0320 14:14:00.147061 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566934-r5dpc"] Mar 20 14:14:00 crc kubenswrapper[4973]: I0320 14:14:00.149461 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566934-r5dpc" Mar 20 14:14:00 crc kubenswrapper[4973]: I0320 14:14:00.152285 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:14:00 crc kubenswrapper[4973]: I0320 14:14:00.152667 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:14:00 crc kubenswrapper[4973]: I0320 14:14:00.152922 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:14:00 crc kubenswrapper[4973]: I0320 14:14:00.161358 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566934-r5dpc"] Mar 20 14:14:00 crc kubenswrapper[4973]: I0320 14:14:00.286199 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf2dv\" (UniqueName: \"kubernetes.io/projected/4d7e3baf-7119-49c5-80c1-4ff4d213d01d-kube-api-access-gf2dv\") pod \"auto-csr-approver-29566934-r5dpc\" (UID: \"4d7e3baf-7119-49c5-80c1-4ff4d213d01d\") " pod="openshift-infra/auto-csr-approver-29566934-r5dpc" Mar 20 14:14:00 crc kubenswrapper[4973]: I0320 14:14:00.388450 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf2dv\" (UniqueName: \"kubernetes.io/projected/4d7e3baf-7119-49c5-80c1-4ff4d213d01d-kube-api-access-gf2dv\") pod \"auto-csr-approver-29566934-r5dpc\" (UID: \"4d7e3baf-7119-49c5-80c1-4ff4d213d01d\") " pod="openshift-infra/auto-csr-approver-29566934-r5dpc" Mar 20 14:14:00 crc kubenswrapper[4973]: I0320 14:14:00.406884 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf2dv\" (UniqueName: \"kubernetes.io/projected/4d7e3baf-7119-49c5-80c1-4ff4d213d01d-kube-api-access-gf2dv\") pod \"auto-csr-approver-29566934-r5dpc\" (UID: \"4d7e3baf-7119-49c5-80c1-4ff4d213d01d\") " pod="openshift-infra/auto-csr-approver-29566934-r5dpc" Mar 20 14:14:00 crc kubenswrapper[4973]: I0320 14:14:00.482767 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566934-r5dpc" Mar 20 14:14:00 crc kubenswrapper[4973]: I0320 14:14:00.991491 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566934-r5dpc"] Mar 20 14:14:01 crc kubenswrapper[4973]: I0320 14:14:01.239277 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566934-r5dpc" event={"ID":"4d7e3baf-7119-49c5-80c1-4ff4d213d01d","Type":"ContainerStarted","Data":"f2c585b1fcb259b4afc2134a4e28fe3130f9fd15aa2081bb772c7ebbef274091"} Mar 20 14:14:03 crc kubenswrapper[4973]: I0320 14:14:03.263603 4973 generic.go:334] "Generic (PLEG): container finished" podID="4d7e3baf-7119-49c5-80c1-4ff4d213d01d" containerID="fa40686177aba5b69526a727c7166004903b53e33471e9d367d892ed3b318c73" exitCode=0 Mar 20 14:14:03 crc kubenswrapper[4973]: I0320 14:14:03.263692 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566934-r5dpc" event={"ID":"4d7e3baf-7119-49c5-80c1-4ff4d213d01d","Type":"ContainerDied","Data":"fa40686177aba5b69526a727c7166004903b53e33471e9d367d892ed3b318c73"} Mar 20 14:14:04 crc kubenswrapper[4973]: I0320 14:14:04.664361 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566934-r5dpc" Mar 20 14:14:04 crc kubenswrapper[4973]: I0320 14:14:04.812969 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf2dv\" (UniqueName: \"kubernetes.io/projected/4d7e3baf-7119-49c5-80c1-4ff4d213d01d-kube-api-access-gf2dv\") pod \"4d7e3baf-7119-49c5-80c1-4ff4d213d01d\" (UID: \"4d7e3baf-7119-49c5-80c1-4ff4d213d01d\") " Mar 20 14:14:04 crc kubenswrapper[4973]: I0320 14:14:04.820105 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d7e3baf-7119-49c5-80c1-4ff4d213d01d-kube-api-access-gf2dv" (OuterVolumeSpecName: "kube-api-access-gf2dv") pod "4d7e3baf-7119-49c5-80c1-4ff4d213d01d" (UID: "4d7e3baf-7119-49c5-80c1-4ff4d213d01d"). InnerVolumeSpecName "kube-api-access-gf2dv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:14:04 crc kubenswrapper[4973]: I0320 14:14:04.916706 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf2dv\" (UniqueName: \"kubernetes.io/projected/4d7e3baf-7119-49c5-80c1-4ff4d213d01d-kube-api-access-gf2dv\") on node \"crc\" DevicePath \"\"" Mar 20 14:14:05 crc kubenswrapper[4973]: I0320 14:14:05.286847 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566934-r5dpc" event={"ID":"4d7e3baf-7119-49c5-80c1-4ff4d213d01d","Type":"ContainerDied","Data":"f2c585b1fcb259b4afc2134a4e28fe3130f9fd15aa2081bb772c7ebbef274091"} Mar 20 14:14:05 crc kubenswrapper[4973]: I0320 14:14:05.286885 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2c585b1fcb259b4afc2134a4e28fe3130f9fd15aa2081bb772c7ebbef274091" Mar 20 14:14:05 crc kubenswrapper[4973]: I0320 14:14:05.286921 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566934-r5dpc" Mar 20 14:14:05 crc kubenswrapper[4973]: I0320 14:14:05.736918 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566928-98vqm"] Mar 20 14:14:05 crc kubenswrapper[4973]: I0320 14:14:05.749219 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566928-98vqm"] Mar 20 14:14:05 crc kubenswrapper[4973]: I0320 14:14:05.964867 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53b986db-231a-4811-b651-7f7c242f0b44" path="/var/lib/kubelet/pods/53b986db-231a-4811-b651-7f7c242f0b44/volumes" Mar 20 14:14:06 crc kubenswrapper[4973]: I0320 14:14:06.951714 4973 scope.go:117] "RemoveContainer" containerID="eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c" Mar 20 14:14:06 crc kubenswrapper[4973]: E0320 14:14:06.952401 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:14:21 crc kubenswrapper[4973]: I0320 14:14:21.950811 4973 scope.go:117] "RemoveContainer" containerID="eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c" Mar 20 14:14:21 crc kubenswrapper[4973]: E0320 14:14:21.951994 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:14:32 crc kubenswrapper[4973]: I0320 14:14:32.952780 4973 scope.go:117] "RemoveContainer" containerID="eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c" Mar 20 14:14:32 crc kubenswrapper[4973]: E0320 14:14:32.953500 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:14:41 crc kubenswrapper[4973]: I0320 14:14:41.680838 4973 generic.go:334] "Generic (PLEG): container finished" podID="5972691e-cb05-4ded-b36a-b045d3b4726f" containerID="3447971b1fd91036781284fb009692d02da57ce1cdc0098beb619c2f0c0fffaf" exitCode=0 Mar 20 14:14:41 crc kubenswrapper[4973]: I0320 14:14:41.680925 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" event={"ID":"5972691e-cb05-4ded-b36a-b045d3b4726f","Type":"ContainerDied","Data":"3447971b1fd91036781284fb009692d02da57ce1cdc0098beb619c2f0c0fffaf"} Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.191373 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.269504 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-ceilometer-ipmi-config-data-1\") pod \"5972691e-cb05-4ded-b36a-b045d3b4726f\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.269591 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-ssh-key-openstack-edpm-ipam\") pod \"5972691e-cb05-4ded-b36a-b045d3b4726f\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.269649 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-ceilometer-ipmi-config-data-0\") pod \"5972691e-cb05-4ded-b36a-b045d3b4726f\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.269740 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-ceilometer-ipmi-config-data-2\") pod \"5972691e-cb05-4ded-b36a-b045d3b4726f\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.269796 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-telemetry-power-monitoring-combined-ca-bundle\") pod \"5972691e-cb05-4ded-b36a-b045d3b4726f\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.270082 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq8q4\" (UniqueName: \"kubernetes.io/projected/5972691e-cb05-4ded-b36a-b045d3b4726f-kube-api-access-zq8q4\") pod \"5972691e-cb05-4ded-b36a-b045d3b4726f\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.270146 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-inventory\") pod \"5972691e-cb05-4ded-b36a-b045d3b4726f\" (UID: \"5972691e-cb05-4ded-b36a-b045d3b4726f\") " Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.275256 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-telemetry-power-monitoring-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-power-monitoring-combined-ca-bundle") pod "5972691e-cb05-4ded-b36a-b045d3b4726f" (UID: "5972691e-cb05-4ded-b36a-b045d3b4726f"). InnerVolumeSpecName "telemetry-power-monitoring-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.275286 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5972691e-cb05-4ded-b36a-b045d3b4726f-kube-api-access-zq8q4" (OuterVolumeSpecName: "kube-api-access-zq8q4") pod "5972691e-cb05-4ded-b36a-b045d3b4726f" (UID: "5972691e-cb05-4ded-b36a-b045d3b4726f"). InnerVolumeSpecName "kube-api-access-zq8q4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.303016 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-ceilometer-ipmi-config-data-0" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-0") pod "5972691e-cb05-4ded-b36a-b045d3b4726f" (UID: "5972691e-cb05-4ded-b36a-b045d3b4726f"). InnerVolumeSpecName "ceilometer-ipmi-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.304157 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5972691e-cb05-4ded-b36a-b045d3b4726f" (UID: "5972691e-cb05-4ded-b36a-b045d3b4726f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.306532 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-inventory" (OuterVolumeSpecName: "inventory") pod "5972691e-cb05-4ded-b36a-b045d3b4726f" (UID: "5972691e-cb05-4ded-b36a-b045d3b4726f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.313790 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-ceilometer-ipmi-config-data-1" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-1") pod "5972691e-cb05-4ded-b36a-b045d3b4726f" (UID: "5972691e-cb05-4ded-b36a-b045d3b4726f"). InnerVolumeSpecName "ceilometer-ipmi-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.314295 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-ceilometer-ipmi-config-data-2" (OuterVolumeSpecName: "ceilometer-ipmi-config-data-2") pod "5972691e-cb05-4ded-b36a-b045d3b4726f" (UID: "5972691e-cb05-4ded-b36a-b045d3b4726f"). InnerVolumeSpecName "ceilometer-ipmi-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.372841 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq8q4\" (UniqueName: \"kubernetes.io/projected/5972691e-cb05-4ded-b36a-b045d3b4726f-kube-api-access-zq8q4\") on node \"crc\" DevicePath \"\"" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.372874 4973 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.372885 4973 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-1\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-ceilometer-ipmi-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.372895 4973 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.372904 4973 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-0\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-ceilometer-ipmi-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.372912 4973 reconciler_common.go:293] "Volume detached for volume \"ceilometer-ipmi-config-data-2\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-ceilometer-ipmi-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.372922 4973 reconciler_common.go:293] "Volume detached for volume \"telemetry-power-monitoring-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5972691e-cb05-4ded-b36a-b045d3b4726f-telemetry-power-monitoring-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.711947 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" event={"ID":"5972691e-cb05-4ded-b36a-b045d3b4726f","Type":"ContainerDied","Data":"c3eb05b5c65cc510164e20861b3930f22d9e8197191ef1a55c3a7f19eba68dbc"} Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.711998 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3eb05b5c65cc510164e20861b3930f22d9e8197191ef1a55c3a7f19eba68dbc" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.712016 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.793838 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-ln9l8"] Mar 20 14:14:43 crc kubenswrapper[4973]: E0320 14:14:43.794834 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d7e3baf-7119-49c5-80c1-4ff4d213d01d" containerName="oc" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.794859 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d7e3baf-7119-49c5-80c1-4ff4d213d01d" containerName="oc" Mar 20 14:14:43 crc kubenswrapper[4973]: E0320 14:14:43.794915 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5972691e-cb05-4ded-b36a-b045d3b4726f" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.794928 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="5972691e-cb05-4ded-b36a-b045d3b4726f" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.795277 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="5972691e-cb05-4ded-b36a-b045d3b4726f" containerName="telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.795307 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d7e3baf-7119-49c5-80c1-4ff4d213d01d" containerName="oc" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.797503 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-ln9l8" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.799658 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"logging-compute-config-data" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.802316 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.802780 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-d4wjc" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.802955 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.803039 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.807729 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-ln9l8"] Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.884698 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/084e7647-9b86-48a2-a4f8-56cb7a8e3457-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-ln9l8\" (UID: \"084e7647-9b86-48a2-a4f8-56cb7a8e3457\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-ln9l8" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.884838 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/084e7647-9b86-48a2-a4f8-56cb7a8e3457-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-ln9l8\" (UID: \"084e7647-9b86-48a2-a4f8-56cb7a8e3457\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-ln9l8" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.884877 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m7n8\" (UniqueName: \"kubernetes.io/projected/084e7647-9b86-48a2-a4f8-56cb7a8e3457-kube-api-access-4m7n8\") pod \"logging-edpm-deployment-openstack-edpm-ipam-ln9l8\" (UID: \"084e7647-9b86-48a2-a4f8-56cb7a8e3457\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-ln9l8" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.884903 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/084e7647-9b86-48a2-a4f8-56cb7a8e3457-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-ln9l8\" (UID: \"084e7647-9b86-48a2-a4f8-56cb7a8e3457\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-ln9l8" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.884923 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/084e7647-9b86-48a2-a4f8-56cb7a8e3457-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-ln9l8\" (UID: \"084e7647-9b86-48a2-a4f8-56cb7a8e3457\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-ln9l8" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.987228 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/084e7647-9b86-48a2-a4f8-56cb7a8e3457-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-ln9l8\" (UID: \"084e7647-9b86-48a2-a4f8-56cb7a8e3457\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-ln9l8" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.987289 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m7n8\" (UniqueName: \"kubernetes.io/projected/084e7647-9b86-48a2-a4f8-56cb7a8e3457-kube-api-access-4m7n8\") pod \"logging-edpm-deployment-openstack-edpm-ipam-ln9l8\" (UID: \"084e7647-9b86-48a2-a4f8-56cb7a8e3457\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-ln9l8" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.987319 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/084e7647-9b86-48a2-a4f8-56cb7a8e3457-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-ln9l8\" (UID: \"084e7647-9b86-48a2-a4f8-56cb7a8e3457\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-ln9l8" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.987445 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/084e7647-9b86-48a2-a4f8-56cb7a8e3457-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-ln9l8\" (UID: \"084e7647-9b86-48a2-a4f8-56cb7a8e3457\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-ln9l8" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.987608 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/084e7647-9b86-48a2-a4f8-56cb7a8e3457-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-ln9l8\" (UID: \"084e7647-9b86-48a2-a4f8-56cb7a8e3457\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-ln9l8" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.992235 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/084e7647-9b86-48a2-a4f8-56cb7a8e3457-ssh-key-openstack-edpm-ipam\") pod \"logging-edpm-deployment-openstack-edpm-ipam-ln9l8\" (UID: \"084e7647-9b86-48a2-a4f8-56cb7a8e3457\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-ln9l8" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.992615 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/084e7647-9b86-48a2-a4f8-56cb7a8e3457-logging-compute-config-data-1\") pod \"logging-edpm-deployment-openstack-edpm-ipam-ln9l8\" (UID: \"084e7647-9b86-48a2-a4f8-56cb7a8e3457\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-ln9l8" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.994709 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/084e7647-9b86-48a2-a4f8-56cb7a8e3457-logging-compute-config-data-0\") pod \"logging-edpm-deployment-openstack-edpm-ipam-ln9l8\" (UID: \"084e7647-9b86-48a2-a4f8-56cb7a8e3457\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-ln9l8" Mar 20 14:14:43 crc kubenswrapper[4973]: I0320 14:14:43.994839 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/084e7647-9b86-48a2-a4f8-56cb7a8e3457-inventory\") pod \"logging-edpm-deployment-openstack-edpm-ipam-ln9l8\" (UID: \"084e7647-9b86-48a2-a4f8-56cb7a8e3457\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-ln9l8" Mar 20 14:14:44 crc kubenswrapper[4973]: I0320 14:14:44.008709 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m7n8\" (UniqueName: \"kubernetes.io/projected/084e7647-9b86-48a2-a4f8-56cb7a8e3457-kube-api-access-4m7n8\") pod \"logging-edpm-deployment-openstack-edpm-ipam-ln9l8\" (UID: \"084e7647-9b86-48a2-a4f8-56cb7a8e3457\") " pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-ln9l8" Mar 20 14:14:44 crc kubenswrapper[4973]: I0320 14:14:44.134051 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-ln9l8" Mar 20 14:14:44 crc kubenswrapper[4973]: I0320 14:14:44.832144 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/logging-edpm-deployment-openstack-edpm-ipam-ln9l8"] Mar 20 14:14:45 crc kubenswrapper[4973]: I0320 14:14:45.753158 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-ln9l8" event={"ID":"084e7647-9b86-48a2-a4f8-56cb7a8e3457","Type":"ContainerStarted","Data":"69a2631a23a2cd8cf2b48cec929f42dacd6813383716c9bcf79fa4a0547a5f83"} Mar 20 14:14:45 crc kubenswrapper[4973]: I0320 14:14:45.753219 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-ln9l8" event={"ID":"084e7647-9b86-48a2-a4f8-56cb7a8e3457","Type":"ContainerStarted","Data":"972ff2e056e3f2bde7328b02857601b5bf8ac26d31db45b6dba38dbfb5ffd5ec"} Mar 20 14:14:45 crc kubenswrapper[4973]: I0320 14:14:45.778646 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-ln9l8" podStartSLOduration=2.322276482 podStartE2EDuration="2.7786246s" podCreationTimestamp="2026-03-20 14:14:43 +0000 UTC" firstStartedPulling="2026-03-20 14:14:44.839870892 +0000 UTC m=+3205.583540636" lastFinishedPulling="2026-03-20 14:14:45.296219 +0000 UTC m=+3206.039888754" observedRunningTime="2026-03-20 14:14:45.77532205 +0000 UTC m=+3206.518991814" watchObservedRunningTime="2026-03-20 14:14:45.7786246 +0000 UTC m=+3206.522294344" Mar 20 14:14:45 crc kubenswrapper[4973]: I0320 14:14:45.956738 4973 scope.go:117] "RemoveContainer" containerID="eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c" Mar 20 14:14:45 crc kubenswrapper[4973]: E0320 14:14:45.957491 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:14:56 crc kubenswrapper[4973]: I0320 14:14:56.951097 4973 scope.go:117] "RemoveContainer" containerID="eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c" Mar 20 14:14:56 crc kubenswrapper[4973]: E0320 14:14:56.951913 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:15:00 crc kubenswrapper[4973]: I0320 14:15:00.134195 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566935-6xxvv"] Mar 20 14:15:00 crc kubenswrapper[4973]: I0320 14:15:00.136199 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-6xxvv" Mar 20 14:15:00 crc kubenswrapper[4973]: I0320 14:15:00.139946 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 14:15:00 crc kubenswrapper[4973]: I0320 14:15:00.140158 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 14:15:00 crc kubenswrapper[4973]: I0320 14:15:00.144306 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566935-6xxvv"] Mar 20 14:15:00 crc kubenswrapper[4973]: I0320 14:15:00.253142 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b2aa00d-07d2-41d5-9fc7-9770039b5663-secret-volume\") pod \"collect-profiles-29566935-6xxvv\" (UID: \"2b2aa00d-07d2-41d5-9fc7-9770039b5663\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-6xxvv" Mar 20 14:15:00 crc kubenswrapper[4973]: I0320 14:15:00.253554 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68pkp\" (UniqueName: \"kubernetes.io/projected/2b2aa00d-07d2-41d5-9fc7-9770039b5663-kube-api-access-68pkp\") pod \"collect-profiles-29566935-6xxvv\" (UID: \"2b2aa00d-07d2-41d5-9fc7-9770039b5663\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-6xxvv" Mar 20 14:15:00 crc kubenswrapper[4973]: I0320 14:15:00.253591 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b2aa00d-07d2-41d5-9fc7-9770039b5663-config-volume\") pod \"collect-profiles-29566935-6xxvv\" (UID: \"2b2aa00d-07d2-41d5-9fc7-9770039b5663\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-6xxvv" Mar 20 14:15:00 crc kubenswrapper[4973]: I0320 14:15:00.356835 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b2aa00d-07d2-41d5-9fc7-9770039b5663-secret-volume\") pod \"collect-profiles-29566935-6xxvv\" (UID: \"2b2aa00d-07d2-41d5-9fc7-9770039b5663\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-6xxvv" Mar 20 14:15:00 crc kubenswrapper[4973]: I0320 14:15:00.356976 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68pkp\" (UniqueName: \"kubernetes.io/projected/2b2aa00d-07d2-41d5-9fc7-9770039b5663-kube-api-access-68pkp\") pod \"collect-profiles-29566935-6xxvv\" (UID: \"2b2aa00d-07d2-41d5-9fc7-9770039b5663\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-6xxvv" Mar 20 14:15:00 crc kubenswrapper[4973]: I0320 14:15:00.357020 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b2aa00d-07d2-41d5-9fc7-9770039b5663-config-volume\") pod \"collect-profiles-29566935-6xxvv\" (UID: \"2b2aa00d-07d2-41d5-9fc7-9770039b5663\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-6xxvv" Mar 20 14:15:00 crc kubenswrapper[4973]: I0320 14:15:00.357878 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b2aa00d-07d2-41d5-9fc7-9770039b5663-config-volume\") pod \"collect-profiles-29566935-6xxvv\" (UID: \"2b2aa00d-07d2-41d5-9fc7-9770039b5663\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-6xxvv" Mar 20 14:15:00 crc kubenswrapper[4973]: I0320 14:15:00.363385 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b2aa00d-07d2-41d5-9fc7-9770039b5663-secret-volume\") pod \"collect-profiles-29566935-6xxvv\" (UID: \"2b2aa00d-07d2-41d5-9fc7-9770039b5663\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-6xxvv" Mar 20 14:15:00 crc kubenswrapper[4973]: I0320 14:15:00.371420 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68pkp\" (UniqueName: \"kubernetes.io/projected/2b2aa00d-07d2-41d5-9fc7-9770039b5663-kube-api-access-68pkp\") pod \"collect-profiles-29566935-6xxvv\" (UID: \"2b2aa00d-07d2-41d5-9fc7-9770039b5663\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-6xxvv" Mar 20 14:15:00 crc kubenswrapper[4973]: I0320 14:15:00.470600 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-6xxvv" Mar 20 14:15:00 crc kubenswrapper[4973]: I0320 14:15:00.915374 4973 generic.go:334] "Generic (PLEG): container finished" podID="084e7647-9b86-48a2-a4f8-56cb7a8e3457" containerID="69a2631a23a2cd8cf2b48cec929f42dacd6813383716c9bcf79fa4a0547a5f83" exitCode=0 Mar 20 14:15:00 crc kubenswrapper[4973]: I0320 14:15:00.915458 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-ln9l8" event={"ID":"084e7647-9b86-48a2-a4f8-56cb7a8e3457","Type":"ContainerDied","Data":"69a2631a23a2cd8cf2b48cec929f42dacd6813383716c9bcf79fa4a0547a5f83"} Mar 20 14:15:00 crc kubenswrapper[4973]: I0320 14:15:00.918560 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566935-6xxvv"] Mar 20 14:15:01 crc kubenswrapper[4973]: I0320 14:15:01.980965 4973 generic.go:334] "Generic (PLEG): container finished" podID="2b2aa00d-07d2-41d5-9fc7-9770039b5663" containerID="6bd82e091e588f69ebed24d4151858259ca48c2563e430ccd30539816b335bdf" exitCode=0 Mar 20 14:15:02 crc kubenswrapper[4973]: I0320 14:15:02.004568 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-6xxvv" event={"ID":"2b2aa00d-07d2-41d5-9fc7-9770039b5663","Type":"ContainerDied","Data":"6bd82e091e588f69ebed24d4151858259ca48c2563e430ccd30539816b335bdf"} Mar 20 14:15:02 crc kubenswrapper[4973]: I0320 14:15:02.004614 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-6xxvv" event={"ID":"2b2aa00d-07d2-41d5-9fc7-9770039b5663","Type":"ContainerStarted","Data":"d3b21fb6c5cd92b594e1ec0ad0980c64900c0d8de3dce4b7b393b323e1d26701"} Mar 20 14:15:02 crc kubenswrapper[4973]: I0320 14:15:02.214604 4973 scope.go:117] "RemoveContainer" containerID="5e5c94cebc85c407bb033bcb272b3d87e43ce6f2747036ba81692afbb5fc42cb" Mar 20 14:15:02 crc kubenswrapper[4973]: I0320 14:15:02.452696 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-ln9l8" Mar 20 14:15:02 crc kubenswrapper[4973]: I0320 14:15:02.625623 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/084e7647-9b86-48a2-a4f8-56cb7a8e3457-ssh-key-openstack-edpm-ipam\") pod \"084e7647-9b86-48a2-a4f8-56cb7a8e3457\" (UID: \"084e7647-9b86-48a2-a4f8-56cb7a8e3457\") " Mar 20 14:15:02 crc kubenswrapper[4973]: I0320 14:15:02.625761 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/084e7647-9b86-48a2-a4f8-56cb7a8e3457-logging-compute-config-data-1\") pod \"084e7647-9b86-48a2-a4f8-56cb7a8e3457\" (UID: \"084e7647-9b86-48a2-a4f8-56cb7a8e3457\") " Mar 20 14:15:02 crc kubenswrapper[4973]: I0320 14:15:02.625852 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/084e7647-9b86-48a2-a4f8-56cb7a8e3457-logging-compute-config-data-0\") pod \"084e7647-9b86-48a2-a4f8-56cb7a8e3457\" (UID: \"084e7647-9b86-48a2-a4f8-56cb7a8e3457\") " Mar 20 14:15:02 crc kubenswrapper[4973]: I0320 14:15:02.625892 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/084e7647-9b86-48a2-a4f8-56cb7a8e3457-inventory\") pod \"084e7647-9b86-48a2-a4f8-56cb7a8e3457\" (UID: \"084e7647-9b86-48a2-a4f8-56cb7a8e3457\") " Mar 20 14:15:02 crc kubenswrapper[4973]: I0320 14:15:02.626067 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m7n8\" (UniqueName: \"kubernetes.io/projected/084e7647-9b86-48a2-a4f8-56cb7a8e3457-kube-api-access-4m7n8\") pod \"084e7647-9b86-48a2-a4f8-56cb7a8e3457\" (UID: \"084e7647-9b86-48a2-a4f8-56cb7a8e3457\") " Mar 20 14:15:02 crc kubenswrapper[4973]: I0320 14:15:02.639747 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/084e7647-9b86-48a2-a4f8-56cb7a8e3457-kube-api-access-4m7n8" (OuterVolumeSpecName: "kube-api-access-4m7n8") pod "084e7647-9b86-48a2-a4f8-56cb7a8e3457" (UID: "084e7647-9b86-48a2-a4f8-56cb7a8e3457"). InnerVolumeSpecName "kube-api-access-4m7n8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:15:02 crc kubenswrapper[4973]: I0320 14:15:02.662313 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/084e7647-9b86-48a2-a4f8-56cb7a8e3457-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "084e7647-9b86-48a2-a4f8-56cb7a8e3457" (UID: "084e7647-9b86-48a2-a4f8-56cb7a8e3457"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:15:02 crc kubenswrapper[4973]: I0320 14:15:02.663562 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/084e7647-9b86-48a2-a4f8-56cb7a8e3457-inventory" (OuterVolumeSpecName: "inventory") pod "084e7647-9b86-48a2-a4f8-56cb7a8e3457" (UID: "084e7647-9b86-48a2-a4f8-56cb7a8e3457"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:15:02 crc kubenswrapper[4973]: I0320 14:15:02.667786 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/084e7647-9b86-48a2-a4f8-56cb7a8e3457-logging-compute-config-data-0" (OuterVolumeSpecName: "logging-compute-config-data-0") pod "084e7647-9b86-48a2-a4f8-56cb7a8e3457" (UID: "084e7647-9b86-48a2-a4f8-56cb7a8e3457"). InnerVolumeSpecName "logging-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:15:02 crc kubenswrapper[4973]: I0320 14:15:02.674924 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/084e7647-9b86-48a2-a4f8-56cb7a8e3457-logging-compute-config-data-1" (OuterVolumeSpecName: "logging-compute-config-data-1") pod "084e7647-9b86-48a2-a4f8-56cb7a8e3457" (UID: "084e7647-9b86-48a2-a4f8-56cb7a8e3457"). InnerVolumeSpecName "logging-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:15:02 crc kubenswrapper[4973]: I0320 14:15:02.730665 4973 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/084e7647-9b86-48a2-a4f8-56cb7a8e3457-logging-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 20 14:15:02 crc kubenswrapper[4973]: I0320 14:15:02.730705 4973 reconciler_common.go:293] "Volume detached for volume \"logging-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/084e7647-9b86-48a2-a4f8-56cb7a8e3457-logging-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 20 14:15:02 crc kubenswrapper[4973]: I0320 14:15:02.730716 4973 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/084e7647-9b86-48a2-a4f8-56cb7a8e3457-inventory\") on node \"crc\" DevicePath \"\"" Mar 20 14:15:02 crc kubenswrapper[4973]: I0320 14:15:02.730733 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m7n8\" (UniqueName: \"kubernetes.io/projected/084e7647-9b86-48a2-a4f8-56cb7a8e3457-kube-api-access-4m7n8\") on node \"crc\" DevicePath \"\"" Mar 20 14:15:02 crc kubenswrapper[4973]: I0320 14:15:02.730748 4973 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/084e7647-9b86-48a2-a4f8-56cb7a8e3457-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 20 14:15:03 crc kubenswrapper[4973]: I0320 14:15:02.998646 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-ln9l8" Mar 20 14:15:03 crc kubenswrapper[4973]: I0320 14:15:02.998702 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/logging-edpm-deployment-openstack-edpm-ipam-ln9l8" event={"ID":"084e7647-9b86-48a2-a4f8-56cb7a8e3457","Type":"ContainerDied","Data":"972ff2e056e3f2bde7328b02857601b5bf8ac26d31db45b6dba38dbfb5ffd5ec"} Mar 20 14:15:03 crc kubenswrapper[4973]: I0320 14:15:02.998746 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="972ff2e056e3f2bde7328b02857601b5bf8ac26d31db45b6dba38dbfb5ffd5ec" Mar 20 14:15:03 crc kubenswrapper[4973]: I0320 14:15:03.405278 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-6xxvv" Mar 20 14:15:03 crc kubenswrapper[4973]: I0320 14:15:03.557580 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68pkp\" (UniqueName: \"kubernetes.io/projected/2b2aa00d-07d2-41d5-9fc7-9770039b5663-kube-api-access-68pkp\") pod \"2b2aa00d-07d2-41d5-9fc7-9770039b5663\" (UID: \"2b2aa00d-07d2-41d5-9fc7-9770039b5663\") " Mar 20 14:15:03 crc kubenswrapper[4973]: I0320 14:15:03.557714 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b2aa00d-07d2-41d5-9fc7-9770039b5663-secret-volume\") pod \"2b2aa00d-07d2-41d5-9fc7-9770039b5663\" (UID: \"2b2aa00d-07d2-41d5-9fc7-9770039b5663\") " Mar 20 14:15:03 crc kubenswrapper[4973]: I0320 14:15:03.557815 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b2aa00d-07d2-41d5-9fc7-9770039b5663-config-volume\") pod \"2b2aa00d-07d2-41d5-9fc7-9770039b5663\" (UID: \"2b2aa00d-07d2-41d5-9fc7-9770039b5663\") " Mar 20 14:15:03 crc kubenswrapper[4973]: I0320 14:15:03.558464 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b2aa00d-07d2-41d5-9fc7-9770039b5663-config-volume" (OuterVolumeSpecName: "config-volume") pod "2b2aa00d-07d2-41d5-9fc7-9770039b5663" (UID: "2b2aa00d-07d2-41d5-9fc7-9770039b5663"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:15:03 crc kubenswrapper[4973]: I0320 14:15:03.559359 4973 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2b2aa00d-07d2-41d5-9fc7-9770039b5663-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:15:03 crc kubenswrapper[4973]: I0320 14:15:03.563675 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b2aa00d-07d2-41d5-9fc7-9770039b5663-kube-api-access-68pkp" (OuterVolumeSpecName: "kube-api-access-68pkp") pod "2b2aa00d-07d2-41d5-9fc7-9770039b5663" (UID: "2b2aa00d-07d2-41d5-9fc7-9770039b5663"). InnerVolumeSpecName "kube-api-access-68pkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:15:03 crc kubenswrapper[4973]: I0320 14:15:03.575308 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b2aa00d-07d2-41d5-9fc7-9770039b5663-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "2b2aa00d-07d2-41d5-9fc7-9770039b5663" (UID: "2b2aa00d-07d2-41d5-9fc7-9770039b5663"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:15:03 crc kubenswrapper[4973]: I0320 14:15:03.662400 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68pkp\" (UniqueName: \"kubernetes.io/projected/2b2aa00d-07d2-41d5-9fc7-9770039b5663-kube-api-access-68pkp\") on node \"crc\" DevicePath \"\"" Mar 20 14:15:03 crc kubenswrapper[4973]: I0320 14:15:03.662443 4973 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/2b2aa00d-07d2-41d5-9fc7-9770039b5663-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:15:04 crc kubenswrapper[4973]: I0320 14:15:04.013379 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-6xxvv" event={"ID":"2b2aa00d-07d2-41d5-9fc7-9770039b5663","Type":"ContainerDied","Data":"d3b21fb6c5cd92b594e1ec0ad0980c64900c0d8de3dce4b7b393b323e1d26701"} Mar 20 14:15:04 crc kubenswrapper[4973]: I0320 14:15:04.013659 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3b21fb6c5cd92b594e1ec0ad0980c64900c0d8de3dce4b7b393b323e1d26701" Mar 20 14:15:04 crc kubenswrapper[4973]: I0320 14:15:04.013485 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566935-6xxvv" Mar 20 14:15:04 crc kubenswrapper[4973]: I0320 14:15:04.557666 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566890-8gj94"] Mar 20 14:15:04 crc kubenswrapper[4973]: I0320 14:15:04.588890 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566890-8gj94"] Mar 20 14:15:05 crc kubenswrapper[4973]: I0320 14:15:05.963286 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12c40f14-cc6c-48dd-b2c0-c026cf5cd708" path="/var/lib/kubelet/pods/12c40f14-cc6c-48dd-b2c0-c026cf5cd708/volumes" Mar 20 14:15:09 crc kubenswrapper[4973]: I0320 14:15:09.958720 4973 scope.go:117] "RemoveContainer" containerID="eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c" Mar 20 14:15:09 crc kubenswrapper[4973]: E0320 14:15:09.959540 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:15:24 crc kubenswrapper[4973]: I0320 14:15:24.950767 4973 scope.go:117] "RemoveContainer" containerID="eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c" Mar 20 14:15:24 crc kubenswrapper[4973]: E0320 14:15:24.951632 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:15:37 crc kubenswrapper[4973]: I0320 14:15:37.950656 4973 scope.go:117] "RemoveContainer" containerID="eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c" Mar 20 14:15:37 crc kubenswrapper[4973]: E0320 14:15:37.951536 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:15:48 crc kubenswrapper[4973]: I0320 14:15:48.951450 4973 scope.go:117] "RemoveContainer" containerID="eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c" Mar 20 14:15:48 crc kubenswrapper[4973]: E0320 14:15:48.952372 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:16:00 crc kubenswrapper[4973]: I0320 14:16:00.180185 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566936-x4v4f"] Mar 20 14:16:00 crc kubenswrapper[4973]: E0320 14:16:00.181495 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="084e7647-9b86-48a2-a4f8-56cb7a8e3457" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 20 14:16:00 crc kubenswrapper[4973]: I0320 14:16:00.181515 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="084e7647-9b86-48a2-a4f8-56cb7a8e3457" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 20 14:16:00 crc kubenswrapper[4973]: E0320 14:16:00.181575 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b2aa00d-07d2-41d5-9fc7-9770039b5663" containerName="collect-profiles" Mar 20 14:16:00 crc kubenswrapper[4973]: I0320 14:16:00.181587 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b2aa00d-07d2-41d5-9fc7-9770039b5663" containerName="collect-profiles" Mar 20 14:16:00 crc kubenswrapper[4973]: I0320 14:16:00.181940 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="084e7647-9b86-48a2-a4f8-56cb7a8e3457" containerName="logging-edpm-deployment-openstack-edpm-ipam" Mar 20 14:16:00 crc kubenswrapper[4973]: I0320 14:16:00.181962 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b2aa00d-07d2-41d5-9fc7-9770039b5663" containerName="collect-profiles" Mar 20 14:16:00 crc kubenswrapper[4973]: I0320 14:16:00.183063 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566936-x4v4f" Mar 20 14:16:00 crc kubenswrapper[4973]: I0320 14:16:00.185453 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:16:00 crc kubenswrapper[4973]: I0320 14:16:00.189871 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:16:00 crc kubenswrapper[4973]: I0320 14:16:00.189968 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:16:00 crc kubenswrapper[4973]: I0320 14:16:00.190972 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566936-x4v4f"] Mar 20 14:16:00 crc kubenswrapper[4973]: I0320 14:16:00.261049 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkn4t\" (UniqueName: \"kubernetes.io/projected/4c0a71f0-a19b-4e23-8965-32ce944c0bf9-kube-api-access-xkn4t\") pod \"auto-csr-approver-29566936-x4v4f\" (UID: \"4c0a71f0-a19b-4e23-8965-32ce944c0bf9\") " pod="openshift-infra/auto-csr-approver-29566936-x4v4f" Mar 20 14:16:00 crc kubenswrapper[4973]: I0320 14:16:00.362581 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkn4t\" (UniqueName: \"kubernetes.io/projected/4c0a71f0-a19b-4e23-8965-32ce944c0bf9-kube-api-access-xkn4t\") pod \"auto-csr-approver-29566936-x4v4f\" (UID: \"4c0a71f0-a19b-4e23-8965-32ce944c0bf9\") " pod="openshift-infra/auto-csr-approver-29566936-x4v4f" Mar 20 14:16:00 crc kubenswrapper[4973]: I0320 14:16:00.387210 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkn4t\" (UniqueName: \"kubernetes.io/projected/4c0a71f0-a19b-4e23-8965-32ce944c0bf9-kube-api-access-xkn4t\") pod \"auto-csr-approver-29566936-x4v4f\" (UID: \"4c0a71f0-a19b-4e23-8965-32ce944c0bf9\") " pod="openshift-infra/auto-csr-approver-29566936-x4v4f" Mar 20 14:16:00 crc kubenswrapper[4973]: I0320 14:16:00.510964 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566936-x4v4f" Mar 20 14:16:01 crc kubenswrapper[4973]: I0320 14:16:01.056252 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566936-x4v4f"] Mar 20 14:16:01 crc kubenswrapper[4973]: I0320 14:16:01.627457 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566936-x4v4f" event={"ID":"4c0a71f0-a19b-4e23-8965-32ce944c0bf9","Type":"ContainerStarted","Data":"a1c42164cb9a1003dccb81975ae5cfd4a366c58f9dae63b0d533be85fc82b7ae"} Mar 20 14:16:02 crc kubenswrapper[4973]: I0320 14:16:02.310357 4973 scope.go:117] "RemoveContainer" containerID="62f0d2355eeefb7c032b003bb6fef5d2bca8c88776749d6bc24a1068ecd65d2e" Mar 20 14:16:02 crc kubenswrapper[4973]: I0320 14:16:02.637099 4973 generic.go:334] "Generic (PLEG): container finished" podID="4c0a71f0-a19b-4e23-8965-32ce944c0bf9" containerID="516236f733f42d5911189934e3b703f51b553506f17f629822bac4aa77544630" exitCode=0 Mar 20 14:16:02 crc kubenswrapper[4973]: I0320 14:16:02.637153 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566936-x4v4f" event={"ID":"4c0a71f0-a19b-4e23-8965-32ce944c0bf9","Type":"ContainerDied","Data":"516236f733f42d5911189934e3b703f51b553506f17f629822bac4aa77544630"} Mar 20 14:16:03 crc kubenswrapper[4973]: I0320 14:16:03.955537 4973 scope.go:117] "RemoveContainer" containerID="eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c" Mar 20 14:16:03 crc kubenswrapper[4973]: E0320 14:16:03.956144 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:16:04 crc kubenswrapper[4973]: I0320 14:16:04.093597 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566936-x4v4f" Mar 20 14:16:04 crc kubenswrapper[4973]: I0320 14:16:04.156592 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkn4t\" (UniqueName: \"kubernetes.io/projected/4c0a71f0-a19b-4e23-8965-32ce944c0bf9-kube-api-access-xkn4t\") pod \"4c0a71f0-a19b-4e23-8965-32ce944c0bf9\" (UID: \"4c0a71f0-a19b-4e23-8965-32ce944c0bf9\") " Mar 20 14:16:04 crc kubenswrapper[4973]: I0320 14:16:04.162278 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c0a71f0-a19b-4e23-8965-32ce944c0bf9-kube-api-access-xkn4t" (OuterVolumeSpecName: "kube-api-access-xkn4t") pod "4c0a71f0-a19b-4e23-8965-32ce944c0bf9" (UID: "4c0a71f0-a19b-4e23-8965-32ce944c0bf9"). InnerVolumeSpecName "kube-api-access-xkn4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:16:04 crc kubenswrapper[4973]: I0320 14:16:04.260318 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkn4t\" (UniqueName: \"kubernetes.io/projected/4c0a71f0-a19b-4e23-8965-32ce944c0bf9-kube-api-access-xkn4t\") on node \"crc\" DevicePath \"\"" Mar 20 14:16:04 crc kubenswrapper[4973]: I0320 14:16:04.658877 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566936-x4v4f" event={"ID":"4c0a71f0-a19b-4e23-8965-32ce944c0bf9","Type":"ContainerDied","Data":"a1c42164cb9a1003dccb81975ae5cfd4a366c58f9dae63b0d533be85fc82b7ae"} Mar 20 14:16:04 crc kubenswrapper[4973]: I0320 14:16:04.658915 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a1c42164cb9a1003dccb81975ae5cfd4a366c58f9dae63b0d533be85fc82b7ae" Mar 20 14:16:04 crc kubenswrapper[4973]: I0320 14:16:04.658971 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566936-x4v4f" Mar 20 14:16:05 crc kubenswrapper[4973]: I0320 14:16:05.163294 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566930-fsj2r"] Mar 20 14:16:05 crc kubenswrapper[4973]: I0320 14:16:05.173675 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566930-fsj2r"] Mar 20 14:16:05 crc kubenswrapper[4973]: I0320 14:16:05.962745 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b962ddb-93ff-417b-812f-1ae7022e90db" path="/var/lib/kubelet/pods/3b962ddb-93ff-417b-812f-1ae7022e90db/volumes" Mar 20 14:16:18 crc kubenswrapper[4973]: I0320 14:16:18.951005 4973 scope.go:117] "RemoveContainer" containerID="eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c" Mar 20 14:16:19 crc kubenswrapper[4973]: I0320 14:16:19.841451 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerStarted","Data":"bf957f9a512aaa4c34dde67075d8a36005e5457ba817ccbde176d2f86a603452"} Mar 20 14:17:02 crc kubenswrapper[4973]: I0320 14:17:02.385514 4973 scope.go:117] "RemoveContainer" containerID="58d057e5a7b6cafbc68a0ecb8cd8418eab3ec9f68ac8942c094282cded3f5563" Mar 20 14:18:00 crc kubenswrapper[4973]: I0320 14:18:00.157612 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566938-jnbkl"] Mar 20 14:18:00 crc kubenswrapper[4973]: E0320 14:18:00.158754 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c0a71f0-a19b-4e23-8965-32ce944c0bf9" containerName="oc" Mar 20 14:18:00 crc kubenswrapper[4973]: I0320 14:18:00.158772 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c0a71f0-a19b-4e23-8965-32ce944c0bf9" containerName="oc" Mar 20 14:18:00 crc kubenswrapper[4973]: I0320 14:18:00.159150 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c0a71f0-a19b-4e23-8965-32ce944c0bf9" containerName="oc" Mar 20 14:18:00 crc kubenswrapper[4973]: I0320 14:18:00.160569 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566938-jnbkl" Mar 20 14:18:00 crc kubenswrapper[4973]: I0320 14:18:00.162548 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:18:00 crc kubenswrapper[4973]: I0320 14:18:00.163553 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:18:00 crc kubenswrapper[4973]: I0320 14:18:00.163859 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:18:00 crc kubenswrapper[4973]: I0320 14:18:00.172278 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566938-jnbkl"] Mar 20 14:18:00 crc kubenswrapper[4973]: I0320 14:18:00.204240 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4ff8\" (UniqueName: \"kubernetes.io/projected/57d73b48-f1c3-4302-8812-6f38589977f0-kube-api-access-f4ff8\") pod \"auto-csr-approver-29566938-jnbkl\" (UID: \"57d73b48-f1c3-4302-8812-6f38589977f0\") " pod="openshift-infra/auto-csr-approver-29566938-jnbkl" Mar 20 14:18:00 crc kubenswrapper[4973]: I0320 14:18:00.307001 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4ff8\" (UniqueName: \"kubernetes.io/projected/57d73b48-f1c3-4302-8812-6f38589977f0-kube-api-access-f4ff8\") pod \"auto-csr-approver-29566938-jnbkl\" (UID: \"57d73b48-f1c3-4302-8812-6f38589977f0\") " pod="openshift-infra/auto-csr-approver-29566938-jnbkl" Mar 20 14:18:00 crc kubenswrapper[4973]: I0320 14:18:00.326976 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4ff8\" (UniqueName: \"kubernetes.io/projected/57d73b48-f1c3-4302-8812-6f38589977f0-kube-api-access-f4ff8\") pod \"auto-csr-approver-29566938-jnbkl\" (UID: \"57d73b48-f1c3-4302-8812-6f38589977f0\") " pod="openshift-infra/auto-csr-approver-29566938-jnbkl" Mar 20 14:18:00 crc kubenswrapper[4973]: I0320 14:18:00.486614 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566938-jnbkl" Mar 20 14:18:00 crc kubenswrapper[4973]: I0320 14:18:00.985391 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566938-jnbkl"] Mar 20 14:18:00 crc kubenswrapper[4973]: I0320 14:18:00.986648 4973 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:18:01 crc kubenswrapper[4973]: I0320 14:18:01.901860 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566938-jnbkl" event={"ID":"57d73b48-f1c3-4302-8812-6f38589977f0","Type":"ContainerStarted","Data":"18aad418fee9add016500f21d6da4ddf86d6e3e11ac5dde27f0c2374b5790f99"} Mar 20 14:18:02 crc kubenswrapper[4973]: I0320 14:18:02.914368 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566938-jnbkl" event={"ID":"57d73b48-f1c3-4302-8812-6f38589977f0","Type":"ContainerStarted","Data":"96e2c7fef3f3897b966e96ad97b348ac24801530ea767fcc591511c269420ad1"} Mar 20 14:18:03 crc kubenswrapper[4973]: I0320 14:18:03.926708 4973 generic.go:334] "Generic (PLEG): container finished" podID="57d73b48-f1c3-4302-8812-6f38589977f0" containerID="96e2c7fef3f3897b966e96ad97b348ac24801530ea767fcc591511c269420ad1" exitCode=0 Mar 20 14:18:03 crc kubenswrapper[4973]: I0320 14:18:03.926815 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566938-jnbkl" event={"ID":"57d73b48-f1c3-4302-8812-6f38589977f0","Type":"ContainerDied","Data":"96e2c7fef3f3897b966e96ad97b348ac24801530ea767fcc591511c269420ad1"} Mar 20 14:18:05 crc kubenswrapper[4973]: I0320 14:18:05.390186 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566938-jnbkl" Mar 20 14:18:05 crc kubenswrapper[4973]: I0320 14:18:05.571290 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4ff8\" (UniqueName: \"kubernetes.io/projected/57d73b48-f1c3-4302-8812-6f38589977f0-kube-api-access-f4ff8\") pod \"57d73b48-f1c3-4302-8812-6f38589977f0\" (UID: \"57d73b48-f1c3-4302-8812-6f38589977f0\") " Mar 20 14:18:05 crc kubenswrapper[4973]: I0320 14:18:05.579612 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57d73b48-f1c3-4302-8812-6f38589977f0-kube-api-access-f4ff8" (OuterVolumeSpecName: "kube-api-access-f4ff8") pod "57d73b48-f1c3-4302-8812-6f38589977f0" (UID: "57d73b48-f1c3-4302-8812-6f38589977f0"). InnerVolumeSpecName "kube-api-access-f4ff8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:18:05 crc kubenswrapper[4973]: I0320 14:18:05.678399 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4ff8\" (UniqueName: \"kubernetes.io/projected/57d73b48-f1c3-4302-8812-6f38589977f0-kube-api-access-f4ff8\") on node \"crc\" DevicePath \"\"" Mar 20 14:18:05 crc kubenswrapper[4973]: I0320 14:18:05.949092 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566938-jnbkl" event={"ID":"57d73b48-f1c3-4302-8812-6f38589977f0","Type":"ContainerDied","Data":"18aad418fee9add016500f21d6da4ddf86d6e3e11ac5dde27f0c2374b5790f99"} Mar 20 14:18:05 crc kubenswrapper[4973]: I0320 14:18:05.949435 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="18aad418fee9add016500f21d6da4ddf86d6e3e11ac5dde27f0c2374b5790f99" Mar 20 14:18:05 crc kubenswrapper[4973]: I0320 14:18:05.949142 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566938-jnbkl" Mar 20 14:18:06 crc kubenswrapper[4973]: I0320 14:18:06.009661 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566932-xpxbx"] Mar 20 14:18:06 crc kubenswrapper[4973]: I0320 14:18:06.021670 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566932-xpxbx"] Mar 20 14:18:07 crc kubenswrapper[4973]: I0320 14:18:07.964949 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bd7505e-95cc-46e7-bff1-53138b47efca" path="/var/lib/kubelet/pods/3bd7505e-95cc-46e7-bff1-53138b47efca/volumes" Mar 20 14:18:43 crc kubenswrapper[4973]: I0320 14:18:43.320754 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:18:43 crc kubenswrapper[4973]: I0320 14:18:43.321266 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:19:02 crc kubenswrapper[4973]: I0320 14:19:02.492640 4973 scope.go:117] "RemoveContainer" containerID="ee41fd4dbeb100a91a3a97ff3a73a074fafe660ffa7721c71ae928a4376f7a31" Mar 20 14:19:13 crc kubenswrapper[4973]: I0320 14:19:13.326113 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:19:13 crc kubenswrapper[4973]: I0320 14:19:13.326775 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:19:38 crc kubenswrapper[4973]: I0320 14:19:38.699799 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fw66n"] Mar 20 14:19:38 crc kubenswrapper[4973]: E0320 14:19:38.702106 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57d73b48-f1c3-4302-8812-6f38589977f0" containerName="oc" Mar 20 14:19:38 crc kubenswrapper[4973]: I0320 14:19:38.702192 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="57d73b48-f1c3-4302-8812-6f38589977f0" containerName="oc" Mar 20 14:19:38 crc kubenswrapper[4973]: I0320 14:19:38.702533 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="57d73b48-f1c3-4302-8812-6f38589977f0" containerName="oc" Mar 20 14:19:38 crc kubenswrapper[4973]: I0320 14:19:38.704727 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fw66n" Mar 20 14:19:38 crc kubenswrapper[4973]: I0320 14:19:38.825412 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae477ba-28ba-4345-9f0c-d409ce5d9c3b-utilities\") pod \"redhat-operators-fw66n\" (UID: \"4ae477ba-28ba-4345-9f0c-d409ce5d9c3b\") " pod="openshift-marketplace/redhat-operators-fw66n" Mar 20 14:19:38 crc kubenswrapper[4973]: I0320 14:19:38.825923 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae477ba-28ba-4345-9f0c-d409ce5d9c3b-catalog-content\") pod \"redhat-operators-fw66n\" (UID: \"4ae477ba-28ba-4345-9f0c-d409ce5d9c3b\") " pod="openshift-marketplace/redhat-operators-fw66n" Mar 20 14:19:38 crc kubenswrapper[4973]: I0320 14:19:38.826127 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gcwf\" (UniqueName: \"kubernetes.io/projected/4ae477ba-28ba-4345-9f0c-d409ce5d9c3b-kube-api-access-2gcwf\") pod \"redhat-operators-fw66n\" (UID: \"4ae477ba-28ba-4345-9f0c-d409ce5d9c3b\") " pod="openshift-marketplace/redhat-operators-fw66n" Mar 20 14:19:38 crc kubenswrapper[4973]: I0320 14:19:38.928118 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gcwf\" (UniqueName: \"kubernetes.io/projected/4ae477ba-28ba-4345-9f0c-d409ce5d9c3b-kube-api-access-2gcwf\") pod \"redhat-operators-fw66n\" (UID: \"4ae477ba-28ba-4345-9f0c-d409ce5d9c3b\") " pod="openshift-marketplace/redhat-operators-fw66n" Mar 20 14:19:38 crc kubenswrapper[4973]: I0320 14:19:38.928256 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae477ba-28ba-4345-9f0c-d409ce5d9c3b-utilities\") pod \"redhat-operators-fw66n\" (UID: \"4ae477ba-28ba-4345-9f0c-d409ce5d9c3b\") " pod="openshift-marketplace/redhat-operators-fw66n" Mar 20 14:19:38 crc kubenswrapper[4973]: I0320 14:19:38.928408 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae477ba-28ba-4345-9f0c-d409ce5d9c3b-catalog-content\") pod \"redhat-operators-fw66n\" (UID: \"4ae477ba-28ba-4345-9f0c-d409ce5d9c3b\") " pod="openshift-marketplace/redhat-operators-fw66n" Mar 20 14:19:38 crc kubenswrapper[4973]: I0320 14:19:38.928826 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae477ba-28ba-4345-9f0c-d409ce5d9c3b-utilities\") pod \"redhat-operators-fw66n\" (UID: \"4ae477ba-28ba-4345-9f0c-d409ce5d9c3b\") " pod="openshift-marketplace/redhat-operators-fw66n" Mar 20 14:19:38 crc kubenswrapper[4973]: I0320 14:19:38.928882 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae477ba-28ba-4345-9f0c-d409ce5d9c3b-catalog-content\") pod \"redhat-operators-fw66n\" (UID: \"4ae477ba-28ba-4345-9f0c-d409ce5d9c3b\") " pod="openshift-marketplace/redhat-operators-fw66n" Mar 20 14:19:38 crc kubenswrapper[4973]: I0320 14:19:38.956784 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gcwf\" (UniqueName: \"kubernetes.io/projected/4ae477ba-28ba-4345-9f0c-d409ce5d9c3b-kube-api-access-2gcwf\") pod \"redhat-operators-fw66n\" (UID: \"4ae477ba-28ba-4345-9f0c-d409ce5d9c3b\") " pod="openshift-marketplace/redhat-operators-fw66n" Mar 20 14:19:38 crc kubenswrapper[4973]: I0320 14:19:38.962930 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fw66n"] Mar 20 14:19:39 crc kubenswrapper[4973]: I0320 14:19:39.023141 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fw66n" Mar 20 14:19:39 crc kubenswrapper[4973]: I0320 14:19:39.738323 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fw66n"] Mar 20 14:19:39 crc kubenswrapper[4973]: I0320 14:19:39.972089 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw66n" event={"ID":"4ae477ba-28ba-4345-9f0c-d409ce5d9c3b","Type":"ContainerStarted","Data":"7018f721b1f2fb4fb5c9bdd08cbf20cf01ac9d5fe1221e1fd5d25ff11fe6a8ed"} Mar 20 14:19:40 crc kubenswrapper[4973]: I0320 14:19:40.976934 4973 generic.go:334] "Generic (PLEG): container finished" podID="4ae477ba-28ba-4345-9f0c-d409ce5d9c3b" containerID="8f1a6d33e8545d259ef5e777aa383ffdbf8a97189c717885fbd5f5ba6acc28ce" exitCode=0 Mar 20 14:19:40 crc kubenswrapper[4973]: I0320 14:19:40.977028 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw66n" event={"ID":"4ae477ba-28ba-4345-9f0c-d409ce5d9c3b","Type":"ContainerDied","Data":"8f1a6d33e8545d259ef5e777aa383ffdbf8a97189c717885fbd5f5ba6acc28ce"} Mar 20 14:19:43 crc kubenswrapper[4973]: I0320 14:19:43.008072 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw66n" event={"ID":"4ae477ba-28ba-4345-9f0c-d409ce5d9c3b","Type":"ContainerStarted","Data":"5403f0da20a835782b1497a8307f019d2f3a43b6434abac93ac664ffa36ad56c"} Mar 20 14:19:43 crc kubenswrapper[4973]: I0320 14:19:43.321199 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:19:43 crc kubenswrapper[4973]: I0320 14:19:43.321261 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:19:43 crc kubenswrapper[4973]: I0320 14:19:43.321304 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 14:19:43 crc kubenswrapper[4973]: I0320 14:19:43.322262 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bf957f9a512aaa4c34dde67075d8a36005e5457ba817ccbde176d2f86a603452"} pod="openshift-machine-config-operator/machine-config-daemon-qlztx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:19:43 crc kubenswrapper[4973]: I0320 14:19:43.322321 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" containerID="cri-o://bf957f9a512aaa4c34dde67075d8a36005e5457ba817ccbde176d2f86a603452" gracePeriod=600 Mar 20 14:19:44 crc kubenswrapper[4973]: I0320 14:19:44.029986 4973 generic.go:334] "Generic (PLEG): container finished" podID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerID="bf957f9a512aaa4c34dde67075d8a36005e5457ba817ccbde176d2f86a603452" exitCode=0 Mar 20 14:19:44 crc kubenswrapper[4973]: I0320 14:19:44.030056 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerDied","Data":"bf957f9a512aaa4c34dde67075d8a36005e5457ba817ccbde176d2f86a603452"} Mar 20 14:19:44 crc kubenswrapper[4973]: I0320 14:19:44.030575 4973 scope.go:117] "RemoveContainer" containerID="eee4d977276eb50c4d6037b49f64e4fc22bb8be044b9efbe0ead78b43e40f94c" Mar 20 14:19:45 crc kubenswrapper[4973]: I0320 14:19:45.043285 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerStarted","Data":"d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309"} Mar 20 14:19:48 crc kubenswrapper[4973]: I0320 14:19:48.075670 4973 generic.go:334] "Generic (PLEG): container finished" podID="4ae477ba-28ba-4345-9f0c-d409ce5d9c3b" containerID="5403f0da20a835782b1497a8307f019d2f3a43b6434abac93ac664ffa36ad56c" exitCode=0 Mar 20 14:19:48 crc kubenswrapper[4973]: I0320 14:19:48.075773 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw66n" event={"ID":"4ae477ba-28ba-4345-9f0c-d409ce5d9c3b","Type":"ContainerDied","Data":"5403f0da20a835782b1497a8307f019d2f3a43b6434abac93ac664ffa36ad56c"} Mar 20 14:19:49 crc kubenswrapper[4973]: I0320 14:19:49.110667 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw66n" event={"ID":"4ae477ba-28ba-4345-9f0c-d409ce5d9c3b","Type":"ContainerStarted","Data":"0bdb96791bff61d1ef2268d8da98ee5a1c1aa610256e35fc2f7f8d85c8993fef"} Mar 20 14:19:49 crc kubenswrapper[4973]: I0320 14:19:49.138567 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fw66n" podStartSLOduration=3.513210679 podStartE2EDuration="11.138537956s" podCreationTimestamp="2026-03-20 14:19:38 +0000 UTC" firstStartedPulling="2026-03-20 14:19:40.978909939 +0000 UTC m=+3501.722579683" lastFinishedPulling="2026-03-20 14:19:48.604237216 +0000 UTC m=+3509.347906960" observedRunningTime="2026-03-20 14:19:49.133646565 +0000 UTC m=+3509.877316309" watchObservedRunningTime="2026-03-20 14:19:49.138537956 +0000 UTC m=+3509.882207700" Mar 20 14:19:52 crc kubenswrapper[4973]: I0320 14:19:52.428409 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b6r4h"] Mar 20 14:19:52 crc kubenswrapper[4973]: I0320 14:19:52.432224 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b6r4h" Mar 20 14:19:52 crc kubenswrapper[4973]: I0320 14:19:52.446867 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b6r4h"] Mar 20 14:19:52 crc kubenswrapper[4973]: I0320 14:19:52.604708 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq9qf\" (UniqueName: \"kubernetes.io/projected/acb14b29-937e-4835-8672-0a59fd7855c0-kube-api-access-mq9qf\") pod \"redhat-marketplace-b6r4h\" (UID: \"acb14b29-937e-4835-8672-0a59fd7855c0\") " pod="openshift-marketplace/redhat-marketplace-b6r4h" Mar 20 14:19:52 crc kubenswrapper[4973]: I0320 14:19:52.604784 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acb14b29-937e-4835-8672-0a59fd7855c0-utilities\") pod \"redhat-marketplace-b6r4h\" (UID: \"acb14b29-937e-4835-8672-0a59fd7855c0\") " pod="openshift-marketplace/redhat-marketplace-b6r4h" Mar 20 14:19:52 crc kubenswrapper[4973]: I0320 14:19:52.604880 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acb14b29-937e-4835-8672-0a59fd7855c0-catalog-content\") pod \"redhat-marketplace-b6r4h\" (UID: \"acb14b29-937e-4835-8672-0a59fd7855c0\") " pod="openshift-marketplace/redhat-marketplace-b6r4h" Mar 20 14:19:52 crc kubenswrapper[4973]: I0320 14:19:52.707458 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acb14b29-937e-4835-8672-0a59fd7855c0-utilities\") pod \"redhat-marketplace-b6r4h\" (UID: \"acb14b29-937e-4835-8672-0a59fd7855c0\") " pod="openshift-marketplace/redhat-marketplace-b6r4h" Mar 20 14:19:52 crc kubenswrapper[4973]: I0320 14:19:52.707559 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acb14b29-937e-4835-8672-0a59fd7855c0-catalog-content\") pod \"redhat-marketplace-b6r4h\" (UID: \"acb14b29-937e-4835-8672-0a59fd7855c0\") " pod="openshift-marketplace/redhat-marketplace-b6r4h" Mar 20 14:19:52 crc kubenswrapper[4973]: I0320 14:19:52.707757 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq9qf\" (UniqueName: \"kubernetes.io/projected/acb14b29-937e-4835-8672-0a59fd7855c0-kube-api-access-mq9qf\") pod \"redhat-marketplace-b6r4h\" (UID: \"acb14b29-937e-4835-8672-0a59fd7855c0\") " pod="openshift-marketplace/redhat-marketplace-b6r4h" Mar 20 14:19:52 crc kubenswrapper[4973]: I0320 14:19:52.708203 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acb14b29-937e-4835-8672-0a59fd7855c0-utilities\") pod \"redhat-marketplace-b6r4h\" (UID: \"acb14b29-937e-4835-8672-0a59fd7855c0\") " pod="openshift-marketplace/redhat-marketplace-b6r4h" Mar 20 14:19:52 crc kubenswrapper[4973]: I0320 14:19:52.708209 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acb14b29-937e-4835-8672-0a59fd7855c0-catalog-content\") pod \"redhat-marketplace-b6r4h\" (UID: \"acb14b29-937e-4835-8672-0a59fd7855c0\") " pod="openshift-marketplace/redhat-marketplace-b6r4h" Mar 20 14:19:52 crc kubenswrapper[4973]: I0320 14:19:52.728400 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq9qf\" (UniqueName: \"kubernetes.io/projected/acb14b29-937e-4835-8672-0a59fd7855c0-kube-api-access-mq9qf\") pod \"redhat-marketplace-b6r4h\" (UID: \"acb14b29-937e-4835-8672-0a59fd7855c0\") " pod="openshift-marketplace/redhat-marketplace-b6r4h" Mar 20 14:19:52 crc kubenswrapper[4973]: I0320 14:19:52.783692 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b6r4h" Mar 20 14:19:53 crc kubenswrapper[4973]: I0320 14:19:53.427578 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b6r4h"] Mar 20 14:19:54 crc kubenswrapper[4973]: I0320 14:19:54.198692 4973 generic.go:334] "Generic (PLEG): container finished" podID="acb14b29-937e-4835-8672-0a59fd7855c0" containerID="5ac7fdfafbc24e2f7bbc4be781ae2c6d1dc0fbbfce3d1ba4b23133cbe24a7375" exitCode=0 Mar 20 14:19:54 crc kubenswrapper[4973]: I0320 14:19:54.199225 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b6r4h" event={"ID":"acb14b29-937e-4835-8672-0a59fd7855c0","Type":"ContainerDied","Data":"5ac7fdfafbc24e2f7bbc4be781ae2c6d1dc0fbbfce3d1ba4b23133cbe24a7375"} Mar 20 14:19:54 crc kubenswrapper[4973]: I0320 14:19:54.199272 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b6r4h" event={"ID":"acb14b29-937e-4835-8672-0a59fd7855c0","Type":"ContainerStarted","Data":"1c91260b49cc6b261b391b01419166db13ca9cee2b0e8b4092a556fac2c343fb"} Mar 20 14:19:55 crc kubenswrapper[4973]: I0320 14:19:55.212575 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b6r4h" event={"ID":"acb14b29-937e-4835-8672-0a59fd7855c0","Type":"ContainerStarted","Data":"b74e4f2cb1e42177774715f8a3cd15eac0a38f06abc5e1e98f922f9cb7e089fc"} Mar 20 14:19:57 crc kubenswrapper[4973]: I0320 14:19:57.236945 4973 generic.go:334] "Generic (PLEG): container finished" podID="acb14b29-937e-4835-8672-0a59fd7855c0" containerID="b74e4f2cb1e42177774715f8a3cd15eac0a38f06abc5e1e98f922f9cb7e089fc" exitCode=0 Mar 20 14:19:57 crc kubenswrapper[4973]: I0320 14:19:57.237011 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b6r4h" event={"ID":"acb14b29-937e-4835-8672-0a59fd7855c0","Type":"ContainerDied","Data":"b74e4f2cb1e42177774715f8a3cd15eac0a38f06abc5e1e98f922f9cb7e089fc"} Mar 20 14:19:59 crc kubenswrapper[4973]: I0320 14:19:59.025073 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fw66n" Mar 20 14:19:59 crc kubenswrapper[4973]: I0320 14:19:59.027614 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fw66n" Mar 20 14:19:59 crc kubenswrapper[4973]: I0320 14:19:59.260539 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b6r4h" event={"ID":"acb14b29-937e-4835-8672-0a59fd7855c0","Type":"ContainerStarted","Data":"9916771a7a5caf9867abe2cfe8f2d223fa1890c42c5af378333826b2e22db032"} Mar 20 14:19:59 crc kubenswrapper[4973]: I0320 14:19:59.276549 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b6r4h" podStartSLOduration=3.214144074 podStartE2EDuration="7.276530628s" podCreationTimestamp="2026-03-20 14:19:52 +0000 UTC" firstStartedPulling="2026-03-20 14:19:54.202115337 +0000 UTC m=+3514.945785081" lastFinishedPulling="2026-03-20 14:19:58.264501891 +0000 UTC m=+3519.008171635" observedRunningTime="2026-03-20 14:19:59.276025354 +0000 UTC m=+3520.019695098" watchObservedRunningTime="2026-03-20 14:19:59.276530628 +0000 UTC m=+3520.020200372" Mar 20 14:20:00 crc kubenswrapper[4973]: I0320 14:20:00.077377 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fw66n" podUID="4ae477ba-28ba-4345-9f0c-d409ce5d9c3b" containerName="registry-server" probeResult="failure" output=< Mar 20 14:20:00 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:20:00 crc kubenswrapper[4973]: > Mar 20 14:20:00 crc kubenswrapper[4973]: I0320 14:20:00.192973 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566940-msgdk"] Mar 20 14:20:00 crc kubenswrapper[4973]: I0320 14:20:00.195085 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566940-msgdk" Mar 20 14:20:00 crc kubenswrapper[4973]: I0320 14:20:00.201847 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:20:00 crc kubenswrapper[4973]: I0320 14:20:00.202063 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:20:00 crc kubenswrapper[4973]: I0320 14:20:00.202577 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:20:00 crc kubenswrapper[4973]: I0320 14:20:00.271413 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566940-msgdk"] Mar 20 14:20:00 crc kubenswrapper[4973]: I0320 14:20:00.324720 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbcs8\" (UniqueName: \"kubernetes.io/projected/d5abf2ce-dc1d-4f0e-8785-2dba2fd88d4b-kube-api-access-mbcs8\") pod \"auto-csr-approver-29566940-msgdk\" (UID: \"d5abf2ce-dc1d-4f0e-8785-2dba2fd88d4b\") " pod="openshift-infra/auto-csr-approver-29566940-msgdk" Mar 20 14:20:00 crc kubenswrapper[4973]: I0320 14:20:00.427644 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbcs8\" (UniqueName: \"kubernetes.io/projected/d5abf2ce-dc1d-4f0e-8785-2dba2fd88d4b-kube-api-access-mbcs8\") pod \"auto-csr-approver-29566940-msgdk\" (UID: \"d5abf2ce-dc1d-4f0e-8785-2dba2fd88d4b\") " pod="openshift-infra/auto-csr-approver-29566940-msgdk" Mar 20 14:20:00 crc kubenswrapper[4973]: I0320 14:20:00.454503 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbcs8\" (UniqueName: \"kubernetes.io/projected/d5abf2ce-dc1d-4f0e-8785-2dba2fd88d4b-kube-api-access-mbcs8\") pod \"auto-csr-approver-29566940-msgdk\" (UID: \"d5abf2ce-dc1d-4f0e-8785-2dba2fd88d4b\") " pod="openshift-infra/auto-csr-approver-29566940-msgdk" Mar 20 14:20:00 crc kubenswrapper[4973]: I0320 14:20:00.530256 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566940-msgdk" Mar 20 14:20:01 crc kubenswrapper[4973]: I0320 14:20:01.174853 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566940-msgdk"] Mar 20 14:20:01 crc kubenswrapper[4973]: I0320 14:20:01.304479 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566940-msgdk" event={"ID":"d5abf2ce-dc1d-4f0e-8785-2dba2fd88d4b","Type":"ContainerStarted","Data":"7ce26af44330eb3b8d528899a58b524a71cf2762b7341ce3e69fa2e73c1f2659"} Mar 20 14:20:02 crc kubenswrapper[4973]: I0320 14:20:02.783884 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b6r4h" Mar 20 14:20:02 crc kubenswrapper[4973]: I0320 14:20:02.785019 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b6r4h" Mar 20 14:20:02 crc kubenswrapper[4973]: I0320 14:20:02.855309 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b6r4h" Mar 20 14:20:03 crc kubenswrapper[4973]: I0320 14:20:03.391271 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b6r4h" Mar 20 14:20:03 crc kubenswrapper[4973]: I0320 14:20:03.493487 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b6r4h"] Mar 20 14:20:04 crc kubenswrapper[4973]: I0320 14:20:04.338292 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566940-msgdk" event={"ID":"d5abf2ce-dc1d-4f0e-8785-2dba2fd88d4b","Type":"ContainerStarted","Data":"c9473ebc80f57f6527750e31dbf763523fbcd347933b2b97c38a97d1eecf6b21"} Mar 20 14:20:04 crc kubenswrapper[4973]: I0320 14:20:04.363361 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566940-msgdk" podStartSLOduration=2.267545915 podStartE2EDuration="4.363325573s" podCreationTimestamp="2026-03-20 14:20:00 +0000 UTC" firstStartedPulling="2026-03-20 14:20:01.183739835 +0000 UTC m=+3521.927409579" lastFinishedPulling="2026-03-20 14:20:03.279519493 +0000 UTC m=+3524.023189237" observedRunningTime="2026-03-20 14:20:04.356014156 +0000 UTC m=+3525.099683910" watchObservedRunningTime="2026-03-20 14:20:04.363325573 +0000 UTC m=+3525.106995317" Mar 20 14:20:05 crc kubenswrapper[4973]: I0320 14:20:05.350278 4973 generic.go:334] "Generic (PLEG): container finished" podID="d5abf2ce-dc1d-4f0e-8785-2dba2fd88d4b" containerID="c9473ebc80f57f6527750e31dbf763523fbcd347933b2b97c38a97d1eecf6b21" exitCode=0 Mar 20 14:20:05 crc kubenswrapper[4973]: I0320 14:20:05.350356 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566940-msgdk" event={"ID":"d5abf2ce-dc1d-4f0e-8785-2dba2fd88d4b","Type":"ContainerDied","Data":"c9473ebc80f57f6527750e31dbf763523fbcd347933b2b97c38a97d1eecf6b21"} Mar 20 14:20:05 crc kubenswrapper[4973]: I0320 14:20:05.350619 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b6r4h" podUID="acb14b29-937e-4835-8672-0a59fd7855c0" containerName="registry-server" containerID="cri-o://9916771a7a5caf9867abe2cfe8f2d223fa1890c42c5af378333826b2e22db032" gracePeriod=2 Mar 20 14:20:05 crc kubenswrapper[4973]: I0320 14:20:05.932195 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b6r4h" Mar 20 14:20:06 crc kubenswrapper[4973]: I0320 14:20:06.089949 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq9qf\" (UniqueName: \"kubernetes.io/projected/acb14b29-937e-4835-8672-0a59fd7855c0-kube-api-access-mq9qf\") pod \"acb14b29-937e-4835-8672-0a59fd7855c0\" (UID: \"acb14b29-937e-4835-8672-0a59fd7855c0\") " Mar 20 14:20:06 crc kubenswrapper[4973]: I0320 14:20:06.090132 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acb14b29-937e-4835-8672-0a59fd7855c0-catalog-content\") pod \"acb14b29-937e-4835-8672-0a59fd7855c0\" (UID: \"acb14b29-937e-4835-8672-0a59fd7855c0\") " Mar 20 14:20:06 crc kubenswrapper[4973]: I0320 14:20:06.090287 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acb14b29-937e-4835-8672-0a59fd7855c0-utilities\") pod \"acb14b29-937e-4835-8672-0a59fd7855c0\" (UID: \"acb14b29-937e-4835-8672-0a59fd7855c0\") " Mar 20 14:20:06 crc kubenswrapper[4973]: I0320 14:20:06.091511 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acb14b29-937e-4835-8672-0a59fd7855c0-utilities" (OuterVolumeSpecName: "utilities") pod "acb14b29-937e-4835-8672-0a59fd7855c0" (UID: "acb14b29-937e-4835-8672-0a59fd7855c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:20:06 crc kubenswrapper[4973]: I0320 14:20:06.100574 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acb14b29-937e-4835-8672-0a59fd7855c0-kube-api-access-mq9qf" (OuterVolumeSpecName: "kube-api-access-mq9qf") pod "acb14b29-937e-4835-8672-0a59fd7855c0" (UID: "acb14b29-937e-4835-8672-0a59fd7855c0"). InnerVolumeSpecName "kube-api-access-mq9qf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:20:06 crc kubenswrapper[4973]: I0320 14:20:06.117191 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acb14b29-937e-4835-8672-0a59fd7855c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "acb14b29-937e-4835-8672-0a59fd7855c0" (UID: "acb14b29-937e-4835-8672-0a59fd7855c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:20:06 crc kubenswrapper[4973]: I0320 14:20:06.193081 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq9qf\" (UniqueName: \"kubernetes.io/projected/acb14b29-937e-4835-8672-0a59fd7855c0-kube-api-access-mq9qf\") on node \"crc\" DevicePath \"\"" Mar 20 14:20:06 crc kubenswrapper[4973]: I0320 14:20:06.193371 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/acb14b29-937e-4835-8672-0a59fd7855c0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:20:06 crc kubenswrapper[4973]: I0320 14:20:06.193443 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/acb14b29-937e-4835-8672-0a59fd7855c0-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:20:06 crc kubenswrapper[4973]: I0320 14:20:06.363936 4973 generic.go:334] "Generic (PLEG): container finished" podID="acb14b29-937e-4835-8672-0a59fd7855c0" containerID="9916771a7a5caf9867abe2cfe8f2d223fa1890c42c5af378333826b2e22db032" exitCode=0 Mar 20 14:20:06 crc kubenswrapper[4973]: I0320 14:20:06.364033 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b6r4h" Mar 20 14:20:06 crc kubenswrapper[4973]: I0320 14:20:06.364138 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b6r4h" event={"ID":"acb14b29-937e-4835-8672-0a59fd7855c0","Type":"ContainerDied","Data":"9916771a7a5caf9867abe2cfe8f2d223fa1890c42c5af378333826b2e22db032"} Mar 20 14:20:06 crc kubenswrapper[4973]: I0320 14:20:06.364167 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b6r4h" event={"ID":"acb14b29-937e-4835-8672-0a59fd7855c0","Type":"ContainerDied","Data":"1c91260b49cc6b261b391b01419166db13ca9cee2b0e8b4092a556fac2c343fb"} Mar 20 14:20:06 crc kubenswrapper[4973]: I0320 14:20:06.364184 4973 scope.go:117] "RemoveContainer" containerID="9916771a7a5caf9867abe2cfe8f2d223fa1890c42c5af378333826b2e22db032" Mar 20 14:20:06 crc kubenswrapper[4973]: I0320 14:20:06.396666 4973 scope.go:117] "RemoveContainer" containerID="b74e4f2cb1e42177774715f8a3cd15eac0a38f06abc5e1e98f922f9cb7e089fc" Mar 20 14:20:06 crc kubenswrapper[4973]: I0320 14:20:06.405465 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b6r4h"] Mar 20 14:20:06 crc kubenswrapper[4973]: I0320 14:20:06.415226 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b6r4h"] Mar 20 14:20:06 crc kubenswrapper[4973]: I0320 14:20:06.427574 4973 scope.go:117] "RemoveContainer" containerID="5ac7fdfafbc24e2f7bbc4be781ae2c6d1dc0fbbfce3d1ba4b23133cbe24a7375" Mar 20 14:20:06 crc kubenswrapper[4973]: I0320 14:20:06.517605 4973 scope.go:117] "RemoveContainer" containerID="9916771a7a5caf9867abe2cfe8f2d223fa1890c42c5af378333826b2e22db032" Mar 20 14:20:06 crc kubenswrapper[4973]: E0320 14:20:06.518007 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9916771a7a5caf9867abe2cfe8f2d223fa1890c42c5af378333826b2e22db032\": container with ID starting with 9916771a7a5caf9867abe2cfe8f2d223fa1890c42c5af378333826b2e22db032 not found: ID does not exist" containerID="9916771a7a5caf9867abe2cfe8f2d223fa1890c42c5af378333826b2e22db032" Mar 20 14:20:06 crc kubenswrapper[4973]: I0320 14:20:06.518040 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9916771a7a5caf9867abe2cfe8f2d223fa1890c42c5af378333826b2e22db032"} err="failed to get container status \"9916771a7a5caf9867abe2cfe8f2d223fa1890c42c5af378333826b2e22db032\": rpc error: code = NotFound desc = could not find container \"9916771a7a5caf9867abe2cfe8f2d223fa1890c42c5af378333826b2e22db032\": container with ID starting with 9916771a7a5caf9867abe2cfe8f2d223fa1890c42c5af378333826b2e22db032 not found: ID does not exist" Mar 20 14:20:06 crc kubenswrapper[4973]: I0320 14:20:06.518059 4973 scope.go:117] "RemoveContainer" containerID="b74e4f2cb1e42177774715f8a3cd15eac0a38f06abc5e1e98f922f9cb7e089fc" Mar 20 14:20:06 crc kubenswrapper[4973]: E0320 14:20:06.518637 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b74e4f2cb1e42177774715f8a3cd15eac0a38f06abc5e1e98f922f9cb7e089fc\": container with ID starting with b74e4f2cb1e42177774715f8a3cd15eac0a38f06abc5e1e98f922f9cb7e089fc not found: ID does not exist" containerID="b74e4f2cb1e42177774715f8a3cd15eac0a38f06abc5e1e98f922f9cb7e089fc" Mar 20 14:20:06 crc kubenswrapper[4973]: I0320 14:20:06.518687 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b74e4f2cb1e42177774715f8a3cd15eac0a38f06abc5e1e98f922f9cb7e089fc"} err="failed to get container status \"b74e4f2cb1e42177774715f8a3cd15eac0a38f06abc5e1e98f922f9cb7e089fc\": rpc error: code = NotFound desc = could not find container \"b74e4f2cb1e42177774715f8a3cd15eac0a38f06abc5e1e98f922f9cb7e089fc\": container with ID starting with b74e4f2cb1e42177774715f8a3cd15eac0a38f06abc5e1e98f922f9cb7e089fc not found: ID does not exist" Mar 20 14:20:06 crc kubenswrapper[4973]: I0320 14:20:06.518720 4973 scope.go:117] "RemoveContainer" containerID="5ac7fdfafbc24e2f7bbc4be781ae2c6d1dc0fbbfce3d1ba4b23133cbe24a7375" Mar 20 14:20:06 crc kubenswrapper[4973]: E0320 14:20:06.520157 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5ac7fdfafbc24e2f7bbc4be781ae2c6d1dc0fbbfce3d1ba4b23133cbe24a7375\": container with ID starting with 5ac7fdfafbc24e2f7bbc4be781ae2c6d1dc0fbbfce3d1ba4b23133cbe24a7375 not found: ID does not exist" containerID="5ac7fdfafbc24e2f7bbc4be781ae2c6d1dc0fbbfce3d1ba4b23133cbe24a7375" Mar 20 14:20:06 crc kubenswrapper[4973]: I0320 14:20:06.520188 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5ac7fdfafbc24e2f7bbc4be781ae2c6d1dc0fbbfce3d1ba4b23133cbe24a7375"} err="failed to get container status \"5ac7fdfafbc24e2f7bbc4be781ae2c6d1dc0fbbfce3d1ba4b23133cbe24a7375\": rpc error: code = NotFound desc = could not find container \"5ac7fdfafbc24e2f7bbc4be781ae2c6d1dc0fbbfce3d1ba4b23133cbe24a7375\": container with ID starting with 5ac7fdfafbc24e2f7bbc4be781ae2c6d1dc0fbbfce3d1ba4b23133cbe24a7375 not found: ID does not exist" Mar 20 14:20:06 crc kubenswrapper[4973]: I0320 14:20:06.841282 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566940-msgdk" Mar 20 14:20:07 crc kubenswrapper[4973]: I0320 14:20:07.010928 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbcs8\" (UniqueName: \"kubernetes.io/projected/d5abf2ce-dc1d-4f0e-8785-2dba2fd88d4b-kube-api-access-mbcs8\") pod \"d5abf2ce-dc1d-4f0e-8785-2dba2fd88d4b\" (UID: \"d5abf2ce-dc1d-4f0e-8785-2dba2fd88d4b\") " Mar 20 14:20:07 crc kubenswrapper[4973]: I0320 14:20:07.016498 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5abf2ce-dc1d-4f0e-8785-2dba2fd88d4b-kube-api-access-mbcs8" (OuterVolumeSpecName: "kube-api-access-mbcs8") pod "d5abf2ce-dc1d-4f0e-8785-2dba2fd88d4b" (UID: "d5abf2ce-dc1d-4f0e-8785-2dba2fd88d4b"). InnerVolumeSpecName "kube-api-access-mbcs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:20:07 crc kubenswrapper[4973]: I0320 14:20:07.114541 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbcs8\" (UniqueName: \"kubernetes.io/projected/d5abf2ce-dc1d-4f0e-8785-2dba2fd88d4b-kube-api-access-mbcs8\") on node \"crc\" DevicePath \"\"" Mar 20 14:20:07 crc kubenswrapper[4973]: I0320 14:20:07.378913 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566940-msgdk" event={"ID":"d5abf2ce-dc1d-4f0e-8785-2dba2fd88d4b","Type":"ContainerDied","Data":"7ce26af44330eb3b8d528899a58b524a71cf2762b7341ce3e69fa2e73c1f2659"} Mar 20 14:20:07 crc kubenswrapper[4973]: I0320 14:20:07.378988 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ce26af44330eb3b8d528899a58b524a71cf2762b7341ce3e69fa2e73c1f2659" Mar 20 14:20:07 crc kubenswrapper[4973]: I0320 14:20:07.378996 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566940-msgdk" Mar 20 14:20:07 crc kubenswrapper[4973]: I0320 14:20:07.454952 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566934-r5dpc"] Mar 20 14:20:07 crc kubenswrapper[4973]: I0320 14:20:07.467456 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566934-r5dpc"] Mar 20 14:20:07 crc kubenswrapper[4973]: I0320 14:20:07.972855 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d7e3baf-7119-49c5-80c1-4ff4d213d01d" path="/var/lib/kubelet/pods/4d7e3baf-7119-49c5-80c1-4ff4d213d01d/volumes" Mar 20 14:20:07 crc kubenswrapper[4973]: I0320 14:20:07.973748 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acb14b29-937e-4835-8672-0a59fd7855c0" path="/var/lib/kubelet/pods/acb14b29-937e-4835-8672-0a59fd7855c0/volumes" Mar 20 14:20:10 crc kubenswrapper[4973]: I0320 14:20:10.091148 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fw66n" podUID="4ae477ba-28ba-4345-9f0c-d409ce5d9c3b" containerName="registry-server" probeResult="failure" output=< Mar 20 14:20:10 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:20:10 crc kubenswrapper[4973]: > Mar 20 14:20:17 crc kubenswrapper[4973]: I0320 14:20:17.927920 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-whsd8"] Mar 20 14:20:17 crc kubenswrapper[4973]: E0320 14:20:17.929109 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb14b29-937e-4835-8672-0a59fd7855c0" containerName="extract-content" Mar 20 14:20:17 crc kubenswrapper[4973]: I0320 14:20:17.929124 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb14b29-937e-4835-8672-0a59fd7855c0" containerName="extract-content" Mar 20 14:20:17 crc kubenswrapper[4973]: E0320 14:20:17.929143 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5abf2ce-dc1d-4f0e-8785-2dba2fd88d4b" containerName="oc" Mar 20 14:20:17 crc kubenswrapper[4973]: I0320 14:20:17.929149 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5abf2ce-dc1d-4f0e-8785-2dba2fd88d4b" containerName="oc" Mar 20 14:20:17 crc kubenswrapper[4973]: E0320 14:20:17.929182 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb14b29-937e-4835-8672-0a59fd7855c0" containerName="registry-server" Mar 20 14:20:17 crc kubenswrapper[4973]: I0320 14:20:17.929188 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb14b29-937e-4835-8672-0a59fd7855c0" containerName="registry-server" Mar 20 14:20:17 crc kubenswrapper[4973]: E0320 14:20:17.929202 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acb14b29-937e-4835-8672-0a59fd7855c0" containerName="extract-utilities" Mar 20 14:20:17 crc kubenswrapper[4973]: I0320 14:20:17.929209 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="acb14b29-937e-4835-8672-0a59fd7855c0" containerName="extract-utilities" Mar 20 14:20:17 crc kubenswrapper[4973]: I0320 14:20:17.929529 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5abf2ce-dc1d-4f0e-8785-2dba2fd88d4b" containerName="oc" Mar 20 14:20:17 crc kubenswrapper[4973]: I0320 14:20:17.929549 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="acb14b29-937e-4835-8672-0a59fd7855c0" containerName="registry-server" Mar 20 14:20:17 crc kubenswrapper[4973]: I0320 14:20:17.931436 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whsd8" Mar 20 14:20:17 crc kubenswrapper[4973]: I0320 14:20:17.940193 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-whsd8"] Mar 20 14:20:17 crc kubenswrapper[4973]: I0320 14:20:17.984146 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14550a46-fa7f-4c8e-9c8a-0571e5843a03-utilities\") pod \"certified-operators-whsd8\" (UID: \"14550a46-fa7f-4c8e-9c8a-0571e5843a03\") " pod="openshift-marketplace/certified-operators-whsd8" Mar 20 14:20:17 crc kubenswrapper[4973]: I0320 14:20:17.987566 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14550a46-fa7f-4c8e-9c8a-0571e5843a03-catalog-content\") pod \"certified-operators-whsd8\" (UID: \"14550a46-fa7f-4c8e-9c8a-0571e5843a03\") " pod="openshift-marketplace/certified-operators-whsd8" Mar 20 14:20:17 crc kubenswrapper[4973]: I0320 14:20:17.987627 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szfhv\" (UniqueName: \"kubernetes.io/projected/14550a46-fa7f-4c8e-9c8a-0571e5843a03-kube-api-access-szfhv\") pod \"certified-operators-whsd8\" (UID: \"14550a46-fa7f-4c8e-9c8a-0571e5843a03\") " pod="openshift-marketplace/certified-operators-whsd8" Mar 20 14:20:18 crc kubenswrapper[4973]: I0320 14:20:18.091690 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14550a46-fa7f-4c8e-9c8a-0571e5843a03-utilities\") pod \"certified-operators-whsd8\" (UID: \"14550a46-fa7f-4c8e-9c8a-0571e5843a03\") " pod="openshift-marketplace/certified-operators-whsd8" Mar 20 14:20:18 crc kubenswrapper[4973]: I0320 14:20:18.091761 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14550a46-fa7f-4c8e-9c8a-0571e5843a03-catalog-content\") pod \"certified-operators-whsd8\" (UID: \"14550a46-fa7f-4c8e-9c8a-0571e5843a03\") " pod="openshift-marketplace/certified-operators-whsd8" Mar 20 14:20:18 crc kubenswrapper[4973]: I0320 14:20:18.091782 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szfhv\" (UniqueName: \"kubernetes.io/projected/14550a46-fa7f-4c8e-9c8a-0571e5843a03-kube-api-access-szfhv\") pod \"certified-operators-whsd8\" (UID: \"14550a46-fa7f-4c8e-9c8a-0571e5843a03\") " pod="openshift-marketplace/certified-operators-whsd8" Mar 20 14:20:18 crc kubenswrapper[4973]: I0320 14:20:18.092278 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14550a46-fa7f-4c8e-9c8a-0571e5843a03-catalog-content\") pod \"certified-operators-whsd8\" (UID: \"14550a46-fa7f-4c8e-9c8a-0571e5843a03\") " pod="openshift-marketplace/certified-operators-whsd8" Mar 20 14:20:18 crc kubenswrapper[4973]: I0320 14:20:18.092678 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14550a46-fa7f-4c8e-9c8a-0571e5843a03-utilities\") pod \"certified-operators-whsd8\" (UID: \"14550a46-fa7f-4c8e-9c8a-0571e5843a03\") " pod="openshift-marketplace/certified-operators-whsd8" Mar 20 14:20:18 crc kubenswrapper[4973]: I0320 14:20:18.113193 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szfhv\" (UniqueName: \"kubernetes.io/projected/14550a46-fa7f-4c8e-9c8a-0571e5843a03-kube-api-access-szfhv\") pod \"certified-operators-whsd8\" (UID: \"14550a46-fa7f-4c8e-9c8a-0571e5843a03\") " pod="openshift-marketplace/certified-operators-whsd8" Mar 20 14:20:18 crc kubenswrapper[4973]: I0320 14:20:18.264289 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whsd8" Mar 20 14:20:18 crc kubenswrapper[4973]: I0320 14:20:18.841667 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-whsd8"] Mar 20 14:20:18 crc kubenswrapper[4973]: W0320 14:20:18.876459 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14550a46_fa7f_4c8e_9c8a_0571e5843a03.slice/crio-62d771ec90604479087eaa7f075840224b804a24cb44862198c9ca9c2ae8ebf2 WatchSource:0}: Error finding container 62d771ec90604479087eaa7f075840224b804a24cb44862198c9ca9c2ae8ebf2: Status 404 returned error can't find the container with id 62d771ec90604479087eaa7f075840224b804a24cb44862198c9ca9c2ae8ebf2 Mar 20 14:20:19 crc kubenswrapper[4973]: I0320 14:20:19.586848 4973 generic.go:334] "Generic (PLEG): container finished" podID="14550a46-fa7f-4c8e-9c8a-0571e5843a03" containerID="26b0b66362d49216874441300d449d86c3938595223e5e4221ee2af751132bb7" exitCode=0 Mar 20 14:20:19 crc kubenswrapper[4973]: I0320 14:20:19.586948 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whsd8" event={"ID":"14550a46-fa7f-4c8e-9c8a-0571e5843a03","Type":"ContainerDied","Data":"26b0b66362d49216874441300d449d86c3938595223e5e4221ee2af751132bb7"} Mar 20 14:20:19 crc kubenswrapper[4973]: I0320 14:20:19.587273 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whsd8" event={"ID":"14550a46-fa7f-4c8e-9c8a-0571e5843a03","Type":"ContainerStarted","Data":"62d771ec90604479087eaa7f075840224b804a24cb44862198c9ca9c2ae8ebf2"} Mar 20 14:20:20 crc kubenswrapper[4973]: I0320 14:20:20.091174 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fw66n" podUID="4ae477ba-28ba-4345-9f0c-d409ce5d9c3b" containerName="registry-server" probeResult="failure" output=< Mar 20 14:20:20 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:20:20 crc kubenswrapper[4973]: > Mar 20 14:20:21 crc kubenswrapper[4973]: I0320 14:20:21.608405 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whsd8" event={"ID":"14550a46-fa7f-4c8e-9c8a-0571e5843a03","Type":"ContainerStarted","Data":"4b25153e9bdc700749f59b71931faeb27815f25be49e2f4e725b85d3d1046314"} Mar 20 14:20:22 crc kubenswrapper[4973]: I0320 14:20:22.621088 4973 generic.go:334] "Generic (PLEG): container finished" podID="14550a46-fa7f-4c8e-9c8a-0571e5843a03" containerID="4b25153e9bdc700749f59b71931faeb27815f25be49e2f4e725b85d3d1046314" exitCode=0 Mar 20 14:20:22 crc kubenswrapper[4973]: I0320 14:20:22.621138 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whsd8" event={"ID":"14550a46-fa7f-4c8e-9c8a-0571e5843a03","Type":"ContainerDied","Data":"4b25153e9bdc700749f59b71931faeb27815f25be49e2f4e725b85d3d1046314"} Mar 20 14:20:23 crc kubenswrapper[4973]: I0320 14:20:23.638310 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whsd8" event={"ID":"14550a46-fa7f-4c8e-9c8a-0571e5843a03","Type":"ContainerStarted","Data":"87f3ba0a5fce9ad5cca4d2c4617a12b9cd19d33ceba17e1c4b86636a6bf4372f"} Mar 20 14:20:23 crc kubenswrapper[4973]: I0320 14:20:23.657839 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-whsd8" podStartSLOduration=3.046122925 podStartE2EDuration="6.657818201s" podCreationTimestamp="2026-03-20 14:20:17 +0000 UTC" firstStartedPulling="2026-03-20 14:20:19.588917661 +0000 UTC m=+3540.332587405" lastFinishedPulling="2026-03-20 14:20:23.200612937 +0000 UTC m=+3543.944282681" observedRunningTime="2026-03-20 14:20:23.65442541 +0000 UTC m=+3544.398095154" watchObservedRunningTime="2026-03-20 14:20:23.657818201 +0000 UTC m=+3544.401487945" Mar 20 14:20:28 crc kubenswrapper[4973]: I0320 14:20:28.265102 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-whsd8" Mar 20 14:20:28 crc kubenswrapper[4973]: I0320 14:20:28.265756 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-whsd8" Mar 20 14:20:28 crc kubenswrapper[4973]: I0320 14:20:28.323144 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-whsd8" Mar 20 14:20:28 crc kubenswrapper[4973]: I0320 14:20:28.743476 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-whsd8" Mar 20 14:20:28 crc kubenswrapper[4973]: I0320 14:20:28.806284 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-whsd8"] Mar 20 14:20:29 crc kubenswrapper[4973]: I0320 14:20:29.071884 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fw66n" Mar 20 14:20:29 crc kubenswrapper[4973]: I0320 14:20:29.121979 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fw66n" Mar 20 14:20:30 crc kubenswrapper[4973]: I0320 14:20:30.708223 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-whsd8" podUID="14550a46-fa7f-4c8e-9c8a-0571e5843a03" containerName="registry-server" containerID="cri-o://87f3ba0a5fce9ad5cca4d2c4617a12b9cd19d33ceba17e1c4b86636a6bf4372f" gracePeriod=2 Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.015301 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fw66n"] Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.016292 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fw66n" podUID="4ae477ba-28ba-4345-9f0c-d409ce5d9c3b" containerName="registry-server" containerID="cri-o://0bdb96791bff61d1ef2268d8da98ee5a1c1aa610256e35fc2f7f8d85c8993fef" gracePeriod=2 Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.473613 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whsd8" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.622780 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fw66n" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.631863 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szfhv\" (UniqueName: \"kubernetes.io/projected/14550a46-fa7f-4c8e-9c8a-0571e5843a03-kube-api-access-szfhv\") pod \"14550a46-fa7f-4c8e-9c8a-0571e5843a03\" (UID: \"14550a46-fa7f-4c8e-9c8a-0571e5843a03\") " Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.631964 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14550a46-fa7f-4c8e-9c8a-0571e5843a03-catalog-content\") pod \"14550a46-fa7f-4c8e-9c8a-0571e5843a03\" (UID: \"14550a46-fa7f-4c8e-9c8a-0571e5843a03\") " Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.632154 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14550a46-fa7f-4c8e-9c8a-0571e5843a03-utilities\") pod \"14550a46-fa7f-4c8e-9c8a-0571e5843a03\" (UID: \"14550a46-fa7f-4c8e-9c8a-0571e5843a03\") " Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.638371 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14550a46-fa7f-4c8e-9c8a-0571e5843a03-utilities" (OuterVolumeSpecName: "utilities") pod "14550a46-fa7f-4c8e-9c8a-0571e5843a03" (UID: "14550a46-fa7f-4c8e-9c8a-0571e5843a03"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.644909 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14550a46-fa7f-4c8e-9c8a-0571e5843a03-kube-api-access-szfhv" (OuterVolumeSpecName: "kube-api-access-szfhv") pod "14550a46-fa7f-4c8e-9c8a-0571e5843a03" (UID: "14550a46-fa7f-4c8e-9c8a-0571e5843a03"). InnerVolumeSpecName "kube-api-access-szfhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.705511 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14550a46-fa7f-4c8e-9c8a-0571e5843a03-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "14550a46-fa7f-4c8e-9c8a-0571e5843a03" (UID: "14550a46-fa7f-4c8e-9c8a-0571e5843a03"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.719322 4973 generic.go:334] "Generic (PLEG): container finished" podID="14550a46-fa7f-4c8e-9c8a-0571e5843a03" containerID="87f3ba0a5fce9ad5cca4d2c4617a12b9cd19d33ceba17e1c4b86636a6bf4372f" exitCode=0 Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.719492 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whsd8" event={"ID":"14550a46-fa7f-4c8e-9c8a-0571e5843a03","Type":"ContainerDied","Data":"87f3ba0a5fce9ad5cca4d2c4617a12b9cd19d33ceba17e1c4b86636a6bf4372f"} Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.719528 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whsd8" event={"ID":"14550a46-fa7f-4c8e-9c8a-0571e5843a03","Type":"ContainerDied","Data":"62d771ec90604479087eaa7f075840224b804a24cb44862198c9ca9c2ae8ebf2"} Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.719571 4973 scope.go:117] "RemoveContainer" containerID="87f3ba0a5fce9ad5cca4d2c4617a12b9cd19d33ceba17e1c4b86636a6bf4372f" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.719805 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whsd8" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.727121 4973 generic.go:334] "Generic (PLEG): container finished" podID="4ae477ba-28ba-4345-9f0c-d409ce5d9c3b" containerID="0bdb96791bff61d1ef2268d8da98ee5a1c1aa610256e35fc2f7f8d85c8993fef" exitCode=0 Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.727170 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw66n" event={"ID":"4ae477ba-28ba-4345-9f0c-d409ce5d9c3b","Type":"ContainerDied","Data":"0bdb96791bff61d1ef2268d8da98ee5a1c1aa610256e35fc2f7f8d85c8993fef"} Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.727240 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fw66n" event={"ID":"4ae477ba-28ba-4345-9f0c-d409ce5d9c3b","Type":"ContainerDied","Data":"7018f721b1f2fb4fb5c9bdd08cbf20cf01ac9d5fe1221e1fd5d25ff11fe6a8ed"} Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.727314 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fw66n" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.733783 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae477ba-28ba-4345-9f0c-d409ce5d9c3b-catalog-content\") pod \"4ae477ba-28ba-4345-9f0c-d409ce5d9c3b\" (UID: \"4ae477ba-28ba-4345-9f0c-d409ce5d9c3b\") " Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.734199 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2gcwf\" (UniqueName: \"kubernetes.io/projected/4ae477ba-28ba-4345-9f0c-d409ce5d9c3b-kube-api-access-2gcwf\") pod \"4ae477ba-28ba-4345-9f0c-d409ce5d9c3b\" (UID: \"4ae477ba-28ba-4345-9f0c-d409ce5d9c3b\") " Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.734456 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae477ba-28ba-4345-9f0c-d409ce5d9c3b-utilities\") pod \"4ae477ba-28ba-4345-9f0c-d409ce5d9c3b\" (UID: \"4ae477ba-28ba-4345-9f0c-d409ce5d9c3b\") " Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.735219 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szfhv\" (UniqueName: \"kubernetes.io/projected/14550a46-fa7f-4c8e-9c8a-0571e5843a03-kube-api-access-szfhv\") on node \"crc\" DevicePath \"\"" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.735241 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/14550a46-fa7f-4c8e-9c8a-0571e5843a03-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.735255 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/14550a46-fa7f-4c8e-9c8a-0571e5843a03-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.736012 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae477ba-28ba-4345-9f0c-d409ce5d9c3b-utilities" (OuterVolumeSpecName: "utilities") pod "4ae477ba-28ba-4345-9f0c-d409ce5d9c3b" (UID: "4ae477ba-28ba-4345-9f0c-d409ce5d9c3b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.740885 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ae477ba-28ba-4345-9f0c-d409ce5d9c3b-kube-api-access-2gcwf" (OuterVolumeSpecName: "kube-api-access-2gcwf") pod "4ae477ba-28ba-4345-9f0c-d409ce5d9c3b" (UID: "4ae477ba-28ba-4345-9f0c-d409ce5d9c3b"). InnerVolumeSpecName "kube-api-access-2gcwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.749410 4973 scope.go:117] "RemoveContainer" containerID="4b25153e9bdc700749f59b71931faeb27815f25be49e2f4e725b85d3d1046314" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.772854 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-whsd8"] Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.783904 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-whsd8"] Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.797461 4973 scope.go:117] "RemoveContainer" containerID="26b0b66362d49216874441300d449d86c3938595223e5e4221ee2af751132bb7" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.846748 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2gcwf\" (UniqueName: \"kubernetes.io/projected/4ae477ba-28ba-4345-9f0c-d409ce5d9c3b-kube-api-access-2gcwf\") on node \"crc\" DevicePath \"\"" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.846788 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4ae477ba-28ba-4345-9f0c-d409ce5d9c3b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.846845 4973 scope.go:117] "RemoveContainer" containerID="87f3ba0a5fce9ad5cca4d2c4617a12b9cd19d33ceba17e1c4b86636a6bf4372f" Mar 20 14:20:31 crc kubenswrapper[4973]: E0320 14:20:31.847486 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87f3ba0a5fce9ad5cca4d2c4617a12b9cd19d33ceba17e1c4b86636a6bf4372f\": container with ID starting with 87f3ba0a5fce9ad5cca4d2c4617a12b9cd19d33ceba17e1c4b86636a6bf4372f not found: ID does not exist" containerID="87f3ba0a5fce9ad5cca4d2c4617a12b9cd19d33ceba17e1c4b86636a6bf4372f" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.847542 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87f3ba0a5fce9ad5cca4d2c4617a12b9cd19d33ceba17e1c4b86636a6bf4372f"} err="failed to get container status \"87f3ba0a5fce9ad5cca4d2c4617a12b9cd19d33ceba17e1c4b86636a6bf4372f\": rpc error: code = NotFound desc = could not find container \"87f3ba0a5fce9ad5cca4d2c4617a12b9cd19d33ceba17e1c4b86636a6bf4372f\": container with ID starting with 87f3ba0a5fce9ad5cca4d2c4617a12b9cd19d33ceba17e1c4b86636a6bf4372f not found: ID does not exist" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.847581 4973 scope.go:117] "RemoveContainer" containerID="4b25153e9bdc700749f59b71931faeb27815f25be49e2f4e725b85d3d1046314" Mar 20 14:20:31 crc kubenswrapper[4973]: E0320 14:20:31.848437 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b25153e9bdc700749f59b71931faeb27815f25be49e2f4e725b85d3d1046314\": container with ID starting with 4b25153e9bdc700749f59b71931faeb27815f25be49e2f4e725b85d3d1046314 not found: ID does not exist" containerID="4b25153e9bdc700749f59b71931faeb27815f25be49e2f4e725b85d3d1046314" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.848489 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b25153e9bdc700749f59b71931faeb27815f25be49e2f4e725b85d3d1046314"} err="failed to get container status \"4b25153e9bdc700749f59b71931faeb27815f25be49e2f4e725b85d3d1046314\": rpc error: code = NotFound desc = could not find container \"4b25153e9bdc700749f59b71931faeb27815f25be49e2f4e725b85d3d1046314\": container with ID starting with 4b25153e9bdc700749f59b71931faeb27815f25be49e2f4e725b85d3d1046314 not found: ID does not exist" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.848522 4973 scope.go:117] "RemoveContainer" containerID="26b0b66362d49216874441300d449d86c3938595223e5e4221ee2af751132bb7" Mar 20 14:20:31 crc kubenswrapper[4973]: E0320 14:20:31.849044 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26b0b66362d49216874441300d449d86c3938595223e5e4221ee2af751132bb7\": container with ID starting with 26b0b66362d49216874441300d449d86c3938595223e5e4221ee2af751132bb7 not found: ID does not exist" containerID="26b0b66362d49216874441300d449d86c3938595223e5e4221ee2af751132bb7" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.849080 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26b0b66362d49216874441300d449d86c3938595223e5e4221ee2af751132bb7"} err="failed to get container status \"26b0b66362d49216874441300d449d86c3938595223e5e4221ee2af751132bb7\": rpc error: code = NotFound desc = could not find container \"26b0b66362d49216874441300d449d86c3938595223e5e4221ee2af751132bb7\": container with ID starting with 26b0b66362d49216874441300d449d86c3938595223e5e4221ee2af751132bb7 not found: ID does not exist" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.849101 4973 scope.go:117] "RemoveContainer" containerID="0bdb96791bff61d1ef2268d8da98ee5a1c1aa610256e35fc2f7f8d85c8993fef" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.907801 4973 scope.go:117] "RemoveContainer" containerID="5403f0da20a835782b1497a8307f019d2f3a43b6434abac93ac664ffa36ad56c" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.907923 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4ae477ba-28ba-4345-9f0c-d409ce5d9c3b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4ae477ba-28ba-4345-9f0c-d409ce5d9c3b" (UID: "4ae477ba-28ba-4345-9f0c-d409ce5d9c3b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.948938 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4ae477ba-28ba-4345-9f0c-d409ce5d9c3b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.951982 4973 scope.go:117] "RemoveContainer" containerID="8f1a6d33e8545d259ef5e777aa383ffdbf8a97189c717885fbd5f5ba6acc28ce" Mar 20 14:20:31 crc kubenswrapper[4973]: I0320 14:20:31.965579 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14550a46-fa7f-4c8e-9c8a-0571e5843a03" path="/var/lib/kubelet/pods/14550a46-fa7f-4c8e-9c8a-0571e5843a03/volumes" Mar 20 14:20:32 crc kubenswrapper[4973]: I0320 14:20:32.001331 4973 scope.go:117] "RemoveContainer" containerID="0bdb96791bff61d1ef2268d8da98ee5a1c1aa610256e35fc2f7f8d85c8993fef" Mar 20 14:20:32 crc kubenswrapper[4973]: E0320 14:20:32.001808 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bdb96791bff61d1ef2268d8da98ee5a1c1aa610256e35fc2f7f8d85c8993fef\": container with ID starting with 0bdb96791bff61d1ef2268d8da98ee5a1c1aa610256e35fc2f7f8d85c8993fef not found: ID does not exist" containerID="0bdb96791bff61d1ef2268d8da98ee5a1c1aa610256e35fc2f7f8d85c8993fef" Mar 20 14:20:32 crc kubenswrapper[4973]: I0320 14:20:32.001852 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bdb96791bff61d1ef2268d8da98ee5a1c1aa610256e35fc2f7f8d85c8993fef"} err="failed to get container status \"0bdb96791bff61d1ef2268d8da98ee5a1c1aa610256e35fc2f7f8d85c8993fef\": rpc error: code = NotFound desc = could not find container \"0bdb96791bff61d1ef2268d8da98ee5a1c1aa610256e35fc2f7f8d85c8993fef\": container with ID starting with 0bdb96791bff61d1ef2268d8da98ee5a1c1aa610256e35fc2f7f8d85c8993fef not found: ID does not exist" Mar 20 14:20:32 crc kubenswrapper[4973]: I0320 14:20:32.001883 4973 scope.go:117] "RemoveContainer" containerID="5403f0da20a835782b1497a8307f019d2f3a43b6434abac93ac664ffa36ad56c" Mar 20 14:20:32 crc kubenswrapper[4973]: E0320 14:20:32.002327 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5403f0da20a835782b1497a8307f019d2f3a43b6434abac93ac664ffa36ad56c\": container with ID starting with 5403f0da20a835782b1497a8307f019d2f3a43b6434abac93ac664ffa36ad56c not found: ID does not exist" containerID="5403f0da20a835782b1497a8307f019d2f3a43b6434abac93ac664ffa36ad56c" Mar 20 14:20:32 crc kubenswrapper[4973]: I0320 14:20:32.002445 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5403f0da20a835782b1497a8307f019d2f3a43b6434abac93ac664ffa36ad56c"} err="failed to get container status \"5403f0da20a835782b1497a8307f019d2f3a43b6434abac93ac664ffa36ad56c\": rpc error: code = NotFound desc = could not find container \"5403f0da20a835782b1497a8307f019d2f3a43b6434abac93ac664ffa36ad56c\": container with ID starting with 5403f0da20a835782b1497a8307f019d2f3a43b6434abac93ac664ffa36ad56c not found: ID does not exist" Mar 20 14:20:32 crc kubenswrapper[4973]: I0320 14:20:32.002517 4973 scope.go:117] "RemoveContainer" containerID="8f1a6d33e8545d259ef5e777aa383ffdbf8a97189c717885fbd5f5ba6acc28ce" Mar 20 14:20:32 crc kubenswrapper[4973]: E0320 14:20:32.002975 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f1a6d33e8545d259ef5e777aa383ffdbf8a97189c717885fbd5f5ba6acc28ce\": container with ID starting with 8f1a6d33e8545d259ef5e777aa383ffdbf8a97189c717885fbd5f5ba6acc28ce not found: ID does not exist" containerID="8f1a6d33e8545d259ef5e777aa383ffdbf8a97189c717885fbd5f5ba6acc28ce" Mar 20 14:20:32 crc kubenswrapper[4973]: I0320 14:20:32.003013 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f1a6d33e8545d259ef5e777aa383ffdbf8a97189c717885fbd5f5ba6acc28ce"} err="failed to get container status \"8f1a6d33e8545d259ef5e777aa383ffdbf8a97189c717885fbd5f5ba6acc28ce\": rpc error: code = NotFound desc = could not find container \"8f1a6d33e8545d259ef5e777aa383ffdbf8a97189c717885fbd5f5ba6acc28ce\": container with ID starting with 8f1a6d33e8545d259ef5e777aa383ffdbf8a97189c717885fbd5f5ba6acc28ce not found: ID does not exist" Mar 20 14:20:32 crc kubenswrapper[4973]: I0320 14:20:32.064518 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fw66n"] Mar 20 14:20:32 crc kubenswrapper[4973]: I0320 14:20:32.078321 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fw66n"] Mar 20 14:20:33 crc kubenswrapper[4973]: I0320 14:20:33.962454 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ae477ba-28ba-4345-9f0c-d409ce5d9c3b" path="/var/lib/kubelet/pods/4ae477ba-28ba-4345-9f0c-d409ce5d9c3b/volumes" Mar 20 14:21:02 crc kubenswrapper[4973]: I0320 14:21:02.606502 4973 scope.go:117] "RemoveContainer" containerID="fa40686177aba5b69526a727c7166004903b53e33471e9d367d892ed3b318c73" Mar 20 14:21:48 crc kubenswrapper[4973]: I0320 14:21:48.565225 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4kn9k"] Mar 20 14:21:48 crc kubenswrapper[4973]: E0320 14:21:48.570177 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14550a46-fa7f-4c8e-9c8a-0571e5843a03" containerName="extract-utilities" Mar 20 14:21:48 crc kubenswrapper[4973]: I0320 14:21:48.570516 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="14550a46-fa7f-4c8e-9c8a-0571e5843a03" containerName="extract-utilities" Mar 20 14:21:48 crc kubenswrapper[4973]: E0320 14:21:48.570614 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14550a46-fa7f-4c8e-9c8a-0571e5843a03" containerName="registry-server" Mar 20 14:21:48 crc kubenswrapper[4973]: I0320 14:21:48.570693 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="14550a46-fa7f-4c8e-9c8a-0571e5843a03" containerName="registry-server" Mar 20 14:21:48 crc kubenswrapper[4973]: E0320 14:21:48.570776 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="14550a46-fa7f-4c8e-9c8a-0571e5843a03" containerName="extract-content" Mar 20 14:21:48 crc kubenswrapper[4973]: I0320 14:21:48.570848 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="14550a46-fa7f-4c8e-9c8a-0571e5843a03" containerName="extract-content" Mar 20 14:21:48 crc kubenswrapper[4973]: E0320 14:21:48.570954 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae477ba-28ba-4345-9f0c-d409ce5d9c3b" containerName="extract-utilities" Mar 20 14:21:48 crc kubenswrapper[4973]: I0320 14:21:48.571034 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae477ba-28ba-4345-9f0c-d409ce5d9c3b" containerName="extract-utilities" Mar 20 14:21:48 crc kubenswrapper[4973]: E0320 14:21:48.571130 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae477ba-28ba-4345-9f0c-d409ce5d9c3b" containerName="registry-server" Mar 20 14:21:48 crc kubenswrapper[4973]: I0320 14:21:48.571202 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae477ba-28ba-4345-9f0c-d409ce5d9c3b" containerName="registry-server" Mar 20 14:21:48 crc kubenswrapper[4973]: E0320 14:21:48.571290 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ae477ba-28ba-4345-9f0c-d409ce5d9c3b" containerName="extract-content" Mar 20 14:21:48 crc kubenswrapper[4973]: I0320 14:21:48.571389 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ae477ba-28ba-4345-9f0c-d409ce5d9c3b" containerName="extract-content" Mar 20 14:21:48 crc kubenswrapper[4973]: I0320 14:21:48.571818 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ae477ba-28ba-4345-9f0c-d409ce5d9c3b" containerName="registry-server" Mar 20 14:21:48 crc kubenswrapper[4973]: I0320 14:21:48.571938 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="14550a46-fa7f-4c8e-9c8a-0571e5843a03" containerName="registry-server" Mar 20 14:21:48 crc kubenswrapper[4973]: I0320 14:21:48.574556 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4kn9k" Mar 20 14:21:48 crc kubenswrapper[4973]: I0320 14:21:48.583216 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4kn9k"] Mar 20 14:21:48 crc kubenswrapper[4973]: I0320 14:21:48.705449 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4137699f-d786-49c9-a1ed-356904406345-utilities\") pod \"community-operators-4kn9k\" (UID: \"4137699f-d786-49c9-a1ed-356904406345\") " pod="openshift-marketplace/community-operators-4kn9k" Mar 20 14:21:48 crc kubenswrapper[4973]: I0320 14:21:48.705646 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4137699f-d786-49c9-a1ed-356904406345-catalog-content\") pod \"community-operators-4kn9k\" (UID: \"4137699f-d786-49c9-a1ed-356904406345\") " pod="openshift-marketplace/community-operators-4kn9k" Mar 20 14:21:48 crc kubenswrapper[4973]: I0320 14:21:48.705719 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s79b2\" (UniqueName: \"kubernetes.io/projected/4137699f-d786-49c9-a1ed-356904406345-kube-api-access-s79b2\") pod \"community-operators-4kn9k\" (UID: \"4137699f-d786-49c9-a1ed-356904406345\") " pod="openshift-marketplace/community-operators-4kn9k" Mar 20 14:21:48 crc kubenswrapper[4973]: I0320 14:21:48.808299 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4137699f-d786-49c9-a1ed-356904406345-utilities\") pod \"community-operators-4kn9k\" (UID: \"4137699f-d786-49c9-a1ed-356904406345\") " pod="openshift-marketplace/community-operators-4kn9k" Mar 20 14:21:48 crc kubenswrapper[4973]: I0320 14:21:48.808913 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4137699f-d786-49c9-a1ed-356904406345-catalog-content\") pod \"community-operators-4kn9k\" (UID: \"4137699f-d786-49c9-a1ed-356904406345\") " pod="openshift-marketplace/community-operators-4kn9k" Mar 20 14:21:48 crc kubenswrapper[4973]: I0320 14:21:48.808922 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4137699f-d786-49c9-a1ed-356904406345-utilities\") pod \"community-operators-4kn9k\" (UID: \"4137699f-d786-49c9-a1ed-356904406345\") " pod="openshift-marketplace/community-operators-4kn9k" Mar 20 14:21:48 crc kubenswrapper[4973]: I0320 14:21:48.808965 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s79b2\" (UniqueName: \"kubernetes.io/projected/4137699f-d786-49c9-a1ed-356904406345-kube-api-access-s79b2\") pod \"community-operators-4kn9k\" (UID: \"4137699f-d786-49c9-a1ed-356904406345\") " pod="openshift-marketplace/community-operators-4kn9k" Mar 20 14:21:48 crc kubenswrapper[4973]: I0320 14:21:48.809414 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4137699f-d786-49c9-a1ed-356904406345-catalog-content\") pod \"community-operators-4kn9k\" (UID: \"4137699f-d786-49c9-a1ed-356904406345\") " pod="openshift-marketplace/community-operators-4kn9k" Mar 20 14:21:48 crc kubenswrapper[4973]: I0320 14:21:48.836149 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s79b2\" (UniqueName: \"kubernetes.io/projected/4137699f-d786-49c9-a1ed-356904406345-kube-api-access-s79b2\") pod \"community-operators-4kn9k\" (UID: \"4137699f-d786-49c9-a1ed-356904406345\") " pod="openshift-marketplace/community-operators-4kn9k" Mar 20 14:21:48 crc kubenswrapper[4973]: I0320 14:21:48.905289 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4kn9k" Mar 20 14:21:49 crc kubenswrapper[4973]: I0320 14:21:49.629849 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4kn9k"] Mar 20 14:21:49 crc kubenswrapper[4973]: I0320 14:21:49.661177 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4kn9k" event={"ID":"4137699f-d786-49c9-a1ed-356904406345","Type":"ContainerStarted","Data":"1e0b77dcd683fb0918353970f705ea21e8bcb3cc0fb95dcd41cda8862de817fc"} Mar 20 14:21:50 crc kubenswrapper[4973]: I0320 14:21:50.677889 4973 generic.go:334] "Generic (PLEG): container finished" podID="4137699f-d786-49c9-a1ed-356904406345" containerID="1e622c51c632db51ba377fc65fe4e591497ece9fea7acd8be135b3e3490cef42" exitCode=0 Mar 20 14:21:50 crc kubenswrapper[4973]: I0320 14:21:50.677986 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4kn9k" event={"ID":"4137699f-d786-49c9-a1ed-356904406345","Type":"ContainerDied","Data":"1e622c51c632db51ba377fc65fe4e591497ece9fea7acd8be135b3e3490cef42"} Mar 20 14:21:52 crc kubenswrapper[4973]: I0320 14:21:52.703618 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4kn9k" event={"ID":"4137699f-d786-49c9-a1ed-356904406345","Type":"ContainerStarted","Data":"d60fb70c2c14e976b851106ae3e4b4e5a056c961eb3e54c1ed6e600b22b00716"} Mar 20 14:21:53 crc kubenswrapper[4973]: I0320 14:21:53.715142 4973 generic.go:334] "Generic (PLEG): container finished" podID="4137699f-d786-49c9-a1ed-356904406345" containerID="d60fb70c2c14e976b851106ae3e4b4e5a056c961eb3e54c1ed6e600b22b00716" exitCode=0 Mar 20 14:21:53 crc kubenswrapper[4973]: I0320 14:21:53.715352 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4kn9k" event={"ID":"4137699f-d786-49c9-a1ed-356904406345","Type":"ContainerDied","Data":"d60fb70c2c14e976b851106ae3e4b4e5a056c961eb3e54c1ed6e600b22b00716"} Mar 20 14:21:54 crc kubenswrapper[4973]: I0320 14:21:54.734243 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4kn9k" event={"ID":"4137699f-d786-49c9-a1ed-356904406345","Type":"ContainerStarted","Data":"36d1df98f195692d941feff3acaf7b58d9b5231d5c3276a490ff0e4820dccf01"} Mar 20 14:21:54 crc kubenswrapper[4973]: I0320 14:21:54.758796 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4kn9k" podStartSLOduration=3.35482517 podStartE2EDuration="6.75877007s" podCreationTimestamp="2026-03-20 14:21:48 +0000 UTC" firstStartedPulling="2026-03-20 14:21:50.680874857 +0000 UTC m=+3631.424544601" lastFinishedPulling="2026-03-20 14:21:54.084819747 +0000 UTC m=+3634.828489501" observedRunningTime="2026-03-20 14:21:54.748976486 +0000 UTC m=+3635.492646230" watchObservedRunningTime="2026-03-20 14:21:54.75877007 +0000 UTC m=+3635.502439814" Mar 20 14:21:58 crc kubenswrapper[4973]: I0320 14:21:58.905926 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4kn9k" Mar 20 14:21:58 crc kubenswrapper[4973]: I0320 14:21:58.906537 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4kn9k" Mar 20 14:21:58 crc kubenswrapper[4973]: I0320 14:21:58.951790 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4kn9k" Mar 20 14:21:59 crc kubenswrapper[4973]: I0320 14:21:59.846330 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4kn9k" Mar 20 14:21:59 crc kubenswrapper[4973]: I0320 14:21:59.911905 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4kn9k"] Mar 20 14:22:00 crc kubenswrapper[4973]: I0320 14:22:00.146251 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566942-pbsc9"] Mar 20 14:22:00 crc kubenswrapper[4973]: I0320 14:22:00.151249 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566942-pbsc9" Mar 20 14:22:00 crc kubenswrapper[4973]: I0320 14:22:00.155176 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:22:00 crc kubenswrapper[4973]: I0320 14:22:00.155515 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:22:00 crc kubenswrapper[4973]: I0320 14:22:00.162913 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:22:00 crc kubenswrapper[4973]: I0320 14:22:00.171455 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566942-pbsc9"] Mar 20 14:22:00 crc kubenswrapper[4973]: I0320 14:22:00.227591 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4cj8\" (UniqueName: \"kubernetes.io/projected/3ec9bf52-c8ae-45b5-a7cc-7d06ecbec1f0-kube-api-access-p4cj8\") pod \"auto-csr-approver-29566942-pbsc9\" (UID: \"3ec9bf52-c8ae-45b5-a7cc-7d06ecbec1f0\") " pod="openshift-infra/auto-csr-approver-29566942-pbsc9" Mar 20 14:22:00 crc kubenswrapper[4973]: I0320 14:22:00.329687 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4cj8\" (UniqueName: \"kubernetes.io/projected/3ec9bf52-c8ae-45b5-a7cc-7d06ecbec1f0-kube-api-access-p4cj8\") pod \"auto-csr-approver-29566942-pbsc9\" (UID: \"3ec9bf52-c8ae-45b5-a7cc-7d06ecbec1f0\") " pod="openshift-infra/auto-csr-approver-29566942-pbsc9" Mar 20 14:22:00 crc kubenswrapper[4973]: I0320 14:22:00.348852 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4cj8\" (UniqueName: \"kubernetes.io/projected/3ec9bf52-c8ae-45b5-a7cc-7d06ecbec1f0-kube-api-access-p4cj8\") pod \"auto-csr-approver-29566942-pbsc9\" (UID: \"3ec9bf52-c8ae-45b5-a7cc-7d06ecbec1f0\") " pod="openshift-infra/auto-csr-approver-29566942-pbsc9" Mar 20 14:22:00 crc kubenswrapper[4973]: I0320 14:22:00.473872 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566942-pbsc9" Mar 20 14:22:00 crc kubenswrapper[4973]: I0320 14:22:00.953251 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566942-pbsc9"] Mar 20 14:22:01 crc kubenswrapper[4973]: I0320 14:22:01.819760 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566942-pbsc9" event={"ID":"3ec9bf52-c8ae-45b5-a7cc-7d06ecbec1f0","Type":"ContainerStarted","Data":"8e1a93e1e468502350a0ef0c9d785ddc3bb3901b48b576a890224d3f001b4950"} Mar 20 14:22:01 crc kubenswrapper[4973]: I0320 14:22:01.819968 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4kn9k" podUID="4137699f-d786-49c9-a1ed-356904406345" containerName="registry-server" containerID="cri-o://36d1df98f195692d941feff3acaf7b58d9b5231d5c3276a490ff0e4820dccf01" gracePeriod=2 Mar 20 14:22:02 crc kubenswrapper[4973]: E0320 14:22:02.150024 4973 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4137699f_d786_49c9_a1ed_356904406345.slice/crio-conmon-36d1df98f195692d941feff3acaf7b58d9b5231d5c3276a490ff0e4820dccf01.scope\": RecentStats: unable to find data in memory cache]" Mar 20 14:22:02 crc kubenswrapper[4973]: I0320 14:22:02.420914 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4kn9k" Mar 20 14:22:02 crc kubenswrapper[4973]: I0320 14:22:02.488599 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4137699f-d786-49c9-a1ed-356904406345-utilities\") pod \"4137699f-d786-49c9-a1ed-356904406345\" (UID: \"4137699f-d786-49c9-a1ed-356904406345\") " Mar 20 14:22:02 crc kubenswrapper[4973]: I0320 14:22:02.488663 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s79b2\" (UniqueName: \"kubernetes.io/projected/4137699f-d786-49c9-a1ed-356904406345-kube-api-access-s79b2\") pod \"4137699f-d786-49c9-a1ed-356904406345\" (UID: \"4137699f-d786-49c9-a1ed-356904406345\") " Mar 20 14:22:02 crc kubenswrapper[4973]: I0320 14:22:02.488691 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4137699f-d786-49c9-a1ed-356904406345-catalog-content\") pod \"4137699f-d786-49c9-a1ed-356904406345\" (UID: \"4137699f-d786-49c9-a1ed-356904406345\") " Mar 20 14:22:02 crc kubenswrapper[4973]: I0320 14:22:02.489691 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4137699f-d786-49c9-a1ed-356904406345-utilities" (OuterVolumeSpecName: "utilities") pod "4137699f-d786-49c9-a1ed-356904406345" (UID: "4137699f-d786-49c9-a1ed-356904406345"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:22:02 crc kubenswrapper[4973]: I0320 14:22:02.495737 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4137699f-d786-49c9-a1ed-356904406345-kube-api-access-s79b2" (OuterVolumeSpecName: "kube-api-access-s79b2") pod "4137699f-d786-49c9-a1ed-356904406345" (UID: "4137699f-d786-49c9-a1ed-356904406345"). InnerVolumeSpecName "kube-api-access-s79b2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:22:02 crc kubenswrapper[4973]: I0320 14:22:02.562416 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4137699f-d786-49c9-a1ed-356904406345-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4137699f-d786-49c9-a1ed-356904406345" (UID: "4137699f-d786-49c9-a1ed-356904406345"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:22:02 crc kubenswrapper[4973]: I0320 14:22:02.591211 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4137699f-d786-49c9-a1ed-356904406345-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:22:02 crc kubenswrapper[4973]: I0320 14:22:02.591247 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s79b2\" (UniqueName: \"kubernetes.io/projected/4137699f-d786-49c9-a1ed-356904406345-kube-api-access-s79b2\") on node \"crc\" DevicePath \"\"" Mar 20 14:22:02 crc kubenswrapper[4973]: I0320 14:22:02.591259 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4137699f-d786-49c9-a1ed-356904406345-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:22:02 crc kubenswrapper[4973]: I0320 14:22:02.834981 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566942-pbsc9" event={"ID":"3ec9bf52-c8ae-45b5-a7cc-7d06ecbec1f0","Type":"ContainerStarted","Data":"7ed7dfaddaa082b39e3f444001674cf5b25bd929bdfd7ca29f46d28f16eb9c36"} Mar 20 14:22:02 crc kubenswrapper[4973]: I0320 14:22:02.837721 4973 generic.go:334] "Generic (PLEG): container finished" podID="4137699f-d786-49c9-a1ed-356904406345" containerID="36d1df98f195692d941feff3acaf7b58d9b5231d5c3276a490ff0e4820dccf01" exitCode=0 Mar 20 14:22:02 crc kubenswrapper[4973]: I0320 14:22:02.837774 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4kn9k" event={"ID":"4137699f-d786-49c9-a1ed-356904406345","Type":"ContainerDied","Data":"36d1df98f195692d941feff3acaf7b58d9b5231d5c3276a490ff0e4820dccf01"} Mar 20 14:22:02 crc kubenswrapper[4973]: I0320 14:22:02.837807 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4kn9k" event={"ID":"4137699f-d786-49c9-a1ed-356904406345","Type":"ContainerDied","Data":"1e0b77dcd683fb0918353970f705ea21e8bcb3cc0fb95dcd41cda8862de817fc"} Mar 20 14:22:02 crc kubenswrapper[4973]: I0320 14:22:02.837829 4973 scope.go:117] "RemoveContainer" containerID="36d1df98f195692d941feff3acaf7b58d9b5231d5c3276a490ff0e4820dccf01" Mar 20 14:22:02 crc kubenswrapper[4973]: I0320 14:22:02.837979 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4kn9k" Mar 20 14:22:02 crc kubenswrapper[4973]: I0320 14:22:02.861948 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566942-pbsc9" podStartSLOduration=1.8661280059999998 podStartE2EDuration="2.861930806s" podCreationTimestamp="2026-03-20 14:22:00 +0000 UTC" firstStartedPulling="2026-03-20 14:22:00.963138805 +0000 UTC m=+3641.706808549" lastFinishedPulling="2026-03-20 14:22:01.958941615 +0000 UTC m=+3642.702611349" observedRunningTime="2026-03-20 14:22:02.858526594 +0000 UTC m=+3643.602196378" watchObservedRunningTime="2026-03-20 14:22:02.861930806 +0000 UTC m=+3643.605600550" Mar 20 14:22:02 crc kubenswrapper[4973]: I0320 14:22:02.915240 4973 scope.go:117] "RemoveContainer" containerID="d60fb70c2c14e976b851106ae3e4b4e5a056c961eb3e54c1ed6e600b22b00716" Mar 20 14:22:03 crc kubenswrapper[4973]: I0320 14:22:03.012500 4973 scope.go:117] "RemoveContainer" containerID="1e622c51c632db51ba377fc65fe4e591497ece9fea7acd8be135b3e3490cef42" Mar 20 14:22:03 crc kubenswrapper[4973]: I0320 14:22:03.063575 4973 scope.go:117] "RemoveContainer" containerID="36d1df98f195692d941feff3acaf7b58d9b5231d5c3276a490ff0e4820dccf01" Mar 20 14:22:03 crc kubenswrapper[4973]: E0320 14:22:03.088463 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36d1df98f195692d941feff3acaf7b58d9b5231d5c3276a490ff0e4820dccf01\": container with ID starting with 36d1df98f195692d941feff3acaf7b58d9b5231d5c3276a490ff0e4820dccf01 not found: ID does not exist" containerID="36d1df98f195692d941feff3acaf7b58d9b5231d5c3276a490ff0e4820dccf01" Mar 20 14:22:03 crc kubenswrapper[4973]: I0320 14:22:03.088519 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36d1df98f195692d941feff3acaf7b58d9b5231d5c3276a490ff0e4820dccf01"} err="failed to get container status \"36d1df98f195692d941feff3acaf7b58d9b5231d5c3276a490ff0e4820dccf01\": rpc error: code = NotFound desc = could not find container \"36d1df98f195692d941feff3acaf7b58d9b5231d5c3276a490ff0e4820dccf01\": container with ID starting with 36d1df98f195692d941feff3acaf7b58d9b5231d5c3276a490ff0e4820dccf01 not found: ID does not exist" Mar 20 14:22:03 crc kubenswrapper[4973]: I0320 14:22:03.088550 4973 scope.go:117] "RemoveContainer" containerID="d60fb70c2c14e976b851106ae3e4b4e5a056c961eb3e54c1ed6e600b22b00716" Mar 20 14:22:03 crc kubenswrapper[4973]: E0320 14:22:03.092590 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d60fb70c2c14e976b851106ae3e4b4e5a056c961eb3e54c1ed6e600b22b00716\": container with ID starting with d60fb70c2c14e976b851106ae3e4b4e5a056c961eb3e54c1ed6e600b22b00716 not found: ID does not exist" containerID="d60fb70c2c14e976b851106ae3e4b4e5a056c961eb3e54c1ed6e600b22b00716" Mar 20 14:22:03 crc kubenswrapper[4973]: I0320 14:22:03.092669 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d60fb70c2c14e976b851106ae3e4b4e5a056c961eb3e54c1ed6e600b22b00716"} err="failed to get container status \"d60fb70c2c14e976b851106ae3e4b4e5a056c961eb3e54c1ed6e600b22b00716\": rpc error: code = NotFound desc = could not find container \"d60fb70c2c14e976b851106ae3e4b4e5a056c961eb3e54c1ed6e600b22b00716\": container with ID starting with d60fb70c2c14e976b851106ae3e4b4e5a056c961eb3e54c1ed6e600b22b00716 not found: ID does not exist" Mar 20 14:22:03 crc kubenswrapper[4973]: I0320 14:22:03.092704 4973 scope.go:117] "RemoveContainer" containerID="1e622c51c632db51ba377fc65fe4e591497ece9fea7acd8be135b3e3490cef42" Mar 20 14:22:03 crc kubenswrapper[4973]: E0320 14:22:03.100682 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e622c51c632db51ba377fc65fe4e591497ece9fea7acd8be135b3e3490cef42\": container with ID starting with 1e622c51c632db51ba377fc65fe4e591497ece9fea7acd8be135b3e3490cef42 not found: ID does not exist" containerID="1e622c51c632db51ba377fc65fe4e591497ece9fea7acd8be135b3e3490cef42" Mar 20 14:22:03 crc kubenswrapper[4973]: I0320 14:22:03.100723 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e622c51c632db51ba377fc65fe4e591497ece9fea7acd8be135b3e3490cef42"} err="failed to get container status \"1e622c51c632db51ba377fc65fe4e591497ece9fea7acd8be135b3e3490cef42\": rpc error: code = NotFound desc = could not find container \"1e622c51c632db51ba377fc65fe4e591497ece9fea7acd8be135b3e3490cef42\": container with ID starting with 1e622c51c632db51ba377fc65fe4e591497ece9fea7acd8be135b3e3490cef42 not found: ID does not exist" Mar 20 14:22:03 crc kubenswrapper[4973]: I0320 14:22:03.113373 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4kn9k"] Mar 20 14:22:03 crc kubenswrapper[4973]: I0320 14:22:03.123617 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4kn9k"] Mar 20 14:22:03 crc kubenswrapper[4973]: I0320 14:22:03.850693 4973 generic.go:334] "Generic (PLEG): container finished" podID="3ec9bf52-c8ae-45b5-a7cc-7d06ecbec1f0" containerID="7ed7dfaddaa082b39e3f444001674cf5b25bd929bdfd7ca29f46d28f16eb9c36" exitCode=0 Mar 20 14:22:03 crc kubenswrapper[4973]: I0320 14:22:03.851052 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566942-pbsc9" event={"ID":"3ec9bf52-c8ae-45b5-a7cc-7d06ecbec1f0","Type":"ContainerDied","Data":"7ed7dfaddaa082b39e3f444001674cf5b25bd929bdfd7ca29f46d28f16eb9c36"} Mar 20 14:22:03 crc kubenswrapper[4973]: I0320 14:22:03.963968 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4137699f-d786-49c9-a1ed-356904406345" path="/var/lib/kubelet/pods/4137699f-d786-49c9-a1ed-356904406345/volumes" Mar 20 14:22:05 crc kubenswrapper[4973]: I0320 14:22:05.285091 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566942-pbsc9" Mar 20 14:22:05 crc kubenswrapper[4973]: I0320 14:22:05.439696 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4cj8\" (UniqueName: \"kubernetes.io/projected/3ec9bf52-c8ae-45b5-a7cc-7d06ecbec1f0-kube-api-access-p4cj8\") pod \"3ec9bf52-c8ae-45b5-a7cc-7d06ecbec1f0\" (UID: \"3ec9bf52-c8ae-45b5-a7cc-7d06ecbec1f0\") " Mar 20 14:22:05 crc kubenswrapper[4973]: I0320 14:22:05.445838 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ec9bf52-c8ae-45b5-a7cc-7d06ecbec1f0-kube-api-access-p4cj8" (OuterVolumeSpecName: "kube-api-access-p4cj8") pod "3ec9bf52-c8ae-45b5-a7cc-7d06ecbec1f0" (UID: "3ec9bf52-c8ae-45b5-a7cc-7d06ecbec1f0"). InnerVolumeSpecName "kube-api-access-p4cj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:22:05 crc kubenswrapper[4973]: I0320 14:22:05.543073 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4cj8\" (UniqueName: \"kubernetes.io/projected/3ec9bf52-c8ae-45b5-a7cc-7d06ecbec1f0-kube-api-access-p4cj8\") on node \"crc\" DevicePath \"\"" Mar 20 14:22:05 crc kubenswrapper[4973]: I0320 14:22:05.874386 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566942-pbsc9" event={"ID":"3ec9bf52-c8ae-45b5-a7cc-7d06ecbec1f0","Type":"ContainerDied","Data":"8e1a93e1e468502350a0ef0c9d785ddc3bb3901b48b576a890224d3f001b4950"} Mar 20 14:22:05 crc kubenswrapper[4973]: I0320 14:22:05.874440 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e1a93e1e468502350a0ef0c9d785ddc3bb3901b48b576a890224d3f001b4950" Mar 20 14:22:05 crc kubenswrapper[4973]: I0320 14:22:05.874548 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566942-pbsc9" Mar 20 14:22:05 crc kubenswrapper[4973]: I0320 14:22:05.929942 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566936-x4v4f"] Mar 20 14:22:05 crc kubenswrapper[4973]: I0320 14:22:05.942886 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566936-x4v4f"] Mar 20 14:22:05 crc kubenswrapper[4973]: I0320 14:22:05.969766 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c0a71f0-a19b-4e23-8965-32ce944c0bf9" path="/var/lib/kubelet/pods/4c0a71f0-a19b-4e23-8965-32ce944c0bf9/volumes" Mar 20 14:22:13 crc kubenswrapper[4973]: I0320 14:22:13.320734 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:22:13 crc kubenswrapper[4973]: I0320 14:22:13.321267 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:22:43 crc kubenswrapper[4973]: I0320 14:22:43.321035 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:22:43 crc kubenswrapper[4973]: I0320 14:22:43.322678 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:23:02 crc kubenswrapper[4973]: I0320 14:23:02.770990 4973 scope.go:117] "RemoveContainer" containerID="516236f733f42d5911189934e3b703f51b553506f17f629822bac4aa77544630" Mar 20 14:23:13 crc kubenswrapper[4973]: I0320 14:23:13.321092 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:23:13 crc kubenswrapper[4973]: I0320 14:23:13.322579 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:23:13 crc kubenswrapper[4973]: I0320 14:23:13.323401 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 14:23:13 crc kubenswrapper[4973]: I0320 14:23:13.325383 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309"} pod="openshift-machine-config-operator/machine-config-daemon-qlztx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:23:13 crc kubenswrapper[4973]: I0320 14:23:13.325534 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" containerID="cri-o://d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309" gracePeriod=600 Mar 20 14:23:13 crc kubenswrapper[4973]: E0320 14:23:13.455963 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:23:14 crc kubenswrapper[4973]: I0320 14:23:14.107481 4973 generic.go:334] "Generic (PLEG): container finished" podID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerID="d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309" exitCode=0 Mar 20 14:23:14 crc kubenswrapper[4973]: I0320 14:23:14.107549 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerDied","Data":"d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309"} Mar 20 14:23:14 crc kubenswrapper[4973]: I0320 14:23:14.108160 4973 scope.go:117] "RemoveContainer" containerID="bf957f9a512aaa4c34dde67075d8a36005e5457ba817ccbde176d2f86a603452" Mar 20 14:23:14 crc kubenswrapper[4973]: I0320 14:23:14.109327 4973 scope.go:117] "RemoveContainer" containerID="d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309" Mar 20 14:23:14 crc kubenswrapper[4973]: E0320 14:23:14.109857 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:23:26 crc kubenswrapper[4973]: I0320 14:23:26.954365 4973 scope.go:117] "RemoveContainer" containerID="d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309" Mar 20 14:23:26 crc kubenswrapper[4973]: E0320 14:23:26.955125 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:23:37 crc kubenswrapper[4973]: I0320 14:23:37.953871 4973 scope.go:117] "RemoveContainer" containerID="d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309" Mar 20 14:23:37 crc kubenswrapper[4973]: E0320 14:23:37.954749 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:23:52 crc kubenswrapper[4973]: I0320 14:23:52.950939 4973 scope.go:117] "RemoveContainer" containerID="d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309" Mar 20 14:23:52 crc kubenswrapper[4973]: E0320 14:23:52.951869 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:24:00 crc kubenswrapper[4973]: I0320 14:24:00.148068 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566944-qjd9k"] Mar 20 14:24:00 crc kubenswrapper[4973]: E0320 14:24:00.149399 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4137699f-d786-49c9-a1ed-356904406345" containerName="extract-content" Mar 20 14:24:00 crc kubenswrapper[4973]: I0320 14:24:00.149417 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="4137699f-d786-49c9-a1ed-356904406345" containerName="extract-content" Mar 20 14:24:00 crc kubenswrapper[4973]: E0320 14:24:00.149446 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ec9bf52-c8ae-45b5-a7cc-7d06ecbec1f0" containerName="oc" Mar 20 14:24:00 crc kubenswrapper[4973]: I0320 14:24:00.149456 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ec9bf52-c8ae-45b5-a7cc-7d06ecbec1f0" containerName="oc" Mar 20 14:24:00 crc kubenswrapper[4973]: E0320 14:24:00.149488 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4137699f-d786-49c9-a1ed-356904406345" containerName="extract-utilities" Mar 20 14:24:00 crc kubenswrapper[4973]: I0320 14:24:00.149497 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="4137699f-d786-49c9-a1ed-356904406345" containerName="extract-utilities" Mar 20 14:24:00 crc kubenswrapper[4973]: E0320 14:24:00.149518 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4137699f-d786-49c9-a1ed-356904406345" containerName="registry-server" Mar 20 14:24:00 crc kubenswrapper[4973]: I0320 14:24:00.149525 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="4137699f-d786-49c9-a1ed-356904406345" containerName="registry-server" Mar 20 14:24:00 crc kubenswrapper[4973]: I0320 14:24:00.149778 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="4137699f-d786-49c9-a1ed-356904406345" containerName="registry-server" Mar 20 14:24:00 crc kubenswrapper[4973]: I0320 14:24:00.149796 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ec9bf52-c8ae-45b5-a7cc-7d06ecbec1f0" containerName="oc" Mar 20 14:24:00 crc kubenswrapper[4973]: I0320 14:24:00.150606 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566944-qjd9k" Mar 20 14:24:00 crc kubenswrapper[4973]: I0320 14:24:00.153521 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:24:00 crc kubenswrapper[4973]: I0320 14:24:00.153569 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:24:00 crc kubenswrapper[4973]: I0320 14:24:00.155730 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:24:00 crc kubenswrapper[4973]: I0320 14:24:00.166100 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566944-qjd9k"] Mar 20 14:24:00 crc kubenswrapper[4973]: I0320 14:24:00.316874 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2td5\" (UniqueName: \"kubernetes.io/projected/73864548-f1e6-4aab-9caa-3d463d39b738-kube-api-access-f2td5\") pod \"auto-csr-approver-29566944-qjd9k\" (UID: \"73864548-f1e6-4aab-9caa-3d463d39b738\") " pod="openshift-infra/auto-csr-approver-29566944-qjd9k" Mar 20 14:24:00 crc kubenswrapper[4973]: I0320 14:24:00.420147 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2td5\" (UniqueName: \"kubernetes.io/projected/73864548-f1e6-4aab-9caa-3d463d39b738-kube-api-access-f2td5\") pod \"auto-csr-approver-29566944-qjd9k\" (UID: \"73864548-f1e6-4aab-9caa-3d463d39b738\") " pod="openshift-infra/auto-csr-approver-29566944-qjd9k" Mar 20 14:24:00 crc kubenswrapper[4973]: I0320 14:24:00.441493 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2td5\" (UniqueName: \"kubernetes.io/projected/73864548-f1e6-4aab-9caa-3d463d39b738-kube-api-access-f2td5\") pod \"auto-csr-approver-29566944-qjd9k\" (UID: \"73864548-f1e6-4aab-9caa-3d463d39b738\") " pod="openshift-infra/auto-csr-approver-29566944-qjd9k" Mar 20 14:24:00 crc kubenswrapper[4973]: I0320 14:24:00.480941 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566944-qjd9k" Mar 20 14:24:00 crc kubenswrapper[4973]: I0320 14:24:00.976599 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566944-qjd9k"] Mar 20 14:24:00 crc kubenswrapper[4973]: I0320 14:24:00.981960 4973 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:24:01 crc kubenswrapper[4973]: I0320 14:24:01.600403 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566944-qjd9k" event={"ID":"73864548-f1e6-4aab-9caa-3d463d39b738","Type":"ContainerStarted","Data":"b0ac7672124849163f0f9f0c3745167e202118cd4369ad807396a14f98fa13b6"} Mar 20 14:24:02 crc kubenswrapper[4973]: I0320 14:24:02.614155 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566944-qjd9k" event={"ID":"73864548-f1e6-4aab-9caa-3d463d39b738","Type":"ContainerStarted","Data":"78765263a5bf6fb473460c8c3a80c2921bc4fbf67285e3fb3a6a80d8e9a64ed9"} Mar 20 14:24:02 crc kubenswrapper[4973]: I0320 14:24:02.634172 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566944-qjd9k" podStartSLOduration=1.346584764 podStartE2EDuration="2.634154482s" podCreationTimestamp="2026-03-20 14:24:00 +0000 UTC" firstStartedPulling="2026-03-20 14:24:00.981765148 +0000 UTC m=+3761.725434892" lastFinishedPulling="2026-03-20 14:24:02.269334866 +0000 UTC m=+3763.013004610" observedRunningTime="2026-03-20 14:24:02.626033164 +0000 UTC m=+3763.369702908" watchObservedRunningTime="2026-03-20 14:24:02.634154482 +0000 UTC m=+3763.377824226" Mar 20 14:24:03 crc kubenswrapper[4973]: I0320 14:24:03.632856 4973 generic.go:334] "Generic (PLEG): container finished" podID="73864548-f1e6-4aab-9caa-3d463d39b738" containerID="78765263a5bf6fb473460c8c3a80c2921bc4fbf67285e3fb3a6a80d8e9a64ed9" exitCode=0 Mar 20 14:24:03 crc kubenswrapper[4973]: I0320 14:24:03.634061 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566944-qjd9k" event={"ID":"73864548-f1e6-4aab-9caa-3d463d39b738","Type":"ContainerDied","Data":"78765263a5bf6fb473460c8c3a80c2921bc4fbf67285e3fb3a6a80d8e9a64ed9"} Mar 20 14:24:05 crc kubenswrapper[4973]: I0320 14:24:05.109652 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566944-qjd9k" Mar 20 14:24:05 crc kubenswrapper[4973]: I0320 14:24:05.242585 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f2td5\" (UniqueName: \"kubernetes.io/projected/73864548-f1e6-4aab-9caa-3d463d39b738-kube-api-access-f2td5\") pod \"73864548-f1e6-4aab-9caa-3d463d39b738\" (UID: \"73864548-f1e6-4aab-9caa-3d463d39b738\") " Mar 20 14:24:05 crc kubenswrapper[4973]: I0320 14:24:05.248151 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73864548-f1e6-4aab-9caa-3d463d39b738-kube-api-access-f2td5" (OuterVolumeSpecName: "kube-api-access-f2td5") pod "73864548-f1e6-4aab-9caa-3d463d39b738" (UID: "73864548-f1e6-4aab-9caa-3d463d39b738"). InnerVolumeSpecName "kube-api-access-f2td5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:24:05 crc kubenswrapper[4973]: I0320 14:24:05.345971 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f2td5\" (UniqueName: \"kubernetes.io/projected/73864548-f1e6-4aab-9caa-3d463d39b738-kube-api-access-f2td5\") on node \"crc\" DevicePath \"\"" Mar 20 14:24:05 crc kubenswrapper[4973]: I0320 14:24:05.657991 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566944-qjd9k" event={"ID":"73864548-f1e6-4aab-9caa-3d463d39b738","Type":"ContainerDied","Data":"b0ac7672124849163f0f9f0c3745167e202118cd4369ad807396a14f98fa13b6"} Mar 20 14:24:05 crc kubenswrapper[4973]: I0320 14:24:05.658037 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0ac7672124849163f0f9f0c3745167e202118cd4369ad807396a14f98fa13b6" Mar 20 14:24:05 crc kubenswrapper[4973]: I0320 14:24:05.658072 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566944-qjd9k" Mar 20 14:24:05 crc kubenswrapper[4973]: I0320 14:24:05.705991 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566938-jnbkl"] Mar 20 14:24:05 crc kubenswrapper[4973]: I0320 14:24:05.720156 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566938-jnbkl"] Mar 20 14:24:05 crc kubenswrapper[4973]: I0320 14:24:05.971535 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57d73b48-f1c3-4302-8812-6f38589977f0" path="/var/lib/kubelet/pods/57d73b48-f1c3-4302-8812-6f38589977f0/volumes" Mar 20 14:24:07 crc kubenswrapper[4973]: I0320 14:24:07.950302 4973 scope.go:117] "RemoveContainer" containerID="d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309" Mar 20 14:24:07 crc kubenswrapper[4973]: E0320 14:24:07.951999 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:24:21 crc kubenswrapper[4973]: I0320 14:24:21.951530 4973 scope.go:117] "RemoveContainer" containerID="d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309" Mar 20 14:24:21 crc kubenswrapper[4973]: E0320 14:24:21.952423 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:24:34 crc kubenswrapper[4973]: I0320 14:24:34.951162 4973 scope.go:117] "RemoveContainer" containerID="d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309" Mar 20 14:24:34 crc kubenswrapper[4973]: E0320 14:24:34.952183 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:24:46 crc kubenswrapper[4973]: I0320 14:24:46.952101 4973 scope.go:117] "RemoveContainer" containerID="d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309" Mar 20 14:24:46 crc kubenswrapper[4973]: E0320 14:24:46.953051 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:24:58 crc kubenswrapper[4973]: I0320 14:24:58.950833 4973 scope.go:117] "RemoveContainer" containerID="d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309" Mar 20 14:24:58 crc kubenswrapper[4973]: E0320 14:24:58.951669 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:25:02 crc kubenswrapper[4973]: I0320 14:25:02.888394 4973 scope.go:117] "RemoveContainer" containerID="96e2c7fef3f3897b966e96ad97b348ac24801530ea767fcc591511c269420ad1" Mar 20 14:25:10 crc kubenswrapper[4973]: I0320 14:25:10.950510 4973 scope.go:117] "RemoveContainer" containerID="d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309" Mar 20 14:25:10 crc kubenswrapper[4973]: E0320 14:25:10.951250 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:25:24 crc kubenswrapper[4973]: I0320 14:25:24.950621 4973 scope.go:117] "RemoveContainer" containerID="d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309" Mar 20 14:25:24 crc kubenswrapper[4973]: E0320 14:25:24.951523 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:25:38 crc kubenswrapper[4973]: I0320 14:25:38.950448 4973 scope.go:117] "RemoveContainer" containerID="d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309" Mar 20 14:25:38 crc kubenswrapper[4973]: E0320 14:25:38.951270 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:25:52 crc kubenswrapper[4973]: I0320 14:25:52.950512 4973 scope.go:117] "RemoveContainer" containerID="d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309" Mar 20 14:25:52 crc kubenswrapper[4973]: E0320 14:25:52.952141 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:26:00 crc kubenswrapper[4973]: I0320 14:26:00.158500 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566946-smptb"] Mar 20 14:26:00 crc kubenswrapper[4973]: E0320 14:26:00.159630 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73864548-f1e6-4aab-9caa-3d463d39b738" containerName="oc" Mar 20 14:26:00 crc kubenswrapper[4973]: I0320 14:26:00.159645 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="73864548-f1e6-4aab-9caa-3d463d39b738" containerName="oc" Mar 20 14:26:00 crc kubenswrapper[4973]: I0320 14:26:00.159865 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="73864548-f1e6-4aab-9caa-3d463d39b738" containerName="oc" Mar 20 14:26:00 crc kubenswrapper[4973]: I0320 14:26:00.160700 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566946-smptb" Mar 20 14:26:00 crc kubenswrapper[4973]: I0320 14:26:00.164089 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:26:00 crc kubenswrapper[4973]: I0320 14:26:00.164106 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:26:00 crc kubenswrapper[4973]: I0320 14:26:00.164593 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:26:00 crc kubenswrapper[4973]: I0320 14:26:00.170413 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566946-smptb"] Mar 20 14:26:00 crc kubenswrapper[4973]: I0320 14:26:00.230574 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfqm2\" (UniqueName: \"kubernetes.io/projected/2702fc74-45a5-4404-8067-9de35aafcac8-kube-api-access-gfqm2\") pod \"auto-csr-approver-29566946-smptb\" (UID: \"2702fc74-45a5-4404-8067-9de35aafcac8\") " pod="openshift-infra/auto-csr-approver-29566946-smptb" Mar 20 14:26:00 crc kubenswrapper[4973]: I0320 14:26:00.332483 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfqm2\" (UniqueName: \"kubernetes.io/projected/2702fc74-45a5-4404-8067-9de35aafcac8-kube-api-access-gfqm2\") pod \"auto-csr-approver-29566946-smptb\" (UID: \"2702fc74-45a5-4404-8067-9de35aafcac8\") " pod="openshift-infra/auto-csr-approver-29566946-smptb" Mar 20 14:26:00 crc kubenswrapper[4973]: I0320 14:26:00.350815 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfqm2\" (UniqueName: \"kubernetes.io/projected/2702fc74-45a5-4404-8067-9de35aafcac8-kube-api-access-gfqm2\") pod \"auto-csr-approver-29566946-smptb\" (UID: \"2702fc74-45a5-4404-8067-9de35aafcac8\") " pod="openshift-infra/auto-csr-approver-29566946-smptb" Mar 20 14:26:00 crc kubenswrapper[4973]: I0320 14:26:00.483531 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566946-smptb" Mar 20 14:26:00 crc kubenswrapper[4973]: I0320 14:26:00.980977 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566946-smptb"] Mar 20 14:26:01 crc kubenswrapper[4973]: I0320 14:26:01.878184 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566946-smptb" event={"ID":"2702fc74-45a5-4404-8067-9de35aafcac8","Type":"ContainerStarted","Data":"df8c47e785e87d2f6d52ae0d94a9da9b23379d20ec4603188871b19b1d7e71ed"} Mar 20 14:26:02 crc kubenswrapper[4973]: I0320 14:26:02.908640 4973 generic.go:334] "Generic (PLEG): container finished" podID="2702fc74-45a5-4404-8067-9de35aafcac8" containerID="4fd369689be79f50de134d5b6613cbb69276843f9c0ecd37c6d3c88f7bc8d22a" exitCode=0 Mar 20 14:26:02 crc kubenswrapper[4973]: I0320 14:26:02.908753 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566946-smptb" event={"ID":"2702fc74-45a5-4404-8067-9de35aafcac8","Type":"ContainerDied","Data":"4fd369689be79f50de134d5b6613cbb69276843f9c0ecd37c6d3c88f7bc8d22a"} Mar 20 14:26:03 crc kubenswrapper[4973]: I0320 14:26:03.951333 4973 scope.go:117] "RemoveContainer" containerID="d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309" Mar 20 14:26:03 crc kubenswrapper[4973]: E0320 14:26:03.952161 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:26:04 crc kubenswrapper[4973]: I0320 14:26:04.405940 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566946-smptb" Mar 20 14:26:04 crc kubenswrapper[4973]: I0320 14:26:04.552808 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfqm2\" (UniqueName: \"kubernetes.io/projected/2702fc74-45a5-4404-8067-9de35aafcac8-kube-api-access-gfqm2\") pod \"2702fc74-45a5-4404-8067-9de35aafcac8\" (UID: \"2702fc74-45a5-4404-8067-9de35aafcac8\") " Mar 20 14:26:04 crc kubenswrapper[4973]: I0320 14:26:04.559395 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2702fc74-45a5-4404-8067-9de35aafcac8-kube-api-access-gfqm2" (OuterVolumeSpecName: "kube-api-access-gfqm2") pod "2702fc74-45a5-4404-8067-9de35aafcac8" (UID: "2702fc74-45a5-4404-8067-9de35aafcac8"). InnerVolumeSpecName "kube-api-access-gfqm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:26:04 crc kubenswrapper[4973]: I0320 14:26:04.655507 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfqm2\" (UniqueName: \"kubernetes.io/projected/2702fc74-45a5-4404-8067-9de35aafcac8-kube-api-access-gfqm2\") on node \"crc\" DevicePath \"\"" Mar 20 14:26:04 crc kubenswrapper[4973]: I0320 14:26:04.931262 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566946-smptb" event={"ID":"2702fc74-45a5-4404-8067-9de35aafcac8","Type":"ContainerDied","Data":"df8c47e785e87d2f6d52ae0d94a9da9b23379d20ec4603188871b19b1d7e71ed"} Mar 20 14:26:04 crc kubenswrapper[4973]: I0320 14:26:04.931301 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df8c47e785e87d2f6d52ae0d94a9da9b23379d20ec4603188871b19b1d7e71ed" Mar 20 14:26:04 crc kubenswrapper[4973]: I0320 14:26:04.931305 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566946-smptb" Mar 20 14:26:05 crc kubenswrapper[4973]: I0320 14:26:05.476829 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566940-msgdk"] Mar 20 14:26:05 crc kubenswrapper[4973]: I0320 14:26:05.487854 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566940-msgdk"] Mar 20 14:26:05 crc kubenswrapper[4973]: I0320 14:26:05.969169 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5abf2ce-dc1d-4f0e-8785-2dba2fd88d4b" path="/var/lib/kubelet/pods/d5abf2ce-dc1d-4f0e-8785-2dba2fd88d4b/volumes" Mar 20 14:26:18 crc kubenswrapper[4973]: I0320 14:26:18.951059 4973 scope.go:117] "RemoveContainer" containerID="d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309" Mar 20 14:26:18 crc kubenswrapper[4973]: E0320 14:26:18.951893 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:26:30 crc kubenswrapper[4973]: I0320 14:26:30.951397 4973 scope.go:117] "RemoveContainer" containerID="d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309" Mar 20 14:26:30 crc kubenswrapper[4973]: E0320 14:26:30.952420 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:26:45 crc kubenswrapper[4973]: I0320 14:26:45.951172 4973 scope.go:117] "RemoveContainer" containerID="d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309" Mar 20 14:26:45 crc kubenswrapper[4973]: E0320 14:26:45.952178 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:26:58 crc kubenswrapper[4973]: I0320 14:26:58.951144 4973 scope.go:117] "RemoveContainer" containerID="d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309" Mar 20 14:26:58 crc kubenswrapper[4973]: E0320 14:26:58.952197 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:27:02 crc kubenswrapper[4973]: I0320 14:27:02.998933 4973 scope.go:117] "RemoveContainer" containerID="c9473ebc80f57f6527750e31dbf763523fbcd347933b2b97c38a97d1eecf6b21" Mar 20 14:27:10 crc kubenswrapper[4973]: I0320 14:27:10.950549 4973 scope.go:117] "RemoveContainer" containerID="d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309" Mar 20 14:27:10 crc kubenswrapper[4973]: E0320 14:27:10.951658 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:27:21 crc kubenswrapper[4973]: I0320 14:27:21.951219 4973 scope.go:117] "RemoveContainer" containerID="d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309" Mar 20 14:27:21 crc kubenswrapper[4973]: E0320 14:27:21.952039 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:27:33 crc kubenswrapper[4973]: I0320 14:27:33.953399 4973 scope.go:117] "RemoveContainer" containerID="d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309" Mar 20 14:27:33 crc kubenswrapper[4973]: E0320 14:27:33.954319 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:27:48 crc kubenswrapper[4973]: I0320 14:27:48.951074 4973 scope.go:117] "RemoveContainer" containerID="d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309" Mar 20 14:27:48 crc kubenswrapper[4973]: E0320 14:27:48.951864 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:28:00 crc kubenswrapper[4973]: I0320 14:28:00.152469 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566948-8czgl"] Mar 20 14:28:00 crc kubenswrapper[4973]: E0320 14:28:00.153655 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2702fc74-45a5-4404-8067-9de35aafcac8" containerName="oc" Mar 20 14:28:00 crc kubenswrapper[4973]: I0320 14:28:00.153669 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="2702fc74-45a5-4404-8067-9de35aafcac8" containerName="oc" Mar 20 14:28:00 crc kubenswrapper[4973]: I0320 14:28:00.153912 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="2702fc74-45a5-4404-8067-9de35aafcac8" containerName="oc" Mar 20 14:28:00 crc kubenswrapper[4973]: I0320 14:28:00.154783 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566948-8czgl" Mar 20 14:28:00 crc kubenswrapper[4973]: I0320 14:28:00.157468 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:28:00 crc kubenswrapper[4973]: I0320 14:28:00.157806 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:28:00 crc kubenswrapper[4973]: I0320 14:28:00.161613 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:28:00 crc kubenswrapper[4973]: I0320 14:28:00.164085 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566948-8czgl"] Mar 20 14:28:00 crc kubenswrapper[4973]: I0320 14:28:00.216322 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk7h9\" (UniqueName: \"kubernetes.io/projected/41055ada-1fe5-480c-8c07-fc6cad616c30-kube-api-access-pk7h9\") pod \"auto-csr-approver-29566948-8czgl\" (UID: \"41055ada-1fe5-480c-8c07-fc6cad616c30\") " pod="openshift-infra/auto-csr-approver-29566948-8czgl" Mar 20 14:28:00 crc kubenswrapper[4973]: I0320 14:28:00.318310 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk7h9\" (UniqueName: \"kubernetes.io/projected/41055ada-1fe5-480c-8c07-fc6cad616c30-kube-api-access-pk7h9\") pod \"auto-csr-approver-29566948-8czgl\" (UID: \"41055ada-1fe5-480c-8c07-fc6cad616c30\") " pod="openshift-infra/auto-csr-approver-29566948-8czgl" Mar 20 14:28:00 crc kubenswrapper[4973]: I0320 14:28:00.341695 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk7h9\" (UniqueName: \"kubernetes.io/projected/41055ada-1fe5-480c-8c07-fc6cad616c30-kube-api-access-pk7h9\") pod \"auto-csr-approver-29566948-8czgl\" (UID: \"41055ada-1fe5-480c-8c07-fc6cad616c30\") " pod="openshift-infra/auto-csr-approver-29566948-8czgl" Mar 20 14:28:00 crc kubenswrapper[4973]: I0320 14:28:00.486835 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566948-8czgl" Mar 20 14:28:01 crc kubenswrapper[4973]: I0320 14:28:01.025190 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566948-8czgl"] Mar 20 14:28:01 crc kubenswrapper[4973]: I0320 14:28:01.224514 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566948-8czgl" event={"ID":"41055ada-1fe5-480c-8c07-fc6cad616c30","Type":"ContainerStarted","Data":"91ea6bc02f5bb09dc26e873b1dd78d48ee1ac494b1e270480464d92f8a7b7aff"} Mar 20 14:28:02 crc kubenswrapper[4973]: I0320 14:28:02.951747 4973 scope.go:117] "RemoveContainer" containerID="d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309" Mar 20 14:28:02 crc kubenswrapper[4973]: E0320 14:28:02.952770 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:28:03 crc kubenswrapper[4973]: I0320 14:28:03.284708 4973 generic.go:334] "Generic (PLEG): container finished" podID="41055ada-1fe5-480c-8c07-fc6cad616c30" containerID="64a4ea6b8f1298026a1cfe7081303e78460c774d104ad21e6f026bb19e710c85" exitCode=0 Mar 20 14:28:03 crc kubenswrapper[4973]: I0320 14:28:03.284819 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566948-8czgl" event={"ID":"41055ada-1fe5-480c-8c07-fc6cad616c30","Type":"ContainerDied","Data":"64a4ea6b8f1298026a1cfe7081303e78460c774d104ad21e6f026bb19e710c85"} Mar 20 14:28:05 crc kubenswrapper[4973]: I0320 14:28:05.309022 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566948-8czgl" event={"ID":"41055ada-1fe5-480c-8c07-fc6cad616c30","Type":"ContainerDied","Data":"91ea6bc02f5bb09dc26e873b1dd78d48ee1ac494b1e270480464d92f8a7b7aff"} Mar 20 14:28:05 crc kubenswrapper[4973]: I0320 14:28:05.309315 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="91ea6bc02f5bb09dc26e873b1dd78d48ee1ac494b1e270480464d92f8a7b7aff" Mar 20 14:28:05 crc kubenswrapper[4973]: I0320 14:28:05.324068 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566948-8czgl" Mar 20 14:28:05 crc kubenswrapper[4973]: I0320 14:28:05.465071 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk7h9\" (UniqueName: \"kubernetes.io/projected/41055ada-1fe5-480c-8c07-fc6cad616c30-kube-api-access-pk7h9\") pod \"41055ada-1fe5-480c-8c07-fc6cad616c30\" (UID: \"41055ada-1fe5-480c-8c07-fc6cad616c30\") " Mar 20 14:28:05 crc kubenswrapper[4973]: I0320 14:28:05.472097 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41055ada-1fe5-480c-8c07-fc6cad616c30-kube-api-access-pk7h9" (OuterVolumeSpecName: "kube-api-access-pk7h9") pod "41055ada-1fe5-480c-8c07-fc6cad616c30" (UID: "41055ada-1fe5-480c-8c07-fc6cad616c30"). InnerVolumeSpecName "kube-api-access-pk7h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:28:05 crc kubenswrapper[4973]: I0320 14:28:05.568382 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk7h9\" (UniqueName: \"kubernetes.io/projected/41055ada-1fe5-480c-8c07-fc6cad616c30-kube-api-access-pk7h9\") on node \"crc\" DevicePath \"\"" Mar 20 14:28:06 crc kubenswrapper[4973]: I0320 14:28:06.319125 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566948-8czgl" Mar 20 14:28:06 crc kubenswrapper[4973]: I0320 14:28:06.391476 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566942-pbsc9"] Mar 20 14:28:06 crc kubenswrapper[4973]: I0320 14:28:06.403366 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566942-pbsc9"] Mar 20 14:28:07 crc kubenswrapper[4973]: I0320 14:28:07.966284 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ec9bf52-c8ae-45b5-a7cc-7d06ecbec1f0" path="/var/lib/kubelet/pods/3ec9bf52-c8ae-45b5-a7cc-7d06ecbec1f0/volumes" Mar 20 14:28:16 crc kubenswrapper[4973]: I0320 14:28:16.951971 4973 scope.go:117] "RemoveContainer" containerID="d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309" Mar 20 14:28:18 crc kubenswrapper[4973]: I0320 14:28:18.071642 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerStarted","Data":"bfc4b31266e980584c244bdcd5c59ee40888c1197c32fc1c6fe5dc06b7ab0740"} Mar 20 14:29:03 crc kubenswrapper[4973]: I0320 14:29:03.136034 4973 scope.go:117] "RemoveContainer" containerID="7ed7dfaddaa082b39e3f444001674cf5b25bd929bdfd7ca29f46d28f16eb9c36" Mar 20 14:30:00 crc kubenswrapper[4973]: I0320 14:30:00.157909 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566950-j7xhj"] Mar 20 14:30:00 crc kubenswrapper[4973]: E0320 14:30:00.159170 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41055ada-1fe5-480c-8c07-fc6cad616c30" containerName="oc" Mar 20 14:30:00 crc kubenswrapper[4973]: I0320 14:30:00.159188 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="41055ada-1fe5-480c-8c07-fc6cad616c30" containerName="oc" Mar 20 14:30:00 crc kubenswrapper[4973]: I0320 14:30:00.159517 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="41055ada-1fe5-480c-8c07-fc6cad616c30" containerName="oc" Mar 20 14:30:00 crc kubenswrapper[4973]: I0320 14:30:00.160708 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566950-j7xhj" Mar 20 14:30:00 crc kubenswrapper[4973]: I0320 14:30:00.162781 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:30:00 crc kubenswrapper[4973]: I0320 14:30:00.163561 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:30:00 crc kubenswrapper[4973]: I0320 14:30:00.166186 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:30:00 crc kubenswrapper[4973]: I0320 14:30:00.178419 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566950-xs97m"] Mar 20 14:30:00 crc kubenswrapper[4973]: I0320 14:30:00.180649 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-xs97m" Mar 20 14:30:00 crc kubenswrapper[4973]: I0320 14:30:00.183120 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 14:30:00 crc kubenswrapper[4973]: I0320 14:30:00.183120 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 14:30:00 crc kubenswrapper[4973]: I0320 14:30:00.196941 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566950-j7xhj"] Mar 20 14:30:00 crc kubenswrapper[4973]: I0320 14:30:00.210273 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566950-xs97m"] Mar 20 14:30:00 crc kubenswrapper[4973]: I0320 14:30:00.269484 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hql2\" (UniqueName: \"kubernetes.io/projected/d513dbd9-be74-4c62-a87a-45071ef62cef-kube-api-access-9hql2\") pod \"auto-csr-approver-29566950-j7xhj\" (UID: \"d513dbd9-be74-4c62-a87a-45071ef62cef\") " pod="openshift-infra/auto-csr-approver-29566950-j7xhj" Mar 20 14:30:00 crc kubenswrapper[4973]: I0320 14:30:00.269560 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxhb6\" (UniqueName: \"kubernetes.io/projected/210e7550-8743-43ef-99c8-ea8ca59dc71d-kube-api-access-nxhb6\") pod \"collect-profiles-29566950-xs97m\" (UID: \"210e7550-8743-43ef-99c8-ea8ca59dc71d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-xs97m" Mar 20 14:30:00 crc kubenswrapper[4973]: I0320 14:30:00.269582 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/210e7550-8743-43ef-99c8-ea8ca59dc71d-secret-volume\") pod \"collect-profiles-29566950-xs97m\" (UID: \"210e7550-8743-43ef-99c8-ea8ca59dc71d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-xs97m" Mar 20 14:30:00 crc kubenswrapper[4973]: I0320 14:30:00.269893 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/210e7550-8743-43ef-99c8-ea8ca59dc71d-config-volume\") pod \"collect-profiles-29566950-xs97m\" (UID: \"210e7550-8743-43ef-99c8-ea8ca59dc71d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-xs97m" Mar 20 14:30:00 crc kubenswrapper[4973]: I0320 14:30:00.372994 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/210e7550-8743-43ef-99c8-ea8ca59dc71d-config-volume\") pod \"collect-profiles-29566950-xs97m\" (UID: \"210e7550-8743-43ef-99c8-ea8ca59dc71d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-xs97m" Mar 20 14:30:00 crc kubenswrapper[4973]: I0320 14:30:00.373584 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hql2\" (UniqueName: \"kubernetes.io/projected/d513dbd9-be74-4c62-a87a-45071ef62cef-kube-api-access-9hql2\") pod \"auto-csr-approver-29566950-j7xhj\" (UID: \"d513dbd9-be74-4c62-a87a-45071ef62cef\") " pod="openshift-infra/auto-csr-approver-29566950-j7xhj" Mar 20 14:30:00 crc kubenswrapper[4973]: I0320 14:30:00.373802 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxhb6\" (UniqueName: \"kubernetes.io/projected/210e7550-8743-43ef-99c8-ea8ca59dc71d-kube-api-access-nxhb6\") pod \"collect-profiles-29566950-xs97m\" (UID: \"210e7550-8743-43ef-99c8-ea8ca59dc71d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-xs97m" Mar 20 14:30:00 crc kubenswrapper[4973]: I0320 14:30:00.373915 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/210e7550-8743-43ef-99c8-ea8ca59dc71d-secret-volume\") pod \"collect-profiles-29566950-xs97m\" (UID: \"210e7550-8743-43ef-99c8-ea8ca59dc71d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-xs97m" Mar 20 14:30:00 crc kubenswrapper[4973]: I0320 14:30:00.374073 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/210e7550-8743-43ef-99c8-ea8ca59dc71d-config-volume\") pod \"collect-profiles-29566950-xs97m\" (UID: \"210e7550-8743-43ef-99c8-ea8ca59dc71d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-xs97m" Mar 20 14:30:00 crc kubenswrapper[4973]: I0320 14:30:00.485988 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/210e7550-8743-43ef-99c8-ea8ca59dc71d-secret-volume\") pod \"collect-profiles-29566950-xs97m\" (UID: \"210e7550-8743-43ef-99c8-ea8ca59dc71d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-xs97m" Mar 20 14:30:00 crc kubenswrapper[4973]: I0320 14:30:00.486530 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hql2\" (UniqueName: \"kubernetes.io/projected/d513dbd9-be74-4c62-a87a-45071ef62cef-kube-api-access-9hql2\") pod \"auto-csr-approver-29566950-j7xhj\" (UID: \"d513dbd9-be74-4c62-a87a-45071ef62cef\") " pod="openshift-infra/auto-csr-approver-29566950-j7xhj" Mar 20 14:30:00 crc kubenswrapper[4973]: I0320 14:30:00.487416 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxhb6\" (UniqueName: \"kubernetes.io/projected/210e7550-8743-43ef-99c8-ea8ca59dc71d-kube-api-access-nxhb6\") pod \"collect-profiles-29566950-xs97m\" (UID: \"210e7550-8743-43ef-99c8-ea8ca59dc71d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-xs97m" Mar 20 14:30:00 crc kubenswrapper[4973]: I0320 14:30:00.490018 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566950-j7xhj" Mar 20 14:30:00 crc kubenswrapper[4973]: I0320 14:30:00.517785 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-xs97m" Mar 20 14:30:01 crc kubenswrapper[4973]: I0320 14:30:01.090886 4973 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:30:01 crc kubenswrapper[4973]: I0320 14:30:01.095467 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566950-j7xhj"] Mar 20 14:30:01 crc kubenswrapper[4973]: I0320 14:30:01.241614 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566950-j7xhj" event={"ID":"d513dbd9-be74-4c62-a87a-45071ef62cef","Type":"ContainerStarted","Data":"7d6b46a178ca3016a3335f9c2f3b188b5a4c80785dc1e2b47d5238cdd6649dc2"} Mar 20 14:30:01 crc kubenswrapper[4973]: I0320 14:30:01.242686 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-xs97m" event={"ID":"210e7550-8743-43ef-99c8-ea8ca59dc71d","Type":"ContainerStarted","Data":"d8b0fa800477a9840840ec5e61f63639181f4e77ffce25ff9a5e8ba5baeb0949"} Mar 20 14:30:01 crc kubenswrapper[4973]: I0320 14:30:01.244146 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566950-xs97m"] Mar 20 14:30:01 crc kubenswrapper[4973]: I0320 14:30:01.293479 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-x5vhs"] Mar 20 14:30:01 crc kubenswrapper[4973]: I0320 14:30:01.300142 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x5vhs" Mar 20 14:30:01 crc kubenswrapper[4973]: I0320 14:30:01.320058 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x5vhs"] Mar 20 14:30:01 crc kubenswrapper[4973]: I0320 14:30:01.405702 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af42117e-d394-4dce-9545-798e55957289-catalog-content\") pod \"redhat-operators-x5vhs\" (UID: \"af42117e-d394-4dce-9545-798e55957289\") " pod="openshift-marketplace/redhat-operators-x5vhs" Mar 20 14:30:01 crc kubenswrapper[4973]: I0320 14:30:01.405766 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlb5j\" (UniqueName: \"kubernetes.io/projected/af42117e-d394-4dce-9545-798e55957289-kube-api-access-mlb5j\") pod \"redhat-operators-x5vhs\" (UID: \"af42117e-d394-4dce-9545-798e55957289\") " pod="openshift-marketplace/redhat-operators-x5vhs" Mar 20 14:30:01 crc kubenswrapper[4973]: I0320 14:30:01.405985 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af42117e-d394-4dce-9545-798e55957289-utilities\") pod \"redhat-operators-x5vhs\" (UID: \"af42117e-d394-4dce-9545-798e55957289\") " pod="openshift-marketplace/redhat-operators-x5vhs" Mar 20 14:30:01 crc kubenswrapper[4973]: I0320 14:30:01.509038 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af42117e-d394-4dce-9545-798e55957289-catalog-content\") pod \"redhat-operators-x5vhs\" (UID: \"af42117e-d394-4dce-9545-798e55957289\") " pod="openshift-marketplace/redhat-operators-x5vhs" Mar 20 14:30:01 crc kubenswrapper[4973]: I0320 14:30:01.509118 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlb5j\" (UniqueName: \"kubernetes.io/projected/af42117e-d394-4dce-9545-798e55957289-kube-api-access-mlb5j\") pod \"redhat-operators-x5vhs\" (UID: \"af42117e-d394-4dce-9545-798e55957289\") " pod="openshift-marketplace/redhat-operators-x5vhs" Mar 20 14:30:01 crc kubenswrapper[4973]: I0320 14:30:01.509421 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af42117e-d394-4dce-9545-798e55957289-utilities\") pod \"redhat-operators-x5vhs\" (UID: \"af42117e-d394-4dce-9545-798e55957289\") " pod="openshift-marketplace/redhat-operators-x5vhs" Mar 20 14:30:01 crc kubenswrapper[4973]: I0320 14:30:01.509860 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af42117e-d394-4dce-9545-798e55957289-catalog-content\") pod \"redhat-operators-x5vhs\" (UID: \"af42117e-d394-4dce-9545-798e55957289\") " pod="openshift-marketplace/redhat-operators-x5vhs" Mar 20 14:30:01 crc kubenswrapper[4973]: I0320 14:30:01.509987 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af42117e-d394-4dce-9545-798e55957289-utilities\") pod \"redhat-operators-x5vhs\" (UID: \"af42117e-d394-4dce-9545-798e55957289\") " pod="openshift-marketplace/redhat-operators-x5vhs" Mar 20 14:30:01 crc kubenswrapper[4973]: I0320 14:30:01.529431 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlb5j\" (UniqueName: \"kubernetes.io/projected/af42117e-d394-4dce-9545-798e55957289-kube-api-access-mlb5j\") pod \"redhat-operators-x5vhs\" (UID: \"af42117e-d394-4dce-9545-798e55957289\") " pod="openshift-marketplace/redhat-operators-x5vhs" Mar 20 14:30:01 crc kubenswrapper[4973]: I0320 14:30:01.689082 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x5vhs" Mar 20 14:30:02 crc kubenswrapper[4973]: I0320 14:30:02.257387 4973 generic.go:334] "Generic (PLEG): container finished" podID="210e7550-8743-43ef-99c8-ea8ca59dc71d" containerID="8fc175b56edc83a50cc9926acd710999a1020a056268ecfb0bb368ddd228affe" exitCode=0 Mar 20 14:30:02 crc kubenswrapper[4973]: I0320 14:30:02.257542 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-xs97m" event={"ID":"210e7550-8743-43ef-99c8-ea8ca59dc71d","Type":"ContainerDied","Data":"8fc175b56edc83a50cc9926acd710999a1020a056268ecfb0bb368ddd228affe"} Mar 20 14:30:02 crc kubenswrapper[4973]: I0320 14:30:02.323583 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-x5vhs"] Mar 20 14:30:03 crc kubenswrapper[4973]: I0320 14:30:03.106507 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-m558r"] Mar 20 14:30:03 crc kubenswrapper[4973]: I0320 14:30:03.110004 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m558r" Mar 20 14:30:03 crc kubenswrapper[4973]: I0320 14:30:03.121912 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m558r"] Mar 20 14:30:03 crc kubenswrapper[4973]: I0320 14:30:03.253861 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qjkl\" (UniqueName: \"kubernetes.io/projected/f247ce3e-bf6b-4df6-ae71-938e7ca98f3b-kube-api-access-8qjkl\") pod \"redhat-marketplace-m558r\" (UID: \"f247ce3e-bf6b-4df6-ae71-938e7ca98f3b\") " pod="openshift-marketplace/redhat-marketplace-m558r" Mar 20 14:30:03 crc kubenswrapper[4973]: I0320 14:30:03.253979 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f247ce3e-bf6b-4df6-ae71-938e7ca98f3b-catalog-content\") pod \"redhat-marketplace-m558r\" (UID: \"f247ce3e-bf6b-4df6-ae71-938e7ca98f3b\") " pod="openshift-marketplace/redhat-marketplace-m558r" Mar 20 14:30:03 crc kubenswrapper[4973]: I0320 14:30:03.254006 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f247ce3e-bf6b-4df6-ae71-938e7ca98f3b-utilities\") pod \"redhat-marketplace-m558r\" (UID: \"f247ce3e-bf6b-4df6-ae71-938e7ca98f3b\") " pod="openshift-marketplace/redhat-marketplace-m558r" Mar 20 14:30:03 crc kubenswrapper[4973]: I0320 14:30:03.292614 4973 generic.go:334] "Generic (PLEG): container finished" podID="af42117e-d394-4dce-9545-798e55957289" containerID="5dcea486ed5ca325a90b553949cab1e71fcd38f24d2bdfc4132e4cd6a30ab755" exitCode=0 Mar 20 14:30:03 crc kubenswrapper[4973]: I0320 14:30:03.295118 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x5vhs" event={"ID":"af42117e-d394-4dce-9545-798e55957289","Type":"ContainerDied","Data":"5dcea486ed5ca325a90b553949cab1e71fcd38f24d2bdfc4132e4cd6a30ab755"} Mar 20 14:30:03 crc kubenswrapper[4973]: I0320 14:30:03.295176 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x5vhs" event={"ID":"af42117e-d394-4dce-9545-798e55957289","Type":"ContainerStarted","Data":"f1ad504e49cc642497586909d5b376283c9e8d85e4b6e0627134eb15294669ac"} Mar 20 14:30:03 crc kubenswrapper[4973]: I0320 14:30:03.360979 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qjkl\" (UniqueName: \"kubernetes.io/projected/f247ce3e-bf6b-4df6-ae71-938e7ca98f3b-kube-api-access-8qjkl\") pod \"redhat-marketplace-m558r\" (UID: \"f247ce3e-bf6b-4df6-ae71-938e7ca98f3b\") " pod="openshift-marketplace/redhat-marketplace-m558r" Mar 20 14:30:03 crc kubenswrapper[4973]: I0320 14:30:03.361170 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f247ce3e-bf6b-4df6-ae71-938e7ca98f3b-catalog-content\") pod \"redhat-marketplace-m558r\" (UID: \"f247ce3e-bf6b-4df6-ae71-938e7ca98f3b\") " pod="openshift-marketplace/redhat-marketplace-m558r" Mar 20 14:30:03 crc kubenswrapper[4973]: I0320 14:30:03.361211 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f247ce3e-bf6b-4df6-ae71-938e7ca98f3b-utilities\") pod \"redhat-marketplace-m558r\" (UID: \"f247ce3e-bf6b-4df6-ae71-938e7ca98f3b\") " pod="openshift-marketplace/redhat-marketplace-m558r" Mar 20 14:30:03 crc kubenswrapper[4973]: I0320 14:30:03.361902 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f247ce3e-bf6b-4df6-ae71-938e7ca98f3b-utilities\") pod \"redhat-marketplace-m558r\" (UID: \"f247ce3e-bf6b-4df6-ae71-938e7ca98f3b\") " pod="openshift-marketplace/redhat-marketplace-m558r" Mar 20 14:30:03 crc kubenswrapper[4973]: I0320 14:30:03.362198 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f247ce3e-bf6b-4df6-ae71-938e7ca98f3b-catalog-content\") pod \"redhat-marketplace-m558r\" (UID: \"f247ce3e-bf6b-4df6-ae71-938e7ca98f3b\") " pod="openshift-marketplace/redhat-marketplace-m558r" Mar 20 14:30:03 crc kubenswrapper[4973]: I0320 14:30:03.423093 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qjkl\" (UniqueName: \"kubernetes.io/projected/f247ce3e-bf6b-4df6-ae71-938e7ca98f3b-kube-api-access-8qjkl\") pod \"redhat-marketplace-m558r\" (UID: \"f247ce3e-bf6b-4df6-ae71-938e7ca98f3b\") " pod="openshift-marketplace/redhat-marketplace-m558r" Mar 20 14:30:03 crc kubenswrapper[4973]: I0320 14:30:03.450130 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m558r" Mar 20 14:30:04 crc kubenswrapper[4973]: I0320 14:30:04.025426 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-xs97m" Mar 20 14:30:04 crc kubenswrapper[4973]: I0320 14:30:04.111280 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/210e7550-8743-43ef-99c8-ea8ca59dc71d-secret-volume\") pod \"210e7550-8743-43ef-99c8-ea8ca59dc71d\" (UID: \"210e7550-8743-43ef-99c8-ea8ca59dc71d\") " Mar 20 14:30:04 crc kubenswrapper[4973]: I0320 14:30:04.112157 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/210e7550-8743-43ef-99c8-ea8ca59dc71d-config-volume\") pod \"210e7550-8743-43ef-99c8-ea8ca59dc71d\" (UID: \"210e7550-8743-43ef-99c8-ea8ca59dc71d\") " Mar 20 14:30:04 crc kubenswrapper[4973]: I0320 14:30:04.112295 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxhb6\" (UniqueName: \"kubernetes.io/projected/210e7550-8743-43ef-99c8-ea8ca59dc71d-kube-api-access-nxhb6\") pod \"210e7550-8743-43ef-99c8-ea8ca59dc71d\" (UID: \"210e7550-8743-43ef-99c8-ea8ca59dc71d\") " Mar 20 14:30:04 crc kubenswrapper[4973]: I0320 14:30:04.113986 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210e7550-8743-43ef-99c8-ea8ca59dc71d-config-volume" (OuterVolumeSpecName: "config-volume") pod "210e7550-8743-43ef-99c8-ea8ca59dc71d" (UID: "210e7550-8743-43ef-99c8-ea8ca59dc71d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:30:04 crc kubenswrapper[4973]: I0320 14:30:04.118204 4973 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/210e7550-8743-43ef-99c8-ea8ca59dc71d-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:30:04 crc kubenswrapper[4973]: I0320 14:30:04.121578 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210e7550-8743-43ef-99c8-ea8ca59dc71d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "210e7550-8743-43ef-99c8-ea8ca59dc71d" (UID: "210e7550-8743-43ef-99c8-ea8ca59dc71d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:30:04 crc kubenswrapper[4973]: I0320 14:30:04.156948 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210e7550-8743-43ef-99c8-ea8ca59dc71d-kube-api-access-nxhb6" (OuterVolumeSpecName: "kube-api-access-nxhb6") pod "210e7550-8743-43ef-99c8-ea8ca59dc71d" (UID: "210e7550-8743-43ef-99c8-ea8ca59dc71d"). InnerVolumeSpecName "kube-api-access-nxhb6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:30:04 crc kubenswrapper[4973]: I0320 14:30:04.220658 4973 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/210e7550-8743-43ef-99c8-ea8ca59dc71d-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:30:04 crc kubenswrapper[4973]: I0320 14:30:04.221932 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxhb6\" (UniqueName: \"kubernetes.io/projected/210e7550-8743-43ef-99c8-ea8ca59dc71d-kube-api-access-nxhb6\") on node \"crc\" DevicePath \"\"" Mar 20 14:30:04 crc kubenswrapper[4973]: I0320 14:30:04.252543 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-m558r"] Mar 20 14:30:04 crc kubenswrapper[4973]: W0320 14:30:04.256626 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf247ce3e_bf6b_4df6_ae71_938e7ca98f3b.slice/crio-27393c595a7911f678cbfde26016bed8c3b6506b33614b51d8cc727008aa22cc WatchSource:0}: Error finding container 27393c595a7911f678cbfde26016bed8c3b6506b33614b51d8cc727008aa22cc: Status 404 returned error can't find the container with id 27393c595a7911f678cbfde26016bed8c3b6506b33614b51d8cc727008aa22cc Mar 20 14:30:04 crc kubenswrapper[4973]: I0320 14:30:04.317452 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m558r" event={"ID":"f247ce3e-bf6b-4df6-ae71-938e7ca98f3b","Type":"ContainerStarted","Data":"27393c595a7911f678cbfde26016bed8c3b6506b33614b51d8cc727008aa22cc"} Mar 20 14:30:04 crc kubenswrapper[4973]: I0320 14:30:04.321734 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-xs97m" event={"ID":"210e7550-8743-43ef-99c8-ea8ca59dc71d","Type":"ContainerDied","Data":"d8b0fa800477a9840840ec5e61f63639181f4e77ffce25ff9a5e8ba5baeb0949"} Mar 20 14:30:04 crc kubenswrapper[4973]: I0320 14:30:04.321777 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8b0fa800477a9840840ec5e61f63639181f4e77ffce25ff9a5e8ba5baeb0949" Mar 20 14:30:04 crc kubenswrapper[4973]: I0320 14:30:04.321785 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566950-xs97m" Mar 20 14:30:05 crc kubenswrapper[4973]: I0320 14:30:05.136725 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566905-blc9z"] Mar 20 14:30:05 crc kubenswrapper[4973]: I0320 14:30:05.147559 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566905-blc9z"] Mar 20 14:30:05 crc kubenswrapper[4973]: I0320 14:30:05.335721 4973 generic.go:334] "Generic (PLEG): container finished" podID="d513dbd9-be74-4c62-a87a-45071ef62cef" containerID="5a1e95cc1cd2438b61622880ea16619085a8941f44d2860a8dcdd42e2632764a" exitCode=0 Mar 20 14:30:05 crc kubenswrapper[4973]: I0320 14:30:05.335793 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566950-j7xhj" event={"ID":"d513dbd9-be74-4c62-a87a-45071ef62cef","Type":"ContainerDied","Data":"5a1e95cc1cd2438b61622880ea16619085a8941f44d2860a8dcdd42e2632764a"} Mar 20 14:30:05 crc kubenswrapper[4973]: I0320 14:30:05.341491 4973 generic.go:334] "Generic (PLEG): container finished" podID="f247ce3e-bf6b-4df6-ae71-938e7ca98f3b" containerID="5a4286889e59386ede25c73a00cac4eb425cac0f7bc7059232ad51a93fbe4090" exitCode=0 Mar 20 14:30:05 crc kubenswrapper[4973]: I0320 14:30:05.341584 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m558r" event={"ID":"f247ce3e-bf6b-4df6-ae71-938e7ca98f3b","Type":"ContainerDied","Data":"5a4286889e59386ede25c73a00cac4eb425cac0f7bc7059232ad51a93fbe4090"} Mar 20 14:30:05 crc kubenswrapper[4973]: I0320 14:30:05.968306 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c64034f-6ded-417d-8b24-6e7f779cabca" path="/var/lib/kubelet/pods/9c64034f-6ded-417d-8b24-6e7f779cabca/volumes" Mar 20 14:30:06 crc kubenswrapper[4973]: I0320 14:30:06.354356 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x5vhs" event={"ID":"af42117e-d394-4dce-9545-798e55957289","Type":"ContainerStarted","Data":"da60518b2436b11c3b81ee81cfefd27ffdc2c7c3b4ade1e9b5d6fa4531a9191f"} Mar 20 14:30:06 crc kubenswrapper[4973]: I0320 14:30:06.992498 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566950-j7xhj" Mar 20 14:30:07 crc kubenswrapper[4973]: I0320 14:30:07.123608 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hql2\" (UniqueName: \"kubernetes.io/projected/d513dbd9-be74-4c62-a87a-45071ef62cef-kube-api-access-9hql2\") pod \"d513dbd9-be74-4c62-a87a-45071ef62cef\" (UID: \"d513dbd9-be74-4c62-a87a-45071ef62cef\") " Mar 20 14:30:07 crc kubenswrapper[4973]: I0320 14:30:07.129481 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d513dbd9-be74-4c62-a87a-45071ef62cef-kube-api-access-9hql2" (OuterVolumeSpecName: "kube-api-access-9hql2") pod "d513dbd9-be74-4c62-a87a-45071ef62cef" (UID: "d513dbd9-be74-4c62-a87a-45071ef62cef"). InnerVolumeSpecName "kube-api-access-9hql2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:30:07 crc kubenswrapper[4973]: I0320 14:30:07.227707 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hql2\" (UniqueName: \"kubernetes.io/projected/d513dbd9-be74-4c62-a87a-45071ef62cef-kube-api-access-9hql2\") on node \"crc\" DevicePath \"\"" Mar 20 14:30:07 crc kubenswrapper[4973]: I0320 14:30:07.365117 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566950-j7xhj" event={"ID":"d513dbd9-be74-4c62-a87a-45071ef62cef","Type":"ContainerDied","Data":"7d6b46a178ca3016a3335f9c2f3b188b5a4c80785dc1e2b47d5238cdd6649dc2"} Mar 20 14:30:07 crc kubenswrapper[4973]: I0320 14:30:07.365172 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d6b46a178ca3016a3335f9c2f3b188b5a4c80785dc1e2b47d5238cdd6649dc2" Mar 20 14:30:07 crc kubenswrapper[4973]: I0320 14:30:07.365144 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566950-j7xhj" Mar 20 14:30:07 crc kubenswrapper[4973]: I0320 14:30:07.366918 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m558r" event={"ID":"f247ce3e-bf6b-4df6-ae71-938e7ca98f3b","Type":"ContainerStarted","Data":"64035c46d03009ab65065eafb545bbc24d4487ece49515419608b4ff2d30bf21"} Mar 20 14:30:08 crc kubenswrapper[4973]: I0320 14:30:08.056678 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566944-qjd9k"] Mar 20 14:30:08 crc kubenswrapper[4973]: I0320 14:30:08.069397 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566944-qjd9k"] Mar 20 14:30:09 crc kubenswrapper[4973]: I0320 14:30:09.388503 4973 generic.go:334] "Generic (PLEG): container finished" podID="f247ce3e-bf6b-4df6-ae71-938e7ca98f3b" containerID="64035c46d03009ab65065eafb545bbc24d4487ece49515419608b4ff2d30bf21" exitCode=0 Mar 20 14:30:09 crc kubenswrapper[4973]: I0320 14:30:09.388547 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m558r" event={"ID":"f247ce3e-bf6b-4df6-ae71-938e7ca98f3b","Type":"ContainerDied","Data":"64035c46d03009ab65065eafb545bbc24d4487ece49515419608b4ff2d30bf21"} Mar 20 14:30:09 crc kubenswrapper[4973]: I0320 14:30:09.965484 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73864548-f1e6-4aab-9caa-3d463d39b738" path="/var/lib/kubelet/pods/73864548-f1e6-4aab-9caa-3d463d39b738/volumes" Mar 20 14:30:12 crc kubenswrapper[4973]: I0320 14:30:12.423469 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m558r" event={"ID":"f247ce3e-bf6b-4df6-ae71-938e7ca98f3b","Type":"ContainerStarted","Data":"4033d067e8271fe6a30494d14cb5bf58f60ada6ab27995c43b02e91b141b526e"} Mar 20 14:30:12 crc kubenswrapper[4973]: I0320 14:30:12.638445 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-m558r" podStartSLOduration=4.223431731 podStartE2EDuration="9.63842459s" podCreationTimestamp="2026-03-20 14:30:03 +0000 UTC" firstStartedPulling="2026-03-20 14:30:05.344638735 +0000 UTC m=+4126.088308479" lastFinishedPulling="2026-03-20 14:30:10.759631594 +0000 UTC m=+4131.503301338" observedRunningTime="2026-03-20 14:30:12.631284126 +0000 UTC m=+4133.374953870" watchObservedRunningTime="2026-03-20 14:30:12.63842459 +0000 UTC m=+4133.382094334" Mar 20 14:30:13 crc kubenswrapper[4973]: I0320 14:30:13.438145 4973 generic.go:334] "Generic (PLEG): container finished" podID="af42117e-d394-4dce-9545-798e55957289" containerID="da60518b2436b11c3b81ee81cfefd27ffdc2c7c3b4ade1e9b5d6fa4531a9191f" exitCode=0 Mar 20 14:30:13 crc kubenswrapper[4973]: I0320 14:30:13.438228 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x5vhs" event={"ID":"af42117e-d394-4dce-9545-798e55957289","Type":"ContainerDied","Data":"da60518b2436b11c3b81ee81cfefd27ffdc2c7c3b4ade1e9b5d6fa4531a9191f"} Mar 20 14:30:13 crc kubenswrapper[4973]: I0320 14:30:13.452507 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-m558r" Mar 20 14:30:13 crc kubenswrapper[4973]: I0320 14:30:13.452561 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-m558r" Mar 20 14:30:14 crc kubenswrapper[4973]: I0320 14:30:14.453300 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x5vhs" event={"ID":"af42117e-d394-4dce-9545-798e55957289","Type":"ContainerStarted","Data":"4a3a07d42572e6307329c1305d7b4a88e763d2420b899696929029076bab9490"} Mar 20 14:30:14 crc kubenswrapper[4973]: I0320 14:30:14.477828 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-x5vhs" podStartSLOduration=2.832936962 podStartE2EDuration="13.477805813s" podCreationTimestamp="2026-03-20 14:30:01 +0000 UTC" firstStartedPulling="2026-03-20 14:30:03.298321258 +0000 UTC m=+4124.041991012" lastFinishedPulling="2026-03-20 14:30:13.943190119 +0000 UTC m=+4134.686859863" observedRunningTime="2026-03-20 14:30:14.475748167 +0000 UTC m=+4135.219417911" watchObservedRunningTime="2026-03-20 14:30:14.477805813 +0000 UTC m=+4135.221475567" Mar 20 14:30:14 crc kubenswrapper[4973]: I0320 14:30:14.506471 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-m558r" podUID="f247ce3e-bf6b-4df6-ae71-938e7ca98f3b" containerName="registry-server" probeResult="failure" output=< Mar 20 14:30:14 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:30:14 crc kubenswrapper[4973]: > Mar 20 14:30:21 crc kubenswrapper[4973]: I0320 14:30:21.690256 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-x5vhs" Mar 20 14:30:21 crc kubenswrapper[4973]: I0320 14:30:21.691020 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-x5vhs" Mar 20 14:30:23 crc kubenswrapper[4973]: I0320 14:30:23.130971 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-x5vhs" podUID="af42117e-d394-4dce-9545-798e55957289" containerName="registry-server" probeResult="failure" output=< Mar 20 14:30:23 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:30:23 crc kubenswrapper[4973]: > Mar 20 14:30:23 crc kubenswrapper[4973]: I0320 14:30:23.503852 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-m558r" Mar 20 14:30:23 crc kubenswrapper[4973]: I0320 14:30:23.560135 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-m558r" Mar 20 14:30:23 crc kubenswrapper[4973]: I0320 14:30:23.741641 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m558r"] Mar 20 14:30:24 crc kubenswrapper[4973]: I0320 14:30:24.585193 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-m558r" podUID="f247ce3e-bf6b-4df6-ae71-938e7ca98f3b" containerName="registry-server" containerID="cri-o://4033d067e8271fe6a30494d14cb5bf58f60ada6ab27995c43b02e91b141b526e" gracePeriod=2 Mar 20 14:30:25 crc kubenswrapper[4973]: I0320 14:30:25.222274 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m558r" Mar 20 14:30:25 crc kubenswrapper[4973]: I0320 14:30:25.392178 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f247ce3e-bf6b-4df6-ae71-938e7ca98f3b-catalog-content\") pod \"f247ce3e-bf6b-4df6-ae71-938e7ca98f3b\" (UID: \"f247ce3e-bf6b-4df6-ae71-938e7ca98f3b\") " Mar 20 14:30:25 crc kubenswrapper[4973]: I0320 14:30:25.392521 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8qjkl\" (UniqueName: \"kubernetes.io/projected/f247ce3e-bf6b-4df6-ae71-938e7ca98f3b-kube-api-access-8qjkl\") pod \"f247ce3e-bf6b-4df6-ae71-938e7ca98f3b\" (UID: \"f247ce3e-bf6b-4df6-ae71-938e7ca98f3b\") " Mar 20 14:30:25 crc kubenswrapper[4973]: I0320 14:30:25.392679 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f247ce3e-bf6b-4df6-ae71-938e7ca98f3b-utilities\") pod \"f247ce3e-bf6b-4df6-ae71-938e7ca98f3b\" (UID: \"f247ce3e-bf6b-4df6-ae71-938e7ca98f3b\") " Mar 20 14:30:25 crc kubenswrapper[4973]: I0320 14:30:25.393254 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f247ce3e-bf6b-4df6-ae71-938e7ca98f3b-utilities" (OuterVolumeSpecName: "utilities") pod "f247ce3e-bf6b-4df6-ae71-938e7ca98f3b" (UID: "f247ce3e-bf6b-4df6-ae71-938e7ca98f3b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:30:25 crc kubenswrapper[4973]: I0320 14:30:25.397950 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f247ce3e-bf6b-4df6-ae71-938e7ca98f3b-kube-api-access-8qjkl" (OuterVolumeSpecName: "kube-api-access-8qjkl") pod "f247ce3e-bf6b-4df6-ae71-938e7ca98f3b" (UID: "f247ce3e-bf6b-4df6-ae71-938e7ca98f3b"). InnerVolumeSpecName "kube-api-access-8qjkl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:30:25 crc kubenswrapper[4973]: I0320 14:30:25.425172 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f247ce3e-bf6b-4df6-ae71-938e7ca98f3b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f247ce3e-bf6b-4df6-ae71-938e7ca98f3b" (UID: "f247ce3e-bf6b-4df6-ae71-938e7ca98f3b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:30:25 crc kubenswrapper[4973]: I0320 14:30:25.495738 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f247ce3e-bf6b-4df6-ae71-938e7ca98f3b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:30:25 crc kubenswrapper[4973]: I0320 14:30:25.495777 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f247ce3e-bf6b-4df6-ae71-938e7ca98f3b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:30:25 crc kubenswrapper[4973]: I0320 14:30:25.495792 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8qjkl\" (UniqueName: \"kubernetes.io/projected/f247ce3e-bf6b-4df6-ae71-938e7ca98f3b-kube-api-access-8qjkl\") on node \"crc\" DevicePath \"\"" Mar 20 14:30:25 crc kubenswrapper[4973]: I0320 14:30:25.597783 4973 generic.go:334] "Generic (PLEG): container finished" podID="f247ce3e-bf6b-4df6-ae71-938e7ca98f3b" containerID="4033d067e8271fe6a30494d14cb5bf58f60ada6ab27995c43b02e91b141b526e" exitCode=0 Mar 20 14:30:25 crc kubenswrapper[4973]: I0320 14:30:25.597825 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m558r" event={"ID":"f247ce3e-bf6b-4df6-ae71-938e7ca98f3b","Type":"ContainerDied","Data":"4033d067e8271fe6a30494d14cb5bf58f60ada6ab27995c43b02e91b141b526e"} Mar 20 14:30:25 crc kubenswrapper[4973]: I0320 14:30:25.597849 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-m558r" Mar 20 14:30:25 crc kubenswrapper[4973]: I0320 14:30:25.597886 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-m558r" event={"ID":"f247ce3e-bf6b-4df6-ae71-938e7ca98f3b","Type":"ContainerDied","Data":"27393c595a7911f678cbfde26016bed8c3b6506b33614b51d8cc727008aa22cc"} Mar 20 14:30:25 crc kubenswrapper[4973]: I0320 14:30:25.597914 4973 scope.go:117] "RemoveContainer" containerID="4033d067e8271fe6a30494d14cb5bf58f60ada6ab27995c43b02e91b141b526e" Mar 20 14:30:25 crc kubenswrapper[4973]: I0320 14:30:25.629125 4973 scope.go:117] "RemoveContainer" containerID="64035c46d03009ab65065eafb545bbc24d4487ece49515419608b4ff2d30bf21" Mar 20 14:30:25 crc kubenswrapper[4973]: I0320 14:30:25.631962 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-m558r"] Mar 20 14:30:25 crc kubenswrapper[4973]: I0320 14:30:25.643897 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-m558r"] Mar 20 14:30:25 crc kubenswrapper[4973]: I0320 14:30:25.662190 4973 scope.go:117] "RemoveContainer" containerID="5a4286889e59386ede25c73a00cac4eb425cac0f7bc7059232ad51a93fbe4090" Mar 20 14:30:25 crc kubenswrapper[4973]: I0320 14:30:25.710934 4973 scope.go:117] "RemoveContainer" containerID="4033d067e8271fe6a30494d14cb5bf58f60ada6ab27995c43b02e91b141b526e" Mar 20 14:30:25 crc kubenswrapper[4973]: E0320 14:30:25.713008 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4033d067e8271fe6a30494d14cb5bf58f60ada6ab27995c43b02e91b141b526e\": container with ID starting with 4033d067e8271fe6a30494d14cb5bf58f60ada6ab27995c43b02e91b141b526e not found: ID does not exist" containerID="4033d067e8271fe6a30494d14cb5bf58f60ada6ab27995c43b02e91b141b526e" Mar 20 14:30:25 crc kubenswrapper[4973]: I0320 14:30:25.713075 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4033d067e8271fe6a30494d14cb5bf58f60ada6ab27995c43b02e91b141b526e"} err="failed to get container status \"4033d067e8271fe6a30494d14cb5bf58f60ada6ab27995c43b02e91b141b526e\": rpc error: code = NotFound desc = could not find container \"4033d067e8271fe6a30494d14cb5bf58f60ada6ab27995c43b02e91b141b526e\": container with ID starting with 4033d067e8271fe6a30494d14cb5bf58f60ada6ab27995c43b02e91b141b526e not found: ID does not exist" Mar 20 14:30:25 crc kubenswrapper[4973]: I0320 14:30:25.713117 4973 scope.go:117] "RemoveContainer" containerID="64035c46d03009ab65065eafb545bbc24d4487ece49515419608b4ff2d30bf21" Mar 20 14:30:25 crc kubenswrapper[4973]: E0320 14:30:25.713453 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64035c46d03009ab65065eafb545bbc24d4487ece49515419608b4ff2d30bf21\": container with ID starting with 64035c46d03009ab65065eafb545bbc24d4487ece49515419608b4ff2d30bf21 not found: ID does not exist" containerID="64035c46d03009ab65065eafb545bbc24d4487ece49515419608b4ff2d30bf21" Mar 20 14:30:25 crc kubenswrapper[4973]: I0320 14:30:25.713487 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64035c46d03009ab65065eafb545bbc24d4487ece49515419608b4ff2d30bf21"} err="failed to get container status \"64035c46d03009ab65065eafb545bbc24d4487ece49515419608b4ff2d30bf21\": rpc error: code = NotFound desc = could not find container \"64035c46d03009ab65065eafb545bbc24d4487ece49515419608b4ff2d30bf21\": container with ID starting with 64035c46d03009ab65065eafb545bbc24d4487ece49515419608b4ff2d30bf21 not found: ID does not exist" Mar 20 14:30:25 crc kubenswrapper[4973]: I0320 14:30:25.713509 4973 scope.go:117] "RemoveContainer" containerID="5a4286889e59386ede25c73a00cac4eb425cac0f7bc7059232ad51a93fbe4090" Mar 20 14:30:25 crc kubenswrapper[4973]: E0320 14:30:25.713751 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a4286889e59386ede25c73a00cac4eb425cac0f7bc7059232ad51a93fbe4090\": container with ID starting with 5a4286889e59386ede25c73a00cac4eb425cac0f7bc7059232ad51a93fbe4090 not found: ID does not exist" containerID="5a4286889e59386ede25c73a00cac4eb425cac0f7bc7059232ad51a93fbe4090" Mar 20 14:30:25 crc kubenswrapper[4973]: I0320 14:30:25.713781 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a4286889e59386ede25c73a00cac4eb425cac0f7bc7059232ad51a93fbe4090"} err="failed to get container status \"5a4286889e59386ede25c73a00cac4eb425cac0f7bc7059232ad51a93fbe4090\": rpc error: code = NotFound desc = could not find container \"5a4286889e59386ede25c73a00cac4eb425cac0f7bc7059232ad51a93fbe4090\": container with ID starting with 5a4286889e59386ede25c73a00cac4eb425cac0f7bc7059232ad51a93fbe4090 not found: ID does not exist" Mar 20 14:30:25 crc kubenswrapper[4973]: I0320 14:30:25.965520 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f247ce3e-bf6b-4df6-ae71-938e7ca98f3b" path="/var/lib/kubelet/pods/f247ce3e-bf6b-4df6-ae71-938e7ca98f3b/volumes" Mar 20 14:30:32 crc kubenswrapper[4973]: I0320 14:30:32.748616 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-x5vhs" podUID="af42117e-d394-4dce-9545-798e55957289" containerName="registry-server" probeResult="failure" output=< Mar 20 14:30:32 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:30:32 crc kubenswrapper[4973]: > Mar 20 14:30:41 crc kubenswrapper[4973]: I0320 14:30:41.755510 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-x5vhs" Mar 20 14:30:41 crc kubenswrapper[4973]: I0320 14:30:41.817971 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-x5vhs" Mar 20 14:30:42 crc kubenswrapper[4973]: I0320 14:30:42.885391 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x5vhs"] Mar 20 14:30:42 crc kubenswrapper[4973]: I0320 14:30:42.885940 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-x5vhs" podUID="af42117e-d394-4dce-9545-798e55957289" containerName="registry-server" containerID="cri-o://4a3a07d42572e6307329c1305d7b4a88e763d2420b899696929029076bab9490" gracePeriod=2 Mar 20 14:30:43 crc kubenswrapper[4973]: I0320 14:30:43.320451 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:30:43 crc kubenswrapper[4973]: I0320 14:30:43.320798 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:30:43 crc kubenswrapper[4973]: I0320 14:30:43.829150 4973 generic.go:334] "Generic (PLEG): container finished" podID="af42117e-d394-4dce-9545-798e55957289" containerID="4a3a07d42572e6307329c1305d7b4a88e763d2420b899696929029076bab9490" exitCode=0 Mar 20 14:30:43 crc kubenswrapper[4973]: I0320 14:30:43.829202 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x5vhs" event={"ID":"af42117e-d394-4dce-9545-798e55957289","Type":"ContainerDied","Data":"4a3a07d42572e6307329c1305d7b4a88e763d2420b899696929029076bab9490"} Mar 20 14:30:44 crc kubenswrapper[4973]: I0320 14:30:44.023035 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x5vhs" Mar 20 14:30:44 crc kubenswrapper[4973]: I0320 14:30:44.177072 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af42117e-d394-4dce-9545-798e55957289-catalog-content\") pod \"af42117e-d394-4dce-9545-798e55957289\" (UID: \"af42117e-d394-4dce-9545-798e55957289\") " Mar 20 14:30:44 crc kubenswrapper[4973]: I0320 14:30:44.177183 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af42117e-d394-4dce-9545-798e55957289-utilities\") pod \"af42117e-d394-4dce-9545-798e55957289\" (UID: \"af42117e-d394-4dce-9545-798e55957289\") " Mar 20 14:30:44 crc kubenswrapper[4973]: I0320 14:30:44.177275 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlb5j\" (UniqueName: \"kubernetes.io/projected/af42117e-d394-4dce-9545-798e55957289-kube-api-access-mlb5j\") pod \"af42117e-d394-4dce-9545-798e55957289\" (UID: \"af42117e-d394-4dce-9545-798e55957289\") " Mar 20 14:30:44 crc kubenswrapper[4973]: I0320 14:30:44.178241 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af42117e-d394-4dce-9545-798e55957289-utilities" (OuterVolumeSpecName: "utilities") pod "af42117e-d394-4dce-9545-798e55957289" (UID: "af42117e-d394-4dce-9545-798e55957289"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:30:44 crc kubenswrapper[4973]: I0320 14:30:44.187386 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af42117e-d394-4dce-9545-798e55957289-kube-api-access-mlb5j" (OuterVolumeSpecName: "kube-api-access-mlb5j") pod "af42117e-d394-4dce-9545-798e55957289" (UID: "af42117e-d394-4dce-9545-798e55957289"). InnerVolumeSpecName "kube-api-access-mlb5j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:30:44 crc kubenswrapper[4973]: I0320 14:30:44.280451 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/af42117e-d394-4dce-9545-798e55957289-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:30:44 crc kubenswrapper[4973]: I0320 14:30:44.280483 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlb5j\" (UniqueName: \"kubernetes.io/projected/af42117e-d394-4dce-9545-798e55957289-kube-api-access-mlb5j\") on node \"crc\" DevicePath \"\"" Mar 20 14:30:44 crc kubenswrapper[4973]: I0320 14:30:44.337770 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/af42117e-d394-4dce-9545-798e55957289-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "af42117e-d394-4dce-9545-798e55957289" (UID: "af42117e-d394-4dce-9545-798e55957289"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:30:44 crc kubenswrapper[4973]: I0320 14:30:44.384083 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/af42117e-d394-4dce-9545-798e55957289-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:30:44 crc kubenswrapper[4973]: I0320 14:30:44.845138 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-x5vhs" event={"ID":"af42117e-d394-4dce-9545-798e55957289","Type":"ContainerDied","Data":"f1ad504e49cc642497586909d5b376283c9e8d85e4b6e0627134eb15294669ac"} Mar 20 14:30:44 crc kubenswrapper[4973]: I0320 14:30:44.845201 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-x5vhs" Mar 20 14:30:44 crc kubenswrapper[4973]: I0320 14:30:44.845235 4973 scope.go:117] "RemoveContainer" containerID="4a3a07d42572e6307329c1305d7b4a88e763d2420b899696929029076bab9490" Mar 20 14:30:44 crc kubenswrapper[4973]: I0320 14:30:44.886068 4973 scope.go:117] "RemoveContainer" containerID="da60518b2436b11c3b81ee81cfefd27ffdc2c7c3b4ade1e9b5d6fa4531a9191f" Mar 20 14:30:44 crc kubenswrapper[4973]: I0320 14:30:44.894004 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-x5vhs"] Mar 20 14:30:44 crc kubenswrapper[4973]: I0320 14:30:44.908911 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-x5vhs"] Mar 20 14:30:44 crc kubenswrapper[4973]: I0320 14:30:44.922628 4973 scope.go:117] "RemoveContainer" containerID="5dcea486ed5ca325a90b553949cab1e71fcd38f24d2bdfc4132e4cd6a30ab755" Mar 20 14:30:45 crc kubenswrapper[4973]: I0320 14:30:45.963243 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af42117e-d394-4dce-9545-798e55957289" path="/var/lib/kubelet/pods/af42117e-d394-4dce-9545-798e55957289/volumes" Mar 20 14:31:03 crc kubenswrapper[4973]: I0320 14:31:03.979170 4973 scope.go:117] "RemoveContainer" containerID="214278786dd60cfbce2505c105d7fbd9130f972bb06bc4b9c2de5e8cec410167" Mar 20 14:31:04 crc kubenswrapper[4973]: I0320 14:31:04.111614 4973 scope.go:117] "RemoveContainer" containerID="78765263a5bf6fb473460c8c3a80c2921bc4fbf67285e3fb3a6a80d8e9a64ed9" Mar 20 14:31:13 crc kubenswrapper[4973]: I0320 14:31:13.320428 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:31:13 crc kubenswrapper[4973]: I0320 14:31:13.321053 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:31:43 crc kubenswrapper[4973]: I0320 14:31:43.320367 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:31:43 crc kubenswrapper[4973]: I0320 14:31:43.320928 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:31:43 crc kubenswrapper[4973]: I0320 14:31:43.320972 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 14:31:43 crc kubenswrapper[4973]: I0320 14:31:43.321869 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bfc4b31266e980584c244bdcd5c59ee40888c1197c32fc1c6fe5dc06b7ab0740"} pod="openshift-machine-config-operator/machine-config-daemon-qlztx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:31:43 crc kubenswrapper[4973]: I0320 14:31:43.321919 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" containerID="cri-o://bfc4b31266e980584c244bdcd5c59ee40888c1197c32fc1c6fe5dc06b7ab0740" gracePeriod=600 Mar 20 14:31:44 crc kubenswrapper[4973]: I0320 14:31:44.557925 4973 generic.go:334] "Generic (PLEG): container finished" podID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerID="bfc4b31266e980584c244bdcd5c59ee40888c1197c32fc1c6fe5dc06b7ab0740" exitCode=0 Mar 20 14:31:44 crc kubenswrapper[4973]: I0320 14:31:44.557999 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerDied","Data":"bfc4b31266e980584c244bdcd5c59ee40888c1197c32fc1c6fe5dc06b7ab0740"} Mar 20 14:31:44 crc kubenswrapper[4973]: I0320 14:31:44.558502 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerStarted","Data":"dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749"} Mar 20 14:31:44 crc kubenswrapper[4973]: I0320 14:31:44.558529 4973 scope.go:117] "RemoveContainer" containerID="d105b09301d9be2a872a6e6f238a34f5fc37b1dbda04f52fc3c79f7780a50309" Mar 20 14:32:00 crc kubenswrapper[4973]: I0320 14:32:00.149133 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566952-v74sn"] Mar 20 14:32:00 crc kubenswrapper[4973]: E0320 14:32:00.150465 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af42117e-d394-4dce-9545-798e55957289" containerName="registry-server" Mar 20 14:32:00 crc kubenswrapper[4973]: I0320 14:32:00.150483 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="af42117e-d394-4dce-9545-798e55957289" containerName="registry-server" Mar 20 14:32:00 crc kubenswrapper[4973]: E0320 14:32:00.150522 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="210e7550-8743-43ef-99c8-ea8ca59dc71d" containerName="collect-profiles" Mar 20 14:32:00 crc kubenswrapper[4973]: I0320 14:32:00.150531 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="210e7550-8743-43ef-99c8-ea8ca59dc71d" containerName="collect-profiles" Mar 20 14:32:00 crc kubenswrapper[4973]: E0320 14:32:00.150556 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d513dbd9-be74-4c62-a87a-45071ef62cef" containerName="oc" Mar 20 14:32:00 crc kubenswrapper[4973]: I0320 14:32:00.150565 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="d513dbd9-be74-4c62-a87a-45071ef62cef" containerName="oc" Mar 20 14:32:00 crc kubenswrapper[4973]: E0320 14:32:00.150599 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f247ce3e-bf6b-4df6-ae71-938e7ca98f3b" containerName="extract-content" Mar 20 14:32:00 crc kubenswrapper[4973]: I0320 14:32:00.150605 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="f247ce3e-bf6b-4df6-ae71-938e7ca98f3b" containerName="extract-content" Mar 20 14:32:00 crc kubenswrapper[4973]: E0320 14:32:00.150618 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f247ce3e-bf6b-4df6-ae71-938e7ca98f3b" containerName="extract-utilities" Mar 20 14:32:00 crc kubenswrapper[4973]: I0320 14:32:00.150626 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="f247ce3e-bf6b-4df6-ae71-938e7ca98f3b" containerName="extract-utilities" Mar 20 14:32:00 crc kubenswrapper[4973]: E0320 14:32:00.150636 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af42117e-d394-4dce-9545-798e55957289" containerName="extract-content" Mar 20 14:32:00 crc kubenswrapper[4973]: I0320 14:32:00.150642 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="af42117e-d394-4dce-9545-798e55957289" containerName="extract-content" Mar 20 14:32:00 crc kubenswrapper[4973]: E0320 14:32:00.150658 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af42117e-d394-4dce-9545-798e55957289" containerName="extract-utilities" Mar 20 14:32:00 crc kubenswrapper[4973]: I0320 14:32:00.150665 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="af42117e-d394-4dce-9545-798e55957289" containerName="extract-utilities" Mar 20 14:32:00 crc kubenswrapper[4973]: E0320 14:32:00.150677 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f247ce3e-bf6b-4df6-ae71-938e7ca98f3b" containerName="registry-server" Mar 20 14:32:00 crc kubenswrapper[4973]: I0320 14:32:00.150682 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="f247ce3e-bf6b-4df6-ae71-938e7ca98f3b" containerName="registry-server" Mar 20 14:32:00 crc kubenswrapper[4973]: I0320 14:32:00.150931 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="af42117e-d394-4dce-9545-798e55957289" containerName="registry-server" Mar 20 14:32:00 crc kubenswrapper[4973]: I0320 14:32:00.150941 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="210e7550-8743-43ef-99c8-ea8ca59dc71d" containerName="collect-profiles" Mar 20 14:32:00 crc kubenswrapper[4973]: I0320 14:32:00.150963 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="f247ce3e-bf6b-4df6-ae71-938e7ca98f3b" containerName="registry-server" Mar 20 14:32:00 crc kubenswrapper[4973]: I0320 14:32:00.150969 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="d513dbd9-be74-4c62-a87a-45071ef62cef" containerName="oc" Mar 20 14:32:00 crc kubenswrapper[4973]: I0320 14:32:00.151967 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566952-v74sn" Mar 20 14:32:00 crc kubenswrapper[4973]: I0320 14:32:00.155670 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:32:00 crc kubenswrapper[4973]: I0320 14:32:00.155916 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:32:00 crc kubenswrapper[4973]: I0320 14:32:00.156056 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:32:00 crc kubenswrapper[4973]: I0320 14:32:00.161422 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566952-v74sn"] Mar 20 14:32:00 crc kubenswrapper[4973]: I0320 14:32:00.215313 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjrkj\" (UniqueName: \"kubernetes.io/projected/ffe8cf8f-857d-49e8-ac4f-d827991dadc7-kube-api-access-fjrkj\") pod \"auto-csr-approver-29566952-v74sn\" (UID: \"ffe8cf8f-857d-49e8-ac4f-d827991dadc7\") " pod="openshift-infra/auto-csr-approver-29566952-v74sn" Mar 20 14:32:00 crc kubenswrapper[4973]: I0320 14:32:00.318667 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjrkj\" (UniqueName: \"kubernetes.io/projected/ffe8cf8f-857d-49e8-ac4f-d827991dadc7-kube-api-access-fjrkj\") pod \"auto-csr-approver-29566952-v74sn\" (UID: \"ffe8cf8f-857d-49e8-ac4f-d827991dadc7\") " pod="openshift-infra/auto-csr-approver-29566952-v74sn" Mar 20 14:32:00 crc kubenswrapper[4973]: I0320 14:32:00.336749 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjrkj\" (UniqueName: \"kubernetes.io/projected/ffe8cf8f-857d-49e8-ac4f-d827991dadc7-kube-api-access-fjrkj\") pod \"auto-csr-approver-29566952-v74sn\" (UID: \"ffe8cf8f-857d-49e8-ac4f-d827991dadc7\") " pod="openshift-infra/auto-csr-approver-29566952-v74sn" Mar 20 14:32:00 crc kubenswrapper[4973]: I0320 14:32:00.473967 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566952-v74sn" Mar 20 14:32:00 crc kubenswrapper[4973]: I0320 14:32:00.988364 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566952-v74sn"] Mar 20 14:32:01 crc kubenswrapper[4973]: I0320 14:32:01.772108 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566952-v74sn" event={"ID":"ffe8cf8f-857d-49e8-ac4f-d827991dadc7","Type":"ContainerStarted","Data":"5c024558a7e5584bb40cb502242252e72ff47f9d71febd62923ddd32c92ff677"} Mar 20 14:32:02 crc kubenswrapper[4973]: I0320 14:32:02.790550 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566952-v74sn" event={"ID":"ffe8cf8f-857d-49e8-ac4f-d827991dadc7","Type":"ContainerStarted","Data":"0976927a13e07eaff8118bb7c410d30f774b887a279fa9ef040fa689eeb4e94e"} Mar 20 14:32:02 crc kubenswrapper[4973]: I0320 14:32:02.813681 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566952-v74sn" podStartSLOduration=1.701991997 podStartE2EDuration="2.81366151s" podCreationTimestamp="2026-03-20 14:32:00 +0000 UTC" firstStartedPulling="2026-03-20 14:32:01.000600483 +0000 UTC m=+4241.744270247" lastFinishedPulling="2026-03-20 14:32:02.112270016 +0000 UTC m=+4242.855939760" observedRunningTime="2026-03-20 14:32:02.813462655 +0000 UTC m=+4243.557132409" watchObservedRunningTime="2026-03-20 14:32:02.81366151 +0000 UTC m=+4243.557331254" Mar 20 14:32:03 crc kubenswrapper[4973]: I0320 14:32:03.803641 4973 generic.go:334] "Generic (PLEG): container finished" podID="ffe8cf8f-857d-49e8-ac4f-d827991dadc7" containerID="0976927a13e07eaff8118bb7c410d30f774b887a279fa9ef040fa689eeb4e94e" exitCode=0 Mar 20 14:32:03 crc kubenswrapper[4973]: I0320 14:32:03.803773 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566952-v74sn" event={"ID":"ffe8cf8f-857d-49e8-ac4f-d827991dadc7","Type":"ContainerDied","Data":"0976927a13e07eaff8118bb7c410d30f774b887a279fa9ef040fa689eeb4e94e"} Mar 20 14:32:05 crc kubenswrapper[4973]: I0320 14:32:05.274273 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566952-v74sn" Mar 20 14:32:05 crc kubenswrapper[4973]: I0320 14:32:05.352701 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjrkj\" (UniqueName: \"kubernetes.io/projected/ffe8cf8f-857d-49e8-ac4f-d827991dadc7-kube-api-access-fjrkj\") pod \"ffe8cf8f-857d-49e8-ac4f-d827991dadc7\" (UID: \"ffe8cf8f-857d-49e8-ac4f-d827991dadc7\") " Mar 20 14:32:05 crc kubenswrapper[4973]: I0320 14:32:05.360280 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ffe8cf8f-857d-49e8-ac4f-d827991dadc7-kube-api-access-fjrkj" (OuterVolumeSpecName: "kube-api-access-fjrkj") pod "ffe8cf8f-857d-49e8-ac4f-d827991dadc7" (UID: "ffe8cf8f-857d-49e8-ac4f-d827991dadc7"). InnerVolumeSpecName "kube-api-access-fjrkj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:32:05 crc kubenswrapper[4973]: I0320 14:32:05.457745 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjrkj\" (UniqueName: \"kubernetes.io/projected/ffe8cf8f-857d-49e8-ac4f-d827991dadc7-kube-api-access-fjrkj\") on node \"crc\" DevicePath \"\"" Mar 20 14:32:05 crc kubenswrapper[4973]: I0320 14:32:05.835887 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566952-v74sn" event={"ID":"ffe8cf8f-857d-49e8-ac4f-d827991dadc7","Type":"ContainerDied","Data":"5c024558a7e5584bb40cb502242252e72ff47f9d71febd62923ddd32c92ff677"} Mar 20 14:32:05 crc kubenswrapper[4973]: I0320 14:32:05.835933 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c024558a7e5584bb40cb502242252e72ff47f9d71febd62923ddd32c92ff677" Mar 20 14:32:05 crc kubenswrapper[4973]: I0320 14:32:05.835988 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566952-v74sn" Mar 20 14:32:05 crc kubenswrapper[4973]: I0320 14:32:05.913820 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566946-smptb"] Mar 20 14:32:05 crc kubenswrapper[4973]: I0320 14:32:05.940063 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566946-smptb"] Mar 20 14:32:05 crc kubenswrapper[4973]: I0320 14:32:05.964482 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2702fc74-45a5-4404-8067-9de35aafcac8" path="/var/lib/kubelet/pods/2702fc74-45a5-4404-8067-9de35aafcac8/volumes" Mar 20 14:33:04 crc kubenswrapper[4973]: I0320 14:33:04.324567 4973 scope.go:117] "RemoveContainer" containerID="4fd369689be79f50de134d5b6613cbb69276843f9c0ecd37c6d3c88f7bc8d22a" Mar 20 14:33:18 crc kubenswrapper[4973]: I0320 14:33:18.168007 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-79dxs"] Mar 20 14:33:18 crc kubenswrapper[4973]: E0320 14:33:18.169331 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ffe8cf8f-857d-49e8-ac4f-d827991dadc7" containerName="oc" Mar 20 14:33:18 crc kubenswrapper[4973]: I0320 14:33:18.169453 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="ffe8cf8f-857d-49e8-ac4f-d827991dadc7" containerName="oc" Mar 20 14:33:18 crc kubenswrapper[4973]: I0320 14:33:18.169771 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="ffe8cf8f-857d-49e8-ac4f-d827991dadc7" containerName="oc" Mar 20 14:33:18 crc kubenswrapper[4973]: I0320 14:33:18.171779 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-79dxs" Mar 20 14:33:18 crc kubenswrapper[4973]: I0320 14:33:18.183658 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-79dxs"] Mar 20 14:33:18 crc kubenswrapper[4973]: I0320 14:33:18.215088 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a404422-9438-4838-a3b3-5086fe6459e8-catalog-content\") pod \"community-operators-79dxs\" (UID: \"5a404422-9438-4838-a3b3-5086fe6459e8\") " pod="openshift-marketplace/community-operators-79dxs" Mar 20 14:33:18 crc kubenswrapper[4973]: I0320 14:33:18.215212 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wqwhf\" (UniqueName: \"kubernetes.io/projected/5a404422-9438-4838-a3b3-5086fe6459e8-kube-api-access-wqwhf\") pod \"community-operators-79dxs\" (UID: \"5a404422-9438-4838-a3b3-5086fe6459e8\") " pod="openshift-marketplace/community-operators-79dxs" Mar 20 14:33:18 crc kubenswrapper[4973]: I0320 14:33:18.215245 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a404422-9438-4838-a3b3-5086fe6459e8-utilities\") pod \"community-operators-79dxs\" (UID: \"5a404422-9438-4838-a3b3-5086fe6459e8\") " pod="openshift-marketplace/community-operators-79dxs" Mar 20 14:33:18 crc kubenswrapper[4973]: I0320 14:33:18.317867 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a404422-9438-4838-a3b3-5086fe6459e8-catalog-content\") pod \"community-operators-79dxs\" (UID: \"5a404422-9438-4838-a3b3-5086fe6459e8\") " pod="openshift-marketplace/community-operators-79dxs" Mar 20 14:33:18 crc kubenswrapper[4973]: I0320 14:33:18.318010 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wqwhf\" (UniqueName: \"kubernetes.io/projected/5a404422-9438-4838-a3b3-5086fe6459e8-kube-api-access-wqwhf\") pod \"community-operators-79dxs\" (UID: \"5a404422-9438-4838-a3b3-5086fe6459e8\") " pod="openshift-marketplace/community-operators-79dxs" Mar 20 14:33:18 crc kubenswrapper[4973]: I0320 14:33:18.318046 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a404422-9438-4838-a3b3-5086fe6459e8-utilities\") pod \"community-operators-79dxs\" (UID: \"5a404422-9438-4838-a3b3-5086fe6459e8\") " pod="openshift-marketplace/community-operators-79dxs" Mar 20 14:33:18 crc kubenswrapper[4973]: I0320 14:33:18.318633 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a404422-9438-4838-a3b3-5086fe6459e8-catalog-content\") pod \"community-operators-79dxs\" (UID: \"5a404422-9438-4838-a3b3-5086fe6459e8\") " pod="openshift-marketplace/community-operators-79dxs" Mar 20 14:33:18 crc kubenswrapper[4973]: I0320 14:33:18.321858 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a404422-9438-4838-a3b3-5086fe6459e8-utilities\") pod \"community-operators-79dxs\" (UID: \"5a404422-9438-4838-a3b3-5086fe6459e8\") " pod="openshift-marketplace/community-operators-79dxs" Mar 20 14:33:18 crc kubenswrapper[4973]: I0320 14:33:18.351522 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wqwhf\" (UniqueName: \"kubernetes.io/projected/5a404422-9438-4838-a3b3-5086fe6459e8-kube-api-access-wqwhf\") pod \"community-operators-79dxs\" (UID: \"5a404422-9438-4838-a3b3-5086fe6459e8\") " pod="openshift-marketplace/community-operators-79dxs" Mar 20 14:33:18 crc kubenswrapper[4973]: I0320 14:33:18.497084 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-79dxs" Mar 20 14:33:19 crc kubenswrapper[4973]: I0320 14:33:19.148682 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-79dxs"] Mar 20 14:33:19 crc kubenswrapper[4973]: I0320 14:33:19.755977 4973 generic.go:334] "Generic (PLEG): container finished" podID="5a404422-9438-4838-a3b3-5086fe6459e8" containerID="18d938ef101b604362888b79452909dc5a5fddcce6bef273c85b8926f4b30759" exitCode=0 Mar 20 14:33:19 crc kubenswrapper[4973]: I0320 14:33:19.756121 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-79dxs" event={"ID":"5a404422-9438-4838-a3b3-5086fe6459e8","Type":"ContainerDied","Data":"18d938ef101b604362888b79452909dc5a5fddcce6bef273c85b8926f4b30759"} Mar 20 14:33:19 crc kubenswrapper[4973]: I0320 14:33:19.756244 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-79dxs" event={"ID":"5a404422-9438-4838-a3b3-5086fe6459e8","Type":"ContainerStarted","Data":"fd6be580253469150c8db67b0f95c59f0e4e6cde916d0c8b9fed02ef3dfaef6e"} Mar 20 14:33:21 crc kubenswrapper[4973]: I0320 14:33:21.788261 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-79dxs" event={"ID":"5a404422-9438-4838-a3b3-5086fe6459e8","Type":"ContainerStarted","Data":"0da8b40db954db445d66db0bccf61fb5bffaa24015e5d8744d9310a8adb935a4"} Mar 20 14:33:22 crc kubenswrapper[4973]: I0320 14:33:22.808123 4973 generic.go:334] "Generic (PLEG): container finished" podID="5a404422-9438-4838-a3b3-5086fe6459e8" containerID="0da8b40db954db445d66db0bccf61fb5bffaa24015e5d8744d9310a8adb935a4" exitCode=0 Mar 20 14:33:22 crc kubenswrapper[4973]: I0320 14:33:22.808919 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-79dxs" event={"ID":"5a404422-9438-4838-a3b3-5086fe6459e8","Type":"ContainerDied","Data":"0da8b40db954db445d66db0bccf61fb5bffaa24015e5d8744d9310a8adb935a4"} Mar 20 14:33:23 crc kubenswrapper[4973]: I0320 14:33:23.822957 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-79dxs" event={"ID":"5a404422-9438-4838-a3b3-5086fe6459e8","Type":"ContainerStarted","Data":"f6b70832f633d654942f107fec1ff3d8ca6fb28b8847419bd96253d97b9c61e8"} Mar 20 14:33:23 crc kubenswrapper[4973]: I0320 14:33:23.847898 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-79dxs" podStartSLOduration=2.35241624 podStartE2EDuration="5.847876776s" podCreationTimestamp="2026-03-20 14:33:18 +0000 UTC" firstStartedPulling="2026-03-20 14:33:19.757867115 +0000 UTC m=+4320.501536859" lastFinishedPulling="2026-03-20 14:33:23.253327651 +0000 UTC m=+4323.996997395" observedRunningTime="2026-03-20 14:33:23.842207981 +0000 UTC m=+4324.585877735" watchObservedRunningTime="2026-03-20 14:33:23.847876776 +0000 UTC m=+4324.591546520" Mar 20 14:33:28 crc kubenswrapper[4973]: I0320 14:33:28.498179 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-79dxs" Mar 20 14:33:28 crc kubenswrapper[4973]: I0320 14:33:28.498711 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-79dxs" Mar 20 14:33:28 crc kubenswrapper[4973]: I0320 14:33:28.550597 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-79dxs" Mar 20 14:33:28 crc kubenswrapper[4973]: I0320 14:33:28.929876 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-79dxs" Mar 20 14:33:28 crc kubenswrapper[4973]: I0320 14:33:28.982757 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-79dxs"] Mar 20 14:33:30 crc kubenswrapper[4973]: I0320 14:33:30.896496 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-79dxs" podUID="5a404422-9438-4838-a3b3-5086fe6459e8" containerName="registry-server" containerID="cri-o://f6b70832f633d654942f107fec1ff3d8ca6fb28b8847419bd96253d97b9c61e8" gracePeriod=2 Mar 20 14:33:31 crc kubenswrapper[4973]: I0320 14:33:31.469866 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-79dxs" Mar 20 14:33:31 crc kubenswrapper[4973]: I0320 14:33:31.639285 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wqwhf\" (UniqueName: \"kubernetes.io/projected/5a404422-9438-4838-a3b3-5086fe6459e8-kube-api-access-wqwhf\") pod \"5a404422-9438-4838-a3b3-5086fe6459e8\" (UID: \"5a404422-9438-4838-a3b3-5086fe6459e8\") " Mar 20 14:33:31 crc kubenswrapper[4973]: I0320 14:33:31.639993 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a404422-9438-4838-a3b3-5086fe6459e8-catalog-content\") pod \"5a404422-9438-4838-a3b3-5086fe6459e8\" (UID: \"5a404422-9438-4838-a3b3-5086fe6459e8\") " Mar 20 14:33:31 crc kubenswrapper[4973]: I0320 14:33:31.640055 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a404422-9438-4838-a3b3-5086fe6459e8-utilities\") pod \"5a404422-9438-4838-a3b3-5086fe6459e8\" (UID: \"5a404422-9438-4838-a3b3-5086fe6459e8\") " Mar 20 14:33:31 crc kubenswrapper[4973]: I0320 14:33:31.642074 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a404422-9438-4838-a3b3-5086fe6459e8-utilities" (OuterVolumeSpecName: "utilities") pod "5a404422-9438-4838-a3b3-5086fe6459e8" (UID: "5a404422-9438-4838-a3b3-5086fe6459e8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:33:31 crc kubenswrapper[4973]: I0320 14:33:31.651714 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a404422-9438-4838-a3b3-5086fe6459e8-kube-api-access-wqwhf" (OuterVolumeSpecName: "kube-api-access-wqwhf") pod "5a404422-9438-4838-a3b3-5086fe6459e8" (UID: "5a404422-9438-4838-a3b3-5086fe6459e8"). InnerVolumeSpecName "kube-api-access-wqwhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:33:31 crc kubenswrapper[4973]: I0320 14:33:31.693875 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5a404422-9438-4838-a3b3-5086fe6459e8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5a404422-9438-4838-a3b3-5086fe6459e8" (UID: "5a404422-9438-4838-a3b3-5086fe6459e8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:33:31 crc kubenswrapper[4973]: I0320 14:33:31.743207 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5a404422-9438-4838-a3b3-5086fe6459e8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:33:31 crc kubenswrapper[4973]: I0320 14:33:31.743248 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5a404422-9438-4838-a3b3-5086fe6459e8-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:33:31 crc kubenswrapper[4973]: I0320 14:33:31.743259 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wqwhf\" (UniqueName: \"kubernetes.io/projected/5a404422-9438-4838-a3b3-5086fe6459e8-kube-api-access-wqwhf\") on node \"crc\" DevicePath \"\"" Mar 20 14:33:31 crc kubenswrapper[4973]: I0320 14:33:31.911539 4973 generic.go:334] "Generic (PLEG): container finished" podID="5a404422-9438-4838-a3b3-5086fe6459e8" containerID="f6b70832f633d654942f107fec1ff3d8ca6fb28b8847419bd96253d97b9c61e8" exitCode=0 Mar 20 14:33:31 crc kubenswrapper[4973]: I0320 14:33:31.911603 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-79dxs" event={"ID":"5a404422-9438-4838-a3b3-5086fe6459e8","Type":"ContainerDied","Data":"f6b70832f633d654942f107fec1ff3d8ca6fb28b8847419bd96253d97b9c61e8"} Mar 20 14:33:31 crc kubenswrapper[4973]: I0320 14:33:31.911607 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-79dxs" Mar 20 14:33:31 crc kubenswrapper[4973]: I0320 14:33:31.911639 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-79dxs" event={"ID":"5a404422-9438-4838-a3b3-5086fe6459e8","Type":"ContainerDied","Data":"fd6be580253469150c8db67b0f95c59f0e4e6cde916d0c8b9fed02ef3dfaef6e"} Mar 20 14:33:31 crc kubenswrapper[4973]: I0320 14:33:31.911660 4973 scope.go:117] "RemoveContainer" containerID="f6b70832f633d654942f107fec1ff3d8ca6fb28b8847419bd96253d97b9c61e8" Mar 20 14:33:31 crc kubenswrapper[4973]: I0320 14:33:31.961907 4973 scope.go:117] "RemoveContainer" containerID="0da8b40db954db445d66db0bccf61fb5bffaa24015e5d8744d9310a8adb935a4" Mar 20 14:33:31 crc kubenswrapper[4973]: I0320 14:33:31.973046 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-79dxs"] Mar 20 14:33:31 crc kubenswrapper[4973]: I0320 14:33:31.979560 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-79dxs"] Mar 20 14:33:32 crc kubenswrapper[4973]: I0320 14:33:32.018789 4973 scope.go:117] "RemoveContainer" containerID="18d938ef101b604362888b79452909dc5a5fddcce6bef273c85b8926f4b30759" Mar 20 14:33:32 crc kubenswrapper[4973]: I0320 14:33:32.066205 4973 scope.go:117] "RemoveContainer" containerID="f6b70832f633d654942f107fec1ff3d8ca6fb28b8847419bd96253d97b9c61e8" Mar 20 14:33:32 crc kubenswrapper[4973]: E0320 14:33:32.066813 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6b70832f633d654942f107fec1ff3d8ca6fb28b8847419bd96253d97b9c61e8\": container with ID starting with f6b70832f633d654942f107fec1ff3d8ca6fb28b8847419bd96253d97b9c61e8 not found: ID does not exist" containerID="f6b70832f633d654942f107fec1ff3d8ca6fb28b8847419bd96253d97b9c61e8" Mar 20 14:33:32 crc kubenswrapper[4973]: I0320 14:33:32.066850 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6b70832f633d654942f107fec1ff3d8ca6fb28b8847419bd96253d97b9c61e8"} err="failed to get container status \"f6b70832f633d654942f107fec1ff3d8ca6fb28b8847419bd96253d97b9c61e8\": rpc error: code = NotFound desc = could not find container \"f6b70832f633d654942f107fec1ff3d8ca6fb28b8847419bd96253d97b9c61e8\": container with ID starting with f6b70832f633d654942f107fec1ff3d8ca6fb28b8847419bd96253d97b9c61e8 not found: ID does not exist" Mar 20 14:33:32 crc kubenswrapper[4973]: I0320 14:33:32.066871 4973 scope.go:117] "RemoveContainer" containerID="0da8b40db954db445d66db0bccf61fb5bffaa24015e5d8744d9310a8adb935a4" Mar 20 14:33:32 crc kubenswrapper[4973]: E0320 14:33:32.067271 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0da8b40db954db445d66db0bccf61fb5bffaa24015e5d8744d9310a8adb935a4\": container with ID starting with 0da8b40db954db445d66db0bccf61fb5bffaa24015e5d8744d9310a8adb935a4 not found: ID does not exist" containerID="0da8b40db954db445d66db0bccf61fb5bffaa24015e5d8744d9310a8adb935a4" Mar 20 14:33:32 crc kubenswrapper[4973]: I0320 14:33:32.067317 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0da8b40db954db445d66db0bccf61fb5bffaa24015e5d8744d9310a8adb935a4"} err="failed to get container status \"0da8b40db954db445d66db0bccf61fb5bffaa24015e5d8744d9310a8adb935a4\": rpc error: code = NotFound desc = could not find container \"0da8b40db954db445d66db0bccf61fb5bffaa24015e5d8744d9310a8adb935a4\": container with ID starting with 0da8b40db954db445d66db0bccf61fb5bffaa24015e5d8744d9310a8adb935a4 not found: ID does not exist" Mar 20 14:33:32 crc kubenswrapper[4973]: I0320 14:33:32.067360 4973 scope.go:117] "RemoveContainer" containerID="18d938ef101b604362888b79452909dc5a5fddcce6bef273c85b8926f4b30759" Mar 20 14:33:32 crc kubenswrapper[4973]: E0320 14:33:32.067667 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"18d938ef101b604362888b79452909dc5a5fddcce6bef273c85b8926f4b30759\": container with ID starting with 18d938ef101b604362888b79452909dc5a5fddcce6bef273c85b8926f4b30759 not found: ID does not exist" containerID="18d938ef101b604362888b79452909dc5a5fddcce6bef273c85b8926f4b30759" Mar 20 14:33:32 crc kubenswrapper[4973]: I0320 14:33:32.067692 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"18d938ef101b604362888b79452909dc5a5fddcce6bef273c85b8926f4b30759"} err="failed to get container status \"18d938ef101b604362888b79452909dc5a5fddcce6bef273c85b8926f4b30759\": rpc error: code = NotFound desc = could not find container \"18d938ef101b604362888b79452909dc5a5fddcce6bef273c85b8926f4b30759\": container with ID starting with 18d938ef101b604362888b79452909dc5a5fddcce6bef273c85b8926f4b30759 not found: ID does not exist" Mar 20 14:33:33 crc kubenswrapper[4973]: I0320 14:33:33.965050 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a404422-9438-4838-a3b3-5086fe6459e8" path="/var/lib/kubelet/pods/5a404422-9438-4838-a3b3-5086fe6459e8/volumes" Mar 20 14:34:00 crc kubenswrapper[4973]: I0320 14:34:00.163577 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566954-gw4dc"] Mar 20 14:34:00 crc kubenswrapper[4973]: E0320 14:34:00.164827 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a404422-9438-4838-a3b3-5086fe6459e8" containerName="extract-content" Mar 20 14:34:00 crc kubenswrapper[4973]: I0320 14:34:00.164845 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a404422-9438-4838-a3b3-5086fe6459e8" containerName="extract-content" Mar 20 14:34:00 crc kubenswrapper[4973]: E0320 14:34:00.164882 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a404422-9438-4838-a3b3-5086fe6459e8" containerName="extract-utilities" Mar 20 14:34:00 crc kubenswrapper[4973]: I0320 14:34:00.164890 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a404422-9438-4838-a3b3-5086fe6459e8" containerName="extract-utilities" Mar 20 14:34:00 crc kubenswrapper[4973]: E0320 14:34:00.164903 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a404422-9438-4838-a3b3-5086fe6459e8" containerName="registry-server" Mar 20 14:34:00 crc kubenswrapper[4973]: I0320 14:34:00.164909 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a404422-9438-4838-a3b3-5086fe6459e8" containerName="registry-server" Mar 20 14:34:00 crc kubenswrapper[4973]: I0320 14:34:00.165138 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a404422-9438-4838-a3b3-5086fe6459e8" containerName="registry-server" Mar 20 14:34:00 crc kubenswrapper[4973]: I0320 14:34:00.166163 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566954-gw4dc" Mar 20 14:34:00 crc kubenswrapper[4973]: I0320 14:34:00.168610 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:34:00 crc kubenswrapper[4973]: I0320 14:34:00.168841 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:34:00 crc kubenswrapper[4973]: I0320 14:34:00.168962 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:34:00 crc kubenswrapper[4973]: I0320 14:34:00.187370 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566954-gw4dc"] Mar 20 14:34:00 crc kubenswrapper[4973]: I0320 14:34:00.212696 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxbnc\" (UniqueName: \"kubernetes.io/projected/b27d1e85-9415-48df-b36b-d508f108cae3-kube-api-access-pxbnc\") pod \"auto-csr-approver-29566954-gw4dc\" (UID: \"b27d1e85-9415-48df-b36b-d508f108cae3\") " pod="openshift-infra/auto-csr-approver-29566954-gw4dc" Mar 20 14:34:00 crc kubenswrapper[4973]: I0320 14:34:00.313891 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxbnc\" (UniqueName: \"kubernetes.io/projected/b27d1e85-9415-48df-b36b-d508f108cae3-kube-api-access-pxbnc\") pod \"auto-csr-approver-29566954-gw4dc\" (UID: \"b27d1e85-9415-48df-b36b-d508f108cae3\") " pod="openshift-infra/auto-csr-approver-29566954-gw4dc" Mar 20 14:34:00 crc kubenswrapper[4973]: I0320 14:34:00.345366 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxbnc\" (UniqueName: \"kubernetes.io/projected/b27d1e85-9415-48df-b36b-d508f108cae3-kube-api-access-pxbnc\") pod \"auto-csr-approver-29566954-gw4dc\" (UID: \"b27d1e85-9415-48df-b36b-d508f108cae3\") " pod="openshift-infra/auto-csr-approver-29566954-gw4dc" Mar 20 14:34:00 crc kubenswrapper[4973]: I0320 14:34:00.495488 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566954-gw4dc" Mar 20 14:34:01 crc kubenswrapper[4973]: I0320 14:34:01.187102 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566954-gw4dc"] Mar 20 14:34:02 crc kubenswrapper[4973]: I0320 14:34:02.231083 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566954-gw4dc" event={"ID":"b27d1e85-9415-48df-b36b-d508f108cae3","Type":"ContainerStarted","Data":"f226e26fb09b5bf829b212e7a205042bfebf94a896ff9e61b742a8c8c9193b55"} Mar 20 14:34:03 crc kubenswrapper[4973]: I0320 14:34:03.285718 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566954-gw4dc" event={"ID":"b27d1e85-9415-48df-b36b-d508f108cae3","Type":"ContainerStarted","Data":"04cbc45e5781779a7aab007341507ac872ecd83d2a8ca07a56c43941e229d75c"} Mar 20 14:34:03 crc kubenswrapper[4973]: I0320 14:34:03.347182 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566954-gw4dc" podStartSLOduration=2.421671181 podStartE2EDuration="3.347158985s" podCreationTimestamp="2026-03-20 14:34:00 +0000 UTC" firstStartedPulling="2026-03-20 14:34:01.691917825 +0000 UTC m=+4362.435587559" lastFinishedPulling="2026-03-20 14:34:02.617405619 +0000 UTC m=+4363.361075363" observedRunningTime="2026-03-20 14:34:03.31246406 +0000 UTC m=+4364.056133814" watchObservedRunningTime="2026-03-20 14:34:03.347158985 +0000 UTC m=+4364.090828729" Mar 20 14:34:04 crc kubenswrapper[4973]: I0320 14:34:04.308411 4973 generic.go:334] "Generic (PLEG): container finished" podID="b27d1e85-9415-48df-b36b-d508f108cae3" containerID="04cbc45e5781779a7aab007341507ac872ecd83d2a8ca07a56c43941e229d75c" exitCode=0 Mar 20 14:34:04 crc kubenswrapper[4973]: I0320 14:34:04.308749 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566954-gw4dc" event={"ID":"b27d1e85-9415-48df-b36b-d508f108cae3","Type":"ContainerDied","Data":"04cbc45e5781779a7aab007341507ac872ecd83d2a8ca07a56c43941e229d75c"} Mar 20 14:34:05 crc kubenswrapper[4973]: I0320 14:34:05.925070 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566954-gw4dc" Mar 20 14:34:06 crc kubenswrapper[4973]: I0320 14:34:06.108643 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxbnc\" (UniqueName: \"kubernetes.io/projected/b27d1e85-9415-48df-b36b-d508f108cae3-kube-api-access-pxbnc\") pod \"b27d1e85-9415-48df-b36b-d508f108cae3\" (UID: \"b27d1e85-9415-48df-b36b-d508f108cae3\") " Mar 20 14:34:06 crc kubenswrapper[4973]: I0320 14:34:06.115828 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b27d1e85-9415-48df-b36b-d508f108cae3-kube-api-access-pxbnc" (OuterVolumeSpecName: "kube-api-access-pxbnc") pod "b27d1e85-9415-48df-b36b-d508f108cae3" (UID: "b27d1e85-9415-48df-b36b-d508f108cae3"). InnerVolumeSpecName "kube-api-access-pxbnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:34:06 crc kubenswrapper[4973]: I0320 14:34:06.212055 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxbnc\" (UniqueName: \"kubernetes.io/projected/b27d1e85-9415-48df-b36b-d508f108cae3-kube-api-access-pxbnc\") on node \"crc\" DevicePath \"\"" Mar 20 14:34:06 crc kubenswrapper[4973]: I0320 14:34:06.328299 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566954-gw4dc" event={"ID":"b27d1e85-9415-48df-b36b-d508f108cae3","Type":"ContainerDied","Data":"f226e26fb09b5bf829b212e7a205042bfebf94a896ff9e61b742a8c8c9193b55"} Mar 20 14:34:06 crc kubenswrapper[4973]: I0320 14:34:06.328771 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f226e26fb09b5bf829b212e7a205042bfebf94a896ff9e61b742a8c8c9193b55" Mar 20 14:34:06 crc kubenswrapper[4973]: I0320 14:34:06.328429 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566954-gw4dc" Mar 20 14:34:06 crc kubenswrapper[4973]: I0320 14:34:06.394991 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566948-8czgl"] Mar 20 14:34:06 crc kubenswrapper[4973]: I0320 14:34:06.410182 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566948-8czgl"] Mar 20 14:34:07 crc kubenswrapper[4973]: I0320 14:34:07.965396 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41055ada-1fe5-480c-8c07-fc6cad616c30" path="/var/lib/kubelet/pods/41055ada-1fe5-480c-8c07-fc6cad616c30/volumes" Mar 20 14:34:13 crc kubenswrapper[4973]: I0320 14:34:13.321186 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:34:13 crc kubenswrapper[4973]: I0320 14:34:13.322803 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:34:30 crc kubenswrapper[4973]: E0320 14:34:30.175396 4973 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.75:59536->38.102.83.75:38041: read tcp 38.102.83.75:59536->38.102.83.75:38041: read: connection reset by peer Mar 20 14:34:43 crc kubenswrapper[4973]: I0320 14:34:43.320790 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:34:43 crc kubenswrapper[4973]: I0320 14:34:43.321419 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:35:04 crc kubenswrapper[4973]: I0320 14:35:04.446118 4973 scope.go:117] "RemoveContainer" containerID="64a4ea6b8f1298026a1cfe7081303e78460c774d104ad21e6f026bb19e710c85" Mar 20 14:35:13 crc kubenswrapper[4973]: I0320 14:35:13.320312 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:35:13 crc kubenswrapper[4973]: I0320 14:35:13.321007 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:35:13 crc kubenswrapper[4973]: I0320 14:35:13.321054 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 14:35:13 crc kubenswrapper[4973]: I0320 14:35:13.322031 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749"} pod="openshift-machine-config-operator/machine-config-daemon-qlztx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:35:13 crc kubenswrapper[4973]: I0320 14:35:13.322098 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" containerID="cri-o://dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" gracePeriod=600 Mar 20 14:35:13 crc kubenswrapper[4973]: E0320 14:35:13.454307 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:35:14 crc kubenswrapper[4973]: I0320 14:35:14.073591 4973 generic.go:334] "Generic (PLEG): container finished" podID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerID="dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" exitCode=0 Mar 20 14:35:14 crc kubenswrapper[4973]: I0320 14:35:14.073639 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerDied","Data":"dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749"} Mar 20 14:35:14 crc kubenswrapper[4973]: I0320 14:35:14.073684 4973 scope.go:117] "RemoveContainer" containerID="bfc4b31266e980584c244bdcd5c59ee40888c1197c32fc1c6fe5dc06b7ab0740" Mar 20 14:35:14 crc kubenswrapper[4973]: I0320 14:35:14.074417 4973 scope.go:117] "RemoveContainer" containerID="dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" Mar 20 14:35:14 crc kubenswrapper[4973]: E0320 14:35:14.074693 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:35:29 crc kubenswrapper[4973]: I0320 14:35:29.959752 4973 scope.go:117] "RemoveContainer" containerID="dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" Mar 20 14:35:29 crc kubenswrapper[4973]: E0320 14:35:29.960832 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:35:42 crc kubenswrapper[4973]: I0320 14:35:42.952718 4973 scope.go:117] "RemoveContainer" containerID="dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" Mar 20 14:35:42 crc kubenswrapper[4973]: E0320 14:35:42.956393 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:35:57 crc kubenswrapper[4973]: I0320 14:35:57.950601 4973 scope.go:117] "RemoveContainer" containerID="dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" Mar 20 14:35:57 crc kubenswrapper[4973]: E0320 14:35:57.952566 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:36:00 crc kubenswrapper[4973]: I0320 14:36:00.149913 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566956-mcmxc"] Mar 20 14:36:00 crc kubenswrapper[4973]: E0320 14:36:00.151092 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b27d1e85-9415-48df-b36b-d508f108cae3" containerName="oc" Mar 20 14:36:00 crc kubenswrapper[4973]: I0320 14:36:00.151108 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="b27d1e85-9415-48df-b36b-d508f108cae3" containerName="oc" Mar 20 14:36:00 crc kubenswrapper[4973]: I0320 14:36:00.151442 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="b27d1e85-9415-48df-b36b-d508f108cae3" containerName="oc" Mar 20 14:36:00 crc kubenswrapper[4973]: I0320 14:36:00.152380 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566956-mcmxc" Mar 20 14:36:00 crc kubenswrapper[4973]: I0320 14:36:00.160863 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:36:00 crc kubenswrapper[4973]: I0320 14:36:00.161111 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:36:00 crc kubenswrapper[4973]: I0320 14:36:00.161251 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:36:00 crc kubenswrapper[4973]: I0320 14:36:00.167904 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566956-mcmxc"] Mar 20 14:36:00 crc kubenswrapper[4973]: I0320 14:36:00.208540 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxwdf\" (UniqueName: \"kubernetes.io/projected/6ecae9f9-4539-4009-93e2-af52b0210fa6-kube-api-access-vxwdf\") pod \"auto-csr-approver-29566956-mcmxc\" (UID: \"6ecae9f9-4539-4009-93e2-af52b0210fa6\") " pod="openshift-infra/auto-csr-approver-29566956-mcmxc" Mar 20 14:36:00 crc kubenswrapper[4973]: I0320 14:36:00.311150 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxwdf\" (UniqueName: \"kubernetes.io/projected/6ecae9f9-4539-4009-93e2-af52b0210fa6-kube-api-access-vxwdf\") pod \"auto-csr-approver-29566956-mcmxc\" (UID: \"6ecae9f9-4539-4009-93e2-af52b0210fa6\") " pod="openshift-infra/auto-csr-approver-29566956-mcmxc" Mar 20 14:36:00 crc kubenswrapper[4973]: I0320 14:36:00.338565 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxwdf\" (UniqueName: \"kubernetes.io/projected/6ecae9f9-4539-4009-93e2-af52b0210fa6-kube-api-access-vxwdf\") pod \"auto-csr-approver-29566956-mcmxc\" (UID: \"6ecae9f9-4539-4009-93e2-af52b0210fa6\") " pod="openshift-infra/auto-csr-approver-29566956-mcmxc" Mar 20 14:36:00 crc kubenswrapper[4973]: I0320 14:36:00.474232 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566956-mcmxc" Mar 20 14:36:00 crc kubenswrapper[4973]: I0320 14:36:00.941380 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566956-mcmxc"] Mar 20 14:36:00 crc kubenswrapper[4973]: I0320 14:36:00.942327 4973 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:36:01 crc kubenswrapper[4973]: I0320 14:36:01.577619 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566956-mcmxc" event={"ID":"6ecae9f9-4539-4009-93e2-af52b0210fa6","Type":"ContainerStarted","Data":"8dcc05bc29bce0efb80e664b8afc5915a499031705daedb0244982146d6deeae"} Mar 20 14:36:02 crc kubenswrapper[4973]: I0320 14:36:02.623244 4973 generic.go:334] "Generic (PLEG): container finished" podID="6ecae9f9-4539-4009-93e2-af52b0210fa6" containerID="21c35e7d0f2129039671361e2346d42976ea15f246bc448db4c46517d48b040f" exitCode=0 Mar 20 14:36:02 crc kubenswrapper[4973]: I0320 14:36:02.623312 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566956-mcmxc" event={"ID":"6ecae9f9-4539-4009-93e2-af52b0210fa6","Type":"ContainerDied","Data":"21c35e7d0f2129039671361e2346d42976ea15f246bc448db4c46517d48b040f"} Mar 20 14:36:04 crc kubenswrapper[4973]: I0320 14:36:04.142062 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566956-mcmxc" Mar 20 14:36:04 crc kubenswrapper[4973]: I0320 14:36:04.230804 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vxwdf\" (UniqueName: \"kubernetes.io/projected/6ecae9f9-4539-4009-93e2-af52b0210fa6-kube-api-access-vxwdf\") pod \"6ecae9f9-4539-4009-93e2-af52b0210fa6\" (UID: \"6ecae9f9-4539-4009-93e2-af52b0210fa6\") " Mar 20 14:36:04 crc kubenswrapper[4973]: I0320 14:36:04.240046 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ecae9f9-4539-4009-93e2-af52b0210fa6-kube-api-access-vxwdf" (OuterVolumeSpecName: "kube-api-access-vxwdf") pod "6ecae9f9-4539-4009-93e2-af52b0210fa6" (UID: "6ecae9f9-4539-4009-93e2-af52b0210fa6"). InnerVolumeSpecName "kube-api-access-vxwdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:36:04 crc kubenswrapper[4973]: I0320 14:36:04.333541 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vxwdf\" (UniqueName: \"kubernetes.io/projected/6ecae9f9-4539-4009-93e2-af52b0210fa6-kube-api-access-vxwdf\") on node \"crc\" DevicePath \"\"" Mar 20 14:36:04 crc kubenswrapper[4973]: I0320 14:36:04.664394 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566956-mcmxc" event={"ID":"6ecae9f9-4539-4009-93e2-af52b0210fa6","Type":"ContainerDied","Data":"8dcc05bc29bce0efb80e664b8afc5915a499031705daedb0244982146d6deeae"} Mar 20 14:36:04 crc kubenswrapper[4973]: I0320 14:36:04.664456 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8dcc05bc29bce0efb80e664b8afc5915a499031705daedb0244982146d6deeae" Mar 20 14:36:04 crc kubenswrapper[4973]: I0320 14:36:04.664520 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566956-mcmxc" Mar 20 14:36:05 crc kubenswrapper[4973]: I0320 14:36:05.216001 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566950-j7xhj"] Mar 20 14:36:05 crc kubenswrapper[4973]: I0320 14:36:05.231479 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566950-j7xhj"] Mar 20 14:36:05 crc kubenswrapper[4973]: I0320 14:36:05.965197 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d513dbd9-be74-4c62-a87a-45071ef62cef" path="/var/lib/kubelet/pods/d513dbd9-be74-4c62-a87a-45071ef62cef/volumes" Mar 20 14:36:11 crc kubenswrapper[4973]: I0320 14:36:11.951406 4973 scope.go:117] "RemoveContainer" containerID="dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" Mar 20 14:36:11 crc kubenswrapper[4973]: E0320 14:36:11.952851 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:36:22 crc kubenswrapper[4973]: I0320 14:36:22.951333 4973 scope.go:117] "RemoveContainer" containerID="dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" Mar 20 14:36:22 crc kubenswrapper[4973]: E0320 14:36:22.952216 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:36:33 crc kubenswrapper[4973]: I0320 14:36:33.952864 4973 scope.go:117] "RemoveContainer" containerID="dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" Mar 20 14:36:33 crc kubenswrapper[4973]: E0320 14:36:33.953565 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:36:44 crc kubenswrapper[4973]: I0320 14:36:44.950408 4973 scope.go:117] "RemoveContainer" containerID="dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" Mar 20 14:36:44 crc kubenswrapper[4973]: E0320 14:36:44.951244 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:36:55 crc kubenswrapper[4973]: I0320 14:36:55.950777 4973 scope.go:117] "RemoveContainer" containerID="dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" Mar 20 14:36:55 crc kubenswrapper[4973]: E0320 14:36:55.951672 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:37:04 crc kubenswrapper[4973]: I0320 14:37:04.556798 4973 scope.go:117] "RemoveContainer" containerID="5a1e95cc1cd2438b61622880ea16619085a8941f44d2860a8dcdd42e2632764a" Mar 20 14:37:06 crc kubenswrapper[4973]: I0320 14:37:06.951126 4973 scope.go:117] "RemoveContainer" containerID="dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" Mar 20 14:37:06 crc kubenswrapper[4973]: E0320 14:37:06.952022 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:37:18 crc kubenswrapper[4973]: I0320 14:37:18.951265 4973 scope.go:117] "RemoveContainer" containerID="dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" Mar 20 14:37:18 crc kubenswrapper[4973]: E0320 14:37:18.952189 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:37:33 crc kubenswrapper[4973]: I0320 14:37:33.950787 4973 scope.go:117] "RemoveContainer" containerID="dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" Mar 20 14:37:33 crc kubenswrapper[4973]: E0320 14:37:33.951689 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:37:44 crc kubenswrapper[4973]: I0320 14:37:44.950551 4973 scope.go:117] "RemoveContainer" containerID="dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" Mar 20 14:37:44 crc kubenswrapper[4973]: E0320 14:37:44.951360 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:37:54 crc kubenswrapper[4973]: I0320 14:37:54.006824 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6sz9b"] Mar 20 14:37:54 crc kubenswrapper[4973]: E0320 14:37:54.008420 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ecae9f9-4539-4009-93e2-af52b0210fa6" containerName="oc" Mar 20 14:37:54 crc kubenswrapper[4973]: I0320 14:37:54.008444 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ecae9f9-4539-4009-93e2-af52b0210fa6" containerName="oc" Mar 20 14:37:54 crc kubenswrapper[4973]: I0320 14:37:54.008758 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ecae9f9-4539-4009-93e2-af52b0210fa6" containerName="oc" Mar 20 14:37:54 crc kubenswrapper[4973]: I0320 14:37:54.010856 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6sz9b" Mar 20 14:37:54 crc kubenswrapper[4973]: I0320 14:37:54.026041 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6sz9b"] Mar 20 14:37:54 crc kubenswrapper[4973]: I0320 14:37:54.100915 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/492b0815-decc-47d8-9b40-56ce1413005a-utilities\") pod \"certified-operators-6sz9b\" (UID: \"492b0815-decc-47d8-9b40-56ce1413005a\") " pod="openshift-marketplace/certified-operators-6sz9b" Mar 20 14:37:54 crc kubenswrapper[4973]: I0320 14:37:54.101688 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwhpj\" (UniqueName: \"kubernetes.io/projected/492b0815-decc-47d8-9b40-56ce1413005a-kube-api-access-xwhpj\") pod \"certified-operators-6sz9b\" (UID: \"492b0815-decc-47d8-9b40-56ce1413005a\") " pod="openshift-marketplace/certified-operators-6sz9b" Mar 20 14:37:54 crc kubenswrapper[4973]: I0320 14:37:54.101795 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/492b0815-decc-47d8-9b40-56ce1413005a-catalog-content\") pod \"certified-operators-6sz9b\" (UID: \"492b0815-decc-47d8-9b40-56ce1413005a\") " pod="openshift-marketplace/certified-operators-6sz9b" Mar 20 14:37:54 crc kubenswrapper[4973]: I0320 14:37:54.203648 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwhpj\" (UniqueName: \"kubernetes.io/projected/492b0815-decc-47d8-9b40-56ce1413005a-kube-api-access-xwhpj\") pod \"certified-operators-6sz9b\" (UID: \"492b0815-decc-47d8-9b40-56ce1413005a\") " pod="openshift-marketplace/certified-operators-6sz9b" Mar 20 14:37:54 crc kubenswrapper[4973]: I0320 14:37:54.203770 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/492b0815-decc-47d8-9b40-56ce1413005a-catalog-content\") pod \"certified-operators-6sz9b\" (UID: \"492b0815-decc-47d8-9b40-56ce1413005a\") " pod="openshift-marketplace/certified-operators-6sz9b" Mar 20 14:37:54 crc kubenswrapper[4973]: I0320 14:37:54.203983 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/492b0815-decc-47d8-9b40-56ce1413005a-utilities\") pod \"certified-operators-6sz9b\" (UID: \"492b0815-decc-47d8-9b40-56ce1413005a\") " pod="openshift-marketplace/certified-operators-6sz9b" Mar 20 14:37:54 crc kubenswrapper[4973]: I0320 14:37:54.204405 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/492b0815-decc-47d8-9b40-56ce1413005a-catalog-content\") pod \"certified-operators-6sz9b\" (UID: \"492b0815-decc-47d8-9b40-56ce1413005a\") " pod="openshift-marketplace/certified-operators-6sz9b" Mar 20 14:37:54 crc kubenswrapper[4973]: I0320 14:37:54.204449 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/492b0815-decc-47d8-9b40-56ce1413005a-utilities\") pod \"certified-operators-6sz9b\" (UID: \"492b0815-decc-47d8-9b40-56ce1413005a\") " pod="openshift-marketplace/certified-operators-6sz9b" Mar 20 14:37:54 crc kubenswrapper[4973]: I0320 14:37:54.225127 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwhpj\" (UniqueName: \"kubernetes.io/projected/492b0815-decc-47d8-9b40-56ce1413005a-kube-api-access-xwhpj\") pod \"certified-operators-6sz9b\" (UID: \"492b0815-decc-47d8-9b40-56ce1413005a\") " pod="openshift-marketplace/certified-operators-6sz9b" Mar 20 14:37:54 crc kubenswrapper[4973]: I0320 14:37:54.338815 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6sz9b" Mar 20 14:37:54 crc kubenswrapper[4973]: I0320 14:37:54.909249 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6sz9b"] Mar 20 14:37:55 crc kubenswrapper[4973]: I0320 14:37:55.783120 4973 generic.go:334] "Generic (PLEG): container finished" podID="492b0815-decc-47d8-9b40-56ce1413005a" containerID="0216ee33b5467811b7ffd19cdedb8be2c24c80cb3445ff485835ed408a61899d" exitCode=0 Mar 20 14:37:55 crc kubenswrapper[4973]: I0320 14:37:55.783579 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6sz9b" event={"ID":"492b0815-decc-47d8-9b40-56ce1413005a","Type":"ContainerDied","Data":"0216ee33b5467811b7ffd19cdedb8be2c24c80cb3445ff485835ed408a61899d"} Mar 20 14:37:55 crc kubenswrapper[4973]: I0320 14:37:55.785173 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6sz9b" event={"ID":"492b0815-decc-47d8-9b40-56ce1413005a","Type":"ContainerStarted","Data":"af08f2478abe6b4c8c38bcc2bfbcc6a446d7efb028c322b4d1aafc2fbf136bd3"} Mar 20 14:37:57 crc kubenswrapper[4973]: I0320 14:37:57.808390 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6sz9b" event={"ID":"492b0815-decc-47d8-9b40-56ce1413005a","Type":"ContainerStarted","Data":"6775e0ad7d150cb14570e982a132524f20dd17a947ae9752bdeed940f3a7a308"} Mar 20 14:37:58 crc kubenswrapper[4973]: I0320 14:37:58.820997 4973 generic.go:334] "Generic (PLEG): container finished" podID="492b0815-decc-47d8-9b40-56ce1413005a" containerID="6775e0ad7d150cb14570e982a132524f20dd17a947ae9752bdeed940f3a7a308" exitCode=0 Mar 20 14:37:58 crc kubenswrapper[4973]: I0320 14:37:58.821037 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6sz9b" event={"ID":"492b0815-decc-47d8-9b40-56ce1413005a","Type":"ContainerDied","Data":"6775e0ad7d150cb14570e982a132524f20dd17a947ae9752bdeed940f3a7a308"} Mar 20 14:37:59 crc kubenswrapper[4973]: I0320 14:37:59.833076 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6sz9b" event={"ID":"492b0815-decc-47d8-9b40-56ce1413005a","Type":"ContainerStarted","Data":"b8892a73892597c67767944c0990a0b2bf4c9c161db34cb1a1a4d3dc9e7ba1c7"} Mar 20 14:37:59 crc kubenswrapper[4973]: I0320 14:37:59.854055 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6sz9b" podStartSLOduration=3.385516998 podStartE2EDuration="6.854037219s" podCreationTimestamp="2026-03-20 14:37:53 +0000 UTC" firstStartedPulling="2026-03-20 14:37:55.790689635 +0000 UTC m=+4596.534359379" lastFinishedPulling="2026-03-20 14:37:59.259209856 +0000 UTC m=+4600.002879600" observedRunningTime="2026-03-20 14:37:59.849723281 +0000 UTC m=+4600.593393045" watchObservedRunningTime="2026-03-20 14:37:59.854037219 +0000 UTC m=+4600.597706963" Mar 20 14:37:59 crc kubenswrapper[4973]: I0320 14:37:59.960939 4973 scope.go:117] "RemoveContainer" containerID="dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" Mar 20 14:37:59 crc kubenswrapper[4973]: E0320 14:37:59.961234 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:38:00 crc kubenswrapper[4973]: I0320 14:38:00.144281 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566958-pszgn"] Mar 20 14:38:00 crc kubenswrapper[4973]: I0320 14:38:00.146967 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566958-pszgn" Mar 20 14:38:00 crc kubenswrapper[4973]: I0320 14:38:00.149713 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:38:00 crc kubenswrapper[4973]: I0320 14:38:00.149799 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:38:00 crc kubenswrapper[4973]: I0320 14:38:00.150411 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:38:00 crc kubenswrapper[4973]: I0320 14:38:00.156808 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566958-pszgn"] Mar 20 14:38:00 crc kubenswrapper[4973]: I0320 14:38:00.254889 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8j78\" (UniqueName: \"kubernetes.io/projected/d1591f03-5f44-4c7e-8e66-5c33919112f5-kube-api-access-r8j78\") pod \"auto-csr-approver-29566958-pszgn\" (UID: \"d1591f03-5f44-4c7e-8e66-5c33919112f5\") " pod="openshift-infra/auto-csr-approver-29566958-pszgn" Mar 20 14:38:00 crc kubenswrapper[4973]: I0320 14:38:00.356999 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8j78\" (UniqueName: \"kubernetes.io/projected/d1591f03-5f44-4c7e-8e66-5c33919112f5-kube-api-access-r8j78\") pod \"auto-csr-approver-29566958-pszgn\" (UID: \"d1591f03-5f44-4c7e-8e66-5c33919112f5\") " pod="openshift-infra/auto-csr-approver-29566958-pszgn" Mar 20 14:38:00 crc kubenswrapper[4973]: I0320 14:38:00.376767 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8j78\" (UniqueName: \"kubernetes.io/projected/d1591f03-5f44-4c7e-8e66-5c33919112f5-kube-api-access-r8j78\") pod \"auto-csr-approver-29566958-pszgn\" (UID: \"d1591f03-5f44-4c7e-8e66-5c33919112f5\") " pod="openshift-infra/auto-csr-approver-29566958-pszgn" Mar 20 14:38:00 crc kubenswrapper[4973]: I0320 14:38:00.470771 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566958-pszgn" Mar 20 14:38:01 crc kubenswrapper[4973]: I0320 14:38:01.001625 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566958-pszgn"] Mar 20 14:38:01 crc kubenswrapper[4973]: I0320 14:38:01.860609 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566958-pszgn" event={"ID":"d1591f03-5f44-4c7e-8e66-5c33919112f5","Type":"ContainerStarted","Data":"460bbda73c42accc32c54474ff4f91d3a160d9737cd8db54f857783bc708e728"} Mar 20 14:38:02 crc kubenswrapper[4973]: I0320 14:38:02.907907 4973 generic.go:334] "Generic (PLEG): container finished" podID="d1591f03-5f44-4c7e-8e66-5c33919112f5" containerID="117efc89b5f0f30b1ac189640ce7b44a1fe082c1847c491a3eafd4dae9876787" exitCode=0 Mar 20 14:38:02 crc kubenswrapper[4973]: I0320 14:38:02.908499 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566958-pszgn" event={"ID":"d1591f03-5f44-4c7e-8e66-5c33919112f5","Type":"ContainerDied","Data":"117efc89b5f0f30b1ac189640ce7b44a1fe082c1847c491a3eafd4dae9876787"} Mar 20 14:38:04 crc kubenswrapper[4973]: I0320 14:38:04.339162 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6sz9b" Mar 20 14:38:04 crc kubenswrapper[4973]: I0320 14:38:04.339603 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6sz9b" Mar 20 14:38:04 crc kubenswrapper[4973]: I0320 14:38:04.378626 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566958-pszgn" Mar 20 14:38:04 crc kubenswrapper[4973]: I0320 14:38:04.394316 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6sz9b" Mar 20 14:38:04 crc kubenswrapper[4973]: I0320 14:38:04.472978 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8j78\" (UniqueName: \"kubernetes.io/projected/d1591f03-5f44-4c7e-8e66-5c33919112f5-kube-api-access-r8j78\") pod \"d1591f03-5f44-4c7e-8e66-5c33919112f5\" (UID: \"d1591f03-5f44-4c7e-8e66-5c33919112f5\") " Mar 20 14:38:04 crc kubenswrapper[4973]: I0320 14:38:04.480701 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1591f03-5f44-4c7e-8e66-5c33919112f5-kube-api-access-r8j78" (OuterVolumeSpecName: "kube-api-access-r8j78") pod "d1591f03-5f44-4c7e-8e66-5c33919112f5" (UID: "d1591f03-5f44-4c7e-8e66-5c33919112f5"). InnerVolumeSpecName "kube-api-access-r8j78". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:38:04 crc kubenswrapper[4973]: I0320 14:38:04.577490 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8j78\" (UniqueName: \"kubernetes.io/projected/d1591f03-5f44-4c7e-8e66-5c33919112f5-kube-api-access-r8j78\") on node \"crc\" DevicePath \"\"" Mar 20 14:38:04 crc kubenswrapper[4973]: I0320 14:38:04.930397 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566958-pszgn" event={"ID":"d1591f03-5f44-4c7e-8e66-5c33919112f5","Type":"ContainerDied","Data":"460bbda73c42accc32c54474ff4f91d3a160d9737cd8db54f857783bc708e728"} Mar 20 14:38:04 crc kubenswrapper[4973]: I0320 14:38:04.930429 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566958-pszgn" Mar 20 14:38:04 crc kubenswrapper[4973]: I0320 14:38:04.930455 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="460bbda73c42accc32c54474ff4f91d3a160d9737cd8db54f857783bc708e728" Mar 20 14:38:04 crc kubenswrapper[4973]: I0320 14:38:04.985567 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6sz9b" Mar 20 14:38:05 crc kubenswrapper[4973]: I0320 14:38:05.041682 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6sz9b"] Mar 20 14:38:05 crc kubenswrapper[4973]: I0320 14:38:05.464470 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566952-v74sn"] Mar 20 14:38:05 crc kubenswrapper[4973]: I0320 14:38:05.474234 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566952-v74sn"] Mar 20 14:38:05 crc kubenswrapper[4973]: I0320 14:38:05.964373 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ffe8cf8f-857d-49e8-ac4f-d827991dadc7" path="/var/lib/kubelet/pods/ffe8cf8f-857d-49e8-ac4f-d827991dadc7/volumes" Mar 20 14:38:06 crc kubenswrapper[4973]: I0320 14:38:06.950539 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6sz9b" podUID="492b0815-decc-47d8-9b40-56ce1413005a" containerName="registry-server" containerID="cri-o://b8892a73892597c67767944c0990a0b2bf4c9c161db34cb1a1a4d3dc9e7ba1c7" gracePeriod=2 Mar 20 14:38:07 crc kubenswrapper[4973]: I0320 14:38:07.472388 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6sz9b" Mar 20 14:38:07 crc kubenswrapper[4973]: I0320 14:38:07.550610 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/492b0815-decc-47d8-9b40-56ce1413005a-catalog-content\") pod \"492b0815-decc-47d8-9b40-56ce1413005a\" (UID: \"492b0815-decc-47d8-9b40-56ce1413005a\") " Mar 20 14:38:07 crc kubenswrapper[4973]: I0320 14:38:07.550807 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwhpj\" (UniqueName: \"kubernetes.io/projected/492b0815-decc-47d8-9b40-56ce1413005a-kube-api-access-xwhpj\") pod \"492b0815-decc-47d8-9b40-56ce1413005a\" (UID: \"492b0815-decc-47d8-9b40-56ce1413005a\") " Mar 20 14:38:07 crc kubenswrapper[4973]: I0320 14:38:07.550845 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/492b0815-decc-47d8-9b40-56ce1413005a-utilities\") pod \"492b0815-decc-47d8-9b40-56ce1413005a\" (UID: \"492b0815-decc-47d8-9b40-56ce1413005a\") " Mar 20 14:38:07 crc kubenswrapper[4973]: I0320 14:38:07.551795 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/492b0815-decc-47d8-9b40-56ce1413005a-utilities" (OuterVolumeSpecName: "utilities") pod "492b0815-decc-47d8-9b40-56ce1413005a" (UID: "492b0815-decc-47d8-9b40-56ce1413005a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:38:07 crc kubenswrapper[4973]: I0320 14:38:07.556820 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/492b0815-decc-47d8-9b40-56ce1413005a-kube-api-access-xwhpj" (OuterVolumeSpecName: "kube-api-access-xwhpj") pod "492b0815-decc-47d8-9b40-56ce1413005a" (UID: "492b0815-decc-47d8-9b40-56ce1413005a"). InnerVolumeSpecName "kube-api-access-xwhpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:38:07 crc kubenswrapper[4973]: I0320 14:38:07.618808 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/492b0815-decc-47d8-9b40-56ce1413005a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "492b0815-decc-47d8-9b40-56ce1413005a" (UID: "492b0815-decc-47d8-9b40-56ce1413005a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:38:07 crc kubenswrapper[4973]: I0320 14:38:07.654854 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/492b0815-decc-47d8-9b40-56ce1413005a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:38:07 crc kubenswrapper[4973]: I0320 14:38:07.654904 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwhpj\" (UniqueName: \"kubernetes.io/projected/492b0815-decc-47d8-9b40-56ce1413005a-kube-api-access-xwhpj\") on node \"crc\" DevicePath \"\"" Mar 20 14:38:07 crc kubenswrapper[4973]: I0320 14:38:07.654920 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/492b0815-decc-47d8-9b40-56ce1413005a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:38:07 crc kubenswrapper[4973]: I0320 14:38:07.961975 4973 generic.go:334] "Generic (PLEG): container finished" podID="492b0815-decc-47d8-9b40-56ce1413005a" containerID="b8892a73892597c67767944c0990a0b2bf4c9c161db34cb1a1a4d3dc9e7ba1c7" exitCode=0 Mar 20 14:38:07 crc kubenswrapper[4973]: I0320 14:38:07.962175 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6sz9b" Mar 20 14:38:07 crc kubenswrapper[4973]: I0320 14:38:07.966708 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6sz9b" event={"ID":"492b0815-decc-47d8-9b40-56ce1413005a","Type":"ContainerDied","Data":"b8892a73892597c67767944c0990a0b2bf4c9c161db34cb1a1a4d3dc9e7ba1c7"} Mar 20 14:38:07 crc kubenswrapper[4973]: I0320 14:38:07.966765 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6sz9b" event={"ID":"492b0815-decc-47d8-9b40-56ce1413005a","Type":"ContainerDied","Data":"af08f2478abe6b4c8c38bcc2bfbcc6a446d7efb028c322b4d1aafc2fbf136bd3"} Mar 20 14:38:07 crc kubenswrapper[4973]: I0320 14:38:07.966784 4973 scope.go:117] "RemoveContainer" containerID="b8892a73892597c67767944c0990a0b2bf4c9c161db34cb1a1a4d3dc9e7ba1c7" Mar 20 14:38:07 crc kubenswrapper[4973]: I0320 14:38:07.993997 4973 scope.go:117] "RemoveContainer" containerID="6775e0ad7d150cb14570e982a132524f20dd17a947ae9752bdeed940f3a7a308" Mar 20 14:38:08 crc kubenswrapper[4973]: I0320 14:38:08.010908 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6sz9b"] Mar 20 14:38:08 crc kubenswrapper[4973]: I0320 14:38:08.022182 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6sz9b"] Mar 20 14:38:08 crc kubenswrapper[4973]: I0320 14:38:08.411291 4973 scope.go:117] "RemoveContainer" containerID="0216ee33b5467811b7ffd19cdedb8be2c24c80cb3445ff485835ed408a61899d" Mar 20 14:38:08 crc kubenswrapper[4973]: I0320 14:38:08.457852 4973 scope.go:117] "RemoveContainer" containerID="b8892a73892597c67767944c0990a0b2bf4c9c161db34cb1a1a4d3dc9e7ba1c7" Mar 20 14:38:08 crc kubenswrapper[4973]: E0320 14:38:08.458392 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8892a73892597c67767944c0990a0b2bf4c9c161db34cb1a1a4d3dc9e7ba1c7\": container with ID starting with b8892a73892597c67767944c0990a0b2bf4c9c161db34cb1a1a4d3dc9e7ba1c7 not found: ID does not exist" containerID="b8892a73892597c67767944c0990a0b2bf4c9c161db34cb1a1a4d3dc9e7ba1c7" Mar 20 14:38:08 crc kubenswrapper[4973]: I0320 14:38:08.458446 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8892a73892597c67767944c0990a0b2bf4c9c161db34cb1a1a4d3dc9e7ba1c7"} err="failed to get container status \"b8892a73892597c67767944c0990a0b2bf4c9c161db34cb1a1a4d3dc9e7ba1c7\": rpc error: code = NotFound desc = could not find container \"b8892a73892597c67767944c0990a0b2bf4c9c161db34cb1a1a4d3dc9e7ba1c7\": container with ID starting with b8892a73892597c67767944c0990a0b2bf4c9c161db34cb1a1a4d3dc9e7ba1c7 not found: ID does not exist" Mar 20 14:38:08 crc kubenswrapper[4973]: I0320 14:38:08.458480 4973 scope.go:117] "RemoveContainer" containerID="6775e0ad7d150cb14570e982a132524f20dd17a947ae9752bdeed940f3a7a308" Mar 20 14:38:08 crc kubenswrapper[4973]: E0320 14:38:08.459130 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6775e0ad7d150cb14570e982a132524f20dd17a947ae9752bdeed940f3a7a308\": container with ID starting with 6775e0ad7d150cb14570e982a132524f20dd17a947ae9752bdeed940f3a7a308 not found: ID does not exist" containerID="6775e0ad7d150cb14570e982a132524f20dd17a947ae9752bdeed940f3a7a308" Mar 20 14:38:08 crc kubenswrapper[4973]: I0320 14:38:08.459160 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6775e0ad7d150cb14570e982a132524f20dd17a947ae9752bdeed940f3a7a308"} err="failed to get container status \"6775e0ad7d150cb14570e982a132524f20dd17a947ae9752bdeed940f3a7a308\": rpc error: code = NotFound desc = could not find container \"6775e0ad7d150cb14570e982a132524f20dd17a947ae9752bdeed940f3a7a308\": container with ID starting with 6775e0ad7d150cb14570e982a132524f20dd17a947ae9752bdeed940f3a7a308 not found: ID does not exist" Mar 20 14:38:08 crc kubenswrapper[4973]: I0320 14:38:08.459173 4973 scope.go:117] "RemoveContainer" containerID="0216ee33b5467811b7ffd19cdedb8be2c24c80cb3445ff485835ed408a61899d" Mar 20 14:38:08 crc kubenswrapper[4973]: E0320 14:38:08.459665 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0216ee33b5467811b7ffd19cdedb8be2c24c80cb3445ff485835ed408a61899d\": container with ID starting with 0216ee33b5467811b7ffd19cdedb8be2c24c80cb3445ff485835ed408a61899d not found: ID does not exist" containerID="0216ee33b5467811b7ffd19cdedb8be2c24c80cb3445ff485835ed408a61899d" Mar 20 14:38:08 crc kubenswrapper[4973]: I0320 14:38:08.459701 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0216ee33b5467811b7ffd19cdedb8be2c24c80cb3445ff485835ed408a61899d"} err="failed to get container status \"0216ee33b5467811b7ffd19cdedb8be2c24c80cb3445ff485835ed408a61899d\": rpc error: code = NotFound desc = could not find container \"0216ee33b5467811b7ffd19cdedb8be2c24c80cb3445ff485835ed408a61899d\": container with ID starting with 0216ee33b5467811b7ffd19cdedb8be2c24c80cb3445ff485835ed408a61899d not found: ID does not exist" Mar 20 14:38:09 crc kubenswrapper[4973]: I0320 14:38:09.963650 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="492b0815-decc-47d8-9b40-56ce1413005a" path="/var/lib/kubelet/pods/492b0815-decc-47d8-9b40-56ce1413005a/volumes" Mar 20 14:38:11 crc kubenswrapper[4973]: I0320 14:38:11.951060 4973 scope.go:117] "RemoveContainer" containerID="dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" Mar 20 14:38:11 crc kubenswrapper[4973]: E0320 14:38:11.951908 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:38:22 crc kubenswrapper[4973]: I0320 14:38:22.951114 4973 scope.go:117] "RemoveContainer" containerID="dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" Mar 20 14:38:22 crc kubenswrapper[4973]: E0320 14:38:22.954690 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:38:36 crc kubenswrapper[4973]: I0320 14:38:36.951716 4973 scope.go:117] "RemoveContainer" containerID="dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" Mar 20 14:38:36 crc kubenswrapper[4973]: E0320 14:38:36.952605 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:38:47 crc kubenswrapper[4973]: I0320 14:38:47.951673 4973 scope.go:117] "RemoveContainer" containerID="dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" Mar 20 14:38:47 crc kubenswrapper[4973]: E0320 14:38:47.952598 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:39:01 crc kubenswrapper[4973]: I0320 14:39:01.950945 4973 scope.go:117] "RemoveContainer" containerID="dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" Mar 20 14:39:01 crc kubenswrapper[4973]: E0320 14:39:01.951900 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:39:04 crc kubenswrapper[4973]: I0320 14:39:04.672093 4973 scope.go:117] "RemoveContainer" containerID="0976927a13e07eaff8118bb7c410d30f774b887a279fa9ef040fa689eeb4e94e" Mar 20 14:39:13 crc kubenswrapper[4973]: I0320 14:39:13.950692 4973 scope.go:117] "RemoveContainer" containerID="dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" Mar 20 14:39:13 crc kubenswrapper[4973]: E0320 14:39:13.951522 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:39:27 crc kubenswrapper[4973]: I0320 14:39:27.951625 4973 scope.go:117] "RemoveContainer" containerID="dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" Mar 20 14:39:27 crc kubenswrapper[4973]: E0320 14:39:27.953683 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.463577 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 14:39:30 crc kubenswrapper[4973]: E0320 14:39:30.465751 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="492b0815-decc-47d8-9b40-56ce1413005a" containerName="registry-server" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.465770 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="492b0815-decc-47d8-9b40-56ce1413005a" containerName="registry-server" Mar 20 14:39:30 crc kubenswrapper[4973]: E0320 14:39:30.465874 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="492b0815-decc-47d8-9b40-56ce1413005a" containerName="extract-content" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.465883 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="492b0815-decc-47d8-9b40-56ce1413005a" containerName="extract-content" Mar 20 14:39:30 crc kubenswrapper[4973]: E0320 14:39:30.465892 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1591f03-5f44-4c7e-8e66-5c33919112f5" containerName="oc" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.465899 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1591f03-5f44-4c7e-8e66-5c33919112f5" containerName="oc" Mar 20 14:39:30 crc kubenswrapper[4973]: E0320 14:39:30.465909 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="492b0815-decc-47d8-9b40-56ce1413005a" containerName="extract-utilities" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.465914 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="492b0815-decc-47d8-9b40-56ce1413005a" containerName="extract-utilities" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.466180 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="492b0815-decc-47d8-9b40-56ce1413005a" containerName="registry-server" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.466205 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1591f03-5f44-4c7e-8e66-5c33919112f5" containerName="oc" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.467333 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.469801 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.469842 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.470178 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-dfv6w" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.470188 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.480913 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.556170 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b5edf151-174a-4c18-b733-318653db1c6e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.556219 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b5edf151-174a-4c18-b733-318653db1c6e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.556269 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b5edf151-174a-4c18-b733-318653db1c6e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.556289 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcqr9\" (UniqueName: \"kubernetes.io/projected/b5edf151-174a-4c18-b733-318653db1c6e-kube-api-access-bcqr9\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.556384 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b5edf151-174a-4c18-b733-318653db1c6e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.556588 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b5edf151-174a-4c18-b733-318653db1c6e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.556615 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.556719 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b5edf151-174a-4c18-b733-318653db1c6e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.556768 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5edf151-174a-4c18-b733-318653db1c6e-config-data\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.660566 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b5edf151-174a-4c18-b733-318653db1c6e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.660708 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b5edf151-174a-4c18-b733-318653db1c6e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.660734 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.660848 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b5edf151-174a-4c18-b733-318653db1c6e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.660928 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5edf151-174a-4c18-b733-318653db1c6e-config-data\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.661095 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b5edf151-174a-4c18-b733-318653db1c6e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.661135 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b5edf151-174a-4c18-b733-318653db1c6e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.661162 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b5edf151-174a-4c18-b733-318653db1c6e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.661186 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcqr9\" (UniqueName: \"kubernetes.io/projected/b5edf151-174a-4c18-b733-318653db1c6e-kube-api-access-bcqr9\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.661558 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b5edf151-174a-4c18-b733-318653db1c6e-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.662133 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b5edf151-174a-4c18-b733-318653db1c6e-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.662429 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b5edf151-174a-4c18-b733-318653db1c6e-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.662542 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5edf151-174a-4c18-b733-318653db1c6e-config-data\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.662852 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.667470 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b5edf151-174a-4c18-b733-318653db1c6e-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.668166 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b5edf151-174a-4c18-b733-318653db1c6e-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.675299 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b5edf151-174a-4c18-b733-318653db1c6e-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.679268 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcqr9\" (UniqueName: \"kubernetes.io/projected/b5edf151-174a-4c18-b733-318653db1c6e-kube-api-access-bcqr9\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.700295 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"tempest-tests-tempest\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " pod="openstack/tempest-tests-tempest" Mar 20 14:39:30 crc kubenswrapper[4973]: I0320 14:39:30.797808 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 14:39:31 crc kubenswrapper[4973]: I0320 14:39:31.414042 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 20 14:39:31 crc kubenswrapper[4973]: I0320 14:39:31.713183 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b5edf151-174a-4c18-b733-318653db1c6e","Type":"ContainerStarted","Data":"776579fb9892ea1c70855e496750a6c3a5bbd050e2bb9a829bd74ed9dc0c74e8"} Mar 20 14:39:38 crc kubenswrapper[4973]: I0320 14:39:38.099176 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-cgmjd" podUID="75fc4720-ae9c-4ae5-8e4c-7c9a800f5478" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:39:38 crc kubenswrapper[4973]: I0320 14:39:38.950983 4973 scope.go:117] "RemoveContainer" containerID="dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" Mar 20 14:39:38 crc kubenswrapper[4973]: E0320 14:39:38.951494 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:39:50 crc kubenswrapper[4973]: I0320 14:39:50.951630 4973 scope.go:117] "RemoveContainer" containerID="dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" Mar 20 14:39:50 crc kubenswrapper[4973]: E0320 14:39:50.952556 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:40:00 crc kubenswrapper[4973]: I0320 14:40:00.146587 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566960-qqpz4"] Mar 20 14:40:00 crc kubenswrapper[4973]: I0320 14:40:00.149096 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566960-qqpz4" Mar 20 14:40:00 crc kubenswrapper[4973]: I0320 14:40:00.151472 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:40:00 crc kubenswrapper[4973]: I0320 14:40:00.151808 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:40:00 crc kubenswrapper[4973]: I0320 14:40:00.152189 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:40:00 crc kubenswrapper[4973]: I0320 14:40:00.161199 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566960-qqpz4"] Mar 20 14:40:00 crc kubenswrapper[4973]: I0320 14:40:00.259399 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nq7qc\" (UniqueName: \"kubernetes.io/projected/d315a1e7-d368-418e-b447-2d03147a9b7d-kube-api-access-nq7qc\") pod \"auto-csr-approver-29566960-qqpz4\" (UID: \"d315a1e7-d368-418e-b447-2d03147a9b7d\") " pod="openshift-infra/auto-csr-approver-29566960-qqpz4" Mar 20 14:40:00 crc kubenswrapper[4973]: I0320 14:40:00.362052 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nq7qc\" (UniqueName: \"kubernetes.io/projected/d315a1e7-d368-418e-b447-2d03147a9b7d-kube-api-access-nq7qc\") pod \"auto-csr-approver-29566960-qqpz4\" (UID: \"d315a1e7-d368-418e-b447-2d03147a9b7d\") " pod="openshift-infra/auto-csr-approver-29566960-qqpz4" Mar 20 14:40:00 crc kubenswrapper[4973]: I0320 14:40:00.417607 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nq7qc\" (UniqueName: \"kubernetes.io/projected/d315a1e7-d368-418e-b447-2d03147a9b7d-kube-api-access-nq7qc\") pod \"auto-csr-approver-29566960-qqpz4\" (UID: \"d315a1e7-d368-418e-b447-2d03147a9b7d\") " pod="openshift-infra/auto-csr-approver-29566960-qqpz4" Mar 20 14:40:00 crc kubenswrapper[4973]: I0320 14:40:00.486776 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566960-qqpz4" Mar 20 14:40:04 crc kubenswrapper[4973]: I0320 14:40:04.951098 4973 scope.go:117] "RemoveContainer" containerID="dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" Mar 20 14:40:04 crc kubenswrapper[4973]: E0320 14:40:04.951861 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:40:10 crc kubenswrapper[4973]: E0320 14:40:10.226015 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 20 14:40:10 crc kubenswrapper[4973]: E0320 14:40:10.231564 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bcqr9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(b5edf151-174a-4c18-b733-318653db1c6e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 14:40:10 crc kubenswrapper[4973]: E0320 14:40:10.232975 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="b5edf151-174a-4c18-b733-318653db1c6e" Mar 20 14:40:10 crc kubenswrapper[4973]: I0320 14:40:10.682179 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566960-qqpz4"] Mar 20 14:40:11 crc kubenswrapper[4973]: I0320 14:40:11.167367 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566960-qqpz4" event={"ID":"d315a1e7-d368-418e-b447-2d03147a9b7d","Type":"ContainerStarted","Data":"6569e4cffc42d319d244a7635d36e224ea055271ac19bdb12a24e555d37ace99"} Mar 20 14:40:11 crc kubenswrapper[4973]: E0320 14:40:11.169654 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="b5edf151-174a-4c18-b733-318653db1c6e" Mar 20 14:40:13 crc kubenswrapper[4973]: I0320 14:40:13.195130 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566960-qqpz4" event={"ID":"d315a1e7-d368-418e-b447-2d03147a9b7d","Type":"ContainerStarted","Data":"f0970ba010c2c95e8c50c06dbb38f7c37e24bedea49d93413f1568c0e2a5fc7c"} Mar 20 14:40:13 crc kubenswrapper[4973]: I0320 14:40:13.218184 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566960-qqpz4" podStartSLOduration=12.22526409 podStartE2EDuration="13.218166031s" podCreationTimestamp="2026-03-20 14:40:00 +0000 UTC" firstStartedPulling="2026-03-20 14:40:10.685180415 +0000 UTC m=+4731.428850159" lastFinishedPulling="2026-03-20 14:40:11.678082356 +0000 UTC m=+4732.421752100" observedRunningTime="2026-03-20 14:40:13.208083856 +0000 UTC m=+4733.951753600" watchObservedRunningTime="2026-03-20 14:40:13.218166031 +0000 UTC m=+4733.961835775" Mar 20 14:40:14 crc kubenswrapper[4973]: I0320 14:40:14.206259 4973 generic.go:334] "Generic (PLEG): container finished" podID="d315a1e7-d368-418e-b447-2d03147a9b7d" containerID="f0970ba010c2c95e8c50c06dbb38f7c37e24bedea49d93413f1568c0e2a5fc7c" exitCode=0 Mar 20 14:40:14 crc kubenswrapper[4973]: I0320 14:40:14.206355 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566960-qqpz4" event={"ID":"d315a1e7-d368-418e-b447-2d03147a9b7d","Type":"ContainerDied","Data":"f0970ba010c2c95e8c50c06dbb38f7c37e24bedea49d93413f1568c0e2a5fc7c"} Mar 20 14:40:15 crc kubenswrapper[4973]: I0320 14:40:15.806257 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566960-qqpz4" Mar 20 14:40:15 crc kubenswrapper[4973]: I0320 14:40:15.873373 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nq7qc\" (UniqueName: \"kubernetes.io/projected/d315a1e7-d368-418e-b447-2d03147a9b7d-kube-api-access-nq7qc\") pod \"d315a1e7-d368-418e-b447-2d03147a9b7d\" (UID: \"d315a1e7-d368-418e-b447-2d03147a9b7d\") " Mar 20 14:40:15 crc kubenswrapper[4973]: I0320 14:40:15.880285 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d315a1e7-d368-418e-b447-2d03147a9b7d-kube-api-access-nq7qc" (OuterVolumeSpecName: "kube-api-access-nq7qc") pod "d315a1e7-d368-418e-b447-2d03147a9b7d" (UID: "d315a1e7-d368-418e-b447-2d03147a9b7d"). InnerVolumeSpecName "kube-api-access-nq7qc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:40:15 crc kubenswrapper[4973]: I0320 14:40:15.976681 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nq7qc\" (UniqueName: \"kubernetes.io/projected/d315a1e7-d368-418e-b447-2d03147a9b7d-kube-api-access-nq7qc\") on node \"crc\" DevicePath \"\"" Mar 20 14:40:16 crc kubenswrapper[4973]: I0320 14:40:16.226390 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566960-qqpz4" event={"ID":"d315a1e7-d368-418e-b447-2d03147a9b7d","Type":"ContainerDied","Data":"6569e4cffc42d319d244a7635d36e224ea055271ac19bdb12a24e555d37ace99"} Mar 20 14:40:16 crc kubenswrapper[4973]: I0320 14:40:16.226435 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6569e4cffc42d319d244a7635d36e224ea055271ac19bdb12a24e555d37ace99" Mar 20 14:40:16 crc kubenswrapper[4973]: I0320 14:40:16.226443 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566960-qqpz4" Mar 20 14:40:16 crc kubenswrapper[4973]: I0320 14:40:16.464422 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566954-gw4dc"] Mar 20 14:40:16 crc kubenswrapper[4973]: I0320 14:40:16.474786 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566954-gw4dc"] Mar 20 14:40:17 crc kubenswrapper[4973]: I0320 14:40:17.961726 4973 scope.go:117] "RemoveContainer" containerID="dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" Mar 20 14:40:17 crc kubenswrapper[4973]: I0320 14:40:17.987302 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b27d1e85-9415-48df-b36b-d508f108cae3" path="/var/lib/kubelet/pods/b27d1e85-9415-48df-b36b-d508f108cae3/volumes" Mar 20 14:40:18 crc kubenswrapper[4973]: I0320 14:40:18.254092 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerStarted","Data":"59cb6d1eb0dc10deddad9ed131c9312329e2744cabbd12799cb25f9f0fdea90e"} Mar 20 14:40:23 crc kubenswrapper[4973]: I0320 14:40:23.476673 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 20 14:40:24 crc kubenswrapper[4973]: I0320 14:40:24.623131 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bvrtk"] Mar 20 14:40:24 crc kubenswrapper[4973]: E0320 14:40:24.624319 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d315a1e7-d368-418e-b447-2d03147a9b7d" containerName="oc" Mar 20 14:40:24 crc kubenswrapper[4973]: I0320 14:40:24.624359 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="d315a1e7-d368-418e-b447-2d03147a9b7d" containerName="oc" Mar 20 14:40:24 crc kubenswrapper[4973]: I0320 14:40:24.624681 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="d315a1e7-d368-418e-b447-2d03147a9b7d" containerName="oc" Mar 20 14:40:24 crc kubenswrapper[4973]: I0320 14:40:24.626812 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bvrtk" Mar 20 14:40:24 crc kubenswrapper[4973]: I0320 14:40:24.645275 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bvrtk"] Mar 20 14:40:24 crc kubenswrapper[4973]: I0320 14:40:24.799902 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8h8n\" (UniqueName: \"kubernetes.io/projected/1dcfb4a8-951e-4354-9668-4dd3386cec76-kube-api-access-j8h8n\") pod \"redhat-marketplace-bvrtk\" (UID: \"1dcfb4a8-951e-4354-9668-4dd3386cec76\") " pod="openshift-marketplace/redhat-marketplace-bvrtk" Mar 20 14:40:24 crc kubenswrapper[4973]: I0320 14:40:24.802992 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dcfb4a8-951e-4354-9668-4dd3386cec76-utilities\") pod \"redhat-marketplace-bvrtk\" (UID: \"1dcfb4a8-951e-4354-9668-4dd3386cec76\") " pod="openshift-marketplace/redhat-marketplace-bvrtk" Mar 20 14:40:24 crc kubenswrapper[4973]: I0320 14:40:24.803435 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dcfb4a8-951e-4354-9668-4dd3386cec76-catalog-content\") pod \"redhat-marketplace-bvrtk\" (UID: \"1dcfb4a8-951e-4354-9668-4dd3386cec76\") " pod="openshift-marketplace/redhat-marketplace-bvrtk" Mar 20 14:40:24 crc kubenswrapper[4973]: I0320 14:40:24.822815 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jf9n9"] Mar 20 14:40:24 crc kubenswrapper[4973]: I0320 14:40:24.825406 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jf9n9" Mar 20 14:40:24 crc kubenswrapper[4973]: I0320 14:40:24.843104 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jf9n9"] Mar 20 14:40:24 crc kubenswrapper[4973]: I0320 14:40:24.906558 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cph2f\" (UniqueName: \"kubernetes.io/projected/686a6ce1-7178-44f3-b302-37dac8298893-kube-api-access-cph2f\") pod \"redhat-operators-jf9n9\" (UID: \"686a6ce1-7178-44f3-b302-37dac8298893\") " pod="openshift-marketplace/redhat-operators-jf9n9" Mar 20 14:40:24 crc kubenswrapper[4973]: I0320 14:40:24.906626 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/686a6ce1-7178-44f3-b302-37dac8298893-utilities\") pod \"redhat-operators-jf9n9\" (UID: \"686a6ce1-7178-44f3-b302-37dac8298893\") " pod="openshift-marketplace/redhat-operators-jf9n9" Mar 20 14:40:24 crc kubenswrapper[4973]: I0320 14:40:24.906785 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8h8n\" (UniqueName: \"kubernetes.io/projected/1dcfb4a8-951e-4354-9668-4dd3386cec76-kube-api-access-j8h8n\") pod \"redhat-marketplace-bvrtk\" (UID: \"1dcfb4a8-951e-4354-9668-4dd3386cec76\") " pod="openshift-marketplace/redhat-marketplace-bvrtk" Mar 20 14:40:24 crc kubenswrapper[4973]: I0320 14:40:24.906853 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/686a6ce1-7178-44f3-b302-37dac8298893-catalog-content\") pod \"redhat-operators-jf9n9\" (UID: \"686a6ce1-7178-44f3-b302-37dac8298893\") " pod="openshift-marketplace/redhat-operators-jf9n9" Mar 20 14:40:24 crc kubenswrapper[4973]: I0320 14:40:24.906946 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dcfb4a8-951e-4354-9668-4dd3386cec76-utilities\") pod \"redhat-marketplace-bvrtk\" (UID: \"1dcfb4a8-951e-4354-9668-4dd3386cec76\") " pod="openshift-marketplace/redhat-marketplace-bvrtk" Mar 20 14:40:24 crc kubenswrapper[4973]: I0320 14:40:24.907153 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dcfb4a8-951e-4354-9668-4dd3386cec76-catalog-content\") pod \"redhat-marketplace-bvrtk\" (UID: \"1dcfb4a8-951e-4354-9668-4dd3386cec76\") " pod="openshift-marketplace/redhat-marketplace-bvrtk" Mar 20 14:40:24 crc kubenswrapper[4973]: I0320 14:40:24.907780 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dcfb4a8-951e-4354-9668-4dd3386cec76-utilities\") pod \"redhat-marketplace-bvrtk\" (UID: \"1dcfb4a8-951e-4354-9668-4dd3386cec76\") " pod="openshift-marketplace/redhat-marketplace-bvrtk" Mar 20 14:40:24 crc kubenswrapper[4973]: I0320 14:40:24.910169 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dcfb4a8-951e-4354-9668-4dd3386cec76-catalog-content\") pod \"redhat-marketplace-bvrtk\" (UID: \"1dcfb4a8-951e-4354-9668-4dd3386cec76\") " pod="openshift-marketplace/redhat-marketplace-bvrtk" Mar 20 14:40:24 crc kubenswrapper[4973]: I0320 14:40:24.931024 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8h8n\" (UniqueName: \"kubernetes.io/projected/1dcfb4a8-951e-4354-9668-4dd3386cec76-kube-api-access-j8h8n\") pod \"redhat-marketplace-bvrtk\" (UID: \"1dcfb4a8-951e-4354-9668-4dd3386cec76\") " pod="openshift-marketplace/redhat-marketplace-bvrtk" Mar 20 14:40:24 crc kubenswrapper[4973]: I0320 14:40:24.975218 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bvrtk" Mar 20 14:40:25 crc kubenswrapper[4973]: I0320 14:40:25.009871 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cph2f\" (UniqueName: \"kubernetes.io/projected/686a6ce1-7178-44f3-b302-37dac8298893-kube-api-access-cph2f\") pod \"redhat-operators-jf9n9\" (UID: \"686a6ce1-7178-44f3-b302-37dac8298893\") " pod="openshift-marketplace/redhat-operators-jf9n9" Mar 20 14:40:25 crc kubenswrapper[4973]: I0320 14:40:25.009947 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/686a6ce1-7178-44f3-b302-37dac8298893-utilities\") pod \"redhat-operators-jf9n9\" (UID: \"686a6ce1-7178-44f3-b302-37dac8298893\") " pod="openshift-marketplace/redhat-operators-jf9n9" Mar 20 14:40:25 crc kubenswrapper[4973]: I0320 14:40:25.010040 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/686a6ce1-7178-44f3-b302-37dac8298893-catalog-content\") pod \"redhat-operators-jf9n9\" (UID: \"686a6ce1-7178-44f3-b302-37dac8298893\") " pod="openshift-marketplace/redhat-operators-jf9n9" Mar 20 14:40:25 crc kubenswrapper[4973]: I0320 14:40:25.010576 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/686a6ce1-7178-44f3-b302-37dac8298893-catalog-content\") pod \"redhat-operators-jf9n9\" (UID: \"686a6ce1-7178-44f3-b302-37dac8298893\") " pod="openshift-marketplace/redhat-operators-jf9n9" Mar 20 14:40:25 crc kubenswrapper[4973]: I0320 14:40:25.011944 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/686a6ce1-7178-44f3-b302-37dac8298893-utilities\") pod \"redhat-operators-jf9n9\" (UID: \"686a6ce1-7178-44f3-b302-37dac8298893\") " pod="openshift-marketplace/redhat-operators-jf9n9" Mar 20 14:40:25 crc kubenswrapper[4973]: I0320 14:40:25.047876 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cph2f\" (UniqueName: \"kubernetes.io/projected/686a6ce1-7178-44f3-b302-37dac8298893-kube-api-access-cph2f\") pod \"redhat-operators-jf9n9\" (UID: \"686a6ce1-7178-44f3-b302-37dac8298893\") " pod="openshift-marketplace/redhat-operators-jf9n9" Mar 20 14:40:25 crc kubenswrapper[4973]: I0320 14:40:25.146898 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jf9n9" Mar 20 14:40:25 crc kubenswrapper[4973]: I0320 14:40:25.645087 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bvrtk"] Mar 20 14:40:25 crc kubenswrapper[4973]: W0320 14:40:25.653023 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dcfb4a8_951e_4354_9668_4dd3386cec76.slice/crio-a6ac1cecabe707e1d4b01ad37dfcb693155a39dc648c72df4dd3f531170a63ef WatchSource:0}: Error finding container a6ac1cecabe707e1d4b01ad37dfcb693155a39dc648c72df4dd3f531170a63ef: Status 404 returned error can't find the container with id a6ac1cecabe707e1d4b01ad37dfcb693155a39dc648c72df4dd3f531170a63ef Mar 20 14:40:25 crc kubenswrapper[4973]: W0320 14:40:25.856617 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod686a6ce1_7178_44f3_b302_37dac8298893.slice/crio-4c91a0381bec599758fce448dec05b8f05e13c8f9fdb76a9f9a51caa0ae41022 WatchSource:0}: Error finding container 4c91a0381bec599758fce448dec05b8f05e13c8f9fdb76a9f9a51caa0ae41022: Status 404 returned error can't find the container with id 4c91a0381bec599758fce448dec05b8f05e13c8f9fdb76a9f9a51caa0ae41022 Mar 20 14:40:25 crc kubenswrapper[4973]: I0320 14:40:25.858006 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jf9n9"] Mar 20 14:40:26 crc kubenswrapper[4973]: I0320 14:40:26.363163 4973 generic.go:334] "Generic (PLEG): container finished" podID="1dcfb4a8-951e-4354-9668-4dd3386cec76" containerID="3931070d1367568585a9eac6c878597c6e48400de0a99e235ac80806c1bebf7c" exitCode=0 Mar 20 14:40:26 crc kubenswrapper[4973]: I0320 14:40:26.363256 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvrtk" event={"ID":"1dcfb4a8-951e-4354-9668-4dd3386cec76","Type":"ContainerDied","Data":"3931070d1367568585a9eac6c878597c6e48400de0a99e235ac80806c1bebf7c"} Mar 20 14:40:26 crc kubenswrapper[4973]: I0320 14:40:26.363292 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvrtk" event={"ID":"1dcfb4a8-951e-4354-9668-4dd3386cec76","Type":"ContainerStarted","Data":"a6ac1cecabe707e1d4b01ad37dfcb693155a39dc648c72df4dd3f531170a63ef"} Mar 20 14:40:26 crc kubenswrapper[4973]: I0320 14:40:26.365723 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b5edf151-174a-4c18-b733-318653db1c6e","Type":"ContainerStarted","Data":"fe8b8eb04a04362d2f5f7d4c168a3ae188ac63e780f21c34b9cabddcd2073d2a"} Mar 20 14:40:26 crc kubenswrapper[4973]: I0320 14:40:26.368022 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jf9n9" event={"ID":"686a6ce1-7178-44f3-b302-37dac8298893","Type":"ContainerStarted","Data":"4c91a0381bec599758fce448dec05b8f05e13c8f9fdb76a9f9a51caa0ae41022"} Mar 20 14:40:26 crc kubenswrapper[4973]: I0320 14:40:26.422185 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=5.369640664 podStartE2EDuration="57.422166617s" podCreationTimestamp="2026-03-20 14:39:29 +0000 UTC" firstStartedPulling="2026-03-20 14:39:31.42119594 +0000 UTC m=+4692.164865684" lastFinishedPulling="2026-03-20 14:40:23.473721893 +0000 UTC m=+4744.217391637" observedRunningTime="2026-03-20 14:40:26.410264093 +0000 UTC m=+4747.153933837" watchObservedRunningTime="2026-03-20 14:40:26.422166617 +0000 UTC m=+4747.165836361" Mar 20 14:40:27 crc kubenswrapper[4973]: I0320 14:40:27.381261 4973 generic.go:334] "Generic (PLEG): container finished" podID="686a6ce1-7178-44f3-b302-37dac8298893" containerID="dc585d0d22a1956905b38c92586364a6e333b00480bde73baef0050e4fb2afc1" exitCode=0 Mar 20 14:40:27 crc kubenswrapper[4973]: I0320 14:40:27.381458 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jf9n9" event={"ID":"686a6ce1-7178-44f3-b302-37dac8298893","Type":"ContainerDied","Data":"dc585d0d22a1956905b38c92586364a6e333b00480bde73baef0050e4fb2afc1"} Mar 20 14:40:27 crc kubenswrapper[4973]: I0320 14:40:27.384849 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvrtk" event={"ID":"1dcfb4a8-951e-4354-9668-4dd3386cec76","Type":"ContainerStarted","Data":"324fb6bf4afdbd7e2fe5e31bdc01d86b42d0805cd67be86e3ce38c0d2bc91408"} Mar 20 14:40:29 crc kubenswrapper[4973]: I0320 14:40:29.408326 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jf9n9" event={"ID":"686a6ce1-7178-44f3-b302-37dac8298893","Type":"ContainerStarted","Data":"e9e8ddae97a91531c827f7ee6d60248350d47a7dc2274fbb594db1b6b3061088"} Mar 20 14:40:29 crc kubenswrapper[4973]: I0320 14:40:29.424122 4973 generic.go:334] "Generic (PLEG): container finished" podID="1dcfb4a8-951e-4354-9668-4dd3386cec76" containerID="324fb6bf4afdbd7e2fe5e31bdc01d86b42d0805cd67be86e3ce38c0d2bc91408" exitCode=0 Mar 20 14:40:29 crc kubenswrapper[4973]: I0320 14:40:29.424190 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvrtk" event={"ID":"1dcfb4a8-951e-4354-9668-4dd3386cec76","Type":"ContainerDied","Data":"324fb6bf4afdbd7e2fe5e31bdc01d86b42d0805cd67be86e3ce38c0d2bc91408"} Mar 20 14:40:30 crc kubenswrapper[4973]: I0320 14:40:30.443111 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvrtk" event={"ID":"1dcfb4a8-951e-4354-9668-4dd3386cec76","Type":"ContainerStarted","Data":"6f8dbfaa13262987ec33d3372485ed2b2fc1a1b4f5ad9e36445908f2e7c7ac10"} Mar 20 14:40:30 crc kubenswrapper[4973]: I0320 14:40:30.467623 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bvrtk" podStartSLOduration=2.890406643 podStartE2EDuration="6.467604574s" podCreationTimestamp="2026-03-20 14:40:24 +0000 UTC" firstStartedPulling="2026-03-20 14:40:26.365014751 +0000 UTC m=+4747.108684495" lastFinishedPulling="2026-03-20 14:40:29.942212682 +0000 UTC m=+4750.685882426" observedRunningTime="2026-03-20 14:40:30.46047867 +0000 UTC m=+4751.204148414" watchObservedRunningTime="2026-03-20 14:40:30.467604574 +0000 UTC m=+4751.211274318" Mar 20 14:40:34 crc kubenswrapper[4973]: I0320 14:40:34.976008 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bvrtk" Mar 20 14:40:34 crc kubenswrapper[4973]: I0320 14:40:34.976670 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bvrtk" Mar 20 14:40:35 crc kubenswrapper[4973]: I0320 14:40:35.491984 4973 generic.go:334] "Generic (PLEG): container finished" podID="686a6ce1-7178-44f3-b302-37dac8298893" containerID="e9e8ddae97a91531c827f7ee6d60248350d47a7dc2274fbb594db1b6b3061088" exitCode=0 Mar 20 14:40:35 crc kubenswrapper[4973]: I0320 14:40:35.492070 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jf9n9" event={"ID":"686a6ce1-7178-44f3-b302-37dac8298893","Type":"ContainerDied","Data":"e9e8ddae97a91531c827f7ee6d60248350d47a7dc2274fbb594db1b6b3061088"} Mar 20 14:40:36 crc kubenswrapper[4973]: I0320 14:40:36.031446 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-bvrtk" podUID="1dcfb4a8-951e-4354-9668-4dd3386cec76" containerName="registry-server" probeResult="failure" output=< Mar 20 14:40:36 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:40:36 crc kubenswrapper[4973]: > Mar 20 14:40:36 crc kubenswrapper[4973]: I0320 14:40:36.505622 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jf9n9" event={"ID":"686a6ce1-7178-44f3-b302-37dac8298893","Type":"ContainerStarted","Data":"f73122d8bf92463f4e12f0a2d98f4cd530a3341831913192ac587e7b08c2a756"} Mar 20 14:40:36 crc kubenswrapper[4973]: I0320 14:40:36.528594 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jf9n9" podStartSLOduration=3.987654304 podStartE2EDuration="12.528573629s" podCreationTimestamp="2026-03-20 14:40:24 +0000 UTC" firstStartedPulling="2026-03-20 14:40:27.382900671 +0000 UTC m=+4748.126570415" lastFinishedPulling="2026-03-20 14:40:35.923819996 +0000 UTC m=+4756.667489740" observedRunningTime="2026-03-20 14:40:36.521576568 +0000 UTC m=+4757.265246312" watchObservedRunningTime="2026-03-20 14:40:36.528573629 +0000 UTC m=+4757.272243373" Mar 20 14:40:45 crc kubenswrapper[4973]: I0320 14:40:45.025014 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bvrtk" Mar 20 14:40:45 crc kubenswrapper[4973]: I0320 14:40:45.077903 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bvrtk" Mar 20 14:40:45 crc kubenswrapper[4973]: I0320 14:40:45.147934 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jf9n9" Mar 20 14:40:45 crc kubenswrapper[4973]: I0320 14:40:45.148277 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jf9n9" Mar 20 14:40:45 crc kubenswrapper[4973]: I0320 14:40:45.398595 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bvrtk"] Mar 20 14:40:46 crc kubenswrapper[4973]: I0320 14:40:46.203085 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jf9n9" podUID="686a6ce1-7178-44f3-b302-37dac8298893" containerName="registry-server" probeResult="failure" output=< Mar 20 14:40:46 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:40:46 crc kubenswrapper[4973]: > Mar 20 14:40:46 crc kubenswrapper[4973]: I0320 14:40:46.607656 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bvrtk" podUID="1dcfb4a8-951e-4354-9668-4dd3386cec76" containerName="registry-server" containerID="cri-o://6f8dbfaa13262987ec33d3372485ed2b2fc1a1b4f5ad9e36445908f2e7c7ac10" gracePeriod=2 Mar 20 14:40:47 crc kubenswrapper[4973]: I0320 14:40:47.265766 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bvrtk" Mar 20 14:40:47 crc kubenswrapper[4973]: I0320 14:40:47.438450 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dcfb4a8-951e-4354-9668-4dd3386cec76-catalog-content\") pod \"1dcfb4a8-951e-4354-9668-4dd3386cec76\" (UID: \"1dcfb4a8-951e-4354-9668-4dd3386cec76\") " Mar 20 14:40:47 crc kubenswrapper[4973]: I0320 14:40:47.438616 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dcfb4a8-951e-4354-9668-4dd3386cec76-utilities\") pod \"1dcfb4a8-951e-4354-9668-4dd3386cec76\" (UID: \"1dcfb4a8-951e-4354-9668-4dd3386cec76\") " Mar 20 14:40:47 crc kubenswrapper[4973]: I0320 14:40:47.438705 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8h8n\" (UniqueName: \"kubernetes.io/projected/1dcfb4a8-951e-4354-9668-4dd3386cec76-kube-api-access-j8h8n\") pod \"1dcfb4a8-951e-4354-9668-4dd3386cec76\" (UID: \"1dcfb4a8-951e-4354-9668-4dd3386cec76\") " Mar 20 14:40:47 crc kubenswrapper[4973]: I0320 14:40:47.439249 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dcfb4a8-951e-4354-9668-4dd3386cec76-utilities" (OuterVolumeSpecName: "utilities") pod "1dcfb4a8-951e-4354-9668-4dd3386cec76" (UID: "1dcfb4a8-951e-4354-9668-4dd3386cec76"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:40:47 crc kubenswrapper[4973]: I0320 14:40:47.447717 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1dcfb4a8-951e-4354-9668-4dd3386cec76-kube-api-access-j8h8n" (OuterVolumeSpecName: "kube-api-access-j8h8n") pod "1dcfb4a8-951e-4354-9668-4dd3386cec76" (UID: "1dcfb4a8-951e-4354-9668-4dd3386cec76"). InnerVolumeSpecName "kube-api-access-j8h8n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:40:47 crc kubenswrapper[4973]: I0320 14:40:47.465800 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1dcfb4a8-951e-4354-9668-4dd3386cec76-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1dcfb4a8-951e-4354-9668-4dd3386cec76" (UID: "1dcfb4a8-951e-4354-9668-4dd3386cec76"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:40:47 crc kubenswrapper[4973]: I0320 14:40:47.541821 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1dcfb4a8-951e-4354-9668-4dd3386cec76-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:40:47 crc kubenswrapper[4973]: I0320 14:40:47.541863 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j8h8n\" (UniqueName: \"kubernetes.io/projected/1dcfb4a8-951e-4354-9668-4dd3386cec76-kube-api-access-j8h8n\") on node \"crc\" DevicePath \"\"" Mar 20 14:40:47 crc kubenswrapper[4973]: I0320 14:40:47.541879 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1dcfb4a8-951e-4354-9668-4dd3386cec76-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:40:47 crc kubenswrapper[4973]: I0320 14:40:47.618848 4973 generic.go:334] "Generic (PLEG): container finished" podID="1dcfb4a8-951e-4354-9668-4dd3386cec76" containerID="6f8dbfaa13262987ec33d3372485ed2b2fc1a1b4f5ad9e36445908f2e7c7ac10" exitCode=0 Mar 20 14:40:47 crc kubenswrapper[4973]: I0320 14:40:47.619058 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvrtk" event={"ID":"1dcfb4a8-951e-4354-9668-4dd3386cec76","Type":"ContainerDied","Data":"6f8dbfaa13262987ec33d3372485ed2b2fc1a1b4f5ad9e36445908f2e7c7ac10"} Mar 20 14:40:47 crc kubenswrapper[4973]: I0320 14:40:47.619246 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bvrtk" event={"ID":"1dcfb4a8-951e-4354-9668-4dd3386cec76","Type":"ContainerDied","Data":"a6ac1cecabe707e1d4b01ad37dfcb693155a39dc648c72df4dd3f531170a63ef"} Mar 20 14:40:47 crc kubenswrapper[4973]: I0320 14:40:47.619125 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bvrtk" Mar 20 14:40:47 crc kubenswrapper[4973]: I0320 14:40:47.619321 4973 scope.go:117] "RemoveContainer" containerID="6f8dbfaa13262987ec33d3372485ed2b2fc1a1b4f5ad9e36445908f2e7c7ac10" Mar 20 14:40:47 crc kubenswrapper[4973]: I0320 14:40:47.643572 4973 scope.go:117] "RemoveContainer" containerID="324fb6bf4afdbd7e2fe5e31bdc01d86b42d0805cd67be86e3ce38c0d2bc91408" Mar 20 14:40:47 crc kubenswrapper[4973]: I0320 14:40:47.666952 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bvrtk"] Mar 20 14:40:47 crc kubenswrapper[4973]: I0320 14:40:47.678979 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bvrtk"] Mar 20 14:40:47 crc kubenswrapper[4973]: I0320 14:40:47.688616 4973 scope.go:117] "RemoveContainer" containerID="3931070d1367568585a9eac6c878597c6e48400de0a99e235ac80806c1bebf7c" Mar 20 14:40:47 crc kubenswrapper[4973]: I0320 14:40:47.734942 4973 scope.go:117] "RemoveContainer" containerID="6f8dbfaa13262987ec33d3372485ed2b2fc1a1b4f5ad9e36445908f2e7c7ac10" Mar 20 14:40:47 crc kubenswrapper[4973]: E0320 14:40:47.735710 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f8dbfaa13262987ec33d3372485ed2b2fc1a1b4f5ad9e36445908f2e7c7ac10\": container with ID starting with 6f8dbfaa13262987ec33d3372485ed2b2fc1a1b4f5ad9e36445908f2e7c7ac10 not found: ID does not exist" containerID="6f8dbfaa13262987ec33d3372485ed2b2fc1a1b4f5ad9e36445908f2e7c7ac10" Mar 20 14:40:47 crc kubenswrapper[4973]: I0320 14:40:47.735769 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f8dbfaa13262987ec33d3372485ed2b2fc1a1b4f5ad9e36445908f2e7c7ac10"} err="failed to get container status \"6f8dbfaa13262987ec33d3372485ed2b2fc1a1b4f5ad9e36445908f2e7c7ac10\": rpc error: code = NotFound desc = could not find container \"6f8dbfaa13262987ec33d3372485ed2b2fc1a1b4f5ad9e36445908f2e7c7ac10\": container with ID starting with 6f8dbfaa13262987ec33d3372485ed2b2fc1a1b4f5ad9e36445908f2e7c7ac10 not found: ID does not exist" Mar 20 14:40:47 crc kubenswrapper[4973]: I0320 14:40:47.735804 4973 scope.go:117] "RemoveContainer" containerID="324fb6bf4afdbd7e2fe5e31bdc01d86b42d0805cd67be86e3ce38c0d2bc91408" Mar 20 14:40:47 crc kubenswrapper[4973]: E0320 14:40:47.736312 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"324fb6bf4afdbd7e2fe5e31bdc01d86b42d0805cd67be86e3ce38c0d2bc91408\": container with ID starting with 324fb6bf4afdbd7e2fe5e31bdc01d86b42d0805cd67be86e3ce38c0d2bc91408 not found: ID does not exist" containerID="324fb6bf4afdbd7e2fe5e31bdc01d86b42d0805cd67be86e3ce38c0d2bc91408" Mar 20 14:40:47 crc kubenswrapper[4973]: I0320 14:40:47.736340 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"324fb6bf4afdbd7e2fe5e31bdc01d86b42d0805cd67be86e3ce38c0d2bc91408"} err="failed to get container status \"324fb6bf4afdbd7e2fe5e31bdc01d86b42d0805cd67be86e3ce38c0d2bc91408\": rpc error: code = NotFound desc = could not find container \"324fb6bf4afdbd7e2fe5e31bdc01d86b42d0805cd67be86e3ce38c0d2bc91408\": container with ID starting with 324fb6bf4afdbd7e2fe5e31bdc01d86b42d0805cd67be86e3ce38c0d2bc91408 not found: ID does not exist" Mar 20 14:40:47 crc kubenswrapper[4973]: I0320 14:40:47.736374 4973 scope.go:117] "RemoveContainer" containerID="3931070d1367568585a9eac6c878597c6e48400de0a99e235ac80806c1bebf7c" Mar 20 14:40:47 crc kubenswrapper[4973]: E0320 14:40:47.736894 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3931070d1367568585a9eac6c878597c6e48400de0a99e235ac80806c1bebf7c\": container with ID starting with 3931070d1367568585a9eac6c878597c6e48400de0a99e235ac80806c1bebf7c not found: ID does not exist" containerID="3931070d1367568585a9eac6c878597c6e48400de0a99e235ac80806c1bebf7c" Mar 20 14:40:47 crc kubenswrapper[4973]: I0320 14:40:47.736921 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3931070d1367568585a9eac6c878597c6e48400de0a99e235ac80806c1bebf7c"} err="failed to get container status \"3931070d1367568585a9eac6c878597c6e48400de0a99e235ac80806c1bebf7c\": rpc error: code = NotFound desc = could not find container \"3931070d1367568585a9eac6c878597c6e48400de0a99e235ac80806c1bebf7c\": container with ID starting with 3931070d1367568585a9eac6c878597c6e48400de0a99e235ac80806c1bebf7c not found: ID does not exist" Mar 20 14:40:47 crc kubenswrapper[4973]: I0320 14:40:47.963736 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1dcfb4a8-951e-4354-9668-4dd3386cec76" path="/var/lib/kubelet/pods/1dcfb4a8-951e-4354-9668-4dd3386cec76/volumes" Mar 20 14:40:56 crc kubenswrapper[4973]: I0320 14:40:56.197881 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jf9n9" podUID="686a6ce1-7178-44f3-b302-37dac8298893" containerName="registry-server" probeResult="failure" output=< Mar 20 14:40:56 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:40:56 crc kubenswrapper[4973]: > Mar 20 14:41:04 crc kubenswrapper[4973]: I0320 14:41:04.850760 4973 scope.go:117] "RemoveContainer" containerID="04cbc45e5781779a7aab007341507ac872ecd83d2a8ca07a56c43941e229d75c" Mar 20 14:41:06 crc kubenswrapper[4973]: I0320 14:41:06.213532 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jf9n9" podUID="686a6ce1-7178-44f3-b302-37dac8298893" containerName="registry-server" probeResult="failure" output=< Mar 20 14:41:06 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:41:06 crc kubenswrapper[4973]: > Mar 20 14:41:16 crc kubenswrapper[4973]: I0320 14:41:16.240434 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jf9n9" podUID="686a6ce1-7178-44f3-b302-37dac8298893" containerName="registry-server" probeResult="failure" output=< Mar 20 14:41:16 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:41:16 crc kubenswrapper[4973]: > Mar 20 14:41:26 crc kubenswrapper[4973]: I0320 14:41:26.278489 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jf9n9" podUID="686a6ce1-7178-44f3-b302-37dac8298893" containerName="registry-server" probeResult="failure" output=< Mar 20 14:41:26 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:41:26 crc kubenswrapper[4973]: > Mar 20 14:41:36 crc kubenswrapper[4973]: I0320 14:41:36.213514 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jf9n9" podUID="686a6ce1-7178-44f3-b302-37dac8298893" containerName="registry-server" probeResult="failure" output=< Mar 20 14:41:36 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:41:36 crc kubenswrapper[4973]: > Mar 20 14:41:45 crc kubenswrapper[4973]: I0320 14:41:45.394155 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jf9n9" Mar 20 14:41:45 crc kubenswrapper[4973]: I0320 14:41:45.453221 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jf9n9" Mar 20 14:41:46 crc kubenswrapper[4973]: I0320 14:41:46.443104 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jf9n9"] Mar 20 14:41:47 crc kubenswrapper[4973]: I0320 14:41:47.307534 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jf9n9" podUID="686a6ce1-7178-44f3-b302-37dac8298893" containerName="registry-server" containerID="cri-o://f73122d8bf92463f4e12f0a2d98f4cd530a3341831913192ac587e7b08c2a756" gracePeriod=2 Mar 20 14:41:48 crc kubenswrapper[4973]: I0320 14:41:48.313792 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jf9n9" event={"ID":"686a6ce1-7178-44f3-b302-37dac8298893","Type":"ContainerDied","Data":"f73122d8bf92463f4e12f0a2d98f4cd530a3341831913192ac587e7b08c2a756"} Mar 20 14:41:48 crc kubenswrapper[4973]: I0320 14:41:48.313700 4973 generic.go:334] "Generic (PLEG): container finished" podID="686a6ce1-7178-44f3-b302-37dac8298893" containerID="f73122d8bf92463f4e12f0a2d98f4cd530a3341831913192ac587e7b08c2a756" exitCode=0 Mar 20 14:41:48 crc kubenswrapper[4973]: I0320 14:41:48.873409 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jf9n9" Mar 20 14:41:49 crc kubenswrapper[4973]: I0320 14:41:49.051991 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/686a6ce1-7178-44f3-b302-37dac8298893-utilities\") pod \"686a6ce1-7178-44f3-b302-37dac8298893\" (UID: \"686a6ce1-7178-44f3-b302-37dac8298893\") " Mar 20 14:41:49 crc kubenswrapper[4973]: I0320 14:41:49.052156 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/686a6ce1-7178-44f3-b302-37dac8298893-catalog-content\") pod \"686a6ce1-7178-44f3-b302-37dac8298893\" (UID: \"686a6ce1-7178-44f3-b302-37dac8298893\") " Mar 20 14:41:49 crc kubenswrapper[4973]: I0320 14:41:49.052190 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cph2f\" (UniqueName: \"kubernetes.io/projected/686a6ce1-7178-44f3-b302-37dac8298893-kube-api-access-cph2f\") pod \"686a6ce1-7178-44f3-b302-37dac8298893\" (UID: \"686a6ce1-7178-44f3-b302-37dac8298893\") " Mar 20 14:41:49 crc kubenswrapper[4973]: I0320 14:41:49.061796 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/686a6ce1-7178-44f3-b302-37dac8298893-utilities" (OuterVolumeSpecName: "utilities") pod "686a6ce1-7178-44f3-b302-37dac8298893" (UID: "686a6ce1-7178-44f3-b302-37dac8298893"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:41:49 crc kubenswrapper[4973]: I0320 14:41:49.098472 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/686a6ce1-7178-44f3-b302-37dac8298893-kube-api-access-cph2f" (OuterVolumeSpecName: "kube-api-access-cph2f") pod "686a6ce1-7178-44f3-b302-37dac8298893" (UID: "686a6ce1-7178-44f3-b302-37dac8298893"). InnerVolumeSpecName "kube-api-access-cph2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:41:49 crc kubenswrapper[4973]: I0320 14:41:49.156593 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/686a6ce1-7178-44f3-b302-37dac8298893-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:41:49 crc kubenswrapper[4973]: I0320 14:41:49.156635 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cph2f\" (UniqueName: \"kubernetes.io/projected/686a6ce1-7178-44f3-b302-37dac8298893-kube-api-access-cph2f\") on node \"crc\" DevicePath \"\"" Mar 20 14:41:49 crc kubenswrapper[4973]: I0320 14:41:49.250500 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/686a6ce1-7178-44f3-b302-37dac8298893-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "686a6ce1-7178-44f3-b302-37dac8298893" (UID: "686a6ce1-7178-44f3-b302-37dac8298893"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:41:49 crc kubenswrapper[4973]: I0320 14:41:49.259481 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/686a6ce1-7178-44f3-b302-37dac8298893-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:41:49 crc kubenswrapper[4973]: I0320 14:41:49.328988 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jf9n9" event={"ID":"686a6ce1-7178-44f3-b302-37dac8298893","Type":"ContainerDied","Data":"4c91a0381bec599758fce448dec05b8f05e13c8f9fdb76a9f9a51caa0ae41022"} Mar 20 14:41:49 crc kubenswrapper[4973]: I0320 14:41:49.329082 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jf9n9" Mar 20 14:41:49 crc kubenswrapper[4973]: I0320 14:41:49.333292 4973 scope.go:117] "RemoveContainer" containerID="f73122d8bf92463f4e12f0a2d98f4cd530a3341831913192ac587e7b08c2a756" Mar 20 14:41:49 crc kubenswrapper[4973]: I0320 14:41:49.368500 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jf9n9"] Mar 20 14:41:49 crc kubenswrapper[4973]: I0320 14:41:49.380856 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jf9n9"] Mar 20 14:41:49 crc kubenswrapper[4973]: I0320 14:41:49.382036 4973 scope.go:117] "RemoveContainer" containerID="e9e8ddae97a91531c827f7ee6d60248350d47a7dc2274fbb594db1b6b3061088" Mar 20 14:41:49 crc kubenswrapper[4973]: I0320 14:41:49.410876 4973 scope.go:117] "RemoveContainer" containerID="dc585d0d22a1956905b38c92586364a6e333b00480bde73baef0050e4fb2afc1" Mar 20 14:41:49 crc kubenswrapper[4973]: I0320 14:41:49.967848 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="686a6ce1-7178-44f3-b302-37dac8298893" path="/var/lib/kubelet/pods/686a6ce1-7178-44f3-b302-37dac8298893/volumes" Mar 20 14:41:57 crc kubenswrapper[4973]: I0320 14:41:57.241451 4973 patch_prober.go:28] interesting pod/monitoring-plugin-5f8fb459cf-n7bzv container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.88:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:41:57 crc kubenswrapper[4973]: I0320 14:41:57.245127 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-5f8fb459cf-n7bzv" podUID="a8d216d7-808c-4312-a314-729e369ee963" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.88:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:41:57 crc kubenswrapper[4973]: I0320 14:41:57.367592 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv" podUID="3c89c7dd-500b-4bd5-a30e-273c2a485728" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:41:57 crc kubenswrapper[4973]: I0320 14:41:57.408559 4973 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-z2rch container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:41:57 crc kubenswrapper[4973]: I0320 14:41:57.408617 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" podUID="925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.16:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:41:57 crc kubenswrapper[4973]: I0320 14:41:57.408562 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv" podUID="3c89c7dd-500b-4bd5-a30e-273c2a485728" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:41:57 crc kubenswrapper[4973]: I0320 14:41:57.408634 4973 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-z2rch container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.16:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:41:57 crc kubenswrapper[4973]: I0320 14:41:57.408753 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" podUID="925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.16:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:41:57 crc kubenswrapper[4973]: I0320 14:41:57.785440 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="27fed14c-9051-4d46-80d5-badf224805a9" containerName="galera" probeResult="failure" output="command timed out" Mar 20 14:41:57 crc kubenswrapper[4973]: I0320 14:41:57.785713 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="27fed14c-9051-4d46-80d5-badf224805a9" containerName="galera" probeResult="failure" output="command timed out" Mar 20 14:41:57 crc kubenswrapper[4973]: I0320 14:41:57.934523 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-cgmjd" podUID="75fc4720-ae9c-4ae5-8e4c-7c9a800f5478" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:41:59 crc kubenswrapper[4973]: I0320 14:41:59.852418 4973 patch_prober.go:28] interesting pod/oauth-openshift-567cd76c58-zvtsd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.71:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:41:59 crc kubenswrapper[4973]: I0320 14:41:59.852527 4973 patch_prober.go:28] interesting pod/oauth-openshift-567cd76c58-zvtsd container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.71:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:41:59 crc kubenswrapper[4973]: I0320 14:41:59.853038 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" podUID="fcfe2a5b-dd17-425e-8024-655606a1c470" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.71:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:41:59 crc kubenswrapper[4973]: I0320 14:41:59.853096 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" podUID="fcfe2a5b-dd17-425e-8024-655606a1c470" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.71:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:00 crc kubenswrapper[4973]: I0320 14:42:00.944594 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566962-2sqjf"] Mar 20 14:42:00 crc kubenswrapper[4973]: E0320 14:42:00.947010 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dcfb4a8-951e-4354-9668-4dd3386cec76" containerName="extract-content" Mar 20 14:42:00 crc kubenswrapper[4973]: I0320 14:42:00.947031 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dcfb4a8-951e-4354-9668-4dd3386cec76" containerName="extract-content" Mar 20 14:42:00 crc kubenswrapper[4973]: E0320 14:42:00.947400 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dcfb4a8-951e-4354-9668-4dd3386cec76" containerName="extract-utilities" Mar 20 14:42:00 crc kubenswrapper[4973]: I0320 14:42:00.947431 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dcfb4a8-951e-4354-9668-4dd3386cec76" containerName="extract-utilities" Mar 20 14:42:00 crc kubenswrapper[4973]: E0320 14:42:00.947441 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1dcfb4a8-951e-4354-9668-4dd3386cec76" containerName="registry-server" Mar 20 14:42:00 crc kubenswrapper[4973]: I0320 14:42:00.947447 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="1dcfb4a8-951e-4354-9668-4dd3386cec76" containerName="registry-server" Mar 20 14:42:00 crc kubenswrapper[4973]: E0320 14:42:00.947466 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="686a6ce1-7178-44f3-b302-37dac8298893" containerName="extract-content" Mar 20 14:42:00 crc kubenswrapper[4973]: I0320 14:42:00.947472 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="686a6ce1-7178-44f3-b302-37dac8298893" containerName="extract-content" Mar 20 14:42:00 crc kubenswrapper[4973]: E0320 14:42:00.947491 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="686a6ce1-7178-44f3-b302-37dac8298893" containerName="registry-server" Mar 20 14:42:00 crc kubenswrapper[4973]: I0320 14:42:00.947496 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="686a6ce1-7178-44f3-b302-37dac8298893" containerName="registry-server" Mar 20 14:42:00 crc kubenswrapper[4973]: E0320 14:42:00.947525 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="686a6ce1-7178-44f3-b302-37dac8298893" containerName="extract-utilities" Mar 20 14:42:00 crc kubenswrapper[4973]: I0320 14:42:00.947531 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="686a6ce1-7178-44f3-b302-37dac8298893" containerName="extract-utilities" Mar 20 14:42:00 crc kubenswrapper[4973]: I0320 14:42:00.948117 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="1dcfb4a8-951e-4354-9668-4dd3386cec76" containerName="registry-server" Mar 20 14:42:00 crc kubenswrapper[4973]: I0320 14:42:00.948134 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="686a6ce1-7178-44f3-b302-37dac8298893" containerName="registry-server" Mar 20 14:42:00 crc kubenswrapper[4973]: I0320 14:42:00.955163 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566962-2sqjf" Mar 20 14:42:01 crc kubenswrapper[4973]: I0320 14:42:01.001053 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:42:01 crc kubenswrapper[4973]: I0320 14:42:01.001059 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:42:01 crc kubenswrapper[4973]: I0320 14:42:01.001060 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:42:01 crc kubenswrapper[4973]: I0320 14:42:01.052957 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpgzk\" (UniqueName: \"kubernetes.io/projected/46a36df3-9f0c-4e6a-b959-5542da9c7534-kube-api-access-gpgzk\") pod \"auto-csr-approver-29566962-2sqjf\" (UID: \"46a36df3-9f0c-4e6a-b959-5542da9c7534\") " pod="openshift-infra/auto-csr-approver-29566962-2sqjf" Mar 20 14:42:01 crc kubenswrapper[4973]: I0320 14:42:01.155707 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpgzk\" (UniqueName: \"kubernetes.io/projected/46a36df3-9f0c-4e6a-b959-5542da9c7534-kube-api-access-gpgzk\") pod \"auto-csr-approver-29566962-2sqjf\" (UID: \"46a36df3-9f0c-4e6a-b959-5542da9c7534\") " pod="openshift-infra/auto-csr-approver-29566962-2sqjf" Mar 20 14:42:01 crc kubenswrapper[4973]: I0320 14:42:01.230995 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566962-2sqjf"] Mar 20 14:42:01 crc kubenswrapper[4973]: I0320 14:42:01.249632 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpgzk\" (UniqueName: \"kubernetes.io/projected/46a36df3-9f0c-4e6a-b959-5542da9c7534-kube-api-access-gpgzk\") pod \"auto-csr-approver-29566962-2sqjf\" (UID: \"46a36df3-9f0c-4e6a-b959-5542da9c7534\") " pod="openshift-infra/auto-csr-approver-29566962-2sqjf" Mar 20 14:42:01 crc kubenswrapper[4973]: I0320 14:42:01.339546 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566962-2sqjf" Mar 20 14:42:02 crc kubenswrapper[4973]: I0320 14:42:02.984109 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566962-2sqjf"] Mar 20 14:42:03 crc kubenswrapper[4973]: I0320 14:42:03.017008 4973 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:42:03 crc kubenswrapper[4973]: I0320 14:42:03.521288 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566962-2sqjf" event={"ID":"46a36df3-9f0c-4e6a-b959-5542da9c7534","Type":"ContainerStarted","Data":"b670a7367358d5bcfe0c87ee2f7299bd07ba0ced252654812117a2d8b3d687e3"} Mar 20 14:42:05 crc kubenswrapper[4973]: I0320 14:42:05.544597 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566962-2sqjf" event={"ID":"46a36df3-9f0c-4e6a-b959-5542da9c7534","Type":"ContainerStarted","Data":"cd991d7527d172dd7c86e149b200189738b1b0a10c0503f34f08c3e440cfb9ee"} Mar 20 14:42:06 crc kubenswrapper[4973]: I0320 14:42:06.930945 4973 patch_prober.go:28] interesting pod/metrics-server-55f4d8dbbb-bmckj container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.87:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:06 crc kubenswrapper[4973]: I0320 14:42:06.930954 4973 patch_prober.go:28] interesting pod/router-default-5444994796-wgtgl container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:06 crc kubenswrapper[4973]: I0320 14:42:06.932074 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-wgtgl" podUID="0e880f23-4bef-4e96-bf00-c94dc4551c5a" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:06 crc kubenswrapper[4973]: I0320 14:42:06.932163 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" podUID="57c75085-cf04-49d6-8b97-902e00c0efd5" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.87:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:06 crc kubenswrapper[4973]: I0320 14:42:06.931029 4973 patch_prober.go:28] interesting pod/metrics-server-55f4d8dbbb-bmckj container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.87:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:06 crc kubenswrapper[4973]: I0320 14:42:06.932527 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" podUID="57c75085-cf04-49d6-8b97-902e00c0efd5" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.87:10250/livez\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:07 crc kubenswrapper[4973]: I0320 14:42:07.034516 4973 patch_prober.go:28] interesting pod/console-operator-58897d9998-4bk2w container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:07 crc kubenswrapper[4973]: I0320 14:42:07.034884 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-4bk2w" podUID="5f512921-f02c-464b-af06-d65fb95f0071" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:07 crc kubenswrapper[4973]: I0320 14:42:07.034744 4973 patch_prober.go:28] interesting pod/console-operator-58897d9998-4bk2w container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:07 crc kubenswrapper[4973]: I0320 14:42:07.035005 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-4bk2w" podUID="5f512921-f02c-464b-af06-d65fb95f0071" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:07 crc kubenswrapper[4973]: I0320 14:42:07.083102 4973 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-4ddcl container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:07 crc kubenswrapper[4973]: I0320 14:42:07.083110 4973 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-4ddcl container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:07 crc kubenswrapper[4973]: I0320 14:42:07.083415 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4ddcl" podUID="6de410a6-e2d0-4c6d-8a38-d8d5c49a30e5" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:07 crc kubenswrapper[4973]: I0320 14:42:07.083375 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4ddcl" podUID="6de410a6-e2d0-4c6d-8a38-d8d5c49a30e5" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:07 crc kubenswrapper[4973]: I0320 14:42:07.148384 4973 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-kz2ff container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:07 crc kubenswrapper[4973]: I0320 14:42:07.148464 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kz2ff" podUID="5e47b273-3bc0-4f32-8bdb-aa283db4d8a1" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:07 crc kubenswrapper[4973]: I0320 14:42:07.148392 4973 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-kz2ff container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.38:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:07 crc kubenswrapper[4973]: I0320 14:42:07.148569 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-kz2ff" podUID="5e47b273-3bc0-4f32-8bdb-aa283db4d8a1" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.38:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:07 crc kubenswrapper[4973]: I0320 14:42:07.236551 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" podUID="d74cf88e-0824-45f2-92ff-3798ad77f943" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:07 crc kubenswrapper[4973]: I0320 14:42:07.236606 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" podUID="d74cf88e-0824-45f2-92ff-3798ad77f943" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:07 crc kubenswrapper[4973]: I0320 14:42:07.237709 4973 patch_prober.go:28] interesting pod/monitoring-plugin-5f8fb459cf-n7bzv container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.88:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:07 crc kubenswrapper[4973]: I0320 14:42:07.237791 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-5f8fb459cf-n7bzv" podUID="a8d216d7-808c-4312-a314-729e369ee963" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.88:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:07 crc kubenswrapper[4973]: I0320 14:42:07.314998 4973 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:07 crc kubenswrapper[4973]: I0320 14:42:07.315067 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:07 crc kubenswrapper[4973]: I0320 14:42:07.366532 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv" podUID="3c89c7dd-500b-4bd5-a30e-273c2a485728" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:07 crc kubenswrapper[4973]: I0320 14:42:07.376928 4973 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-z2rch container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:07 crc kubenswrapper[4973]: I0320 14:42:07.376992 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" podUID="925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.16:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:07 crc kubenswrapper[4973]: I0320 14:42:07.376995 4973 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-z2rch container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.16:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:07 crc kubenswrapper[4973]: I0320 14:42:07.377071 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" podUID="925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.16:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:07 crc kubenswrapper[4973]: I0320 14:42:07.784390 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="27fed14c-9051-4d46-80d5-badf224805a9" containerName="galera" probeResult="failure" output="command timed out" Mar 20 14:42:07 crc kubenswrapper[4973]: I0320 14:42:07.785390 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="27fed14c-9051-4d46-80d5-badf224805a9" containerName="galera" probeResult="failure" output="command timed out" Mar 20 14:42:07 crc kubenswrapper[4973]: I0320 14:42:07.933543 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-cgmjd" podUID="75fc4720-ae9c-4ae5-8e4c-7c9a800f5478" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:08 crc kubenswrapper[4973]: I0320 14:42:08.184564 4973 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-nhgtl container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.19:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:08 crc kubenswrapper[4973]: I0320 14:42:08.184624 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-6dd7dd855f-nhgtl" podUID="0e53f263-96c0-4390-b28e-ca37e867101b" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.19:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:08 crc kubenswrapper[4973]: I0320 14:42:08.209805 4973 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-nhgtl container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.19:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:08 crc kubenswrapper[4973]: I0320 14:42:08.209930 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-6dd7dd855f-nhgtl" podUID="0e53f263-96c0-4390-b28e-ca37e867101b" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.19:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:08 crc kubenswrapper[4973]: I0320 14:42:08.542550 4973 patch_prober.go:28] interesting pod/perses-operator-9b89954cc-wfdgp container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.26:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:08 crc kubenswrapper[4973]: I0320 14:42:08.542616 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-9b89954cc-wfdgp" podUID="a19fcda0-339c-4f0e-9f54-5a2f76c934c5" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.26:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:08 crc kubenswrapper[4973]: I0320 14:42:08.784889 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="4bdbb8eb-c36d-43f0-a705-3b3e59128b7f" containerName="galera" probeResult="failure" output="command timed out" Mar 20 14:42:08 crc kubenswrapper[4973]: I0320 14:42:08.784971 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="4bdbb8eb-c36d-43f0-a705-3b3e59128b7f" containerName="galera" probeResult="failure" output="command timed out" Mar 20 14:42:09 crc kubenswrapper[4973]: I0320 14:42:09.852807 4973 patch_prober.go:28] interesting pod/oauth-openshift-567cd76c58-zvtsd container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.71:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:09 crc kubenswrapper[4973]: I0320 14:42:09.853144 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" podUID="fcfe2a5b-dd17-425e-8024-655606a1c470" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.71:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:09 crc kubenswrapper[4973]: I0320 14:42:09.852923 4973 patch_prober.go:28] interesting pod/oauth-openshift-567cd76c58-zvtsd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.71:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:09 crc kubenswrapper[4973]: I0320 14:42:09.853276 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" podUID="fcfe2a5b-dd17-425e-8024-655606a1c470" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.71:6443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:11 crc kubenswrapper[4973]: I0320 14:42:11.069038 4973 patch_prober.go:28] interesting pod/logging-loki-gateway-fbc7bc644-sw7l9 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:11 crc kubenswrapper[4973]: I0320 14:42:11.070092 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" podUID="d4e8002e-56ed-40a4-a768-9fd6a44d891c" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:11 crc kubenswrapper[4973]: I0320 14:42:11.077819 4973 patch_prober.go:28] interesting pod/logging-loki-gateway-fbc7bc644-ltv8r container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:11 crc kubenswrapper[4973]: I0320 14:42:11.077889 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" podUID="9ff158ae-7281-4d5c-95cf-ff14e136c414" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:11 crc kubenswrapper[4973]: I0320 14:42:11.416014 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-h5hz4" podUID="b450ebc6-3181-4e0d-b546-b10ac89e0481" containerName="registry-server" probeResult="failure" output=< Mar 20 14:42:11 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:42:11 crc kubenswrapper[4973]: > Mar 20 14:42:11 crc kubenswrapper[4973]: I0320 14:42:11.416176 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-h5hz4" podUID="b450ebc6-3181-4e0d-b546-b10ac89e0481" containerName="registry-server" probeResult="failure" output=< Mar 20 14:42:11 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:42:11 crc kubenswrapper[4973]: > Mar 20 14:42:11 crc kubenswrapper[4973]: I0320 14:42:11.788890 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="1cde80e1-f72d-4080-a86b-5968a8904333" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 20 14:42:12 crc kubenswrapper[4973]: I0320 14:42:12.198964 4973 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:12 crc kubenswrapper[4973]: I0320 14:42:12.200426 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:12 crc kubenswrapper[4973]: I0320 14:42:12.216216 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-fpwkr" podUID="57731a76-c496-43ff-afea-a5685864a2f3" containerName="registry-server" probeResult="failure" output=< Mar 20 14:42:12 crc kubenswrapper[4973]: timeout: health rpc did not complete within 1s Mar 20 14:42:12 crc kubenswrapper[4973]: > Mar 20 14:42:12 crc kubenswrapper[4973]: I0320 14:42:12.216218 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-fpwkr" podUID="57731a76-c496-43ff-afea-a5685864a2f3" containerName="registry-server" probeResult="failure" output=< Mar 20 14:42:12 crc kubenswrapper[4973]: timeout: health rpc did not complete within 1s Mar 20 14:42:12 crc kubenswrapper[4973]: > Mar 20 14:42:12 crc kubenswrapper[4973]: I0320 14:42:12.406552 4973 patch_prober.go:28] interesting pod/loki-operator-controller-manager-6996757d8d-46qmw container/manager namespace/openshift-operators-redhat: Readiness probe status=failure output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:12 crc kubenswrapper[4973]: I0320 14:42:12.406615 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators-redhat/loki-operator-controller-manager-6996757d8d-46qmw" podUID="3bf2a551-4944-4096-99f4-03effa26dde8" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.49:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:12 crc kubenswrapper[4973]: I0320 14:42:12.533580 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-4qwzv" podUID="6f5a8b02-59f4-427d-b91d-e7cacaa1ba23" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.40:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:12 crc kubenswrapper[4973]: I0320 14:42:12.533590 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="cert-manager/cert-manager-webhook-687f57d79b-4qwzv" podUID="6f5a8b02-59f4-427d-b91d-e7cacaa1ba23" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.40:6080/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:12 crc kubenswrapper[4973]: I0320 14:42:12.860545 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" podUID="835537e8-dced-4516-a7b9-168d9bb6b687" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.126:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:13 crc kubenswrapper[4973]: I0320 14:42:13.115051 4973 trace.go:236] Trace[1471522276]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-cell1-server-0" (20-Mar-2026 14:42:10.818) (total time: 2286ms): Mar 20 14:42:13 crc kubenswrapper[4973]: Trace[1471522276]: [2.286848454s] [2.286848454s] END Mar 20 14:42:13 crc kubenswrapper[4973]: I0320 14:42:13.603603 4973 patch_prober.go:28] interesting pod/thanos-querier-55bcd89946-hkjh8 container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.85:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:13 crc kubenswrapper[4973]: I0320 14:42:13.603664 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" podUID="db81edf4-da9e-422b-b1cc-7842cf2c1183" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.85:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:13 crc kubenswrapper[4973]: I0320 14:42:13.894551 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-init-6f7459b8bf-nf8rm" podUID="bad13d41-c3be-4f23-b40f-f621e669ef5b" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.103:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:14 crc kubenswrapper[4973]: I0320 14:42:14.402533 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-w7228" podUID="9d106cd3-cadb-4cc7-b237-f05294c67dcd" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.105:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:14 crc kubenswrapper[4973]: I0320 14:42:14.443558 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-pzl6t" podUID="dbb02721-66ce-44b6-bffe-59851197efa8" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.106:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:14 crc kubenswrapper[4973]: I0320 14:42:14.511650 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-rm7q7" podUID="7260cd47-ce83-44db-951d-757908bf5953" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.109:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:14 crc kubenswrapper[4973]: I0320 14:42:14.877556 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-jn8rq" podUID="c0abcba1-e57c-4d90-a8cb-61989da15e87" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.113:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:14 crc kubenswrapper[4973]: I0320 14:42:14.959642 4973 patch_prober.go:28] interesting pod/controller-manager-6f85f9899-lkzwb container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:14 crc kubenswrapper[4973]: I0320 14:42:14.960184 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" podUID="e04b3fba-1427-496a-b880-f61867a2c3ac" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:15 crc kubenswrapper[4973]: I0320 14:42:15.000644 4973 patch_prober.go:28] interesting pod/route-controller-manager-b56c97db-xrm2h container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:15 crc kubenswrapper[4973]: I0320 14:42:15.001003 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" podUID="451656d5-3bd4-402b-98b9-202b3ac829e1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:15 crc kubenswrapper[4973]: I0320 14:42:15.042551 4973 patch_prober.go:28] interesting pod/route-controller-manager-b56c97db-xrm2h container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:15 crc kubenswrapper[4973]: I0320 14:42:15.042647 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" podUID="451656d5-3bd4-402b-98b9-202b3ac829e1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:15 crc kubenswrapper[4973]: I0320 14:42:15.125628 4973 patch_prober.go:28] interesting pod/controller-manager-6f85f9899-lkzwb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:15 crc kubenswrapper[4973]: I0320 14:42:15.125620 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-crjg2" podUID="97aab498-21c1-476f-a64b-a526745fc64a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.115:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:15 crc kubenswrapper[4973]: I0320 14:42:15.125675 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" podUID="e04b3fba-1427-496a-b880-f61867a2c3ac" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:15 crc kubenswrapper[4973]: I0320 14:42:15.174668 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-p6vx2" podUID="d5a271f2-b17d-487d-a61b-00bd17841392" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.117:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:15 crc kubenswrapper[4973]: I0320 14:42:15.174968 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/manila-operator-controller-manager-55f864c847-gs46s" podUID="530d31a0-48a0-4d06-9b03-c9c205312bdc" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.114:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:15 crc kubenswrapper[4973]: I0320 14:42:15.212164 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-x4r2t" podUID="39227253-9885-4ba2-a216-c04066dc7c84" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:15 crc kubenswrapper[4973]: I0320 14:42:15.599605 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2zn7v" podUID="991cca2c-022f-4c90-a1ba-287191fc2d49" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:15 crc kubenswrapper[4973]: I0320 14:42:15.652117 4973 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-b5mdp container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:15 crc kubenswrapper[4973]: I0320 14:42:15.652172 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-b5mdp" podUID="870a7fc7-0aac-45df-857e-dba72c60f80a" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.52:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:15 crc kubenswrapper[4973]: I0320 14:42:15.860728 4973 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-zjf9g container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:15 crc kubenswrapper[4973]: I0320 14:42:15.860818 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g" podUID="727aa0fc-f2ea-4183-a168-24918669937b" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:15 crc kubenswrapper[4973]: I0320 14:42:15.895688 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xlqnk" podUID="aa34abe0-30d3-4d49-9f20-c15990a91a36" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.124:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:15 crc kubenswrapper[4973]: I0320 14:42:15.899722 4973 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-ssp8t container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": context deadline exceeded" start-of-body= Mar 20 14:42:15 crc kubenswrapper[4973]: I0320 14:42:15.899791 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-ssp8t" podUID="d08579c3-9bb3-4b06-a613-0b81a2d7fb44" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": context deadline exceeded" Mar 20 14:42:16 crc kubenswrapper[4973]: I0320 14:42:16.109171 4973 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-kfft6 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.72:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:16 crc kubenswrapper[4973]: I0320 14:42:16.109243 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-kfft6" podUID="2843ad35-cfc0-4922-8b96-cebb15694c99" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.72:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:16 crc kubenswrapper[4973]: I0320 14:42:16.109396 4973 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-kfft6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.72:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:16 crc kubenswrapper[4973]: I0320 14:42:16.109439 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-kfft6" podUID="2843ad35-cfc0-4922-8b96-cebb15694c99" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.72:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:16 crc kubenswrapper[4973]: I0320 14:42:16.789156 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="1cde80e1-f72d-4080-a86b-5968a8904333" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 20 14:42:16 crc kubenswrapper[4973]: I0320 14:42:16.846554 4973 patch_prober.go:28] interesting pod/metrics-server-55f4d8dbbb-bmckj container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.87:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:16 crc kubenswrapper[4973]: I0320 14:42:16.846620 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" podUID="57c75085-cf04-49d6-8b97-902e00c0efd5" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.87:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:16 crc kubenswrapper[4973]: I0320 14:42:16.933520 4973 patch_prober.go:28] interesting pod/router-default-5444994796-wgtgl container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:16 crc kubenswrapper[4973]: I0320 14:42:16.933585 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-wgtgl" podUID="0e880f23-4bef-4e96-bf00-c94dc4551c5a" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:16 crc kubenswrapper[4973]: I0320 14:42:16.974561 4973 patch_prober.go:28] interesting pod/router-default-5444994796-wgtgl container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:16 crc kubenswrapper[4973]: I0320 14:42:16.974637 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-wgtgl" podUID="0e880f23-4bef-4e96-bf00-c94dc4551c5a" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.056523 4973 patch_prober.go:28] interesting pod/console-operator-58897d9998-4bk2w container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.056579 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-4bk2w" podUID="5f512921-f02c-464b-af06-d65fb95f0071" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.056769 4973 patch_prober.go:28] interesting pod/console-operator-58897d9998-4bk2w container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.056787 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-4bk2w" podUID="5f512921-f02c-464b-af06-d65fb95f0071" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.097537 4973 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-c4gcj container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.097620 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-c4gcj" podUID="322233fd-b71f-4ef5-931f-58e98326386a" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.234527 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" podUID="d74cf88e-0824-45f2-92ff-3798ad77f943" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.234541 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" podUID="d74cf88e-0824-45f2-92ff-3798ad77f943" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.238572 4973 patch_prober.go:28] interesting pod/monitoring-plugin-5f8fb459cf-n7bzv container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.88:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.238648 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-5f8fb459cf-n7bzv" podUID="a8d216d7-808c-4312-a314-729e369ee963" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.88:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.241935 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-5f8fb459cf-n7bzv" Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.314797 4973 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.314854 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.409498 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv" podUID="3c89c7dd-500b-4bd5-a30e-273c2a485728" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.409629 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv" Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.409607 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv" podUID="3c89c7dd-500b-4bd5-a30e-273c2a485728" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.409794 4973 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-z2rch container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.16:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.409823 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" podUID="925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.16:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.409860 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.410047 4973 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-z2rch container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.410099 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" podUID="925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.16:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.410252 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.415254 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="packageserver" containerStatusID={"Type":"cri-o","ID":"6526dc6a5249aa733a59e5eaf020bcb049f6ee208c7b5a098a4d4625af1109f0"} pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" containerMessage="Container packageserver failed liveness probe, will be restarted" Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.417151 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" podUID="925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e" containerName="packageserver" containerID="cri-o://6526dc6a5249aa733a59e5eaf020bcb049f6ee208c7b5a098a4d4625af1109f0" gracePeriod=30 Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.783552 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-galera-0" podUID="27fed14c-9051-4d46-80d5-badf224805a9" containerName="galera" probeResult="failure" output="command timed out" Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.783630 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/openstack-galera-0" Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.784739 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="27fed14c-9051-4d46-80d5-badf224805a9" containerName="galera" probeResult="failure" output="command timed out" Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.784855 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.785250 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="galera" containerStatusID={"Type":"cri-o","ID":"896517b7ebc700d1ed501fe4fd56b18472ac17a48bafa5e5f2cc8d7a6847cfc7"} pod="openstack/openstack-galera-0" containerMessage="Container galera failed liveness probe, will be restarted" Mar 20 14:42:17 crc kubenswrapper[4973]: I0320 14:42:17.802060 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5f8fb459cf-n7bzv" Mar 20 14:42:18 crc kubenswrapper[4973]: I0320 14:42:18.018796 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjhq2" podUID="1221336e-652c-45b4-bd66-43e96cf2c643" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:18 crc kubenswrapper[4973]: I0320 14:42:18.018799 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-cgmjd" podUID="75fc4720-ae9c-4ae5-8e4c-7c9a800f5478" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:18 crc kubenswrapper[4973]: I0320 14:42:18.019078 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-cgmjd" podUID="75fc4720-ae9c-4ae5-8e4c-7c9a800f5478" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:18 crc kubenswrapper[4973]: I0320 14:42:18.019261 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-cgmjd" podUID="75fc4720-ae9c-4ae5-8e4c-7c9a800f5478" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:18 crc kubenswrapper[4973]: I0320 14:42:18.019382 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/frr-k8s-cgmjd" Mar 20 14:42:18 crc kubenswrapper[4973]: I0320 14:42:18.022135 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="frr" containerStatusID={"Type":"cri-o","ID":"3da8094c7961b1d5508014f85a570e7a748f3540826fec30ffabc2a2bad449ed"} pod="metallb-system/frr-k8s-cgmjd" containerMessage="Container frr failed liveness probe, will be restarted" Mar 20 14:42:18 crc kubenswrapper[4973]: I0320 14:42:18.022255 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/frr-k8s-cgmjd" podUID="75fc4720-ae9c-4ae5-8e4c-7c9a800f5478" containerName="frr" containerID="cri-o://3da8094c7961b1d5508014f85a570e7a748f3540826fec30ffabc2a2bad449ed" gracePeriod=2 Mar 20 14:42:18 crc kubenswrapper[4973]: I0320 14:42:18.098522 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv" Mar 20 14:42:18 crc kubenswrapper[4973]: I0320 14:42:18.216244 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-kwvd2" podUID="f580709c-eab2-41f5-96b4-2e32cf02cdcb" containerName="registry-server" probeResult="failure" output=< Mar 20 14:42:18 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:42:18 crc kubenswrapper[4973]: > Mar 20 14:42:18 crc kubenswrapper[4973]: I0320 14:42:18.217890 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-kwvd2" podUID="f580709c-eab2-41f5-96b4-2e32cf02cdcb" containerName="registry-server" probeResult="failure" output=< Mar 20 14:42:18 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:42:18 crc kubenswrapper[4973]: > Mar 20 14:42:18 crc kubenswrapper[4973]: I0320 14:42:18.727173 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cgmjd" event={"ID":"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478","Type":"ContainerDied","Data":"3da8094c7961b1d5508014f85a570e7a748f3540826fec30ffabc2a2bad449ed"} Mar 20 14:42:18 crc kubenswrapper[4973]: I0320 14:42:18.727755 4973 generic.go:334] "Generic (PLEG): container finished" podID="75fc4720-ae9c-4ae5-8e4c-7c9a800f5478" containerID="3da8094c7961b1d5508014f85a570e7a748f3540826fec30ffabc2a2bad449ed" exitCode=143 Mar 20 14:42:19 crc kubenswrapper[4973]: I0320 14:42:19.533875 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566962-2sqjf" podStartSLOduration=18.555545399 podStartE2EDuration="19.529456502s" podCreationTimestamp="2026-03-20 14:42:00 +0000 UTC" firstStartedPulling="2026-03-20 14:42:03.012211141 +0000 UTC m=+4843.755880885" lastFinishedPulling="2026-03-20 14:42:03.986122244 +0000 UTC m=+4844.729791988" observedRunningTime="2026-03-20 14:42:05.56301677 +0000 UTC m=+4846.306686514" watchObservedRunningTime="2026-03-20 14:42:19.529456502 +0000 UTC m=+4860.273126246" Mar 20 14:42:19 crc kubenswrapper[4973]: I0320 14:42:19.740083 4973 generic.go:334] "Generic (PLEG): container finished" podID="925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e" containerID="6526dc6a5249aa733a59e5eaf020bcb049f6ee208c7b5a098a4d4625af1109f0" exitCode=0 Mar 20 14:42:19 crc kubenswrapper[4973]: I0320 14:42:19.740199 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" event={"ID":"925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e","Type":"ContainerDied","Data":"6526dc6a5249aa733a59e5eaf020bcb049f6ee208c7b5a098a4d4625af1109f0"} Mar 20 14:42:19 crc kubenswrapper[4973]: I0320 14:42:19.745245 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-cgmjd" event={"ID":"75fc4720-ae9c-4ae5-8e4c-7c9a800f5478","Type":"ContainerStarted","Data":"3b82410cac570b5f778a421a6b157466c4f4e46d50fffffe19e82e8a8797dcb9"} Mar 20 14:42:19 crc kubenswrapper[4973]: I0320 14:42:19.751216 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="27fed14c-9051-4d46-80d5-badf224805a9" containerName="galera" containerID="cri-o://896517b7ebc700d1ed501fe4fd56b18472ac17a48bafa5e5f2cc8d7a6847cfc7" gracePeriod=29 Mar 20 14:42:20 crc kubenswrapper[4973]: I0320 14:42:20.759673 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" event={"ID":"925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e","Type":"ContainerStarted","Data":"f27d57ff12b91d06c7a4dc3b02ed12bc2519eb0389f9c74f566d772b32faab46"} Mar 20 14:42:20 crc kubenswrapper[4973]: I0320 14:42:20.760358 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" Mar 20 14:42:20 crc kubenswrapper[4973]: I0320 14:42:20.760810 4973 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-z2rch container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:5443/healthz\": dial tcp 10.217.0.16:5443: connect: connection refused" start-of-body= Mar 20 14:42:20 crc kubenswrapper[4973]: I0320 14:42:20.760853 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" podUID="925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.16:5443/healthz\": dial tcp 10.217.0.16:5443: connect: connection refused" Mar 20 14:42:21 crc kubenswrapper[4973]: I0320 14:42:21.069278 4973 patch_prober.go:28] interesting pod/logging-loki-gateway-fbc7bc644-sw7l9 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:21 crc kubenswrapper[4973]: I0320 14:42:21.069693 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" podUID="d4e8002e-56ed-40a4-a768-9fd6a44d891c" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:21 crc kubenswrapper[4973]: I0320 14:42:21.070053 4973 patch_prober.go:28] interesting pod/logging-loki-gateway-fbc7bc644-sw7l9 container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:21 crc kubenswrapper[4973]: I0320 14:42:21.070087 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" podUID="d4e8002e-56ed-40a4-a768-9fd6a44d891c" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:21 crc kubenswrapper[4973]: I0320 14:42:21.078442 4973 patch_prober.go:28] interesting pod/logging-loki-gateway-fbc7bc644-ltv8r container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:21 crc kubenswrapper[4973]: I0320 14:42:21.078463 4973 patch_prober.go:28] interesting pod/logging-loki-gateway-fbc7bc644-ltv8r container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": context deadline exceeded" start-of-body= Mar 20 14:42:21 crc kubenswrapper[4973]: I0320 14:42:21.078509 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" podUID="9ff158ae-7281-4d5c-95cf-ff14e136c414" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:21 crc kubenswrapper[4973]: I0320 14:42:21.078561 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" podUID="9ff158ae-7281-4d5c-95cf-ff14e136c414" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": context deadline exceeded" Mar 20 14:42:21 crc kubenswrapper[4973]: I0320 14:42:21.215811 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-h5hz4" podUID="b450ebc6-3181-4e0d-b546-b10ac89e0481" containerName="registry-server" probeResult="failure" output=< Mar 20 14:42:21 crc kubenswrapper[4973]: timeout: health rpc did not complete within 1s Mar 20 14:42:21 crc kubenswrapper[4973]: > Mar 20 14:42:21 crc kubenswrapper[4973]: I0320 14:42:21.216132 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-h5hz4" podUID="b450ebc6-3181-4e0d-b546-b10ac89e0481" containerName="registry-server" probeResult="failure" output=< Mar 20 14:42:21 crc kubenswrapper[4973]: timeout: health rpc did not complete within 1s Mar 20 14:42:21 crc kubenswrapper[4973]: > Mar 20 14:42:21 crc kubenswrapper[4973]: I0320 14:42:21.216173 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-fpwkr" podUID="57731a76-c496-43ff-afea-a5685864a2f3" containerName="registry-server" probeResult="failure" output=< Mar 20 14:42:21 crc kubenswrapper[4973]: timeout: health rpc did not complete within 1s Mar 20 14:42:21 crc kubenswrapper[4973]: > Mar 20 14:42:21 crc kubenswrapper[4973]: I0320 14:42:21.222356 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-fpwkr" podUID="57731a76-c496-43ff-afea-a5685864a2f3" containerName="registry-server" probeResult="failure" output=< Mar 20 14:42:21 crc kubenswrapper[4973]: timeout: health rpc did not complete within 1s Mar 20 14:42:21 crc kubenswrapper[4973]: > Mar 20 14:42:21 crc kubenswrapper[4973]: I0320 14:42:21.770163 4973 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-z2rch container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:5443/healthz\": dial tcp 10.217.0.16:5443: connect: connection refused" start-of-body= Mar 20 14:42:21 crc kubenswrapper[4973]: I0320 14:42:21.770223 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" podUID="925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.16:5443/healthz\": dial tcp 10.217.0.16:5443: connect: connection refused" Mar 20 14:42:21 crc kubenswrapper[4973]: I0320 14:42:21.902560 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-cgmjd" Mar 20 14:42:22 crc kubenswrapper[4973]: I0320 14:42:22.198741 4973 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:22 crc kubenswrapper[4973]: I0320 14:42:22.198816 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:22 crc kubenswrapper[4973]: I0320 14:42:22.899599 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" podUID="835537e8-dced-4516-a7b9-168d9bb6b687" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.126:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:22 crc kubenswrapper[4973]: I0320 14:42:22.899852 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-controller-manager-78865ff6b4-wlv2f" podUID="835537e8-dced-4516-a7b9-168d9bb6b687" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.126:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:22 crc kubenswrapper[4973]: I0320 14:42:22.940538 4973 prober.go:107] "Probe failed" probeType="Startup" pod="metallb-system/frr-k8s-cgmjd" podUID="75fc4720-ae9c-4ae5-8e4c-7c9a800f5478" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:23 crc kubenswrapper[4973]: I0320 14:42:23.109889 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="95ca3c33-8a98-4ce8-8cb7-06c855d090ac" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 14:42:23 crc kubenswrapper[4973]: I0320 14:42:23.133078 4973 patch_prober.go:28] interesting pod/console-7b5fcf6d5d-lqlbt container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.217.0.141:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:23 crc kubenswrapper[4973]: I0320 14:42:23.133163 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-7b5fcf6d5d-lqlbt" podUID="2e60f0d1-0367-49ca-9c60-e1b6ade5b5f7" containerName="console" probeResult="failure" output="Get \"https://10.217.0.141:8443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:23 crc kubenswrapper[4973]: I0320 14:42:23.606789 4973 patch_prober.go:28] interesting pod/thanos-querier-55bcd89946-hkjh8 container/kube-rbac-proxy-web namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.85:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:23 crc kubenswrapper[4973]: I0320 14:42:23.607420 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/thanos-querier-55bcd89946-hkjh8" podUID="db81edf4-da9e-422b-b1cc-7842cf2c1183" containerName="kube-rbac-proxy-web" probeResult="failure" output="Get \"https://10.217.0.85:9091/-/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:24 crc kubenswrapper[4973]: I0320 14:42:24.030550 4973 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-4v8kv container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:24 crc kubenswrapper[4973]: I0320 14:42:24.030629 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4v8kv" podUID="b3ad60e7-f0fc-4ae0-bf47-a92997c32a08" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:24 crc kubenswrapper[4973]: I0320 14:42:24.030674 4973 patch_prober.go:28] interesting pod/prometheus-operator-admission-webhook-f54c54754-4v8kv container/prometheus-operator-admission-webhook namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:24 crc kubenswrapper[4973]: I0320 14:42:24.030754 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-operator-admission-webhook-f54c54754-4v8kv" podUID="b3ad60e7-f0fc-4ae0-bf47-a92997c32a08" containerName="prometheus-operator-admission-webhook" probeResult="failure" output="Get \"https://10.217.0.78:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:24 crc kubenswrapper[4973]: I0320 14:42:24.542558 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2sflb" podUID="96f5f9d5-bb8c-497c-bfbb-8fd46342ce69" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:24 crc kubenswrapper[4973]: I0320 14:42:24.542627 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-2sflb" podUID="96f5f9d5-bb8c-497c-bfbb-8fd46342ce69" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.107:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:24 crc kubenswrapper[4973]: I0320 14:42:24.684527 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zmqsw" podUID="84e8fd4a-d562-4e92-adfc-479867cf9d3a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:24 crc kubenswrapper[4973]: I0320 14:42:24.684766 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-zmqsw" podUID="84e8fd4a-d562-4e92-adfc-479867cf9d3a" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.110:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:24 crc kubenswrapper[4973]: I0320 14:42:24.700101 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="95ca3c33-8a98-4ce8-8cb7-06c855d090ac" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 14:42:24 crc kubenswrapper[4973]: I0320 14:42:24.776327 4973 patch_prober.go:28] interesting pod/nmstate-webhook-5f558f5558-9c8nv container/nmstate-webhook namespace/openshift-nmstate: Readiness probe status=failure output="Get \"https://10.217.0.86:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:24 crc kubenswrapper[4973]: I0320 14:42:24.776407 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-nmstate/nmstate-webhook-5f558f5558-9c8nv" podUID="2437569f-a833-4666-a051-db0d4818cc5f" containerName="nmstate-webhook" probeResult="failure" output="Get \"https://10.217.0.86:9443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:24 crc kubenswrapper[4973]: I0320 14:42:24.783636 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/prometheus-k8s-0" podUID="1623a587-c5f5-49eb-bc1d-960e4a0faf81" containerName="prometheus" probeResult="failure" output="command timed out" Mar 20 14:42:24 crc kubenswrapper[4973]: I0320 14:42:24.784233 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/prometheus-k8s-0" podUID="1623a587-c5f5-49eb-bc1d-960e4a0faf81" containerName="prometheus" probeResult="failure" output="command timed out" Mar 20 14:42:24 crc kubenswrapper[4973]: I0320 14:42:24.883627 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qxlrn" podUID="13c558f8-2e66-49c3-b184-7fdbbf4ff6b1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:24 crc kubenswrapper[4973]: I0320 14:42:24.883993 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-qxlrn" podUID="13c558f8-2e66-49c3-b184-7fdbbf4ff6b1" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.112:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:24 crc kubenswrapper[4973]: I0320 14:42:24.884013 4973 patch_prober.go:28] interesting pod/controller-manager-6f85f9899-lkzwb container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:24 crc kubenswrapper[4973]: I0320 14:42:24.884040 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" podUID="e04b3fba-1427-496a-b880-f61867a2c3ac" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:24 crc kubenswrapper[4973]: I0320 14:42:24.884054 4973 patch_prober.go:28] interesting pod/controller-manager-6f85f9899-lkzwb container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:24 crc kubenswrapper[4973]: I0320 14:42:24.884075 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-6f85f9899-lkzwb" podUID="e04b3fba-1427-496a-b880-f61867a2c3ac" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.67:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:24 crc kubenswrapper[4973]: I0320 14:42:24.884268 4973 patch_prober.go:28] interesting pod/route-controller-manager-b56c97db-xrm2h container/route-controller-manager namespace/openshift-route-controller-manager: Liveness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:24 crc kubenswrapper[4973]: I0320 14:42:24.884332 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" podUID="451656d5-3bd4-402b-98b9-202b3ac829e1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:24 crc kubenswrapper[4973]: I0320 14:42:24.884392 4973 patch_prober.go:28] interesting pod/route-controller-manager-b56c97db-xrm2h container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:24 crc kubenswrapper[4973]: I0320 14:42:24.884440 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-b56c97db-xrm2h" podUID="451656d5-3bd4-402b-98b9-202b3ac829e1" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.68:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:25 crc kubenswrapper[4973]: I0320 14:42:25.202639 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-x4r2t" podUID="39227253-9885-4ba2-a216-c04066dc7c84" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:25 crc kubenswrapper[4973]: I0320 14:42:25.202708 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-x4r2t" podUID="39227253-9885-4ba2-a216-c04066dc7c84" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.118:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:25 crc kubenswrapper[4973]: I0320 14:42:25.532542 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9dv29" podUID="106eb66b-ca71-49b1-a80e-699f34ac9df9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:25 crc kubenswrapper[4973]: I0320 14:42:25.573606 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/placement-operator-controller-manager-5784578c99-9dv29" podUID="106eb66b-ca71-49b1-a80e-699f34ac9df9" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.121:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:25 crc kubenswrapper[4973]: I0320 14:42:25.655561 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2zn7v" podUID="991cca2c-022f-4c90-a1ba-287191fc2d49" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:25 crc kubenswrapper[4973]: I0320 14:42:25.655602 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/ovn-operator-controller-manager-884679f54-2zn7v" podUID="991cca2c-022f-4c90-a1ba-287191fc2d49" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.120:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:25 crc kubenswrapper[4973]: I0320 14:42:25.738628 4973 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-2zrvx container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:25 crc kubenswrapper[4973]: I0320 14:42:25.738690 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2zrvx" podUID="2dce229d-701a-4a70-9c44-5f99d4c6fe79" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:25 crc kubenswrapper[4973]: I0320 14:42:25.738628 4973 patch_prober.go:28] interesting pod/logging-loki-distributor-9c6b6d984-b5mdp container/loki-distributor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.52:3101/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:25 crc kubenswrapper[4973]: I0320 14:42:25.738653 4973 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-2zrvx container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:25 crc kubenswrapper[4973]: I0320 14:42:25.738793 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-distributor-9c6b6d984-b5mdp" podUID="870a7fc7-0aac-45df-857e-dba72c60f80a" containerName="loki-distributor" probeResult="failure" output="Get \"https://10.217.0.52:3101/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:25 crc kubenswrapper[4973]: I0320 14:42:25.739358 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2zrvx" podUID="2dce229d-701a-4a70-9c44-5f99d4c6fe79" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:25 crc kubenswrapper[4973]: I0320 14:42:25.895427 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-nh7dd" podUID="ecf17bc8-3c8a-4791-a205-2bdc718ec15f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:25 crc kubenswrapper[4973]: I0320 14:42:25.895510 4973 patch_prober.go:28] interesting pod/logging-loki-querier-6dcbdf8bb8-zjf9g container/loki-querier namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:25 crc kubenswrapper[4973]: I0320 14:42:25.895560 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/telemetry-operator-controller-manager-fbb6f4f4f-nh7dd" podUID="ecf17bc8-3c8a-4791-a205-2bdc718ec15f" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.123:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:25 crc kubenswrapper[4973]: I0320 14:42:25.895580 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-querier-6dcbdf8bb8-zjf9g" podUID="727aa0fc-f2ea-4183-a168-24918669937b" containerName="loki-querier" probeResult="failure" output="Get \"https://10.217.0.53:3101/ready\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:25 crc kubenswrapper[4973]: I0320 14:42:25.900118 4973 patch_prober.go:28] interesting pod/logging-loki-query-frontend-ff66c4dc9-ssp8t container/loki-query-frontend namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:25 crc kubenswrapper[4973]: I0320 14:42:25.900173 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-query-frontend-ff66c4dc9-ssp8t" podUID="d08579c3-9bb3-4b06-a613-0b81a2d7fb44" containerName="loki-query-frontend" probeResult="failure" output="Get \"https://10.217.0.54:3101/loki/api/v1/status/buildinfo\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.032533 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kzrpr" podUID="5a41f6b9-9f79-454a-af8a-c0ad746f1d42" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.032827 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-kzrpr" podUID="5a41f6b9-9f79-454a-af8a-c0ad746f1d42" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.125:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.069111 4973 patch_prober.go:28] interesting pod/logging-loki-gateway-fbc7bc644-sw7l9 container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.069190 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" podUID="d4e8002e-56ed-40a4-a768-9fd6a44d891c" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.55:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.069413 4973 patch_prober.go:28] interesting pod/logging-loki-gateway-fbc7bc644-sw7l9 container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.069568 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-fbc7bc644-sw7l9" podUID="d4e8002e-56ed-40a4-a768-9fd6a44d891c" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.55:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.076760 4973 patch_prober.go:28] interesting pod/logging-loki-gateway-fbc7bc644-ltv8r container/opa namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.076775 4973 patch_prober.go:28] interesting pod/logging-loki-gateway-fbc7bc644-ltv8r container/gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.076885 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" podUID="9ff158ae-7281-4d5c-95cf-ff14e136c414" containerName="gateway" probeResult="failure" output="Get \"https://10.217.0.56:8081/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.076812 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-gateway-fbc7bc644-ltv8r" podUID="9ff158ae-7281-4d5c-95cf-ff14e136c414" containerName="opa" probeResult="failure" output="Get \"https://10.217.0.56:8083/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.147648 4973 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-kfft6 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.72:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.147717 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-kfft6" podUID="2843ad35-cfc0-4922-8b96-cebb15694c99" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.72:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.147887 4973 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-kfft6 container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.72:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.147912 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-kfft6" podUID="2843ad35-cfc0-4922-8b96-cebb15694c99" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.72:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.376379 4973 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-z2rch container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:5443/healthz\": dial tcp 10.217.0.16:5443: connect: connection refused" start-of-body= Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.376448 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" podUID="925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.16:5443/healthz\": dial tcp 10.217.0.16:5443: connect: connection refused" Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.376394 4973 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-z2rch container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.16:5443/healthz\": dial tcp 10.217.0.16:5443: connect: connection refused" start-of-body= Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.376554 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" podUID="925857fc-d2ca-4a3e-ba15-1ed6f07f6c4e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.16:5443/healthz\": dial tcp 10.217.0.16:5443: connect: connection refused" Mar 20 14:42:26 crc kubenswrapper[4973]: E0320 14:42:26.454424 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="896517b7ebc700d1ed501fe4fd56b18472ac17a48bafa5e5f2cc8d7a6847cfc7" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 20 14:42:26 crc kubenswrapper[4973]: E0320 14:42:26.455891 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="896517b7ebc700d1ed501fe4fd56b18472ac17a48bafa5e5f2cc8d7a6847cfc7" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 20 14:42:26 crc kubenswrapper[4973]: E0320 14:42:26.457336 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="896517b7ebc700d1ed501fe4fd56b18472ac17a48bafa5e5f2cc8d7a6847cfc7" cmd=["/bin/bash","/var/lib/operator-scripts/mysql_probe.sh","readiness"] Mar 20 14:42:26 crc kubenswrapper[4973]: E0320 14:42:26.457391 4973 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/openstack-galera-0" podUID="27fed14c-9051-4d46-80d5-badf224805a9" containerName="galera" Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.788846 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="1cde80e1-f72d-4080-a86b-5968a8904333" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.796410 4973 patch_prober.go:28] interesting pod/logging-loki-ingester-0 container/loki-ingester namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.57:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.796468 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-ingester-0" podUID="96545de2-ed06-4e4e-9102-37e56cbd6cdb" containerName="loki-ingester" probeResult="failure" output="Get \"https://10.217.0.57:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.845279 4973 patch_prober.go:28] interesting pod/metrics-server-55f4d8dbbb-bmckj container/metrics-server namespace/openshift-monitoring: Liveness probe status=failure output="Get \"https://10.217.0.87:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.845279 4973 patch_prober.go:28] interesting pod/metrics-server-55f4d8dbbb-bmckj container/metrics-server namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.87:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.846060 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" podUID="57c75085-cf04-49d6-8b97-902e00c0efd5" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.87:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.845966 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" podUID="57c75085-cf04-49d6-8b97-902e00c0efd5" containerName="metrics-server" probeResult="failure" output="Get \"https://10.217.0.87:10250/livez\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.846204 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.850542 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="metrics-server" containerStatusID={"Type":"cri-o","ID":"da6003725dac38353c12d436c1cdfb7e598b608c8da0d7de51706cb392e36cf6"} pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" containerMessage="Container metrics-server failed liveness probe, will be restarted" Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.851668 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" podUID="57c75085-cf04-49d6-8b97-902e00c0efd5" containerName="metrics-server" containerID="cri-o://da6003725dac38353c12d436c1cdfb7e598b608c8da0d7de51706cb392e36cf6" gracePeriod=170 Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.944829 4973 patch_prober.go:28] interesting pod/logging-loki-compactor-0 container/loki-compactor namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.944894 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-compactor-0" podUID="fef9a921-6e35-4927-87b4-2741b40a3ab8" containerName="loki-compactor" probeResult="failure" output="Get \"https://10.217.0.58:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.956518 4973 patch_prober.go:28] interesting pod/logging-loki-index-gateway-0 container/loki-index-gateway namespace/openshift-logging: Readiness probe status=failure output="Get \"https://10.217.0.60:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:26 crc kubenswrapper[4973]: I0320 14:42:26.956695 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-logging/logging-loki-index-gateway-0" podUID="28abf398-d023-4625-b8f5-42db7c452df8" containerName="loki-index-gateway" probeResult="failure" output="Get \"https://10.217.0.60:3101/ready\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.017589 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-controller-manager-7f9db6bfb5-b8w47" podUID="487335bd-36f4-42e2-87e1-5acef7226919" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.95:8080/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.018131 4973 patch_prober.go:28] interesting pod/router-default-5444994796-wgtgl container/router namespace/openshift-ingress: Liveness probe status=failure output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.018161 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-ingress/router-default-5444994796-wgtgl" podUID="0e880f23-4bef-4e96-bf00-c94dc4551c5a" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.018190 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-ingress/router-default-5444994796-wgtgl" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.018276 4973 patch_prober.go:28] interesting pod/console-operator-58897d9998-4bk2w container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.018301 4973 patch_prober.go:28] interesting pod/router-default-5444994796-wgtgl container/router namespace/openshift-ingress: Readiness probe status=failure output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.018331 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-4bk2w" podUID="5f512921-f02c-464b-af06-d65fb95f0071" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.018365 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-wgtgl" podUID="0e880f23-4bef-4e96-bf00-c94dc4551c5a" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.018481 4973 patch_prober.go:28] interesting pod/console-operator-58897d9998-4bk2w container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.018527 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-58897d9998-4bk2w" podUID="5f512921-f02c-464b-af06-d65fb95f0071" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.019504 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="router" containerStatusID={"Type":"cri-o","ID":"b824383e01c8df94aaa5809814495fba866ff51ce71903d2604814a774acd055"} pod="openshift-ingress/router-default-5444994796-wgtgl" containerMessage="Container router failed liveness probe, will be restarted" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.019549 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ingress/router-default-5444994796-wgtgl" podUID="0e880f23-4bef-4e96-bf00-c94dc4551c5a" containerName="router" containerID="cri-o://b824383e01c8df94aaa5809814495fba866ff51ce71903d2604814a774acd055" gracePeriod=10 Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.019653 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-4bk2w" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.019679 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-58897d9998-4bk2w" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.021094 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="console-operator" containerStatusID={"Type":"cri-o","ID":"ecebfd6bfdf0e1f243a5a5712948ef8dac51bc241b1c6d43c082d02fe786e5db"} pod="openshift-console-operator/console-operator-58897d9998-4bk2w" containerMessage="Container console-operator failed liveness probe, will be restarted" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.021135 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console-operator/console-operator-58897d9998-4bk2w" podUID="5f512921-f02c-464b-af06-d65fb95f0071" containerName="console-operator" containerID="cri-o://ecebfd6bfdf0e1f243a5a5712948ef8dac51bc241b1c6d43c082d02fe786e5db" gracePeriod=30 Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.040564 4973 patch_prober.go:28] interesting pod/authentication-operator-69f744f599-c4gcj container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.217.0.14:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.040639 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-69f744f599-c4gcj" podUID="322233fd-b71f-4ef5-931f-58e98326386a" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.217.0.14:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.082588 4973 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-4ddcl container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": context deadline exceeded" start-of-body= Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.082616 4973 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-4ddcl container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.082667 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4ddcl" podUID="6de410a6-e2d0-4c6d-8a38-d8d5c49a30e5" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": context deadline exceeded" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.082716 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-4ddcl" podUID="6de410a6-e2d0-4c6d-8a38-d8d5c49a30e5" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.236614 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" podUID="d74cf88e-0824-45f2-92ff-3798ad77f943" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.236720 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.236906 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" podUID="d74cf88e-0824-45f2-92ff-3798ad77f943" containerName="webhook-server" probeResult="failure" output="Get \"http://10.217.0.96:7472/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.237020 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.238237 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="webhook-server" containerStatusID={"Type":"cri-o","ID":"06762ebb55337251c2176c98fc908dfe85ebf019d46b4dc4000dbf43275e0070"} pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" containerMessage="Container webhook-server failed liveness probe, will be restarted" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.238286 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" podUID="d74cf88e-0824-45f2-92ff-3798ad77f943" containerName="webhook-server" containerID="cri-o://06762ebb55337251c2176c98fc908dfe85ebf019d46b4dc4000dbf43275e0070" gracePeriod=2 Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.241837 4973 patch_prober.go:28] interesting pod/monitoring-plugin-5f8fb459cf-n7bzv container/monitoring-plugin namespace/openshift-monitoring: Readiness probe status=failure output="Get \"https://10.217.0.88:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.241870 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-monitoring/monitoring-plugin-5f8fb459cf-n7bzv" podUID="a8d216d7-808c-4312-a314-729e369ee963" containerName="monitoring-plugin" probeResult="failure" output="Get \"https://10.217.0.88:9443/health\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.326510 4973 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-bw9k5 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.326542 4973 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.326511 4973 patch_prober.go:28] interesting pod/package-server-manager-789f6589d5-bw9k5 container/package-server-manager namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.326566 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bw9k5" podUID="05c127a2-f6b5-4d71-8646-e29396ea7971" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.326588 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.326611 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-bw9k5" podUID="05c127a2-f6b5-4d71-8646-e29396ea7971" containerName="package-server-manager" probeResult="failure" output="Get \"http://10.217.0.37:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.326634 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.331804 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-scheduler" containerStatusID={"Type":"cri-o","ID":"2599b537b19ffc88d8144a6613b7812f3ab5e582d3ae0706db4919c06465c1e1"} pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" containerMessage="Container kube-scheduler failed liveness probe, will be restarted" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.331926 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" containerID="cri-o://2599b537b19ffc88d8144a6613b7812f3ab5e582d3ae0706db4919c06465c1e1" gracePeriod=30 Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.367551 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fjztv" podUID="3c89c7dd-500b-4bd5-a30e-273c2a485728" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.119:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.420051 4973 patch_prober.go:28] interesting pod/console-operator-58897d9998-4bk2w container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": EOF" start-of-body= Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.420160 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-4bk2w" podUID="5f512921-f02c-464b-af06-d65fb95f0071" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": EOF" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.451372 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="cert-manager/cert-manager-webhook-687f57d79b-4qwzv" podUID="6f5a8b02-59f4-427d-b91d-e7cacaa1ba23" containerName="cert-manager-webhook" probeResult="failure" output="Get \"http://10.217.0.40:6080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.511534 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/cinder-scheduler-0" podUID="95ca3c33-8a98-4ce8-8cb7-06c855d090ac" containerName="cinder-scheduler" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.511610 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.514822 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cinder-scheduler" containerStatusID={"Type":"cri-o","ID":"c4112903279386936de252a9123dcc64166a267f9121a2ed5268d620fab9ca7d"} pod="openstack/cinder-scheduler-0" containerMessage="Container cinder-scheduler failed liveness probe, will be restarted" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.514990 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="95ca3c33-8a98-4ce8-8cb7-06c855d090ac" containerName="cinder-scheduler" containerID="cri-o://c4112903279386936de252a9123dcc64166a267f9121a2ed5268d620fab9ca7d" gracePeriod=30 Mar 20 14:42:27 crc kubenswrapper[4973]: E0320 14:42:27.712394 4973 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f512921_f02c_464b_af06_d65fb95f0071.slice/crio-ecebfd6bfdf0e1f243a5a5712948ef8dac51bc241b1c6d43c082d02fe786e5db.scope\": RecentStats: unable to find data in memory cache]" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.957613 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/prometheus-metric-storage-0" podUID="b51c1ea9-b42f-47a5-8f74-164a29b2d036" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.182:9090/-/ready\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:27 crc kubenswrapper[4973]: I0320 14:42:27.957846 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/prometheus-metric-storage-0" podUID="b51c1ea9-b42f-47a5-8f74-164a29b2d036" containerName="prometheus" probeResult="failure" output="Get \"https://10.217.0.182:9090/-/healthy\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:28 crc kubenswrapper[4973]: I0320 14:42:28.026467 4973 trace.go:236] Trace[400360610]: "Calculate volume metrics of persistence for pod openstack/rabbitmq-server-1" (20-Mar-2026 14:42:24.478) (total time: 3542ms): Mar 20 14:42:28 crc kubenswrapper[4973]: Trace[400360610]: [3.542694141s] [3.542694141s] END Mar 20 14:42:28 crc kubenswrapper[4973]: I0320 14:42:28.097702 4973 prober.go:107] "Probe failed" probeType="Startup" pod="metallb-system/frr-k8s-cgmjd" podUID="75fc4720-ae9c-4ae5-8e4c-7c9a800f5478" containerName="frr" probeResult="failure" output="Get \"http://127.0.0.1:7573/livez\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:28 crc kubenswrapper[4973]: I0320 14:42:28.097748 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjhq2" podUID="1221336e-652c-45b4-bd66-43e96cf2c643" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:28 crc kubenswrapper[4973]: I0320 14:42:28.098953 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-cgmjd" podUID="75fc4720-ae9c-4ae5-8e4c-7c9a800f5478" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:28 crc kubenswrapper[4973]: I0320 14:42:28.099000 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="metallb-system/frr-k8s-cgmjd" podUID="75fc4720-ae9c-4ae5-8e4c-7c9a800f5478" containerName="controller" probeResult="failure" output="Get \"http://127.0.0.1:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:28 crc kubenswrapper[4973]: I0320 14:42:28.099042 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-rjhq2" podUID="1221336e-652c-45b4-bd66-43e96cf2c643" containerName="frr-k8s-webhook-server" probeResult="failure" output="Get \"http://10.217.0.98:7572/metrics\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:28 crc kubenswrapper[4973]: I0320 14:42:28.185910 4973 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-nhgtl container/operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.19:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:28 crc kubenswrapper[4973]: I0320 14:42:28.185942 4973 patch_prober.go:28] interesting pod/observability-operator-6dd7dd855f-nhgtl container/operator namespace/openshift-operators: Liveness probe status=failure output="Get \"http://10.217.0.19:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:28 crc kubenswrapper[4973]: I0320 14:42:28.186014 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/observability-operator-6dd7dd855f-nhgtl" podUID="0e53f263-96c0-4390-b28e-ca37e867101b" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.19:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:28 crc kubenswrapper[4973]: I0320 14:42:28.186022 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operators/observability-operator-6dd7dd855f-nhgtl" podUID="0e53f263-96c0-4390-b28e-ca37e867101b" containerName="operator" probeResult="failure" output="Get \"http://10.217.0.19:8081/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:28 crc kubenswrapper[4973]: I0320 14:42:28.549546 4973 patch_prober.go:28] interesting pod/perses-operator-9b89954cc-wfdgp container/perses-operator namespace/openshift-operators: Readiness probe status=failure output="Get \"http://10.217.0.26:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:28 crc kubenswrapper[4973]: I0320 14:42:28.550076 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operators/perses-operator-9b89954cc-wfdgp" podUID="a19fcda0-339c-4f0e-9f54-5a2f76c934c5" containerName="perses-operator" probeResult="failure" output="Get \"http://10.217.0.26:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:28 crc kubenswrapper[4973]: I0320 14:42:28.784321 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/openstack-cell1-galera-0" podUID="4bdbb8eb-c36d-43f0-a705-3b3e59128b7f" containerName="galera" probeResult="failure" output="command timed out" Mar 20 14:42:28 crc kubenswrapper[4973]: I0320 14:42:28.786676 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/openstack-cell1-galera-0" podUID="4bdbb8eb-c36d-43f0-a705-3b3e59128b7f" containerName="galera" probeResult="failure" output="command timed out" Mar 20 14:42:28 crc kubenswrapper[4973]: I0320 14:42:28.867074 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-58897d9998-4bk2w_5f512921-f02c-464b-af06-d65fb95f0071/console-operator/0.log" Mar 20 14:42:28 crc kubenswrapper[4973]: I0320 14:42:28.867253 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-4bk2w" event={"ID":"5f512921-f02c-464b-af06-d65fb95f0071","Type":"ContainerDied","Data":"ecebfd6bfdf0e1f243a5a5712948ef8dac51bc241b1c6d43c082d02fe786e5db"} Mar 20 14:42:28 crc kubenswrapper[4973]: I0320 14:42:28.867923 4973 generic.go:334] "Generic (PLEG): container finished" podID="5f512921-f02c-464b-af06-d65fb95f0071" containerID="ecebfd6bfdf0e1f243a5a5712948ef8dac51bc241b1c6d43c082d02fe786e5db" exitCode=1 Mar 20 14:42:29 crc kubenswrapper[4973]: I0320 14:42:29.618484 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/openstack-operator-index-kwvd2" podUID="f580709c-eab2-41f5-96b4-2e32cf02cdcb" containerName="registry-server" probeResult="failure" output=< Mar 20 14:42:29 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:42:29 crc kubenswrapper[4973]: > Mar 20 14:42:29 crc kubenswrapper[4973]: I0320 14:42:29.623296 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack-operators/openstack-operator-index-kwvd2" podUID="f580709c-eab2-41f5-96b4-2e32cf02cdcb" containerName="registry-server" probeResult="failure" output=< Mar 20 14:42:29 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:42:29 crc kubenswrapper[4973]: > Mar 20 14:42:29 crc kubenswrapper[4973]: I0320 14:42:29.881002 4973 generic.go:334] "Generic (PLEG): container finished" podID="d74cf88e-0824-45f2-92ff-3798ad77f943" containerID="06762ebb55337251c2176c98fc908dfe85ebf019d46b4dc4000dbf43275e0070" exitCode=0 Mar 20 14:42:29 crc kubenswrapper[4973]: I0320 14:42:29.881035 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" event={"ID":"d74cf88e-0824-45f2-92ff-3798ad77f943","Type":"ContainerDied","Data":"06762ebb55337251c2176c98fc908dfe85ebf019d46b4dc4000dbf43275e0070"} Mar 20 14:42:29 crc kubenswrapper[4973]: I0320 14:42:29.885489 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-58897d9998-4bk2w_5f512921-f02c-464b-af06-d65fb95f0071/console-operator/0.log" Mar 20 14:42:29 crc kubenswrapper[4973]: I0320 14:42:29.885712 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-4bk2w" event={"ID":"5f512921-f02c-464b-af06-d65fb95f0071","Type":"ContainerStarted","Data":"822a6a09becbc9d3fad11c009ece14d14672d2cca5f3915ce2094a786f715d21"} Mar 20 14:42:29 crc kubenswrapper[4973]: I0320 14:42:29.885965 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-4bk2w" Mar 20 14:42:29 crc kubenswrapper[4973]: I0320 14:42:29.888674 4973 generic.go:334] "Generic (PLEG): container finished" podID="27fed14c-9051-4d46-80d5-badf224805a9" containerID="896517b7ebc700d1ed501fe4fd56b18472ac17a48bafa5e5f2cc8d7a6847cfc7" exitCode=0 Mar 20 14:42:29 crc kubenswrapper[4973]: I0320 14:42:29.888785 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"27fed14c-9051-4d46-80d5-badf224805a9","Type":"ContainerDied","Data":"896517b7ebc700d1ed501fe4fd56b18472ac17a48bafa5e5f2cc8d7a6847cfc7"} Mar 20 14:42:29 crc kubenswrapper[4973]: I0320 14:42:29.896310 4973 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="2599b537b19ffc88d8144a6613b7812f3ab5e582d3ae0706db4919c06465c1e1" exitCode=0 Mar 20 14:42:29 crc kubenswrapper[4973]: I0320 14:42:29.896379 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"2599b537b19ffc88d8144a6613b7812f3ab5e582d3ae0706db4919c06465c1e1"} Mar 20 14:42:29 crc kubenswrapper[4973]: I0320 14:42:29.937528 4973 patch_prober.go:28] interesting pod/oauth-openshift-567cd76c58-zvtsd container/oauth-openshift namespace/openshift-authentication: Liveness probe status=failure output="Get \"https://10.217.0.71:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:29 crc kubenswrapper[4973]: I0320 14:42:29.937554 4973 patch_prober.go:28] interesting pod/oauth-openshift-567cd76c58-zvtsd container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.71:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 14:42:29 crc kubenswrapper[4973]: I0320 14:42:29.937585 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" podUID="fcfe2a5b-dd17-425e-8024-655606a1c470" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.71:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:29 crc kubenswrapper[4973]: I0320 14:42:29.937609 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-567cd76c58-zvtsd" podUID="fcfe2a5b-dd17-425e-8024-655606a1c470" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.71:6443/healthz\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:29 crc kubenswrapper[4973]: I0320 14:42:29.937921 4973 patch_prober.go:28] interesting pod/console-operator-58897d9998-4bk2w container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 20 14:42:29 crc kubenswrapper[4973]: I0320 14:42:29.937948 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-4bk2w" podUID="5f512921-f02c-464b-af06-d65fb95f0071" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 20 14:42:30 crc kubenswrapper[4973]: I0320 14:42:30.912312 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" event={"ID":"d74cf88e-0824-45f2-92ff-3798ad77f943","Type":"ContainerStarted","Data":"c2c895b1f1c504799570b243936652d6db250c47d276b74386a2637d0e8c71dd"} Mar 20 14:42:30 crc kubenswrapper[4973]: I0320 14:42:30.913458 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" Mar 20 14:42:30 crc kubenswrapper[4973]: I0320 14:42:30.918268 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"c459f4db3c9671e390b5432b5f77c4a0df711f55f69c3e4611cad16ad1436ea6"} Mar 20 14:42:30 crc kubenswrapper[4973]: I0320 14:42:30.918391 4973 patch_prober.go:28] interesting pod/console-operator-58897d9998-4bk2w container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Mar 20 14:42:30 crc kubenswrapper[4973]: I0320 14:42:30.918427 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-4bk2w" podUID="5f512921-f02c-464b-af06-d65fb95f0071" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Mar 20 14:42:30 crc kubenswrapper[4973]: I0320 14:42:30.919496 4973 patch_prober.go:28] interesting pod/openshift-kube-scheduler-crc container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.126.11:10259/healthz\": dial tcp 192.168.126.11:10259: connect: connection refused" start-of-body= Mar 20 14:42:30 crc kubenswrapper[4973]: I0320 14:42:30.919565 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podUID="3dcd261975c3d6b9a6ad6367fd4facd3" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.126.11:10259/healthz\": dial tcp 192.168.126.11:10259: connect: connection refused" Mar 20 14:42:30 crc kubenswrapper[4973]: I0320 14:42:30.921088 4973 status_manager.go:317] "Container readiness changed for unknown container" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" containerID="cri-o://2599b537b19ffc88d8144a6613b7812f3ab5e582d3ae0706db4919c06465c1e1" Mar 20 14:42:30 crc kubenswrapper[4973]: I0320 14:42:30.921412 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 14:42:31 crc kubenswrapper[4973]: I0320 14:42:31.199530 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 14:42:31 crc kubenswrapper[4973]: I0320 14:42:31.517632 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/redhat-operators-h5hz4" podUID="b450ebc6-3181-4e0d-b546-b10ac89e0481" containerName="registry-server" probeResult="failure" output=< Mar 20 14:42:31 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:42:31 crc kubenswrapper[4973]: > Mar 20 14:42:31 crc kubenswrapper[4973]: I0320 14:42:31.518086 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/redhat-operators-h5hz4" Mar 20 14:42:31 crc kubenswrapper[4973]: I0320 14:42:31.520762 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"f57e0f4515afd97e275b7d92199045d344693ab84816b0462c87b831b7e9bda1"} pod="openshift-marketplace/redhat-operators-h5hz4" containerMessage="Container registry-server failed liveness probe, will be restarted" Mar 20 14:42:31 crc kubenswrapper[4973]: I0320 14:42:31.520819 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-h5hz4" podUID="b450ebc6-3181-4e0d-b546-b10ac89e0481" containerName="registry-server" containerID="cri-o://f57e0f4515afd97e275b7d92199045d344693ab84816b0462c87b831b7e9bda1" gracePeriod=30 Mar 20 14:42:31 crc kubenswrapper[4973]: I0320 14:42:31.607151 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-fpwkr" podUID="57731a76-c496-43ff-afea-a5685864a2f3" containerName="registry-server" probeResult="failure" output=< Mar 20 14:42:31 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:42:31 crc kubenswrapper[4973]: > Mar 20 14:42:31 crc kubenswrapper[4973]: I0320 14:42:31.607324 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fpwkr" Mar 20 14:42:31 crc kubenswrapper[4973]: I0320 14:42:31.612608 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-fpwkr" podUID="57731a76-c496-43ff-afea-a5685864a2f3" containerName="registry-server" probeResult="failure" output=< Mar 20 14:42:31 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:42:31 crc kubenswrapper[4973]: > Mar 20 14:42:31 crc kubenswrapper[4973]: I0320 14:42:31.612685 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/community-operators-fpwkr" Mar 20 14:42:31 crc kubenswrapper[4973]: I0320 14:42:31.618822 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/redhat-operators-h5hz4" podUID="b450ebc6-3181-4e0d-b546-b10ac89e0481" containerName="registry-server" probeResult="failure" output=< Mar 20 14:42:31 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:42:31 crc kubenswrapper[4973]: > Mar 20 14:42:31 crc kubenswrapper[4973]: I0320 14:42:31.618936 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h5hz4" Mar 20 14:42:31 crc kubenswrapper[4973]: I0320 14:42:31.785643 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="1cde80e1-f72d-4080-a86b-5968a8904333" containerName="ceilometer-notification-agent" probeResult="failure" output="command timed out" Mar 20 14:42:31 crc kubenswrapper[4973]: I0320 14:42:31.936428 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="registry-server" containerStatusID={"Type":"cri-o","ID":"07df23c3b96eff2dda412f9198cc400c4978dd51d11200b75b8c76345a68a2c0"} pod="openshift-marketplace/community-operators-fpwkr" containerMessage="Container registry-server failed liveness probe, will be restarted" Mar 20 14:42:31 crc kubenswrapper[4973]: I0320 14:42:31.936493 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fpwkr" podUID="57731a76-c496-43ff-afea-a5685864a2f3" containerName="registry-server" containerID="cri-o://07df23c3b96eff2dda412f9198cc400c4978dd51d11200b75b8c76345a68a2c0" gracePeriod=30 Mar 20 14:42:31 crc kubenswrapper[4973]: I0320 14:42:31.940627 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"27fed14c-9051-4d46-80d5-badf224805a9","Type":"ContainerStarted","Data":"255fad69a896fa491dda354b57284344f3953f11a263ce7a32ce85c5c12b5cd7"} Mar 20 14:42:32 crc kubenswrapper[4973]: I0320 14:42:32.205061 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fpwkr" Mar 20 14:42:32 crc kubenswrapper[4973]: E0320 14:42:32.214570 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07df23c3b96eff2dda412f9198cc400c4978dd51d11200b75b8c76345a68a2c0" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 14:42:32 crc kubenswrapper[4973]: E0320 14:42:32.216064 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="07df23c3b96eff2dda412f9198cc400c4978dd51d11200b75b8c76345a68a2c0" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 14:42:32 crc kubenswrapper[4973]: E0320 14:42:32.219631 4973 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 07df23c3b96eff2dda412f9198cc400c4978dd51d11200b75b8c76345a68a2c0 is running failed: container process not found" containerID="07df23c3b96eff2dda412f9198cc400c4978dd51d11200b75b8c76345a68a2c0" cmd=["grpc_health_probe","-addr=:50051"] Mar 20 14:42:32 crc kubenswrapper[4973]: E0320 14:42:32.219693 4973 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 07df23c3b96eff2dda412f9198cc400c4978dd51d11200b75b8c76345a68a2c0 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/community-operators-fpwkr" podUID="57731a76-c496-43ff-afea-a5685864a2f3" containerName="registry-server" Mar 20 14:42:32 crc kubenswrapper[4973]: I0320 14:42:32.418450 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-cgmjd" Mar 20 14:42:32 crc kubenswrapper[4973]: I0320 14:42:32.785192 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="1cde80e1-f72d-4080-a86b-5968a8904333" containerName="ceilometer-central-agent" probeResult="failure" output="command timed out" Mar 20 14:42:32 crc kubenswrapper[4973]: I0320 14:42:32.956842 4973 generic.go:334] "Generic (PLEG): container finished" podID="b450ebc6-3181-4e0d-b546-b10ac89e0481" containerID="f57e0f4515afd97e275b7d92199045d344693ab84816b0462c87b831b7e9bda1" exitCode=0 Mar 20 14:42:32 crc kubenswrapper[4973]: I0320 14:42:32.956946 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5hz4" event={"ID":"b450ebc6-3181-4e0d-b546-b10ac89e0481","Type":"ContainerDied","Data":"f57e0f4515afd97e275b7d92199045d344693ab84816b0462c87b831b7e9bda1"} Mar 20 14:42:32 crc kubenswrapper[4973]: I0320 14:42:32.960420 4973 generic.go:334] "Generic (PLEG): container finished" podID="57731a76-c496-43ff-afea-a5685864a2f3" containerID="07df23c3b96eff2dda412f9198cc400c4978dd51d11200b75b8c76345a68a2c0" exitCode=0 Mar 20 14:42:32 crc kubenswrapper[4973]: I0320 14:42:32.960505 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fpwkr" event={"ID":"57731a76-c496-43ff-afea-a5685864a2f3","Type":"ContainerDied","Data":"07df23c3b96eff2dda412f9198cc400c4978dd51d11200b75b8c76345a68a2c0"} Mar 20 14:42:32 crc kubenswrapper[4973]: I0320 14:42:32.962124 4973 generic.go:334] "Generic (PLEG): container finished" podID="95ca3c33-8a98-4ce8-8cb7-06c855d090ac" containerID="c4112903279386936de252a9123dcc64166a267f9121a2ed5268d620fab9ca7d" exitCode=0 Mar 20 14:42:32 crc kubenswrapper[4973]: I0320 14:42:32.963384 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"95ca3c33-8a98-4ce8-8cb7-06c855d090ac","Type":"ContainerDied","Data":"c4112903279386936de252a9123dcc64166a267f9121a2ed5268d620fab9ca7d"} Mar 20 14:42:33 crc kubenswrapper[4973]: I0320 14:42:33.982593 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fpwkr" event={"ID":"57731a76-c496-43ff-afea-a5685864a2f3","Type":"ContainerStarted","Data":"d09dd5833f5b06e76c16fdf3d18908838721e424293043bcef8a78897bfde4f8"} Mar 20 14:42:33 crc kubenswrapper[4973]: I0320 14:42:33.986585 4973 generic.go:334] "Generic (PLEG): container finished" podID="46a36df3-9f0c-4e6a-b959-5542da9c7534" containerID="cd991d7527d172dd7c86e149b200189738b1b0a10c0503f34f08c3e440cfb9ee" exitCode=0 Mar 20 14:42:33 crc kubenswrapper[4973]: I0320 14:42:33.986631 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566962-2sqjf" event={"ID":"46a36df3-9f0c-4e6a-b959-5542da9c7534","Type":"ContainerDied","Data":"cd991d7527d172dd7c86e149b200189738b1b0a10c0503f34f08c3e440cfb9ee"} Mar 20 14:42:34 crc kubenswrapper[4973]: I0320 14:42:34.525589 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-z4dzx" podUID="3233d229-1d2f-4c90-b76a-f27ca914f0ad" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.108:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 14:42:35 crc kubenswrapper[4973]: I0320 14:42:35.995671 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-4bk2w" Mar 20 14:42:36 crc kubenswrapper[4973]: I0320 14:42:36.022813 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-h5hz4" event={"ID":"b450ebc6-3181-4e0d-b546-b10ac89e0481","Type":"ContainerStarted","Data":"a4e8f9de74d930f9ce58943aca8dd7aba3a37ab137055a00e5f4d883938cb010"} Mar 20 14:42:36 crc kubenswrapper[4973]: I0320 14:42:36.106085 4973 patch_prober.go:28] interesting pod/router-default-5444994796-wgtgl container/router namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]backend-http ok Mar 20 14:42:36 crc kubenswrapper[4973]: [+]has-synced ok Mar 20 14:42:36 crc kubenswrapper[4973]: [-]process-running failed: reason withheld Mar 20 14:42:36 crc kubenswrapper[4973]: healthz check failed Mar 20 14:42:36 crc kubenswrapper[4973]: I0320 14:42:36.106146 4973 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-ingress/router-default-5444994796-wgtgl" podUID="0e880f23-4bef-4e96-bf00-c94dc4551c5a" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 14:42:36 crc kubenswrapper[4973]: I0320 14:42:36.106223 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-wgtgl" Mar 20 14:42:36 crc kubenswrapper[4973]: I0320 14:42:36.402261 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-z2rch" Mar 20 14:42:36 crc kubenswrapper[4973]: I0320 14:42:36.441718 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 20 14:42:36 crc kubenswrapper[4973]: I0320 14:42:36.443182 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 20 14:42:37 crc kubenswrapper[4973]: I0320 14:42:37.458500 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566962-2sqjf" Mar 20 14:42:37 crc kubenswrapper[4973]: I0320 14:42:37.601426 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpgzk\" (UniqueName: \"kubernetes.io/projected/46a36df3-9f0c-4e6a-b959-5542da9c7534-kube-api-access-gpgzk\") pod \"46a36df3-9f0c-4e6a-b959-5542da9c7534\" (UID: \"46a36df3-9f0c-4e6a-b959-5542da9c7534\") " Mar 20 14:42:37 crc kubenswrapper[4973]: I0320 14:42:37.634089 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46a36df3-9f0c-4e6a-b959-5542da9c7534-kube-api-access-gpgzk" (OuterVolumeSpecName: "kube-api-access-gpgzk") pod "46a36df3-9f0c-4e6a-b959-5542da9c7534" (UID: "46a36df3-9f0c-4e6a-b959-5542da9c7534"). InnerVolumeSpecName "kube-api-access-gpgzk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:42:37 crc kubenswrapper[4973]: I0320 14:42:37.710307 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpgzk\" (UniqueName: \"kubernetes.io/projected/46a36df3-9f0c-4e6a-b959-5542da9c7534-kube-api-access-gpgzk\") on node \"crc\" DevicePath \"\"" Mar 20 14:42:38 crc kubenswrapper[4973]: I0320 14:42:38.089998 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"95ca3c33-8a98-4ce8-8cb7-06c855d090ac","Type":"ContainerStarted","Data":"7808b427b9ec061f39859ce39a70ddd973d9358f2cf38af3ea918fbee8b4656a"} Mar 20 14:42:38 crc kubenswrapper[4973]: I0320 14:42:38.098920 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566962-2sqjf" event={"ID":"46a36df3-9f0c-4e6a-b959-5542da9c7534","Type":"ContainerDied","Data":"b670a7367358d5bcfe0c87ee2f7299bd07ba0ced252654812117a2d8b3d687e3"} Mar 20 14:42:38 crc kubenswrapper[4973]: I0320 14:42:38.099073 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566962-2sqjf" Mar 20 14:42:38 crc kubenswrapper[4973]: I0320 14:42:38.100254 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b670a7367358d5bcfe0c87ee2f7299bd07ba0ced252654812117a2d8b3d687e3" Mar 20 14:42:38 crc kubenswrapper[4973]: I0320 14:42:38.112225 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-5444994796-wgtgl_0e880f23-4bef-4e96-bf00-c94dc4551c5a/router/0.log" Mar 20 14:42:38 crc kubenswrapper[4973]: I0320 14:42:38.112296 4973 generic.go:334] "Generic (PLEG): container finished" podID="0e880f23-4bef-4e96-bf00-c94dc4551c5a" containerID="b824383e01c8df94aaa5809814495fba866ff51ce71903d2604814a774acd055" exitCode=137 Mar 20 14:42:38 crc kubenswrapper[4973]: I0320 14:42:38.112364 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-wgtgl" event={"ID":"0e880f23-4bef-4e96-bf00-c94dc4551c5a","Type":"ContainerDied","Data":"b824383e01c8df94aaa5809814495fba866ff51ce71903d2604814a774acd055"} Mar 20 14:42:38 crc kubenswrapper[4973]: I0320 14:42:38.112399 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-wgtgl" event={"ID":"0e880f23-4bef-4e96-bf00-c94dc4551c5a","Type":"ContainerStarted","Data":"a826cfa43f70449e9afd0b1ceb4c5852a6ff23db4ad516bf4c289b7859bb55f6"} Mar 20 14:42:38 crc kubenswrapper[4973]: I0320 14:42:38.638001 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566956-mcmxc"] Mar 20 14:42:38 crc kubenswrapper[4973]: I0320 14:42:38.653263 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566956-mcmxc"] Mar 20 14:42:38 crc kubenswrapper[4973]: I0320 14:42:38.760410 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-h5hz4" Mar 20 14:42:38 crc kubenswrapper[4973]: I0320 14:42:38.761686 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-h5hz4" Mar 20 14:42:38 crc kubenswrapper[4973]: I0320 14:42:38.896616 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-wgtgl" Mar 20 14:42:38 crc kubenswrapper[4973]: I0320 14:42:38.902470 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-wgtgl" Mar 20 14:42:39 crc kubenswrapper[4973]: I0320 14:42:39.097261 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 20 14:42:39 crc kubenswrapper[4973]: I0320 14:42:39.121964 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-wgtgl" Mar 20 14:42:39 crc kubenswrapper[4973]: I0320 14:42:39.131114 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-wgtgl" Mar 20 14:42:39 crc kubenswrapper[4973]: I0320 14:42:39.934387 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h5hz4" podUID="b450ebc6-3181-4e0d-b546-b10ac89e0481" containerName="registry-server" probeResult="failure" output=< Mar 20 14:42:39 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:42:39 crc kubenswrapper[4973]: > Mar 20 14:42:39 crc kubenswrapper[4973]: I0320 14:42:39.967109 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ecae9f9-4539-4009-93e2-af52b0210fa6" path="/var/lib/kubelet/pods/6ecae9f9-4539-4009-93e2-af52b0210fa6/volumes" Mar 20 14:42:40 crc kubenswrapper[4973]: I0320 14:42:40.135018 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fpwkr" Mar 20 14:42:40 crc kubenswrapper[4973]: I0320 14:42:40.135200 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fpwkr" Mar 20 14:42:40 crc kubenswrapper[4973]: I0320 14:42:40.986979 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 20 14:42:41 crc kubenswrapper[4973]: I0320 14:42:41.081760 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 20 14:42:41 crc kubenswrapper[4973]: I0320 14:42:41.206127 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-fpwkr" podUID="57731a76-c496-43ff-afea-a5685864a2f3" containerName="registry-server" probeResult="failure" output=< Mar 20 14:42:41 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:42:41 crc kubenswrapper[4973]: > Mar 20 14:42:43 crc kubenswrapper[4973]: I0320 14:42:43.320913 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:42:43 crc kubenswrapper[4973]: I0320 14:42:43.322683 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:42:44 crc kubenswrapper[4973]: I0320 14:42:44.128175 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 20 14:42:46 crc kubenswrapper[4973]: I0320 14:42:46.158928 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-68c6dd9858-4mw5r" Mar 20 14:42:49 crc kubenswrapper[4973]: I0320 14:42:49.837284 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h5hz4" podUID="b450ebc6-3181-4e0d-b546-b10ac89e0481" containerName="registry-server" probeResult="failure" output=< Mar 20 14:42:49 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:42:49 crc kubenswrapper[4973]: > Mar 20 14:42:51 crc kubenswrapper[4973]: I0320 14:42:51.207721 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-fpwkr" podUID="57731a76-c496-43ff-afea-a5685864a2f3" containerName="registry-server" probeResult="failure" output=< Mar 20 14:42:51 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:42:51 crc kubenswrapper[4973]: > Mar 20 14:42:56 crc kubenswrapper[4973]: I0320 14:42:56.465044 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="1cde80e1-f72d-4080-a86b-5968a8904333" containerName="ceilometer-notification-agent" probeResult="failure" output=< Mar 20 14:42:56 crc kubenswrapper[4973]: Unkown error: Expecting value: line 1 column 1 (char 0) Mar 20 14:42:56 crc kubenswrapper[4973]: > Mar 20 14:42:59 crc kubenswrapper[4973]: I0320 14:42:59.808011 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h5hz4" podUID="b450ebc6-3181-4e0d-b546-b10ac89e0481" containerName="registry-server" probeResult="failure" output=< Mar 20 14:42:59 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:42:59 crc kubenswrapper[4973]: > Mar 20 14:43:00 crc kubenswrapper[4973]: I0320 14:43:00.214427 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fpwkr" Mar 20 14:43:00 crc kubenswrapper[4973]: I0320 14:43:00.268769 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fpwkr" Mar 20 14:43:05 crc kubenswrapper[4973]: I0320 14:43:05.219086 4973 scope.go:117] "RemoveContainer" containerID="21c35e7d0f2129039671361e2346d42976ea15f246bc448db4c46517d48b040f" Mar 20 14:43:09 crc kubenswrapper[4973]: I0320 14:43:09.810966 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h5hz4" podUID="b450ebc6-3181-4e0d-b546-b10ac89e0481" containerName="registry-server" probeResult="failure" output=< Mar 20 14:43:09 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:43:09 crc kubenswrapper[4973]: > Mar 20 14:43:13 crc kubenswrapper[4973]: I0320 14:43:13.320846 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:43:13 crc kubenswrapper[4973]: I0320 14:43:13.321441 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:43:18 crc kubenswrapper[4973]: I0320 14:43:18.170441 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-px2c4"] Mar 20 14:43:18 crc kubenswrapper[4973]: E0320 14:43:18.174880 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46a36df3-9f0c-4e6a-b959-5542da9c7534" containerName="oc" Mar 20 14:43:18 crc kubenswrapper[4973]: I0320 14:43:18.174912 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="46a36df3-9f0c-4e6a-b959-5542da9c7534" containerName="oc" Mar 20 14:43:18 crc kubenswrapper[4973]: I0320 14:43:18.175220 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="46a36df3-9f0c-4e6a-b959-5542da9c7534" containerName="oc" Mar 20 14:43:18 crc kubenswrapper[4973]: I0320 14:43:18.180382 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-px2c4" Mar 20 14:43:18 crc kubenswrapper[4973]: I0320 14:43:18.245194 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-px2c4"] Mar 20 14:43:18 crc kubenswrapper[4973]: I0320 14:43:18.353211 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36698437-ce7d-4a1b-98bc-8ca8e743ace5-utilities\") pod \"community-operators-px2c4\" (UID: \"36698437-ce7d-4a1b-98bc-8ca8e743ace5\") " pod="openshift-marketplace/community-operators-px2c4" Mar 20 14:43:18 crc kubenswrapper[4973]: I0320 14:43:18.353303 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srhqh\" (UniqueName: \"kubernetes.io/projected/36698437-ce7d-4a1b-98bc-8ca8e743ace5-kube-api-access-srhqh\") pod \"community-operators-px2c4\" (UID: \"36698437-ce7d-4a1b-98bc-8ca8e743ace5\") " pod="openshift-marketplace/community-operators-px2c4" Mar 20 14:43:18 crc kubenswrapper[4973]: I0320 14:43:18.353544 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36698437-ce7d-4a1b-98bc-8ca8e743ace5-catalog-content\") pod \"community-operators-px2c4\" (UID: \"36698437-ce7d-4a1b-98bc-8ca8e743ace5\") " pod="openshift-marketplace/community-operators-px2c4" Mar 20 14:43:18 crc kubenswrapper[4973]: I0320 14:43:18.456655 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36698437-ce7d-4a1b-98bc-8ca8e743ace5-utilities\") pod \"community-operators-px2c4\" (UID: \"36698437-ce7d-4a1b-98bc-8ca8e743ace5\") " pod="openshift-marketplace/community-operators-px2c4" Mar 20 14:43:18 crc kubenswrapper[4973]: I0320 14:43:18.457037 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-srhqh\" (UniqueName: \"kubernetes.io/projected/36698437-ce7d-4a1b-98bc-8ca8e743ace5-kube-api-access-srhqh\") pod \"community-operators-px2c4\" (UID: \"36698437-ce7d-4a1b-98bc-8ca8e743ace5\") " pod="openshift-marketplace/community-operators-px2c4" Mar 20 14:43:18 crc kubenswrapper[4973]: I0320 14:43:18.457169 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36698437-ce7d-4a1b-98bc-8ca8e743ace5-catalog-content\") pod \"community-operators-px2c4\" (UID: \"36698437-ce7d-4a1b-98bc-8ca8e743ace5\") " pod="openshift-marketplace/community-operators-px2c4" Mar 20 14:43:18 crc kubenswrapper[4973]: I0320 14:43:18.458744 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36698437-ce7d-4a1b-98bc-8ca8e743ace5-utilities\") pod \"community-operators-px2c4\" (UID: \"36698437-ce7d-4a1b-98bc-8ca8e743ace5\") " pod="openshift-marketplace/community-operators-px2c4" Mar 20 14:43:18 crc kubenswrapper[4973]: I0320 14:43:18.459198 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36698437-ce7d-4a1b-98bc-8ca8e743ace5-catalog-content\") pod \"community-operators-px2c4\" (UID: \"36698437-ce7d-4a1b-98bc-8ca8e743ace5\") " pod="openshift-marketplace/community-operators-px2c4" Mar 20 14:43:18 crc kubenswrapper[4973]: I0320 14:43:18.484465 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-srhqh\" (UniqueName: \"kubernetes.io/projected/36698437-ce7d-4a1b-98bc-8ca8e743ace5-kube-api-access-srhqh\") pod \"community-operators-px2c4\" (UID: \"36698437-ce7d-4a1b-98bc-8ca8e743ace5\") " pod="openshift-marketplace/community-operators-px2c4" Mar 20 14:43:18 crc kubenswrapper[4973]: I0320 14:43:18.505050 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-px2c4" Mar 20 14:43:19 crc kubenswrapper[4973]: I0320 14:43:19.474680 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-px2c4"] Mar 20 14:43:19 crc kubenswrapper[4973]: I0320 14:43:19.627160 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-px2c4" event={"ID":"36698437-ce7d-4a1b-98bc-8ca8e743ace5","Type":"ContainerStarted","Data":"db9deee09f8f329d1d98df94699324284329baa3a3683f6901d7358cd611fd5a"} Mar 20 14:43:19 crc kubenswrapper[4973]: I0320 14:43:19.823617 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-h5hz4" podUID="b450ebc6-3181-4e0d-b546-b10ac89e0481" containerName="registry-server" probeResult="failure" output=< Mar 20 14:43:19 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:43:19 crc kubenswrapper[4973]: > Mar 20 14:43:20 crc kubenswrapper[4973]: I0320 14:43:20.643314 4973 generic.go:334] "Generic (PLEG): container finished" podID="36698437-ce7d-4a1b-98bc-8ca8e743ace5" containerID="8f0a04e34e2a7462bdfbd68736880d98bdc15f84f16840fc5d5b8062ff32b6b6" exitCode=0 Mar 20 14:43:20 crc kubenswrapper[4973]: I0320 14:43:20.643398 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-px2c4" event={"ID":"36698437-ce7d-4a1b-98bc-8ca8e743ace5","Type":"ContainerDied","Data":"8f0a04e34e2a7462bdfbd68736880d98bdc15f84f16840fc5d5b8062ff32b6b6"} Mar 20 14:43:21 crc kubenswrapper[4973]: I0320 14:43:21.202799 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 14:43:21 crc kubenswrapper[4973]: I0320 14:43:21.661003 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-px2c4" event={"ID":"36698437-ce7d-4a1b-98bc-8ca8e743ace5","Type":"ContainerStarted","Data":"2388795ff38e1351088bfb0310af1e52dbf4a082efcc39ca6ca1eed400d949e4"} Mar 20 14:43:23 crc kubenswrapper[4973]: I0320 14:43:23.701233 4973 generic.go:334] "Generic (PLEG): container finished" podID="36698437-ce7d-4a1b-98bc-8ca8e743ace5" containerID="2388795ff38e1351088bfb0310af1e52dbf4a082efcc39ca6ca1eed400d949e4" exitCode=0 Mar 20 14:43:23 crc kubenswrapper[4973]: I0320 14:43:23.701286 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-px2c4" event={"ID":"36698437-ce7d-4a1b-98bc-8ca8e743ace5","Type":"ContainerDied","Data":"2388795ff38e1351088bfb0310af1e52dbf4a082efcc39ca6ca1eed400d949e4"} Mar 20 14:43:24 crc kubenswrapper[4973]: I0320 14:43:24.713558 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-px2c4" event={"ID":"36698437-ce7d-4a1b-98bc-8ca8e743ace5","Type":"ContainerStarted","Data":"a60209a53bb76b1a892cb87b9c8660d7d91f588c03b5f0f822bea9e575dd97a4"} Mar 20 14:43:24 crc kubenswrapper[4973]: I0320 14:43:24.743710 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-px2c4" podStartSLOduration=3.211193175 podStartE2EDuration="6.742159288s" podCreationTimestamp="2026-03-20 14:43:18 +0000 UTC" firstStartedPulling="2026-03-20 14:43:20.647618733 +0000 UTC m=+4921.391288477" lastFinishedPulling="2026-03-20 14:43:24.178584846 +0000 UTC m=+4924.922254590" observedRunningTime="2026-03-20 14:43:24.729693438 +0000 UTC m=+4925.473363182" watchObservedRunningTime="2026-03-20 14:43:24.742159288 +0000 UTC m=+4925.485829032" Mar 20 14:43:26 crc kubenswrapper[4973]: I0320 14:43:26.473752 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/ceilometer-0" podUID="1cde80e1-f72d-4080-a86b-5968a8904333" containerName="ceilometer-notification-agent" probeResult="failure" output=< Mar 20 14:43:26 crc kubenswrapper[4973]: Unkown error: Expecting value: line 1 column 1 (char 0) Mar 20 14:43:26 crc kubenswrapper[4973]: > Mar 20 14:43:26 crc kubenswrapper[4973]: I0320 14:43:26.475322 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ceilometer-0" Mar 20 14:43:26 crc kubenswrapper[4973]: I0320 14:43:26.477844 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="ceilometer-notification-agent" containerStatusID={"Type":"cri-o","ID":"800654125f78fdf5a9132ee5dda7e8cd40f7a094f028f4f95050718d0a1f9a46"} pod="openstack/ceilometer-0" containerMessage="Container ceilometer-notification-agent failed liveness probe, will be restarted" Mar 20 14:43:26 crc kubenswrapper[4973]: I0320 14:43:26.478322 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1cde80e1-f72d-4080-a86b-5968a8904333" containerName="ceilometer-notification-agent" containerID="cri-o://800654125f78fdf5a9132ee5dda7e8cd40f7a094f028f4f95050718d0a1f9a46" gracePeriod=30 Mar 20 14:43:28 crc kubenswrapper[4973]: I0320 14:43:28.506096 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-px2c4" Mar 20 14:43:28 crc kubenswrapper[4973]: I0320 14:43:28.506454 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-px2c4" Mar 20 14:43:28 crc kubenswrapper[4973]: I0320 14:43:28.812212 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-h5hz4" Mar 20 14:43:28 crc kubenswrapper[4973]: I0320 14:43:28.871366 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-h5hz4" Mar 20 14:43:29 crc kubenswrapper[4973]: I0320 14:43:29.565732 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-px2c4" podUID="36698437-ce7d-4a1b-98bc-8ca8e743ace5" containerName="registry-server" probeResult="failure" output=< Mar 20 14:43:29 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:43:29 crc kubenswrapper[4973]: > Mar 20 14:43:30 crc kubenswrapper[4973]: I0320 14:43:30.782090 4973 generic.go:334] "Generic (PLEG): container finished" podID="1cde80e1-f72d-4080-a86b-5968a8904333" containerID="800654125f78fdf5a9132ee5dda7e8cd40f7a094f028f4f95050718d0a1f9a46" exitCode=0 Mar 20 14:43:30 crc kubenswrapper[4973]: I0320 14:43:30.782196 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cde80e1-f72d-4080-a86b-5968a8904333","Type":"ContainerDied","Data":"800654125f78fdf5a9132ee5dda7e8cd40f7a094f028f4f95050718d0a1f9a46"} Mar 20 14:43:32 crc kubenswrapper[4973]: I0320 14:43:32.811976 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1cde80e1-f72d-4080-a86b-5968a8904333","Type":"ContainerStarted","Data":"adae0be4b4facb967fd46919898f3b5dc903053dbf89ddc46bcabf2b27ad8d38"} Mar 20 14:43:38 crc kubenswrapper[4973]: I0320 14:43:38.560700 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-px2c4" Mar 20 14:43:38 crc kubenswrapper[4973]: I0320 14:43:38.613899 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-px2c4" Mar 20 14:43:38 crc kubenswrapper[4973]: I0320 14:43:38.816049 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-px2c4"] Mar 20 14:43:39 crc kubenswrapper[4973]: I0320 14:43:39.909033 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-px2c4" podUID="36698437-ce7d-4a1b-98bc-8ca8e743ace5" containerName="registry-server" containerID="cri-o://a60209a53bb76b1a892cb87b9c8660d7d91f588c03b5f0f822bea9e575dd97a4" gracePeriod=2 Mar 20 14:43:40 crc kubenswrapper[4973]: I0320 14:43:40.812052 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-px2c4" Mar 20 14:43:40 crc kubenswrapper[4973]: I0320 14:43:40.917561 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-srhqh\" (UniqueName: \"kubernetes.io/projected/36698437-ce7d-4a1b-98bc-8ca8e743ace5-kube-api-access-srhqh\") pod \"36698437-ce7d-4a1b-98bc-8ca8e743ace5\" (UID: \"36698437-ce7d-4a1b-98bc-8ca8e743ace5\") " Mar 20 14:43:40 crc kubenswrapper[4973]: I0320 14:43:40.917876 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36698437-ce7d-4a1b-98bc-8ca8e743ace5-catalog-content\") pod \"36698437-ce7d-4a1b-98bc-8ca8e743ace5\" (UID: \"36698437-ce7d-4a1b-98bc-8ca8e743ace5\") " Mar 20 14:43:40 crc kubenswrapper[4973]: I0320 14:43:40.918034 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36698437-ce7d-4a1b-98bc-8ca8e743ace5-utilities\") pod \"36698437-ce7d-4a1b-98bc-8ca8e743ace5\" (UID: \"36698437-ce7d-4a1b-98bc-8ca8e743ace5\") " Mar 20 14:43:40 crc kubenswrapper[4973]: I0320 14:43:40.918726 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36698437-ce7d-4a1b-98bc-8ca8e743ace5-utilities" (OuterVolumeSpecName: "utilities") pod "36698437-ce7d-4a1b-98bc-8ca8e743ace5" (UID: "36698437-ce7d-4a1b-98bc-8ca8e743ace5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:43:40 crc kubenswrapper[4973]: I0320 14:43:40.924349 4973 generic.go:334] "Generic (PLEG): container finished" podID="36698437-ce7d-4a1b-98bc-8ca8e743ace5" containerID="a60209a53bb76b1a892cb87b9c8660d7d91f588c03b5f0f822bea9e575dd97a4" exitCode=0 Mar 20 14:43:40 crc kubenswrapper[4973]: I0320 14:43:40.924463 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-px2c4" Mar 20 14:43:40 crc kubenswrapper[4973]: I0320 14:43:40.924486 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-px2c4" event={"ID":"36698437-ce7d-4a1b-98bc-8ca8e743ace5","Type":"ContainerDied","Data":"a60209a53bb76b1a892cb87b9c8660d7d91f588c03b5f0f822bea9e575dd97a4"} Mar 20 14:43:40 crc kubenswrapper[4973]: I0320 14:43:40.924775 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-px2c4" event={"ID":"36698437-ce7d-4a1b-98bc-8ca8e743ace5","Type":"ContainerDied","Data":"db9deee09f8f329d1d98df94699324284329baa3a3683f6901d7358cd611fd5a"} Mar 20 14:43:40 crc kubenswrapper[4973]: I0320 14:43:40.924802 4973 scope.go:117] "RemoveContainer" containerID="a60209a53bb76b1a892cb87b9c8660d7d91f588c03b5f0f822bea9e575dd97a4" Mar 20 14:43:40 crc kubenswrapper[4973]: I0320 14:43:40.932301 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36698437-ce7d-4a1b-98bc-8ca8e743ace5-kube-api-access-srhqh" (OuterVolumeSpecName: "kube-api-access-srhqh") pod "36698437-ce7d-4a1b-98bc-8ca8e743ace5" (UID: "36698437-ce7d-4a1b-98bc-8ca8e743ace5"). InnerVolumeSpecName "kube-api-access-srhqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:43:40 crc kubenswrapper[4973]: I0320 14:43:40.976751 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36698437-ce7d-4a1b-98bc-8ca8e743ace5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "36698437-ce7d-4a1b-98bc-8ca8e743ace5" (UID: "36698437-ce7d-4a1b-98bc-8ca8e743ace5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:43:41 crc kubenswrapper[4973]: I0320 14:43:41.023175 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/36698437-ce7d-4a1b-98bc-8ca8e743ace5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:43:41 crc kubenswrapper[4973]: I0320 14:43:41.023207 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/36698437-ce7d-4a1b-98bc-8ca8e743ace5-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:43:41 crc kubenswrapper[4973]: I0320 14:43:41.023217 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-srhqh\" (UniqueName: \"kubernetes.io/projected/36698437-ce7d-4a1b-98bc-8ca8e743ace5-kube-api-access-srhqh\") on node \"crc\" DevicePath \"\"" Mar 20 14:43:41 crc kubenswrapper[4973]: I0320 14:43:41.030594 4973 scope.go:117] "RemoveContainer" containerID="2388795ff38e1351088bfb0310af1e52dbf4a082efcc39ca6ca1eed400d949e4" Mar 20 14:43:41 crc kubenswrapper[4973]: I0320 14:43:41.054361 4973 scope.go:117] "RemoveContainer" containerID="8f0a04e34e2a7462bdfbd68736880d98bdc15f84f16840fc5d5b8062ff32b6b6" Mar 20 14:43:41 crc kubenswrapper[4973]: I0320 14:43:41.115483 4973 scope.go:117] "RemoveContainer" containerID="a60209a53bb76b1a892cb87b9c8660d7d91f588c03b5f0f822bea9e575dd97a4" Mar 20 14:43:41 crc kubenswrapper[4973]: E0320 14:43:41.116655 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a60209a53bb76b1a892cb87b9c8660d7d91f588c03b5f0f822bea9e575dd97a4\": container with ID starting with a60209a53bb76b1a892cb87b9c8660d7d91f588c03b5f0f822bea9e575dd97a4 not found: ID does not exist" containerID="a60209a53bb76b1a892cb87b9c8660d7d91f588c03b5f0f822bea9e575dd97a4" Mar 20 14:43:41 crc kubenswrapper[4973]: I0320 14:43:41.116692 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a60209a53bb76b1a892cb87b9c8660d7d91f588c03b5f0f822bea9e575dd97a4"} err="failed to get container status \"a60209a53bb76b1a892cb87b9c8660d7d91f588c03b5f0f822bea9e575dd97a4\": rpc error: code = NotFound desc = could not find container \"a60209a53bb76b1a892cb87b9c8660d7d91f588c03b5f0f822bea9e575dd97a4\": container with ID starting with a60209a53bb76b1a892cb87b9c8660d7d91f588c03b5f0f822bea9e575dd97a4 not found: ID does not exist" Mar 20 14:43:41 crc kubenswrapper[4973]: I0320 14:43:41.116721 4973 scope.go:117] "RemoveContainer" containerID="2388795ff38e1351088bfb0310af1e52dbf4a082efcc39ca6ca1eed400d949e4" Mar 20 14:43:41 crc kubenswrapper[4973]: E0320 14:43:41.117706 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2388795ff38e1351088bfb0310af1e52dbf4a082efcc39ca6ca1eed400d949e4\": container with ID starting with 2388795ff38e1351088bfb0310af1e52dbf4a082efcc39ca6ca1eed400d949e4 not found: ID does not exist" containerID="2388795ff38e1351088bfb0310af1e52dbf4a082efcc39ca6ca1eed400d949e4" Mar 20 14:43:41 crc kubenswrapper[4973]: I0320 14:43:41.117734 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2388795ff38e1351088bfb0310af1e52dbf4a082efcc39ca6ca1eed400d949e4"} err="failed to get container status \"2388795ff38e1351088bfb0310af1e52dbf4a082efcc39ca6ca1eed400d949e4\": rpc error: code = NotFound desc = could not find container \"2388795ff38e1351088bfb0310af1e52dbf4a082efcc39ca6ca1eed400d949e4\": container with ID starting with 2388795ff38e1351088bfb0310af1e52dbf4a082efcc39ca6ca1eed400d949e4 not found: ID does not exist" Mar 20 14:43:41 crc kubenswrapper[4973]: I0320 14:43:41.117754 4973 scope.go:117] "RemoveContainer" containerID="8f0a04e34e2a7462bdfbd68736880d98bdc15f84f16840fc5d5b8062ff32b6b6" Mar 20 14:43:41 crc kubenswrapper[4973]: E0320 14:43:41.118075 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f0a04e34e2a7462bdfbd68736880d98bdc15f84f16840fc5d5b8062ff32b6b6\": container with ID starting with 8f0a04e34e2a7462bdfbd68736880d98bdc15f84f16840fc5d5b8062ff32b6b6 not found: ID does not exist" containerID="8f0a04e34e2a7462bdfbd68736880d98bdc15f84f16840fc5d5b8062ff32b6b6" Mar 20 14:43:41 crc kubenswrapper[4973]: I0320 14:43:41.118101 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f0a04e34e2a7462bdfbd68736880d98bdc15f84f16840fc5d5b8062ff32b6b6"} err="failed to get container status \"8f0a04e34e2a7462bdfbd68736880d98bdc15f84f16840fc5d5b8062ff32b6b6\": rpc error: code = NotFound desc = could not find container \"8f0a04e34e2a7462bdfbd68736880d98bdc15f84f16840fc5d5b8062ff32b6b6\": container with ID starting with 8f0a04e34e2a7462bdfbd68736880d98bdc15f84f16840fc5d5b8062ff32b6b6 not found: ID does not exist" Mar 20 14:43:41 crc kubenswrapper[4973]: I0320 14:43:41.268390 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-px2c4"] Mar 20 14:43:41 crc kubenswrapper[4973]: I0320 14:43:41.281675 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-px2c4"] Mar 20 14:43:41 crc kubenswrapper[4973]: I0320 14:43:41.962971 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36698437-ce7d-4a1b-98bc-8ca8e743ace5" path="/var/lib/kubelet/pods/36698437-ce7d-4a1b-98bc-8ca8e743ace5/volumes" Mar 20 14:43:43 crc kubenswrapper[4973]: I0320 14:43:43.320514 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:43:43 crc kubenswrapper[4973]: I0320 14:43:43.320572 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:43:43 crc kubenswrapper[4973]: I0320 14:43:43.320616 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 14:43:43 crc kubenswrapper[4973]: I0320 14:43:43.321646 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"59cb6d1eb0dc10deddad9ed131c9312329e2744cabbd12799cb25f9f0fdea90e"} pod="openshift-machine-config-operator/machine-config-daemon-qlztx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:43:43 crc kubenswrapper[4973]: I0320 14:43:43.321707 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" containerID="cri-o://59cb6d1eb0dc10deddad9ed131c9312329e2744cabbd12799cb25f9f0fdea90e" gracePeriod=600 Mar 20 14:43:43 crc kubenswrapper[4973]: I0320 14:43:43.966122 4973 generic.go:334] "Generic (PLEG): container finished" podID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerID="59cb6d1eb0dc10deddad9ed131c9312329e2744cabbd12799cb25f9f0fdea90e" exitCode=0 Mar 20 14:43:43 crc kubenswrapper[4973]: I0320 14:43:43.966488 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerDied","Data":"59cb6d1eb0dc10deddad9ed131c9312329e2744cabbd12799cb25f9f0fdea90e"} Mar 20 14:43:43 crc kubenswrapper[4973]: I0320 14:43:43.966516 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerStarted","Data":"3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790"} Mar 20 14:43:43 crc kubenswrapper[4973]: I0320 14:43:43.966534 4973 scope.go:117] "RemoveContainer" containerID="dff428940e820f52f1312ddaa416c41891756bd3b999ccc7515237f831aba749" Mar 20 14:44:00 crc kubenswrapper[4973]: I0320 14:44:00.171608 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566964-hpzvv"] Mar 20 14:44:00 crc kubenswrapper[4973]: E0320 14:44:00.173012 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36698437-ce7d-4a1b-98bc-8ca8e743ace5" containerName="registry-server" Mar 20 14:44:00 crc kubenswrapper[4973]: I0320 14:44:00.173029 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="36698437-ce7d-4a1b-98bc-8ca8e743ace5" containerName="registry-server" Mar 20 14:44:00 crc kubenswrapper[4973]: E0320 14:44:00.173079 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36698437-ce7d-4a1b-98bc-8ca8e743ace5" containerName="extract-utilities" Mar 20 14:44:00 crc kubenswrapper[4973]: I0320 14:44:00.173086 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="36698437-ce7d-4a1b-98bc-8ca8e743ace5" containerName="extract-utilities" Mar 20 14:44:00 crc kubenswrapper[4973]: E0320 14:44:00.173117 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36698437-ce7d-4a1b-98bc-8ca8e743ace5" containerName="extract-content" Mar 20 14:44:00 crc kubenswrapper[4973]: I0320 14:44:00.173122 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="36698437-ce7d-4a1b-98bc-8ca8e743ace5" containerName="extract-content" Mar 20 14:44:00 crc kubenswrapper[4973]: I0320 14:44:00.173383 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="36698437-ce7d-4a1b-98bc-8ca8e743ace5" containerName="registry-server" Mar 20 14:44:00 crc kubenswrapper[4973]: I0320 14:44:00.174277 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566964-hpzvv" Mar 20 14:44:00 crc kubenswrapper[4973]: I0320 14:44:00.189920 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566964-hpzvv"] Mar 20 14:44:00 crc kubenswrapper[4973]: I0320 14:44:00.205486 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:44:00 crc kubenswrapper[4973]: I0320 14:44:00.214717 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:44:00 crc kubenswrapper[4973]: I0320 14:44:00.241678 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:44:00 crc kubenswrapper[4973]: I0320 14:44:00.332559 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57mk5\" (UniqueName: \"kubernetes.io/projected/a9dc265a-1fa0-4161-9a5c-0bdeea9ee7e6-kube-api-access-57mk5\") pod \"auto-csr-approver-29566964-hpzvv\" (UID: \"a9dc265a-1fa0-4161-9a5c-0bdeea9ee7e6\") " pod="openshift-infra/auto-csr-approver-29566964-hpzvv" Mar 20 14:44:00 crc kubenswrapper[4973]: I0320 14:44:00.434747 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57mk5\" (UniqueName: \"kubernetes.io/projected/a9dc265a-1fa0-4161-9a5c-0bdeea9ee7e6-kube-api-access-57mk5\") pod \"auto-csr-approver-29566964-hpzvv\" (UID: \"a9dc265a-1fa0-4161-9a5c-0bdeea9ee7e6\") " pod="openshift-infra/auto-csr-approver-29566964-hpzvv" Mar 20 14:44:00 crc kubenswrapper[4973]: I0320 14:44:00.458981 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57mk5\" (UniqueName: \"kubernetes.io/projected/a9dc265a-1fa0-4161-9a5c-0bdeea9ee7e6-kube-api-access-57mk5\") pod \"auto-csr-approver-29566964-hpzvv\" (UID: \"a9dc265a-1fa0-4161-9a5c-0bdeea9ee7e6\") " pod="openshift-infra/auto-csr-approver-29566964-hpzvv" Mar 20 14:44:00 crc kubenswrapper[4973]: I0320 14:44:00.507114 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566964-hpzvv" Mar 20 14:44:00 crc kubenswrapper[4973]: I0320 14:44:00.965508 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566964-hpzvv"] Mar 20 14:44:00 crc kubenswrapper[4973]: W0320 14:44:00.977705 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9dc265a_1fa0_4161_9a5c_0bdeea9ee7e6.slice/crio-ceb5b43a950cac3abe3abfbf7542b28927e2cab84b069a9c81799f3e017824c5 WatchSource:0}: Error finding container ceb5b43a950cac3abe3abfbf7542b28927e2cab84b069a9c81799f3e017824c5: Status 404 returned error can't find the container with id ceb5b43a950cac3abe3abfbf7542b28927e2cab84b069a9c81799f3e017824c5 Mar 20 14:44:01 crc kubenswrapper[4973]: I0320 14:44:01.159277 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566964-hpzvv" event={"ID":"a9dc265a-1fa0-4161-9a5c-0bdeea9ee7e6","Type":"ContainerStarted","Data":"ceb5b43a950cac3abe3abfbf7542b28927e2cab84b069a9c81799f3e017824c5"} Mar 20 14:44:04 crc kubenswrapper[4973]: I0320 14:44:04.208057 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566964-hpzvv" event={"ID":"a9dc265a-1fa0-4161-9a5c-0bdeea9ee7e6","Type":"ContainerStarted","Data":"0b784b3d6db72c4fd3d08921f5c105ae53129255e422784907eaab1b9f5125a1"} Mar 20 14:44:04 crc kubenswrapper[4973]: I0320 14:44:04.223941 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566964-hpzvv" podStartSLOduration=3.354436135 podStartE2EDuration="4.223923414s" podCreationTimestamp="2026-03-20 14:44:00 +0000 UTC" firstStartedPulling="2026-03-20 14:44:00.981094635 +0000 UTC m=+4961.724764379" lastFinishedPulling="2026-03-20 14:44:01.850581914 +0000 UTC m=+4962.594251658" observedRunningTime="2026-03-20 14:44:04.220305445 +0000 UTC m=+4964.963975199" watchObservedRunningTime="2026-03-20 14:44:04.223923414 +0000 UTC m=+4964.967593158" Mar 20 14:44:06 crc kubenswrapper[4973]: I0320 14:44:06.231896 4973 generic.go:334] "Generic (PLEG): container finished" podID="a9dc265a-1fa0-4161-9a5c-0bdeea9ee7e6" containerID="0b784b3d6db72c4fd3d08921f5c105ae53129255e422784907eaab1b9f5125a1" exitCode=0 Mar 20 14:44:06 crc kubenswrapper[4973]: I0320 14:44:06.232157 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566964-hpzvv" event={"ID":"a9dc265a-1fa0-4161-9a5c-0bdeea9ee7e6","Type":"ContainerDied","Data":"0b784b3d6db72c4fd3d08921f5c105ae53129255e422784907eaab1b9f5125a1"} Mar 20 14:44:07 crc kubenswrapper[4973]: I0320 14:44:07.716424 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566964-hpzvv" Mar 20 14:44:07 crc kubenswrapper[4973]: I0320 14:44:07.844439 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-57mk5\" (UniqueName: \"kubernetes.io/projected/a9dc265a-1fa0-4161-9a5c-0bdeea9ee7e6-kube-api-access-57mk5\") pod \"a9dc265a-1fa0-4161-9a5c-0bdeea9ee7e6\" (UID: \"a9dc265a-1fa0-4161-9a5c-0bdeea9ee7e6\") " Mar 20 14:44:07 crc kubenswrapper[4973]: I0320 14:44:07.866586 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9dc265a-1fa0-4161-9a5c-0bdeea9ee7e6-kube-api-access-57mk5" (OuterVolumeSpecName: "kube-api-access-57mk5") pod "a9dc265a-1fa0-4161-9a5c-0bdeea9ee7e6" (UID: "a9dc265a-1fa0-4161-9a5c-0bdeea9ee7e6"). InnerVolumeSpecName "kube-api-access-57mk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:44:07 crc kubenswrapper[4973]: I0320 14:44:07.948332 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-57mk5\" (UniqueName: \"kubernetes.io/projected/a9dc265a-1fa0-4161-9a5c-0bdeea9ee7e6-kube-api-access-57mk5\") on node \"crc\" DevicePath \"\"" Mar 20 14:44:08 crc kubenswrapper[4973]: I0320 14:44:08.262282 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566964-hpzvv" event={"ID":"a9dc265a-1fa0-4161-9a5c-0bdeea9ee7e6","Type":"ContainerDied","Data":"ceb5b43a950cac3abe3abfbf7542b28927e2cab84b069a9c81799f3e017824c5"} Mar 20 14:44:08 crc kubenswrapper[4973]: I0320 14:44:08.262369 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ceb5b43a950cac3abe3abfbf7542b28927e2cab84b069a9c81799f3e017824c5" Mar 20 14:44:08 crc kubenswrapper[4973]: I0320 14:44:08.262380 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566964-hpzvv" Mar 20 14:44:08 crc kubenswrapper[4973]: I0320 14:44:08.800560 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566958-pszgn"] Mar 20 14:44:08 crc kubenswrapper[4973]: I0320 14:44:08.813334 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566958-pszgn"] Mar 20 14:44:09 crc kubenswrapper[4973]: I0320 14:44:09.966584 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1591f03-5f44-4c7e-8e66-5c33919112f5" path="/var/lib/kubelet/pods/d1591f03-5f44-4c7e-8e66-5c33919112f5/volumes" Mar 20 14:44:20 crc kubenswrapper[4973]: I0320 14:44:20.397202 4973 generic.go:334] "Generic (PLEG): container finished" podID="b5edf151-174a-4c18-b733-318653db1c6e" containerID="fe8b8eb04a04362d2f5f7d4c168a3ae188ac63e780f21c34b9cabddcd2073d2a" exitCode=1 Mar 20 14:44:20 crc kubenswrapper[4973]: I0320 14:44:20.397294 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b5edf151-174a-4c18-b733-318653db1c6e","Type":"ContainerDied","Data":"fe8b8eb04a04362d2f5f7d4c168a3ae188ac63e780f21c34b9cabddcd2073d2a"} Mar 20 14:44:21 crc kubenswrapper[4973]: I0320 14:44:21.847593 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 14:44:21 crc kubenswrapper[4973]: I0320 14:44:21.921907 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"b5edf151-174a-4c18-b733-318653db1c6e\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " Mar 20 14:44:21 crc kubenswrapper[4973]: I0320 14:44:21.921962 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcqr9\" (UniqueName: \"kubernetes.io/projected/b5edf151-174a-4c18-b733-318653db1c6e-kube-api-access-bcqr9\") pod \"b5edf151-174a-4c18-b733-318653db1c6e\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " Mar 20 14:44:21 crc kubenswrapper[4973]: I0320 14:44:21.922033 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b5edf151-174a-4c18-b733-318653db1c6e-openstack-config-secret\") pod \"b5edf151-174a-4c18-b733-318653db1c6e\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " Mar 20 14:44:21 crc kubenswrapper[4973]: I0320 14:44:21.922100 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b5edf151-174a-4c18-b733-318653db1c6e-test-operator-ephemeral-temporary\") pod \"b5edf151-174a-4c18-b733-318653db1c6e\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " Mar 20 14:44:21 crc kubenswrapper[4973]: I0320 14:44:21.922129 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b5edf151-174a-4c18-b733-318653db1c6e-openstack-config\") pod \"b5edf151-174a-4c18-b733-318653db1c6e\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " Mar 20 14:44:21 crc kubenswrapper[4973]: I0320 14:44:21.922179 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b5edf151-174a-4c18-b733-318653db1c6e-test-operator-ephemeral-workdir\") pod \"b5edf151-174a-4c18-b733-318653db1c6e\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " Mar 20 14:44:21 crc kubenswrapper[4973]: I0320 14:44:21.922713 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5edf151-174a-4c18-b733-318653db1c6e-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "b5edf151-174a-4c18-b733-318653db1c6e" (UID: "b5edf151-174a-4c18-b733-318653db1c6e"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:44:21 crc kubenswrapper[4973]: I0320 14:44:21.928880 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5edf151-174a-4c18-b733-318653db1c6e-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "b5edf151-174a-4c18-b733-318653db1c6e" (UID: "b5edf151-174a-4c18-b733-318653db1c6e"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:44:21 crc kubenswrapper[4973]: I0320 14:44:21.932910 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5edf151-174a-4c18-b733-318653db1c6e-kube-api-access-bcqr9" (OuterVolumeSpecName: "kube-api-access-bcqr9") pod "b5edf151-174a-4c18-b733-318653db1c6e" (UID: "b5edf151-174a-4c18-b733-318653db1c6e"). InnerVolumeSpecName "kube-api-access-bcqr9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:44:21 crc kubenswrapper[4973]: I0320 14:44:21.932917 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "test-operator-logs") pod "b5edf151-174a-4c18-b733-318653db1c6e" (UID: "b5edf151-174a-4c18-b733-318653db1c6e"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 20 14:44:21 crc kubenswrapper[4973]: I0320 14:44:21.965201 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5edf151-174a-4c18-b733-318653db1c6e-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "b5edf151-174a-4c18-b733-318653db1c6e" (UID: "b5edf151-174a-4c18-b733-318653db1c6e"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:44:22 crc kubenswrapper[4973]: I0320 14:44:22.002010 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5edf151-174a-4c18-b733-318653db1c6e-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "b5edf151-174a-4c18-b733-318653db1c6e" (UID: "b5edf151-174a-4c18-b733-318653db1c6e"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:44:22 crc kubenswrapper[4973]: I0320 14:44:22.025232 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5edf151-174a-4c18-b733-318653db1c6e-config-data\") pod \"b5edf151-174a-4c18-b733-318653db1c6e\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " Mar 20 14:44:22 crc kubenswrapper[4973]: I0320 14:44:22.025439 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b5edf151-174a-4c18-b733-318653db1c6e-ssh-key\") pod \"b5edf151-174a-4c18-b733-318653db1c6e\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " Mar 20 14:44:22 crc kubenswrapper[4973]: I0320 14:44:22.025482 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b5edf151-174a-4c18-b733-318653db1c6e-ca-certs\") pod \"b5edf151-174a-4c18-b733-318653db1c6e\" (UID: \"b5edf151-174a-4c18-b733-318653db1c6e\") " Mar 20 14:44:22 crc kubenswrapper[4973]: I0320 14:44:22.026176 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcqr9\" (UniqueName: \"kubernetes.io/projected/b5edf151-174a-4c18-b733-318653db1c6e-kube-api-access-bcqr9\") on node \"crc\" DevicePath \"\"" Mar 20 14:44:22 crc kubenswrapper[4973]: I0320 14:44:22.026873 4973 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 20 14:44:22 crc kubenswrapper[4973]: I0320 14:44:22.026893 4973 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/b5edf151-174a-4c18-b733-318653db1c6e-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 20 14:44:22 crc kubenswrapper[4973]: I0320 14:44:22.026903 4973 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/b5edf151-174a-4c18-b733-318653db1c6e-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 20 14:44:22 crc kubenswrapper[4973]: I0320 14:44:22.026915 4973 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/b5edf151-174a-4c18-b733-318653db1c6e-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 20 14:44:22 crc kubenswrapper[4973]: I0320 14:44:22.026927 4973 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/b5edf151-174a-4c18-b733-318653db1c6e-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 20 14:44:22 crc kubenswrapper[4973]: I0320 14:44:22.028476 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b5edf151-174a-4c18-b733-318653db1c6e-config-data" (OuterVolumeSpecName: "config-data") pod "b5edf151-174a-4c18-b733-318653db1c6e" (UID: "b5edf151-174a-4c18-b733-318653db1c6e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:44:22 crc kubenswrapper[4973]: I0320 14:44:22.055006 4973 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 20 14:44:22 crc kubenswrapper[4973]: I0320 14:44:22.058617 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5edf151-174a-4c18-b733-318653db1c6e-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "b5edf151-174a-4c18-b733-318653db1c6e" (UID: "b5edf151-174a-4c18-b733-318653db1c6e"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:44:22 crc kubenswrapper[4973]: I0320 14:44:22.059474 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5edf151-174a-4c18-b733-318653db1c6e-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b5edf151-174a-4c18-b733-318653db1c6e" (UID: "b5edf151-174a-4c18-b733-318653db1c6e"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:44:22 crc kubenswrapper[4973]: I0320 14:44:22.128459 4973 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b5edf151-174a-4c18-b733-318653db1c6e-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 20 14:44:22 crc kubenswrapper[4973]: I0320 14:44:22.128528 4973 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/b5edf151-174a-4c18-b733-318653db1c6e-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 20 14:44:22 crc kubenswrapper[4973]: I0320 14:44:22.128547 4973 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 20 14:44:22 crc kubenswrapper[4973]: I0320 14:44:22.128566 4973 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b5edf151-174a-4c18-b733-318653db1c6e-config-data\") on node \"crc\" DevicePath \"\"" Mar 20 14:44:22 crc kubenswrapper[4973]: I0320 14:44:22.431501 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"b5edf151-174a-4c18-b733-318653db1c6e","Type":"ContainerDied","Data":"776579fb9892ea1c70855e496750a6c3a5bbd050e2bb9a829bd74ed9dc0c74e8"} Mar 20 14:44:22 crc kubenswrapper[4973]: I0320 14:44:22.432109 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="776579fb9892ea1c70855e496750a6c3a5bbd050e2bb9a829bd74ed9dc0c74e8" Mar 20 14:44:22 crc kubenswrapper[4973]: I0320 14:44:22.431621 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 20 14:44:34 crc kubenswrapper[4973]: I0320 14:44:34.074749 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 14:44:34 crc kubenswrapper[4973]: E0320 14:44:34.075948 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5edf151-174a-4c18-b733-318653db1c6e" containerName="tempest-tests-tempest-tests-runner" Mar 20 14:44:34 crc kubenswrapper[4973]: I0320 14:44:34.075967 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5edf151-174a-4c18-b733-318653db1c6e" containerName="tempest-tests-tempest-tests-runner" Mar 20 14:44:34 crc kubenswrapper[4973]: E0320 14:44:34.076014 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9dc265a-1fa0-4161-9a5c-0bdeea9ee7e6" containerName="oc" Mar 20 14:44:34 crc kubenswrapper[4973]: I0320 14:44:34.076023 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9dc265a-1fa0-4161-9a5c-0bdeea9ee7e6" containerName="oc" Mar 20 14:44:34 crc kubenswrapper[4973]: I0320 14:44:34.076294 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9dc265a-1fa0-4161-9a5c-0bdeea9ee7e6" containerName="oc" Mar 20 14:44:34 crc kubenswrapper[4973]: I0320 14:44:34.076327 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5edf151-174a-4c18-b733-318653db1c6e" containerName="tempest-tests-tempest-tests-runner" Mar 20 14:44:34 crc kubenswrapper[4973]: I0320 14:44:34.080509 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 14:44:34 crc kubenswrapper[4973]: I0320 14:44:34.084142 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-dfv6w" Mar 20 14:44:34 crc kubenswrapper[4973]: I0320 14:44:34.090743 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 14:44:34 crc kubenswrapper[4973]: I0320 14:44:34.163396 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"66048617-923b-4595-bc01-4fedc0092948\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 14:44:34 crc kubenswrapper[4973]: I0320 14:44:34.163495 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qtj5\" (UniqueName: \"kubernetes.io/projected/66048617-923b-4595-bc01-4fedc0092948-kube-api-access-9qtj5\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"66048617-923b-4595-bc01-4fedc0092948\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 14:44:34 crc kubenswrapper[4973]: I0320 14:44:34.266189 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"66048617-923b-4595-bc01-4fedc0092948\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 14:44:34 crc kubenswrapper[4973]: I0320 14:44:34.266284 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qtj5\" (UniqueName: \"kubernetes.io/projected/66048617-923b-4595-bc01-4fedc0092948-kube-api-access-9qtj5\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"66048617-923b-4595-bc01-4fedc0092948\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 14:44:34 crc kubenswrapper[4973]: I0320 14:44:34.267554 4973 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"66048617-923b-4595-bc01-4fedc0092948\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 14:44:34 crc kubenswrapper[4973]: I0320 14:44:34.286229 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qtj5\" (UniqueName: \"kubernetes.io/projected/66048617-923b-4595-bc01-4fedc0092948-kube-api-access-9qtj5\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"66048617-923b-4595-bc01-4fedc0092948\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 14:44:34 crc kubenswrapper[4973]: I0320 14:44:34.300520 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"66048617-923b-4595-bc01-4fedc0092948\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 14:44:34 crc kubenswrapper[4973]: I0320 14:44:34.405268 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 20 14:44:34 crc kubenswrapper[4973]: I0320 14:44:34.862489 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 20 14:44:35 crc kubenswrapper[4973]: I0320 14:44:35.582909 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"66048617-923b-4595-bc01-4fedc0092948","Type":"ContainerStarted","Data":"835e76d1e1738cd486aa396badf8ae7dc30cfbb8fc70915d86dfc18d95e0c0b1"} Mar 20 14:44:37 crc kubenswrapper[4973]: I0320 14:44:37.608911 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"66048617-923b-4595-bc01-4fedc0092948","Type":"ContainerStarted","Data":"89d4d0c1f606a540fdd9c743edc36926b5f0844f68054beec53ac653e0d2e421"} Mar 20 14:44:37 crc kubenswrapper[4973]: I0320 14:44:37.632456 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.49148562 podStartE2EDuration="3.632426016s" podCreationTimestamp="2026-03-20 14:44:34 +0000 UTC" firstStartedPulling="2026-03-20 14:44:35.195912912 +0000 UTC m=+4995.939582656" lastFinishedPulling="2026-03-20 14:44:36.336853308 +0000 UTC m=+4997.080523052" observedRunningTime="2026-03-20 14:44:37.622024891 +0000 UTC m=+4998.365694625" watchObservedRunningTime="2026-03-20 14:44:37.632426016 +0000 UTC m=+4998.376095760" Mar 20 14:44:57 crc kubenswrapper[4973]: I0320 14:44:57.852733 4973 generic.go:334] "Generic (PLEG): container finished" podID="57c75085-cf04-49d6-8b97-902e00c0efd5" containerID="da6003725dac38353c12d436c1cdfb7e598b608c8da0d7de51706cb392e36cf6" exitCode=0 Mar 20 14:44:57 crc kubenswrapper[4973]: I0320 14:44:57.852799 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" event={"ID":"57c75085-cf04-49d6-8b97-902e00c0efd5","Type":"ContainerDied","Data":"da6003725dac38353c12d436c1cdfb7e598b608c8da0d7de51706cb392e36cf6"} Mar 20 14:44:57 crc kubenswrapper[4973]: I0320 14:44:57.853255 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" event={"ID":"57c75085-cf04-49d6-8b97-902e00c0efd5","Type":"ContainerStarted","Data":"f15417282a2f213a94b807738daca9b81c74b40351939862d61db01c6ffee62a"} Mar 20 14:45:00 crc kubenswrapper[4973]: I0320 14:45:00.161303 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566965-5jgs6"] Mar 20 14:45:00 crc kubenswrapper[4973]: I0320 14:45:00.163944 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5jgs6" Mar 20 14:45:00 crc kubenswrapper[4973]: I0320 14:45:00.166914 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 14:45:00 crc kubenswrapper[4973]: I0320 14:45:00.167236 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 14:45:00 crc kubenswrapper[4973]: I0320 14:45:00.173797 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566965-5jgs6"] Mar 20 14:45:00 crc kubenswrapper[4973]: I0320 14:45:00.256248 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bcc22ace-18ff-4b6a-aa48-04ab3d8736c2-config-volume\") pod \"collect-profiles-29566965-5jgs6\" (UID: \"bcc22ace-18ff-4b6a-aa48-04ab3d8736c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5jgs6" Mar 20 14:45:00 crc kubenswrapper[4973]: I0320 14:45:00.256487 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bcc22ace-18ff-4b6a-aa48-04ab3d8736c2-secret-volume\") pod \"collect-profiles-29566965-5jgs6\" (UID: \"bcc22ace-18ff-4b6a-aa48-04ab3d8736c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5jgs6" Mar 20 14:45:00 crc kubenswrapper[4973]: I0320 14:45:00.257037 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfjbp\" (UniqueName: \"kubernetes.io/projected/bcc22ace-18ff-4b6a-aa48-04ab3d8736c2-kube-api-access-dfjbp\") pod \"collect-profiles-29566965-5jgs6\" (UID: \"bcc22ace-18ff-4b6a-aa48-04ab3d8736c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5jgs6" Mar 20 14:45:00 crc kubenswrapper[4973]: I0320 14:45:00.359515 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfjbp\" (UniqueName: \"kubernetes.io/projected/bcc22ace-18ff-4b6a-aa48-04ab3d8736c2-kube-api-access-dfjbp\") pod \"collect-profiles-29566965-5jgs6\" (UID: \"bcc22ace-18ff-4b6a-aa48-04ab3d8736c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5jgs6" Mar 20 14:45:00 crc kubenswrapper[4973]: I0320 14:45:00.359989 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bcc22ace-18ff-4b6a-aa48-04ab3d8736c2-config-volume\") pod \"collect-profiles-29566965-5jgs6\" (UID: \"bcc22ace-18ff-4b6a-aa48-04ab3d8736c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5jgs6" Mar 20 14:45:00 crc kubenswrapper[4973]: I0320 14:45:00.360908 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bcc22ace-18ff-4b6a-aa48-04ab3d8736c2-config-volume\") pod \"collect-profiles-29566965-5jgs6\" (UID: \"bcc22ace-18ff-4b6a-aa48-04ab3d8736c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5jgs6" Mar 20 14:45:00 crc kubenswrapper[4973]: I0320 14:45:00.360990 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bcc22ace-18ff-4b6a-aa48-04ab3d8736c2-secret-volume\") pod \"collect-profiles-29566965-5jgs6\" (UID: \"bcc22ace-18ff-4b6a-aa48-04ab3d8736c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5jgs6" Mar 20 14:45:00 crc kubenswrapper[4973]: I0320 14:45:00.368503 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bcc22ace-18ff-4b6a-aa48-04ab3d8736c2-secret-volume\") pod \"collect-profiles-29566965-5jgs6\" (UID: \"bcc22ace-18ff-4b6a-aa48-04ab3d8736c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5jgs6" Mar 20 14:45:00 crc kubenswrapper[4973]: I0320 14:45:00.376705 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfjbp\" (UniqueName: \"kubernetes.io/projected/bcc22ace-18ff-4b6a-aa48-04ab3d8736c2-kube-api-access-dfjbp\") pod \"collect-profiles-29566965-5jgs6\" (UID: \"bcc22ace-18ff-4b6a-aa48-04ab3d8736c2\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5jgs6" Mar 20 14:45:00 crc kubenswrapper[4973]: I0320 14:45:00.490257 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5jgs6" Mar 20 14:45:00 crc kubenswrapper[4973]: I0320 14:45:00.970588 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566965-5jgs6"] Mar 20 14:45:00 crc kubenswrapper[4973]: W0320 14:45:00.977361 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcc22ace_18ff_4b6a_aa48_04ab3d8736c2.slice/crio-f3cdc058e3b25402d570a9281d0f8b1650084852598a1e563b896a06340eb9f4 WatchSource:0}: Error finding container f3cdc058e3b25402d570a9281d0f8b1650084852598a1e563b896a06340eb9f4: Status 404 returned error can't find the container with id f3cdc058e3b25402d570a9281d0f8b1650084852598a1e563b896a06340eb9f4 Mar 20 14:45:01 crc kubenswrapper[4973]: I0320 14:45:01.898980 4973 generic.go:334] "Generic (PLEG): container finished" podID="bcc22ace-18ff-4b6a-aa48-04ab3d8736c2" containerID="67a7c88f10d4c30a2425b5a38c823e9cfa5b5c7c3d1c28947b224e074468d72d" exitCode=0 Mar 20 14:45:01 crc kubenswrapper[4973]: I0320 14:45:01.899032 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5jgs6" event={"ID":"bcc22ace-18ff-4b6a-aa48-04ab3d8736c2","Type":"ContainerDied","Data":"67a7c88f10d4c30a2425b5a38c823e9cfa5b5c7c3d1c28947b224e074468d72d"} Mar 20 14:45:01 crc kubenswrapper[4973]: I0320 14:45:01.899656 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5jgs6" event={"ID":"bcc22ace-18ff-4b6a-aa48-04ab3d8736c2","Type":"ContainerStarted","Data":"f3cdc058e3b25402d570a9281d0f8b1650084852598a1e563b896a06340eb9f4"} Mar 20 14:45:03 crc kubenswrapper[4973]: I0320 14:45:03.322392 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5jgs6" Mar 20 14:45:03 crc kubenswrapper[4973]: I0320 14:45:03.429972 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bcc22ace-18ff-4b6a-aa48-04ab3d8736c2-config-volume\") pod \"bcc22ace-18ff-4b6a-aa48-04ab3d8736c2\" (UID: \"bcc22ace-18ff-4b6a-aa48-04ab3d8736c2\") " Mar 20 14:45:03 crc kubenswrapper[4973]: I0320 14:45:03.430229 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bcc22ace-18ff-4b6a-aa48-04ab3d8736c2-secret-volume\") pod \"bcc22ace-18ff-4b6a-aa48-04ab3d8736c2\" (UID: \"bcc22ace-18ff-4b6a-aa48-04ab3d8736c2\") " Mar 20 14:45:03 crc kubenswrapper[4973]: I0320 14:45:03.430350 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfjbp\" (UniqueName: \"kubernetes.io/projected/bcc22ace-18ff-4b6a-aa48-04ab3d8736c2-kube-api-access-dfjbp\") pod \"bcc22ace-18ff-4b6a-aa48-04ab3d8736c2\" (UID: \"bcc22ace-18ff-4b6a-aa48-04ab3d8736c2\") " Mar 20 14:45:03 crc kubenswrapper[4973]: I0320 14:45:03.432719 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bcc22ace-18ff-4b6a-aa48-04ab3d8736c2-config-volume" (OuterVolumeSpecName: "config-volume") pod "bcc22ace-18ff-4b6a-aa48-04ab3d8736c2" (UID: "bcc22ace-18ff-4b6a-aa48-04ab3d8736c2"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 14:45:03 crc kubenswrapper[4973]: I0320 14:45:03.438634 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcc22ace-18ff-4b6a-aa48-04ab3d8736c2-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "bcc22ace-18ff-4b6a-aa48-04ab3d8736c2" (UID: "bcc22ace-18ff-4b6a-aa48-04ab3d8736c2"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 14:45:03 crc kubenswrapper[4973]: I0320 14:45:03.438689 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcc22ace-18ff-4b6a-aa48-04ab3d8736c2-kube-api-access-dfjbp" (OuterVolumeSpecName: "kube-api-access-dfjbp") pod "bcc22ace-18ff-4b6a-aa48-04ab3d8736c2" (UID: "bcc22ace-18ff-4b6a-aa48-04ab3d8736c2"). InnerVolumeSpecName "kube-api-access-dfjbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:45:03 crc kubenswrapper[4973]: I0320 14:45:03.534224 4973 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/bcc22ace-18ff-4b6a-aa48-04ab3d8736c2-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:45:03 crc kubenswrapper[4973]: I0320 14:45:03.534269 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfjbp\" (UniqueName: \"kubernetes.io/projected/bcc22ace-18ff-4b6a-aa48-04ab3d8736c2-kube-api-access-dfjbp\") on node \"crc\" DevicePath \"\"" Mar 20 14:45:03 crc kubenswrapper[4973]: I0320 14:45:03.534283 4973 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/bcc22ace-18ff-4b6a-aa48-04ab3d8736c2-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 14:45:03 crc kubenswrapper[4973]: I0320 14:45:03.928759 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5jgs6" event={"ID":"bcc22ace-18ff-4b6a-aa48-04ab3d8736c2","Type":"ContainerDied","Data":"f3cdc058e3b25402d570a9281d0f8b1650084852598a1e563b896a06340eb9f4"} Mar 20 14:45:03 crc kubenswrapper[4973]: I0320 14:45:03.929143 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3cdc058e3b25402d570a9281d0f8b1650084852598a1e563b896a06340eb9f4" Mar 20 14:45:03 crc kubenswrapper[4973]: I0320 14:45:03.929237 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566965-5jgs6" Mar 20 14:45:04 crc kubenswrapper[4973]: I0320 14:45:04.412212 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566920-hh7rt"] Mar 20 14:45:04 crc kubenswrapper[4973]: I0320 14:45:04.423475 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566920-hh7rt"] Mar 20 14:45:05 crc kubenswrapper[4973]: I0320 14:45:05.544034 4973 scope.go:117] "RemoveContainer" containerID="117efc89b5f0f30b1ac189640ce7b44a1fe082c1847c491a3eafd4dae9876787" Mar 20 14:45:05 crc kubenswrapper[4973]: I0320 14:45:05.845284 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 14:45:05 crc kubenswrapper[4973]: I0320 14:45:05.845671 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 14:45:05 crc kubenswrapper[4973]: I0320 14:45:05.964885 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb0d8010-577f-4749-a2b2-d0cf211ca0ec" path="/var/lib/kubelet/pods/eb0d8010-577f-4749-a2b2-d0cf211ca0ec/volumes" Mar 20 14:45:20 crc kubenswrapper[4973]: I0320 14:45:20.541029 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qxmg4/must-gather-b9cmm"] Mar 20 14:45:20 crc kubenswrapper[4973]: E0320 14:45:20.543089 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcc22ace-18ff-4b6a-aa48-04ab3d8736c2" containerName="collect-profiles" Mar 20 14:45:20 crc kubenswrapper[4973]: I0320 14:45:20.543111 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcc22ace-18ff-4b6a-aa48-04ab3d8736c2" containerName="collect-profiles" Mar 20 14:45:20 crc kubenswrapper[4973]: I0320 14:45:20.543546 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcc22ace-18ff-4b6a-aa48-04ab3d8736c2" containerName="collect-profiles" Mar 20 14:45:20 crc kubenswrapper[4973]: I0320 14:45:20.546062 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qxmg4/must-gather-b9cmm" Mar 20 14:45:20 crc kubenswrapper[4973]: I0320 14:45:20.548742 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qxmg4"/"openshift-service-ca.crt" Mar 20 14:45:20 crc kubenswrapper[4973]: I0320 14:45:20.549089 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-qxmg4"/"kube-root-ca.crt" Mar 20 14:45:20 crc kubenswrapper[4973]: I0320 14:45:20.549782 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-qxmg4"/"default-dockercfg-7lcvz" Mar 20 14:45:20 crc kubenswrapper[4973]: I0320 14:45:20.565100 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qxmg4/must-gather-b9cmm"] Mar 20 14:45:20 crc kubenswrapper[4973]: I0320 14:45:20.670547 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcxsl\" (UniqueName: \"kubernetes.io/projected/f5c0588a-6f96-4145-84f6-488007b3b05a-kube-api-access-dcxsl\") pod \"must-gather-b9cmm\" (UID: \"f5c0588a-6f96-4145-84f6-488007b3b05a\") " pod="openshift-must-gather-qxmg4/must-gather-b9cmm" Mar 20 14:45:20 crc kubenswrapper[4973]: I0320 14:45:20.671216 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f5c0588a-6f96-4145-84f6-488007b3b05a-must-gather-output\") pod \"must-gather-b9cmm\" (UID: \"f5c0588a-6f96-4145-84f6-488007b3b05a\") " pod="openshift-must-gather-qxmg4/must-gather-b9cmm" Mar 20 14:45:20 crc kubenswrapper[4973]: I0320 14:45:20.773558 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcxsl\" (UniqueName: \"kubernetes.io/projected/f5c0588a-6f96-4145-84f6-488007b3b05a-kube-api-access-dcxsl\") pod \"must-gather-b9cmm\" (UID: \"f5c0588a-6f96-4145-84f6-488007b3b05a\") " pod="openshift-must-gather-qxmg4/must-gather-b9cmm" Mar 20 14:45:20 crc kubenswrapper[4973]: I0320 14:45:20.773694 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f5c0588a-6f96-4145-84f6-488007b3b05a-must-gather-output\") pod \"must-gather-b9cmm\" (UID: \"f5c0588a-6f96-4145-84f6-488007b3b05a\") " pod="openshift-must-gather-qxmg4/must-gather-b9cmm" Mar 20 14:45:20 crc kubenswrapper[4973]: I0320 14:45:20.774111 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f5c0588a-6f96-4145-84f6-488007b3b05a-must-gather-output\") pod \"must-gather-b9cmm\" (UID: \"f5c0588a-6f96-4145-84f6-488007b3b05a\") " pod="openshift-must-gather-qxmg4/must-gather-b9cmm" Mar 20 14:45:21 crc kubenswrapper[4973]: I0320 14:45:21.384086 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcxsl\" (UniqueName: \"kubernetes.io/projected/f5c0588a-6f96-4145-84f6-488007b3b05a-kube-api-access-dcxsl\") pod \"must-gather-b9cmm\" (UID: \"f5c0588a-6f96-4145-84f6-488007b3b05a\") " pod="openshift-must-gather-qxmg4/must-gather-b9cmm" Mar 20 14:45:21 crc kubenswrapper[4973]: I0320 14:45:21.474162 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qxmg4/must-gather-b9cmm" Mar 20 14:45:22 crc kubenswrapper[4973]: I0320 14:45:22.013150 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qxmg4/must-gather-b9cmm"] Mar 20 14:45:22 crc kubenswrapper[4973]: W0320 14:45:22.014826 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5c0588a_6f96_4145_84f6_488007b3b05a.slice/crio-df9ca591d5988c0b8783523535dff341a38f2de3eedb029eea2f37868a95f209 WatchSource:0}: Error finding container df9ca591d5988c0b8783523535dff341a38f2de3eedb029eea2f37868a95f209: Status 404 returned error can't find the container with id df9ca591d5988c0b8783523535dff341a38f2de3eedb029eea2f37868a95f209 Mar 20 14:45:22 crc kubenswrapper[4973]: I0320 14:45:22.142503 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qxmg4/must-gather-b9cmm" event={"ID":"f5c0588a-6f96-4145-84f6-488007b3b05a","Type":"ContainerStarted","Data":"df9ca591d5988c0b8783523535dff341a38f2de3eedb029eea2f37868a95f209"} Mar 20 14:45:25 crc kubenswrapper[4973]: I0320 14:45:25.851623 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 14:45:25 crc kubenswrapper[4973]: I0320 14:45:25.855897 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-55f4d8dbbb-bmckj" Mar 20 14:45:28 crc kubenswrapper[4973]: I0320 14:45:28.246022 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qxmg4/must-gather-b9cmm" event={"ID":"f5c0588a-6f96-4145-84f6-488007b3b05a","Type":"ContainerStarted","Data":"096b5d7e17d7dd4db535d894e23efafb3f3e670c12ffeeec84b0df45185df7c3"} Mar 20 14:45:28 crc kubenswrapper[4973]: I0320 14:45:28.246687 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qxmg4/must-gather-b9cmm" event={"ID":"f5c0588a-6f96-4145-84f6-488007b3b05a","Type":"ContainerStarted","Data":"1142170c94b4fa736ea77dca698b5e88f957482bf3ea318150cad610c6fd18a5"} Mar 20 14:45:28 crc kubenswrapper[4973]: I0320 14:45:28.275628 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qxmg4/must-gather-b9cmm" podStartSLOduration=2.917434458 podStartE2EDuration="8.275603985s" podCreationTimestamp="2026-03-20 14:45:20 +0000 UTC" firstStartedPulling="2026-03-20 14:45:22.018042003 +0000 UTC m=+5042.761711747" lastFinishedPulling="2026-03-20 14:45:27.37621153 +0000 UTC m=+5048.119881274" observedRunningTime="2026-03-20 14:45:28.272074828 +0000 UTC m=+5049.015744582" watchObservedRunningTime="2026-03-20 14:45:28.275603985 +0000 UTC m=+5049.019273729" Mar 20 14:45:33 crc kubenswrapper[4973]: E0320 14:45:33.783769 4973 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.75:51086->38.102.83.75:38041: write tcp 38.102.83.75:51086->38.102.83.75:38041: write: connection reset by peer Mar 20 14:45:35 crc kubenswrapper[4973]: I0320 14:45:35.914425 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qxmg4/crc-debug-qp6fv"] Mar 20 14:45:35 crc kubenswrapper[4973]: I0320 14:45:35.916782 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qxmg4/crc-debug-qp6fv" Mar 20 14:45:35 crc kubenswrapper[4973]: I0320 14:45:35.988942 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2f3cd89-268c-4e05-9068-0e624a831c0a-host\") pod \"crc-debug-qp6fv\" (UID: \"e2f3cd89-268c-4e05-9068-0e624a831c0a\") " pod="openshift-must-gather-qxmg4/crc-debug-qp6fv" Mar 20 14:45:35 crc kubenswrapper[4973]: I0320 14:45:35.989910 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7lpr\" (UniqueName: \"kubernetes.io/projected/e2f3cd89-268c-4e05-9068-0e624a831c0a-kube-api-access-g7lpr\") pod \"crc-debug-qp6fv\" (UID: \"e2f3cd89-268c-4e05-9068-0e624a831c0a\") " pod="openshift-must-gather-qxmg4/crc-debug-qp6fv" Mar 20 14:45:36 crc kubenswrapper[4973]: I0320 14:45:36.092398 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7lpr\" (UniqueName: \"kubernetes.io/projected/e2f3cd89-268c-4e05-9068-0e624a831c0a-kube-api-access-g7lpr\") pod \"crc-debug-qp6fv\" (UID: \"e2f3cd89-268c-4e05-9068-0e624a831c0a\") " pod="openshift-must-gather-qxmg4/crc-debug-qp6fv" Mar 20 14:45:36 crc kubenswrapper[4973]: I0320 14:45:36.092556 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2f3cd89-268c-4e05-9068-0e624a831c0a-host\") pod \"crc-debug-qp6fv\" (UID: \"e2f3cd89-268c-4e05-9068-0e624a831c0a\") " pod="openshift-must-gather-qxmg4/crc-debug-qp6fv" Mar 20 14:45:36 crc kubenswrapper[4973]: I0320 14:45:36.092796 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2f3cd89-268c-4e05-9068-0e624a831c0a-host\") pod \"crc-debug-qp6fv\" (UID: \"e2f3cd89-268c-4e05-9068-0e624a831c0a\") " pod="openshift-must-gather-qxmg4/crc-debug-qp6fv" Mar 20 14:45:36 crc kubenswrapper[4973]: I0320 14:45:36.135182 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7lpr\" (UniqueName: \"kubernetes.io/projected/e2f3cd89-268c-4e05-9068-0e624a831c0a-kube-api-access-g7lpr\") pod \"crc-debug-qp6fv\" (UID: \"e2f3cd89-268c-4e05-9068-0e624a831c0a\") " pod="openshift-must-gather-qxmg4/crc-debug-qp6fv" Mar 20 14:45:36 crc kubenswrapper[4973]: I0320 14:45:36.239184 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qxmg4/crc-debug-qp6fv" Mar 20 14:45:36 crc kubenswrapper[4973]: I0320 14:45:36.354002 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qxmg4/crc-debug-qp6fv" event={"ID":"e2f3cd89-268c-4e05-9068-0e624a831c0a","Type":"ContainerStarted","Data":"ca4736bc7ab7e079df31477cc2350d175d88d198357c63f538b1dd0959bd7636"} Mar 20 14:45:43 crc kubenswrapper[4973]: I0320 14:45:43.320532 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:45:43 crc kubenswrapper[4973]: I0320 14:45:43.321168 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:45:53 crc kubenswrapper[4973]: E0320 14:45:53.011424 4973 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296" Mar 20 14:45:53 crc kubenswrapper[4973]: E0320 14:45:53.015015 4973 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:container-00,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296,Command:[chroot /host bash -c echo 'TOOLBOX_NAME=toolbox-osp' > /root/.toolboxrc ; rm -rf \"/var/tmp/sos-osp\" && mkdir -p \"/var/tmp/sos-osp\" && sudo podman rm --force toolbox-osp; sudo --preserve-env podman pull --authfile /var/lib/kubelet/config.json registry.redhat.io/rhel9/support-tools && toolbox sos report --batch --all-logs --only-plugins block,cifs,crio,devicemapper,devices,firewall_tables,firewalld,iscsi,lvm2,memory,multipath,nfs,nis,nvme,podman,process,processor,selinux,scsi,udev,logs,crypto --tmp-dir=\"/var/tmp/sos-osp\" && if [[ \"$(ls /var/log/pods/*/{*.log.*,*/*.log.*} 2>/dev/null)\" != '' ]]; then tar --ignore-failed-read --warning=no-file-changed -cJf \"/var/tmp/sos-osp/podlogs.tar.xz\" --transform 's,^,podlogs/,' /var/log/pods/*/{*.log.*,*/*.log.*} || true; fi],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:TMOUT,Value:900,ValueFrom:nil,},EnvVar{Name:HOST,Value:/host,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host,ReadOnly:false,MountPath:/host,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g7lpr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod crc-debug-qp6fv_openshift-must-gather-qxmg4(e2f3cd89-268c-4e05-9068-0e624a831c0a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 14:45:53 crc kubenswrapper[4973]: E0320 14:45:53.017097 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-must-gather-qxmg4/crc-debug-qp6fv" podUID="e2f3cd89-268c-4e05-9068-0e624a831c0a" Mar 20 14:45:53 crc kubenswrapper[4973]: E0320 14:45:53.676938 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"container-00\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6ab858aed98e4fe57e6b144da8e90ad5d6698bb4cc5521206f5c05809f0f9296\\\"\"" pod="openshift-must-gather-qxmg4/crc-debug-qp6fv" podUID="e2f3cd89-268c-4e05-9068-0e624a831c0a" Mar 20 14:46:00 crc kubenswrapper[4973]: I0320 14:46:00.145092 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566966-s9z2v"] Mar 20 14:46:00 crc kubenswrapper[4973]: I0320 14:46:00.147579 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566966-s9z2v" Mar 20 14:46:00 crc kubenswrapper[4973]: I0320 14:46:00.150017 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:46:00 crc kubenswrapper[4973]: I0320 14:46:00.151233 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:46:00 crc kubenswrapper[4973]: I0320 14:46:00.152433 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:46:00 crc kubenswrapper[4973]: I0320 14:46:00.158864 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566966-s9z2v"] Mar 20 14:46:00 crc kubenswrapper[4973]: I0320 14:46:00.253818 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmk5r\" (UniqueName: \"kubernetes.io/projected/288970b0-4121-4ffc-8fdf-92911f5ab464-kube-api-access-tmk5r\") pod \"auto-csr-approver-29566966-s9z2v\" (UID: \"288970b0-4121-4ffc-8fdf-92911f5ab464\") " pod="openshift-infra/auto-csr-approver-29566966-s9z2v" Mar 20 14:46:00 crc kubenswrapper[4973]: I0320 14:46:00.356808 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmk5r\" (UniqueName: \"kubernetes.io/projected/288970b0-4121-4ffc-8fdf-92911f5ab464-kube-api-access-tmk5r\") pod \"auto-csr-approver-29566966-s9z2v\" (UID: \"288970b0-4121-4ffc-8fdf-92911f5ab464\") " pod="openshift-infra/auto-csr-approver-29566966-s9z2v" Mar 20 14:46:00 crc kubenswrapper[4973]: I0320 14:46:00.376190 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmk5r\" (UniqueName: \"kubernetes.io/projected/288970b0-4121-4ffc-8fdf-92911f5ab464-kube-api-access-tmk5r\") pod \"auto-csr-approver-29566966-s9z2v\" (UID: \"288970b0-4121-4ffc-8fdf-92911f5ab464\") " pod="openshift-infra/auto-csr-approver-29566966-s9z2v" Mar 20 14:46:00 crc kubenswrapper[4973]: I0320 14:46:00.501448 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566966-s9z2v" Mar 20 14:46:01 crc kubenswrapper[4973]: I0320 14:46:01.898599 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566966-s9z2v"] Mar 20 14:46:02 crc kubenswrapper[4973]: I0320 14:46:02.805207 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566966-s9z2v" event={"ID":"288970b0-4121-4ffc-8fdf-92911f5ab464","Type":"ContainerStarted","Data":"b3b0f98146221470dbb388b94d2226517b887c0b0eeb3c12af195c8811c8277a"} Mar 20 14:46:04 crc kubenswrapper[4973]: I0320 14:46:04.828588 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566966-s9z2v" event={"ID":"288970b0-4121-4ffc-8fdf-92911f5ab464","Type":"ContainerStarted","Data":"eb7ff137dfe90269893c295cde81c93cbb3355046a0d4bc25a9a8b804748b973"} Mar 20 14:46:04 crc kubenswrapper[4973]: I0320 14:46:04.866595 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566966-s9z2v" podStartSLOduration=3.526206598 podStartE2EDuration="4.866572522s" podCreationTimestamp="2026-03-20 14:46:00 +0000 UTC" firstStartedPulling="2026-03-20 14:46:01.911352112 +0000 UTC m=+5082.655021866" lastFinishedPulling="2026-03-20 14:46:03.251718046 +0000 UTC m=+5083.995387790" observedRunningTime="2026-03-20 14:46:04.852772444 +0000 UTC m=+5085.596442198" watchObservedRunningTime="2026-03-20 14:46:04.866572522 +0000 UTC m=+5085.610242266" Mar 20 14:46:05 crc kubenswrapper[4973]: I0320 14:46:05.678749 4973 scope.go:117] "RemoveContainer" containerID="9de663864ca341f6e12eacddc91f098517dc595a608e7452438ef1cb10ff127a" Mar 20 14:46:07 crc kubenswrapper[4973]: I0320 14:46:07.875061 4973 generic.go:334] "Generic (PLEG): container finished" podID="288970b0-4121-4ffc-8fdf-92911f5ab464" containerID="eb7ff137dfe90269893c295cde81c93cbb3355046a0d4bc25a9a8b804748b973" exitCode=0 Mar 20 14:46:07 crc kubenswrapper[4973]: I0320 14:46:07.875648 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566966-s9z2v" event={"ID":"288970b0-4121-4ffc-8fdf-92911f5ab464","Type":"ContainerDied","Data":"eb7ff137dfe90269893c295cde81c93cbb3355046a0d4bc25a9a8b804748b973"} Mar 20 14:46:07 crc kubenswrapper[4973]: I0320 14:46:07.881780 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qxmg4/crc-debug-qp6fv" event={"ID":"e2f3cd89-268c-4e05-9068-0e624a831c0a","Type":"ContainerStarted","Data":"b64916c7602d72df1b9ec2ba122ed3bfc3a8affe6cc0adeff8281ea96651aea4"} Mar 20 14:46:07 crc kubenswrapper[4973]: I0320 14:46:07.930314 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qxmg4/crc-debug-qp6fv" podStartSLOduration=2.336660824 podStartE2EDuration="32.930293082s" podCreationTimestamp="2026-03-20 14:45:35 +0000 UTC" firstStartedPulling="2026-03-20 14:45:36.28275821 +0000 UTC m=+5057.026427954" lastFinishedPulling="2026-03-20 14:46:06.876390468 +0000 UTC m=+5087.620060212" observedRunningTime="2026-03-20 14:46:07.909325448 +0000 UTC m=+5088.652995192" watchObservedRunningTime="2026-03-20 14:46:07.930293082 +0000 UTC m=+5088.673962826" Mar 20 14:46:09 crc kubenswrapper[4973]: I0320 14:46:09.349402 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566966-s9z2v" Mar 20 14:46:09 crc kubenswrapper[4973]: I0320 14:46:09.427928 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmk5r\" (UniqueName: \"kubernetes.io/projected/288970b0-4121-4ffc-8fdf-92911f5ab464-kube-api-access-tmk5r\") pod \"288970b0-4121-4ffc-8fdf-92911f5ab464\" (UID: \"288970b0-4121-4ffc-8fdf-92911f5ab464\") " Mar 20 14:46:09 crc kubenswrapper[4973]: I0320 14:46:09.448238 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/288970b0-4121-4ffc-8fdf-92911f5ab464-kube-api-access-tmk5r" (OuterVolumeSpecName: "kube-api-access-tmk5r") pod "288970b0-4121-4ffc-8fdf-92911f5ab464" (UID: "288970b0-4121-4ffc-8fdf-92911f5ab464"). InnerVolumeSpecName "kube-api-access-tmk5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:46:09 crc kubenswrapper[4973]: I0320 14:46:09.531834 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmk5r\" (UniqueName: \"kubernetes.io/projected/288970b0-4121-4ffc-8fdf-92911f5ab464-kube-api-access-tmk5r\") on node \"crc\" DevicePath \"\"" Mar 20 14:46:09 crc kubenswrapper[4973]: I0320 14:46:09.902380 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566966-s9z2v" event={"ID":"288970b0-4121-4ffc-8fdf-92911f5ab464","Type":"ContainerDied","Data":"b3b0f98146221470dbb388b94d2226517b887c0b0eeb3c12af195c8811c8277a"} Mar 20 14:46:09 crc kubenswrapper[4973]: I0320 14:46:09.902925 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3b0f98146221470dbb388b94d2226517b887c0b0eeb3c12af195c8811c8277a" Mar 20 14:46:09 crc kubenswrapper[4973]: I0320 14:46:09.902423 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566966-s9z2v" Mar 20 14:46:09 crc kubenswrapper[4973]: I0320 14:46:09.984223 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566960-qqpz4"] Mar 20 14:46:09 crc kubenswrapper[4973]: I0320 14:46:09.995147 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566960-qqpz4"] Mar 20 14:46:11 crc kubenswrapper[4973]: I0320 14:46:11.969291 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d315a1e7-d368-418e-b447-2d03147a9b7d" path="/var/lib/kubelet/pods/d315a1e7-d368-418e-b447-2d03147a9b7d/volumes" Mar 20 14:46:13 crc kubenswrapper[4973]: I0320 14:46:13.321600 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:46:13 crc kubenswrapper[4973]: I0320 14:46:13.322184 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:46:43 crc kubenswrapper[4973]: I0320 14:46:43.320648 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:46:43 crc kubenswrapper[4973]: I0320 14:46:43.321213 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:46:43 crc kubenswrapper[4973]: I0320 14:46:43.321260 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 14:46:43 crc kubenswrapper[4973]: I0320 14:46:43.322348 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790"} pod="openshift-machine-config-operator/machine-config-daemon-qlztx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:46:43 crc kubenswrapper[4973]: I0320 14:46:43.322435 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" containerID="cri-o://3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" gracePeriod=600 Mar 20 14:46:43 crc kubenswrapper[4973]: E0320 14:46:43.496476 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:46:44 crc kubenswrapper[4973]: I0320 14:46:44.301322 4973 generic.go:334] "Generic (PLEG): container finished" podID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerID="3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" exitCode=0 Mar 20 14:46:44 crc kubenswrapper[4973]: I0320 14:46:44.301383 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerDied","Data":"3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790"} Mar 20 14:46:44 crc kubenswrapper[4973]: I0320 14:46:44.301726 4973 scope.go:117] "RemoveContainer" containerID="59cb6d1eb0dc10deddad9ed131c9312329e2744cabbd12799cb25f9f0fdea90e" Mar 20 14:46:44 crc kubenswrapper[4973]: I0320 14:46:44.303033 4973 scope.go:117] "RemoveContainer" containerID="3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" Mar 20 14:46:44 crc kubenswrapper[4973]: E0320 14:46:44.305753 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:46:54 crc kubenswrapper[4973]: I0320 14:46:54.950758 4973 scope.go:117] "RemoveContainer" containerID="3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" Mar 20 14:46:54 crc kubenswrapper[4973]: E0320 14:46:54.952175 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:46:59 crc kubenswrapper[4973]: I0320 14:46:59.486713 4973 generic.go:334] "Generic (PLEG): container finished" podID="e2f3cd89-268c-4e05-9068-0e624a831c0a" containerID="b64916c7602d72df1b9ec2ba122ed3bfc3a8affe6cc0adeff8281ea96651aea4" exitCode=0 Mar 20 14:46:59 crc kubenswrapper[4973]: I0320 14:46:59.486797 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qxmg4/crc-debug-qp6fv" event={"ID":"e2f3cd89-268c-4e05-9068-0e624a831c0a","Type":"ContainerDied","Data":"b64916c7602d72df1b9ec2ba122ed3bfc3a8affe6cc0adeff8281ea96651aea4"} Mar 20 14:47:00 crc kubenswrapper[4973]: I0320 14:47:00.628299 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qxmg4/crc-debug-qp6fv" Mar 20 14:47:00 crc kubenswrapper[4973]: I0320 14:47:00.668125 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qxmg4/crc-debug-qp6fv"] Mar 20 14:47:00 crc kubenswrapper[4973]: I0320 14:47:00.680573 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qxmg4/crc-debug-qp6fv"] Mar 20 14:47:00 crc kubenswrapper[4973]: I0320 14:47:00.696199 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7lpr\" (UniqueName: \"kubernetes.io/projected/e2f3cd89-268c-4e05-9068-0e624a831c0a-kube-api-access-g7lpr\") pod \"e2f3cd89-268c-4e05-9068-0e624a831c0a\" (UID: \"e2f3cd89-268c-4e05-9068-0e624a831c0a\") " Mar 20 14:47:00 crc kubenswrapper[4973]: I0320 14:47:00.696375 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2f3cd89-268c-4e05-9068-0e624a831c0a-host\") pod \"e2f3cd89-268c-4e05-9068-0e624a831c0a\" (UID: \"e2f3cd89-268c-4e05-9068-0e624a831c0a\") " Mar 20 14:47:00 crc kubenswrapper[4973]: I0320 14:47:00.697126 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2f3cd89-268c-4e05-9068-0e624a831c0a-host" (OuterVolumeSpecName: "host") pod "e2f3cd89-268c-4e05-9068-0e624a831c0a" (UID: "e2f3cd89-268c-4e05-9068-0e624a831c0a"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 14:47:00 crc kubenswrapper[4973]: I0320 14:47:00.712776 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2f3cd89-268c-4e05-9068-0e624a831c0a-kube-api-access-g7lpr" (OuterVolumeSpecName: "kube-api-access-g7lpr") pod "e2f3cd89-268c-4e05-9068-0e624a831c0a" (UID: "e2f3cd89-268c-4e05-9068-0e624a831c0a"). InnerVolumeSpecName "kube-api-access-g7lpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:47:00 crc kubenswrapper[4973]: I0320 14:47:00.799207 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7lpr\" (UniqueName: \"kubernetes.io/projected/e2f3cd89-268c-4e05-9068-0e624a831c0a-kube-api-access-g7lpr\") on node \"crc\" DevicePath \"\"" Mar 20 14:47:00 crc kubenswrapper[4973]: I0320 14:47:00.799546 4973 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2f3cd89-268c-4e05-9068-0e624a831c0a-host\") on node \"crc\" DevicePath \"\"" Mar 20 14:47:01 crc kubenswrapper[4973]: I0320 14:47:01.511556 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca4736bc7ab7e079df31477cc2350d175d88d198357c63f538b1dd0959bd7636" Mar 20 14:47:01 crc kubenswrapper[4973]: I0320 14:47:01.511697 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qxmg4/crc-debug-qp6fv" Mar 20 14:47:01 crc kubenswrapper[4973]: I0320 14:47:01.870810 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qxmg4/crc-debug-w4kz6"] Mar 20 14:47:01 crc kubenswrapper[4973]: E0320 14:47:01.871321 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="288970b0-4121-4ffc-8fdf-92911f5ab464" containerName="oc" Mar 20 14:47:01 crc kubenswrapper[4973]: I0320 14:47:01.871354 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="288970b0-4121-4ffc-8fdf-92911f5ab464" containerName="oc" Mar 20 14:47:01 crc kubenswrapper[4973]: E0320 14:47:01.871377 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2f3cd89-268c-4e05-9068-0e624a831c0a" containerName="container-00" Mar 20 14:47:01 crc kubenswrapper[4973]: I0320 14:47:01.871383 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2f3cd89-268c-4e05-9068-0e624a831c0a" containerName="container-00" Mar 20 14:47:01 crc kubenswrapper[4973]: I0320 14:47:01.871608 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2f3cd89-268c-4e05-9068-0e624a831c0a" containerName="container-00" Mar 20 14:47:01 crc kubenswrapper[4973]: I0320 14:47:01.871642 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="288970b0-4121-4ffc-8fdf-92911f5ab464" containerName="oc" Mar 20 14:47:01 crc kubenswrapper[4973]: I0320 14:47:01.872558 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qxmg4/crc-debug-w4kz6" Mar 20 14:47:01 crc kubenswrapper[4973]: I0320 14:47:01.964580 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2f3cd89-268c-4e05-9068-0e624a831c0a" path="/var/lib/kubelet/pods/e2f3cd89-268c-4e05-9068-0e624a831c0a/volumes" Mar 20 14:47:02 crc kubenswrapper[4973]: I0320 14:47:02.037204 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d11c13bc-1c5a-439f-9d00-84b55b22d5b0-host\") pod \"crc-debug-w4kz6\" (UID: \"d11c13bc-1c5a-439f-9d00-84b55b22d5b0\") " pod="openshift-must-gather-qxmg4/crc-debug-w4kz6" Mar 20 14:47:02 crc kubenswrapper[4973]: I0320 14:47:02.037414 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbrwx\" (UniqueName: \"kubernetes.io/projected/d11c13bc-1c5a-439f-9d00-84b55b22d5b0-kube-api-access-dbrwx\") pod \"crc-debug-w4kz6\" (UID: \"d11c13bc-1c5a-439f-9d00-84b55b22d5b0\") " pod="openshift-must-gather-qxmg4/crc-debug-w4kz6" Mar 20 14:47:02 crc kubenswrapper[4973]: I0320 14:47:02.139993 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbrwx\" (UniqueName: \"kubernetes.io/projected/d11c13bc-1c5a-439f-9d00-84b55b22d5b0-kube-api-access-dbrwx\") pod \"crc-debug-w4kz6\" (UID: \"d11c13bc-1c5a-439f-9d00-84b55b22d5b0\") " pod="openshift-must-gather-qxmg4/crc-debug-w4kz6" Mar 20 14:47:02 crc kubenswrapper[4973]: I0320 14:47:02.140151 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d11c13bc-1c5a-439f-9d00-84b55b22d5b0-host\") pod \"crc-debug-w4kz6\" (UID: \"d11c13bc-1c5a-439f-9d00-84b55b22d5b0\") " pod="openshift-must-gather-qxmg4/crc-debug-w4kz6" Mar 20 14:47:02 crc kubenswrapper[4973]: I0320 14:47:02.140232 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d11c13bc-1c5a-439f-9d00-84b55b22d5b0-host\") pod \"crc-debug-w4kz6\" (UID: \"d11c13bc-1c5a-439f-9d00-84b55b22d5b0\") " pod="openshift-must-gather-qxmg4/crc-debug-w4kz6" Mar 20 14:47:02 crc kubenswrapper[4973]: I0320 14:47:02.157008 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbrwx\" (UniqueName: \"kubernetes.io/projected/d11c13bc-1c5a-439f-9d00-84b55b22d5b0-kube-api-access-dbrwx\") pod \"crc-debug-w4kz6\" (UID: \"d11c13bc-1c5a-439f-9d00-84b55b22d5b0\") " pod="openshift-must-gather-qxmg4/crc-debug-w4kz6" Mar 20 14:47:02 crc kubenswrapper[4973]: I0320 14:47:02.191954 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qxmg4/crc-debug-w4kz6" Mar 20 14:47:02 crc kubenswrapper[4973]: I0320 14:47:02.522171 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qxmg4/crc-debug-w4kz6" event={"ID":"d11c13bc-1c5a-439f-9d00-84b55b22d5b0","Type":"ContainerStarted","Data":"39b578a38fe5c9207db1fbc7328d64f8780fa14e870bf24cd24a014926ecf52e"} Mar 20 14:47:02 crc kubenswrapper[4973]: I0320 14:47:02.522546 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qxmg4/crc-debug-w4kz6" event={"ID":"d11c13bc-1c5a-439f-9d00-84b55b22d5b0","Type":"ContainerStarted","Data":"ed846c63ace821e8e124cb344e073f449937af005ac16a0735ed00b004642f0d"} Mar 20 14:47:02 crc kubenswrapper[4973]: I0320 14:47:02.537111 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qxmg4/crc-debug-w4kz6" podStartSLOduration=1.537091399 podStartE2EDuration="1.537091399s" podCreationTimestamp="2026-03-20 14:47:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:47:02.533789009 +0000 UTC m=+5143.277458753" watchObservedRunningTime="2026-03-20 14:47:02.537091399 +0000 UTC m=+5143.280761143" Mar 20 14:47:03 crc kubenswrapper[4973]: I0320 14:47:03.542666 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qxmg4/crc-debug-w4kz6" event={"ID":"d11c13bc-1c5a-439f-9d00-84b55b22d5b0","Type":"ContainerDied","Data":"39b578a38fe5c9207db1fbc7328d64f8780fa14e870bf24cd24a014926ecf52e"} Mar 20 14:47:03 crc kubenswrapper[4973]: I0320 14:47:03.542744 4973 generic.go:334] "Generic (PLEG): container finished" podID="d11c13bc-1c5a-439f-9d00-84b55b22d5b0" containerID="39b578a38fe5c9207db1fbc7328d64f8780fa14e870bf24cd24a014926ecf52e" exitCode=0 Mar 20 14:47:04 crc kubenswrapper[4973]: I0320 14:47:04.676209 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qxmg4/crc-debug-w4kz6" Mar 20 14:47:04 crc kubenswrapper[4973]: I0320 14:47:04.725054 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qxmg4/crc-debug-w4kz6"] Mar 20 14:47:04 crc kubenswrapper[4973]: I0320 14:47:04.735276 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qxmg4/crc-debug-w4kz6"] Mar 20 14:47:04 crc kubenswrapper[4973]: I0320 14:47:04.816373 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d11c13bc-1c5a-439f-9d00-84b55b22d5b0-host\") pod \"d11c13bc-1c5a-439f-9d00-84b55b22d5b0\" (UID: \"d11c13bc-1c5a-439f-9d00-84b55b22d5b0\") " Mar 20 14:47:04 crc kubenswrapper[4973]: I0320 14:47:04.816490 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbrwx\" (UniqueName: \"kubernetes.io/projected/d11c13bc-1c5a-439f-9d00-84b55b22d5b0-kube-api-access-dbrwx\") pod \"d11c13bc-1c5a-439f-9d00-84b55b22d5b0\" (UID: \"d11c13bc-1c5a-439f-9d00-84b55b22d5b0\") " Mar 20 14:47:04 crc kubenswrapper[4973]: I0320 14:47:04.816780 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d11c13bc-1c5a-439f-9d00-84b55b22d5b0-host" (OuterVolumeSpecName: "host") pod "d11c13bc-1c5a-439f-9d00-84b55b22d5b0" (UID: "d11c13bc-1c5a-439f-9d00-84b55b22d5b0"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 14:47:04 crc kubenswrapper[4973]: I0320 14:47:04.817375 4973 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d11c13bc-1c5a-439f-9d00-84b55b22d5b0-host\") on node \"crc\" DevicePath \"\"" Mar 20 14:47:04 crc kubenswrapper[4973]: I0320 14:47:04.822187 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d11c13bc-1c5a-439f-9d00-84b55b22d5b0-kube-api-access-dbrwx" (OuterVolumeSpecName: "kube-api-access-dbrwx") pod "d11c13bc-1c5a-439f-9d00-84b55b22d5b0" (UID: "d11c13bc-1c5a-439f-9d00-84b55b22d5b0"). InnerVolumeSpecName "kube-api-access-dbrwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:47:04 crc kubenswrapper[4973]: I0320 14:47:04.919236 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbrwx\" (UniqueName: \"kubernetes.io/projected/d11c13bc-1c5a-439f-9d00-84b55b22d5b0-kube-api-access-dbrwx\") on node \"crc\" DevicePath \"\"" Mar 20 14:47:05 crc kubenswrapper[4973]: I0320 14:47:05.564452 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed846c63ace821e8e124cb344e073f449937af005ac16a0735ed00b004642f0d" Mar 20 14:47:05 crc kubenswrapper[4973]: I0320 14:47:05.564749 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qxmg4/crc-debug-w4kz6" Mar 20 14:47:05 crc kubenswrapper[4973]: I0320 14:47:05.836694 4973 scope.go:117] "RemoveContainer" containerID="f0970ba010c2c95e8c50c06dbb38f7c37e24bedea49d93413f1568c0e2a5fc7c" Mar 20 14:47:05 crc kubenswrapper[4973]: I0320 14:47:05.886008 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qxmg4/crc-debug-gh6s4"] Mar 20 14:47:05 crc kubenswrapper[4973]: E0320 14:47:05.886674 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d11c13bc-1c5a-439f-9d00-84b55b22d5b0" containerName="container-00" Mar 20 14:47:05 crc kubenswrapper[4973]: I0320 14:47:05.889428 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="d11c13bc-1c5a-439f-9d00-84b55b22d5b0" containerName="container-00" Mar 20 14:47:05 crc kubenswrapper[4973]: I0320 14:47:05.889790 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="d11c13bc-1c5a-439f-9d00-84b55b22d5b0" containerName="container-00" Mar 20 14:47:05 crc kubenswrapper[4973]: I0320 14:47:05.890792 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qxmg4/crc-debug-gh6s4" Mar 20 14:47:05 crc kubenswrapper[4973]: I0320 14:47:05.965268 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d11c13bc-1c5a-439f-9d00-84b55b22d5b0" path="/var/lib/kubelet/pods/d11c13bc-1c5a-439f-9d00-84b55b22d5b0/volumes" Mar 20 14:47:06 crc kubenswrapper[4973]: I0320 14:47:06.045704 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhsw4\" (UniqueName: \"kubernetes.io/projected/52fa9a9d-9f39-4775-bc42-b41e0f237e09-kube-api-access-qhsw4\") pod \"crc-debug-gh6s4\" (UID: \"52fa9a9d-9f39-4775-bc42-b41e0f237e09\") " pod="openshift-must-gather-qxmg4/crc-debug-gh6s4" Mar 20 14:47:06 crc kubenswrapper[4973]: I0320 14:47:06.045898 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52fa9a9d-9f39-4775-bc42-b41e0f237e09-host\") pod \"crc-debug-gh6s4\" (UID: \"52fa9a9d-9f39-4775-bc42-b41e0f237e09\") " pod="openshift-must-gather-qxmg4/crc-debug-gh6s4" Mar 20 14:47:06 crc kubenswrapper[4973]: I0320 14:47:06.149022 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52fa9a9d-9f39-4775-bc42-b41e0f237e09-host\") pod \"crc-debug-gh6s4\" (UID: \"52fa9a9d-9f39-4775-bc42-b41e0f237e09\") " pod="openshift-must-gather-qxmg4/crc-debug-gh6s4" Mar 20 14:47:06 crc kubenswrapper[4973]: I0320 14:47:06.149187 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52fa9a9d-9f39-4775-bc42-b41e0f237e09-host\") pod \"crc-debug-gh6s4\" (UID: \"52fa9a9d-9f39-4775-bc42-b41e0f237e09\") " pod="openshift-must-gather-qxmg4/crc-debug-gh6s4" Mar 20 14:47:06 crc kubenswrapper[4973]: I0320 14:47:06.149994 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhsw4\" (UniqueName: \"kubernetes.io/projected/52fa9a9d-9f39-4775-bc42-b41e0f237e09-kube-api-access-qhsw4\") pod \"crc-debug-gh6s4\" (UID: \"52fa9a9d-9f39-4775-bc42-b41e0f237e09\") " pod="openshift-must-gather-qxmg4/crc-debug-gh6s4" Mar 20 14:47:06 crc kubenswrapper[4973]: I0320 14:47:06.168600 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhsw4\" (UniqueName: \"kubernetes.io/projected/52fa9a9d-9f39-4775-bc42-b41e0f237e09-kube-api-access-qhsw4\") pod \"crc-debug-gh6s4\" (UID: \"52fa9a9d-9f39-4775-bc42-b41e0f237e09\") " pod="openshift-must-gather-qxmg4/crc-debug-gh6s4" Mar 20 14:47:06 crc kubenswrapper[4973]: I0320 14:47:06.236731 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qxmg4/crc-debug-gh6s4" Mar 20 14:47:06 crc kubenswrapper[4973]: W0320 14:47:06.268153 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52fa9a9d_9f39_4775_bc42_b41e0f237e09.slice/crio-b0e573d5bfc94ddab0cd8893967587900712a17b84f48ddf26f3b3473550adf8 WatchSource:0}: Error finding container b0e573d5bfc94ddab0cd8893967587900712a17b84f48ddf26f3b3473550adf8: Status 404 returned error can't find the container with id b0e573d5bfc94ddab0cd8893967587900712a17b84f48ddf26f3b3473550adf8 Mar 20 14:47:06 crc kubenswrapper[4973]: I0320 14:47:06.578612 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qxmg4/crc-debug-gh6s4" event={"ID":"52fa9a9d-9f39-4775-bc42-b41e0f237e09","Type":"ContainerStarted","Data":"6bc29c783723b650e1653b4808e9bfb56477fa2975915d1f5f42e3ebd5d05fc7"} Mar 20 14:47:06 crc kubenswrapper[4973]: I0320 14:47:06.578685 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qxmg4/crc-debug-gh6s4" event={"ID":"52fa9a9d-9f39-4775-bc42-b41e0f237e09","Type":"ContainerStarted","Data":"b0e573d5bfc94ddab0cd8893967587900712a17b84f48ddf26f3b3473550adf8"} Mar 20 14:47:06 crc kubenswrapper[4973]: I0320 14:47:06.604511 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qxmg4/crc-debug-gh6s4" podStartSLOduration=1.6044799570000001 podStartE2EDuration="1.604479957s" podCreationTimestamp="2026-03-20 14:47:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 14:47:06.60203021 +0000 UTC m=+5147.345699974" watchObservedRunningTime="2026-03-20 14:47:06.604479957 +0000 UTC m=+5147.348149701" Mar 20 14:47:07 crc kubenswrapper[4973]: I0320 14:47:07.591762 4973 generic.go:334] "Generic (PLEG): container finished" podID="52fa9a9d-9f39-4775-bc42-b41e0f237e09" containerID="6bc29c783723b650e1653b4808e9bfb56477fa2975915d1f5f42e3ebd5d05fc7" exitCode=0 Mar 20 14:47:07 crc kubenswrapper[4973]: I0320 14:47:07.591933 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qxmg4/crc-debug-gh6s4" event={"ID":"52fa9a9d-9f39-4775-bc42-b41e0f237e09","Type":"ContainerDied","Data":"6bc29c783723b650e1653b4808e9bfb56477fa2975915d1f5f42e3ebd5d05fc7"} Mar 20 14:47:08 crc kubenswrapper[4973]: I0320 14:47:08.745606 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qxmg4/crc-debug-gh6s4" Mar 20 14:47:08 crc kubenswrapper[4973]: I0320 14:47:08.791005 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qxmg4/crc-debug-gh6s4"] Mar 20 14:47:08 crc kubenswrapper[4973]: I0320 14:47:08.805277 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qxmg4/crc-debug-gh6s4"] Mar 20 14:47:08 crc kubenswrapper[4973]: I0320 14:47:08.822578 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52fa9a9d-9f39-4775-bc42-b41e0f237e09-host\") pod \"52fa9a9d-9f39-4775-bc42-b41e0f237e09\" (UID: \"52fa9a9d-9f39-4775-bc42-b41e0f237e09\") " Mar 20 14:47:08 crc kubenswrapper[4973]: I0320 14:47:08.822687 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/52fa9a9d-9f39-4775-bc42-b41e0f237e09-host" (OuterVolumeSpecName: "host") pod "52fa9a9d-9f39-4775-bc42-b41e0f237e09" (UID: "52fa9a9d-9f39-4775-bc42-b41e0f237e09"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 14:47:08 crc kubenswrapper[4973]: I0320 14:47:08.822949 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhsw4\" (UniqueName: \"kubernetes.io/projected/52fa9a9d-9f39-4775-bc42-b41e0f237e09-kube-api-access-qhsw4\") pod \"52fa9a9d-9f39-4775-bc42-b41e0f237e09\" (UID: \"52fa9a9d-9f39-4775-bc42-b41e0f237e09\") " Mar 20 14:47:08 crc kubenswrapper[4973]: I0320 14:47:08.823555 4973 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/52fa9a9d-9f39-4775-bc42-b41e0f237e09-host\") on node \"crc\" DevicePath \"\"" Mar 20 14:47:08 crc kubenswrapper[4973]: I0320 14:47:08.834669 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52fa9a9d-9f39-4775-bc42-b41e0f237e09-kube-api-access-qhsw4" (OuterVolumeSpecName: "kube-api-access-qhsw4") pod "52fa9a9d-9f39-4775-bc42-b41e0f237e09" (UID: "52fa9a9d-9f39-4775-bc42-b41e0f237e09"). InnerVolumeSpecName "kube-api-access-qhsw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:47:08 crc kubenswrapper[4973]: I0320 14:47:08.925406 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhsw4\" (UniqueName: \"kubernetes.io/projected/52fa9a9d-9f39-4775-bc42-b41e0f237e09-kube-api-access-qhsw4\") on node \"crc\" DevicePath \"\"" Mar 20 14:47:08 crc kubenswrapper[4973]: I0320 14:47:08.951313 4973 scope.go:117] "RemoveContainer" containerID="3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" Mar 20 14:47:08 crc kubenswrapper[4973]: E0320 14:47:08.951676 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:47:09 crc kubenswrapper[4973]: I0320 14:47:09.617146 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0e573d5bfc94ddab0cd8893967587900712a17b84f48ddf26f3b3473550adf8" Mar 20 14:47:09 crc kubenswrapper[4973]: I0320 14:47:09.617230 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qxmg4/crc-debug-gh6s4" Mar 20 14:47:09 crc kubenswrapper[4973]: I0320 14:47:09.965939 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52fa9a9d-9f39-4775-bc42-b41e0f237e09" path="/var/lib/kubelet/pods/52fa9a9d-9f39-4775-bc42-b41e0f237e09/volumes" Mar 20 14:47:21 crc kubenswrapper[4973]: I0320 14:47:21.951404 4973 scope.go:117] "RemoveContainer" containerID="3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" Mar 20 14:47:21 crc kubenswrapper[4973]: E0320 14:47:21.952164 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:47:33 crc kubenswrapper[4973]: I0320 14:47:33.951979 4973 scope.go:117] "RemoveContainer" containerID="3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" Mar 20 14:47:33 crc kubenswrapper[4973]: E0320 14:47:33.953735 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:47:37 crc kubenswrapper[4973]: I0320 14:47:37.311685 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_12a105da-e1bf-4c7a-aabb-b81defe003af/aodh-api/0.log" Mar 20 14:47:37 crc kubenswrapper[4973]: I0320 14:47:37.531195 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_12a105da-e1bf-4c7a-aabb-b81defe003af/aodh-notifier/0.log" Mar 20 14:47:37 crc kubenswrapper[4973]: I0320 14:47:37.543860 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_12a105da-e1bf-4c7a-aabb-b81defe003af/aodh-listener/0.log" Mar 20 14:47:37 crc kubenswrapper[4973]: I0320 14:47:37.556442 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_12a105da-e1bf-4c7a-aabb-b81defe003af/aodh-evaluator/0.log" Mar 20 14:47:37 crc kubenswrapper[4973]: I0320 14:47:37.741883 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5589965cd6-qwjps_bd8bb058-5f72-497c-9bf2-7ac7b932cc5d/barbican-api/0.log" Mar 20 14:47:37 crc kubenswrapper[4973]: I0320 14:47:37.748957 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5589965cd6-qwjps_bd8bb058-5f72-497c-9bf2-7ac7b932cc5d/barbican-api-log/0.log" Mar 20 14:47:37 crc kubenswrapper[4973]: I0320 14:47:37.942122 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-68cf78bb54-pdcln_e9a97269-d458-4405-988c-b32339897e4f/barbican-keystone-listener/0.log" Mar 20 14:47:38 crc kubenswrapper[4973]: I0320 14:47:38.040934 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-68cf78bb54-pdcln_e9a97269-d458-4405-988c-b32339897e4f/barbican-keystone-listener-log/0.log" Mar 20 14:47:38 crc kubenswrapper[4973]: I0320 14:47:38.110016 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65b846dd97-gcnmc_8df11824-a676-4f6a-8f12-ccff6ce1bdc6/barbican-worker/0.log" Mar 20 14:47:38 crc kubenswrapper[4973]: I0320 14:47:38.184503 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-65b846dd97-gcnmc_8df11824-a676-4f6a-8f12-ccff6ce1bdc6/barbican-worker-log/0.log" Mar 20 14:47:38 crc kubenswrapper[4973]: I0320 14:47:38.484526 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-nzxgl_8dfa03f4-98c0-4122-8fbb-abeba13439f0/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:47:38 crc kubenswrapper[4973]: I0320 14:47:38.530979 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1cde80e1-f72d-4080-a86b-5968a8904333/ceilometer-central-agent/0.log" Mar 20 14:47:38 crc kubenswrapper[4973]: I0320 14:47:38.607804 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1cde80e1-f72d-4080-a86b-5968a8904333/ceilometer-notification-agent/1.log" Mar 20 14:47:38 crc kubenswrapper[4973]: I0320 14:47:38.765726 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1cde80e1-f72d-4080-a86b-5968a8904333/ceilometer-notification-agent/0.log" Mar 20 14:47:38 crc kubenswrapper[4973]: I0320 14:47:38.785107 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1cde80e1-f72d-4080-a86b-5968a8904333/proxy-httpd/0.log" Mar 20 14:47:38 crc kubenswrapper[4973]: I0320 14:47:38.814448 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_1cde80e1-f72d-4080-a86b-5968a8904333/sg-core/0.log" Mar 20 14:47:39 crc kubenswrapper[4973]: I0320 14:47:39.071542 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_273c74ce-9e0d-437a-aaf8-b16451028b6e/cinder-api/0.log" Mar 20 14:47:39 crc kubenswrapper[4973]: I0320 14:47:39.097408 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_273c74ce-9e0d-437a-aaf8-b16451028b6e/cinder-api-log/0.log" Mar 20 14:47:39 crc kubenswrapper[4973]: I0320 14:47:39.223887 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_95ca3c33-8a98-4ce8-8cb7-06c855d090ac/cinder-scheduler/1.log" Mar 20 14:47:39 crc kubenswrapper[4973]: I0320 14:47:39.275865 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_95ca3c33-8a98-4ce8-8cb7-06c855d090ac/cinder-scheduler/0.log" Mar 20 14:47:39 crc kubenswrapper[4973]: I0320 14:47:39.360201 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_95ca3c33-8a98-4ce8-8cb7-06c855d090ac/probe/0.log" Mar 20 14:47:39 crc kubenswrapper[4973]: I0320 14:47:39.528879 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-h9tj7_5d8302de-98a7-45e4-ab44-fddc83ce2f4b/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:47:39 crc kubenswrapper[4973]: I0320 14:47:39.733642 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-trb69_42ef6035-8917-4665-aaab-67b6c8e74ca7/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:47:39 crc kubenswrapper[4973]: I0320 14:47:39.776748 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-bb85b8995-9fz8b_9253dc20-632e-44ce-8d38-e452186eddbd/init/0.log" Mar 20 14:47:39 crc kubenswrapper[4973]: I0320 14:47:39.936779 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-bb85b8995-9fz8b_9253dc20-632e-44ce-8d38-e452186eddbd/init/0.log" Mar 20 14:47:40 crc kubenswrapper[4973]: I0320 14:47:40.064306 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-bb85b8995-9fz8b_9253dc20-632e-44ce-8d38-e452186eddbd/dnsmasq-dns/0.log" Mar 20 14:47:40 crc kubenswrapper[4973]: I0320 14:47:40.093295 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-pqzh5_08df9ecf-310e-4fee-9ec6-e13e27f1537b/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:47:40 crc kubenswrapper[4973]: I0320 14:47:40.303429 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_154777eb-43ed-484d-9e6e-f2bd09fecf57/glance-httpd/0.log" Mar 20 14:47:40 crc kubenswrapper[4973]: I0320 14:47:40.341864 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_154777eb-43ed-484d-9e6e-f2bd09fecf57/glance-log/0.log" Mar 20 14:47:40 crc kubenswrapper[4973]: I0320 14:47:40.539175 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8c476652-ce6b-4652-8f8a-9415b0be7465/glance-log/0.log" Mar 20 14:47:40 crc kubenswrapper[4973]: I0320 14:47:40.624816 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_8c476652-ce6b-4652-8f8a-9415b0be7465/glance-httpd/0.log" Mar 20 14:47:41 crc kubenswrapper[4973]: I0320 14:47:41.307290 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-f55bdff67-69cjm_b35b5167-8059-4cb9-a93b-c8dc96bb23f7/heat-api/0.log" Mar 20 14:47:41 crc kubenswrapper[4973]: I0320 14:47:41.431109 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-6b4775c997-mc4n9_ea073f8d-1032-4d22-b195-61be13e3e832/heat-engine/0.log" Mar 20 14:47:41 crc kubenswrapper[4973]: I0320 14:47:41.519806 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-gjmrs_fbb12599-731a-426e-9032-80f723072a75/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:47:41 crc kubenswrapper[4973]: I0320 14:47:41.551859 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-7d8cc574d6-dm54c_adfece77-bf45-4bb7-8f7f-c57be5a4edfc/heat-cfnapi/0.log" Mar 20 14:47:42 crc kubenswrapper[4973]: I0320 14:47:42.244283 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29566921-tn9xk_3defeff9-b2c0-4236-a0f2-c91f57208005/keystone-cron/0.log" Mar 20 14:47:42 crc kubenswrapper[4973]: I0320 14:47:42.554470 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-76xjq_9456a1ee-7677-4176-9dd8-ec10621b434f/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:47:42 crc kubenswrapper[4973]: I0320 14:47:42.556043 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_6a06b002-fe34-40e9-ae6d-54b6a9e7751b/kube-state-metrics/0.log" Mar 20 14:47:42 crc kubenswrapper[4973]: I0320 14:47:42.954143 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-7b85d6bdf6-j78f6_47ee1828-62b2-46d7-9225-61a4725bd6a6/keystone-api/0.log" Mar 20 14:47:43 crc kubenswrapper[4973]: I0320 14:47:43.190676 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_logging-edpm-deployment-openstack-edpm-ipam-ln9l8_084e7647-9b86-48a2-a4f8-56cb7a8e3457/logging-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:47:43 crc kubenswrapper[4973]: I0320 14:47:43.555817 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mysqld-exporter-0_c8fe0389-306c-44f1-9a9b-9ae5907ec1ef/mysqld-exporter/0.log" Mar 20 14:47:43 crc kubenswrapper[4973]: I0320 14:47:43.957887 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5874c8d58f-l5f6s_983eb56d-9c01-48a2-bcd1-3ea59f11bc01/neutron-httpd/0.log" Mar 20 14:47:44 crc kubenswrapper[4973]: I0320 14:47:44.017321 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-vnjxc_3fa3f71f-ea87-4f03-9499-d0c4bea49c04/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:47:44 crc kubenswrapper[4973]: I0320 14:47:44.027420 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-5874c8d58f-l5f6s_983eb56d-9c01-48a2-bcd1-3ea59f11bc01/neutron-api/0.log" Mar 20 14:47:44 crc kubenswrapper[4973]: I0320 14:47:44.685244 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-m5qr8_ad27159b-a222-439e-985c-d0164bb4eb21/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:47:45 crc kubenswrapper[4973]: I0320 14:47:45.187362 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a938dfd5-e277-4303-adfc-1d4ad07f2240/nova-cell0-conductor-conductor/0.log" Mar 20 14:47:45 crc kubenswrapper[4973]: I0320 14:47:45.193632 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_83d90a8e-827e-4720-9a8f-307311e2a6d9/nova-api-log/0.log" Mar 20 14:47:45 crc kubenswrapper[4973]: I0320 14:47:45.532126 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_b0a61963-2c9b-403b-8fed-f4072b979eb8/nova-cell1-conductor-conductor/0.log" Mar 20 14:47:45 crc kubenswrapper[4973]: I0320 14:47:45.634767 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_48646158-e77c-4710-b05b-030e7ff1dfbe/nova-cell1-novncproxy-novncproxy/0.log" Mar 20 14:47:45 crc kubenswrapper[4973]: I0320 14:47:45.732368 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_83d90a8e-827e-4720-9a8f-307311e2a6d9/nova-api-api/0.log" Mar 20 14:47:46 crc kubenswrapper[4973]: I0320 14:47:46.083432 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_da8c0543-63bd-4817-90c7-8cc02e6ddd5d/nova-metadata-log/0.log" Mar 20 14:47:46 crc kubenswrapper[4973]: I0320 14:47:46.535927 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-zrpbv_f2cb943f-f811-4eea-b860-1c19a6137dbb/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:47:46 crc kubenswrapper[4973]: I0320 14:47:46.557203 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_e31feddc-5c5c-4520-bd2e-52ab71a2e318/nova-scheduler-scheduler/0.log" Mar 20 14:47:46 crc kubenswrapper[4973]: I0320 14:47:46.712152 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4bdbb8eb-c36d-43f0-a705-3b3e59128b7f/mysql-bootstrap/0.log" Mar 20 14:47:46 crc kubenswrapper[4973]: I0320 14:47:46.746034 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_da8c0543-63bd-4817-90c7-8cc02e6ddd5d/nova-metadata-metadata/0.log" Mar 20 14:47:46 crc kubenswrapper[4973]: I0320 14:47:46.928136 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4bdbb8eb-c36d-43f0-a705-3b3e59128b7f/galera/0.log" Mar 20 14:47:47 crc kubenswrapper[4973]: I0320 14:47:47.025377 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_4bdbb8eb-c36d-43f0-a705-3b3e59128b7f/mysql-bootstrap/0.log" Mar 20 14:47:47 crc kubenswrapper[4973]: I0320 14:47:47.062219 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_27fed14c-9051-4d46-80d5-badf224805a9/mysql-bootstrap/0.log" Mar 20 14:47:47 crc kubenswrapper[4973]: I0320 14:47:47.288387 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_27fed14c-9051-4d46-80d5-badf224805a9/galera/1.log" Mar 20 14:47:47 crc kubenswrapper[4973]: I0320 14:47:47.292861 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_27fed14c-9051-4d46-80d5-badf224805a9/mysql-bootstrap/0.log" Mar 20 14:47:47 crc kubenswrapper[4973]: I0320 14:47:47.329219 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_27fed14c-9051-4d46-80d5-badf224805a9/galera/0.log" Mar 20 14:47:47 crc kubenswrapper[4973]: I0320 14:47:47.562230 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_5e16c419-72e5-4d2c-bda0-3a0f6ec97aac/openstackclient/0.log" Mar 20 14:47:47 crc kubenswrapper[4973]: I0320 14:47:47.587306 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-l28sq_2425ebf1-c2bc-4b94-b3aa-473fd4690168/openstack-network-exporter/0.log" Mar 20 14:47:47 crc kubenswrapper[4973]: I0320 14:47:47.766770 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gvrbc_3809b124-46bb-42ba-a467-279857c61ef6/ovsdb-server-init/0.log" Mar 20 14:47:48 crc kubenswrapper[4973]: I0320 14:47:48.009606 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gvrbc_3809b124-46bb-42ba-a467-279857c61ef6/ovsdb-server-init/0.log" Mar 20 14:47:48 crc kubenswrapper[4973]: I0320 14:47:48.037852 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gvrbc_3809b124-46bb-42ba-a467-279857c61ef6/ovs-vswitchd/0.log" Mar 20 14:47:48 crc kubenswrapper[4973]: I0320 14:47:48.090849 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-gvrbc_3809b124-46bb-42ba-a467-279857c61ef6/ovsdb-server/0.log" Mar 20 14:47:48 crc kubenswrapper[4973]: I0320 14:47:48.257276 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-x2nll_f2c7b535-ad26-4bf4-848b-26890c0eb580/ovn-controller/0.log" Mar 20 14:47:48 crc kubenswrapper[4973]: I0320 14:47:48.423702 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-w4k6b_eda7094e-499a-4027-9f8d-91360c6d9780/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:47:48 crc kubenswrapper[4973]: I0320 14:47:48.540328 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5936fcd4-c4a1-423b-b026-9a86d9964154/openstack-network-exporter/0.log" Mar 20 14:47:48 crc kubenswrapper[4973]: I0320 14:47:48.583881 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5936fcd4-c4a1-423b-b026-9a86d9964154/ovn-northd/0.log" Mar 20 14:47:48 crc kubenswrapper[4973]: I0320 14:47:48.720919 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_feb13208-e959-4cb3-8d6f-185bf075036c/openstack-network-exporter/0.log" Mar 20 14:47:48 crc kubenswrapper[4973]: I0320 14:47:48.758649 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-849bfc9c67-jsf5j" podUID="28f1bcfd-788c-47fa-a462-cd5068ec34d2" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 20 14:47:48 crc kubenswrapper[4973]: I0320 14:47:48.872617 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_feb13208-e959-4cb3-8d6f-185bf075036c/ovsdbserver-nb/0.log" Mar 20 14:47:48 crc kubenswrapper[4973]: I0320 14:47:48.955752 4973 scope.go:117] "RemoveContainer" containerID="3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" Mar 20 14:47:48 crc kubenswrapper[4973]: E0320 14:47:48.956080 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:47:48 crc kubenswrapper[4973]: I0320 14:47:48.983923 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a/openstack-network-exporter/0.log" Mar 20 14:47:49 crc kubenswrapper[4973]: I0320 14:47:49.210910 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_2aa4ddbf-ce83-41c8-b61e-80bd9bea2b6a/ovsdbserver-sb/0.log" Mar 20 14:47:49 crc kubenswrapper[4973]: I0320 14:47:49.452121 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-85f659f75b-sh7fp_fff088b0-c9e8-47de-a698-80bcebcfd1a4/placement-api/0.log" Mar 20 14:47:49 crc kubenswrapper[4973]: I0320 14:47:49.502474 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-85f659f75b-sh7fp_fff088b0-c9e8-47de-a698-80bcebcfd1a4/placement-log/0.log" Mar 20 14:47:49 crc kubenswrapper[4973]: I0320 14:47:49.608944 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b51c1ea9-b42f-47a5-8f74-164a29b2d036/init-config-reloader/0.log" Mar 20 14:47:49 crc kubenswrapper[4973]: I0320 14:47:49.920531 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b51c1ea9-b42f-47a5-8f74-164a29b2d036/config-reloader/0.log" Mar 20 14:47:49 crc kubenswrapper[4973]: I0320 14:47:49.922144 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b51c1ea9-b42f-47a5-8f74-164a29b2d036/init-config-reloader/0.log" Mar 20 14:47:49 crc kubenswrapper[4973]: I0320 14:47:49.964048 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b51c1ea9-b42f-47a5-8f74-164a29b2d036/thanos-sidecar/0.log" Mar 20 14:47:49 crc kubenswrapper[4973]: I0320 14:47:49.987600 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b51c1ea9-b42f-47a5-8f74-164a29b2d036/prometheus/0.log" Mar 20 14:47:50 crc kubenswrapper[4973]: I0320 14:47:50.222670 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_89a22e25-955b-4786-baf2-7138a668f512/setup-container/0.log" Mar 20 14:47:50 crc kubenswrapper[4973]: I0320 14:47:50.377270 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_89a22e25-955b-4786-baf2-7138a668f512/setup-container/0.log" Mar 20 14:47:50 crc kubenswrapper[4973]: I0320 14:47:50.483169 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_89a22e25-955b-4786-baf2-7138a668f512/rabbitmq/0.log" Mar 20 14:47:50 crc kubenswrapper[4973]: I0320 14:47:50.535893 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_10689a54-14e6-456c-b710-e7c24c71016d/setup-container/0.log" Mar 20 14:47:50 crc kubenswrapper[4973]: I0320 14:47:50.742717 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_10689a54-14e6-456c-b710-e7c24c71016d/setup-container/0.log" Mar 20 14:47:50 crc kubenswrapper[4973]: I0320 14:47:50.808902 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_22bad2d1-9031-42e8-882b-f3cebea8db32/setup-container/0.log" Mar 20 14:47:50 crc kubenswrapper[4973]: I0320 14:47:50.897584 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_10689a54-14e6-456c-b710-e7c24c71016d/rabbitmq/0.log" Mar 20 14:47:51 crc kubenswrapper[4973]: I0320 14:47:51.059926 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_22bad2d1-9031-42e8-882b-f3cebea8db32/setup-container/0.log" Mar 20 14:47:51 crc kubenswrapper[4973]: I0320 14:47:51.094104 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-1_22bad2d1-9031-42e8-882b-f3cebea8db32/rabbitmq/0.log" Mar 20 14:47:51 crc kubenswrapper[4973]: I0320 14:47:51.165509 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_f8b4580d-53af-4c18-9f2c-b883b8621113/setup-container/0.log" Mar 20 14:47:51 crc kubenswrapper[4973]: I0320 14:47:51.440570 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_f8b4580d-53af-4c18-9f2c-b883b8621113/setup-container/0.log" Mar 20 14:47:51 crc kubenswrapper[4973]: I0320 14:47:51.568545 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-tgcd7_d26727be-6f6f-4044-b0ac-771e58cb8641/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:47:51 crc kubenswrapper[4973]: I0320 14:47:51.581627 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-2_f8b4580d-53af-4c18-9f2c-b883b8621113/rabbitmq/0.log" Mar 20 14:47:51 crc kubenswrapper[4973]: I0320 14:47:51.897080 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-7sp58_7673cb4f-1440-480a-8d50-2640987b8a0f/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:47:51 crc kubenswrapper[4973]: I0320 14:47:51.928925 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-82xbs_c1eea56a-055e-400a-8300-cde71ecad667/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:47:52 crc kubenswrapper[4973]: I0320 14:47:52.156983 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-gkhr6_da2fdea4-15c3-4312-9e6f-bbff469b8feb/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:47:52 crc kubenswrapper[4973]: I0320 14:47:52.194367 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-cvmvr_02613b7a-06ef-4a92-8d7e-55dd12481786/ssh-known-hosts-edpm-deployment/0.log" Mar 20 14:47:52 crc kubenswrapper[4973]: I0320 14:47:52.549282 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-849bfc9c67-jsf5j_28f1bcfd-788c-47fa-a462-cd5068ec34d2/proxy-server/0.log" Mar 20 14:47:52 crc kubenswrapper[4973]: I0320 14:47:52.636103 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-849bfc9c67-jsf5j_28f1bcfd-788c-47fa-a462-cd5068ec34d2/proxy-httpd/0.log" Mar 20 14:47:52 crc kubenswrapper[4973]: I0320 14:47:52.746860 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-qjvlf_dbc7e778-9029-42ce-9a8e-e76636aad6a5/swift-ring-rebalance/0.log" Mar 20 14:47:52 crc kubenswrapper[4973]: I0320 14:47:52.886356 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8184c0e7-f9ef-48a3-9461-5cc6c1188e6b/account-reaper/0.log" Mar 20 14:47:52 crc kubenswrapper[4973]: I0320 14:47:52.932975 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8184c0e7-f9ef-48a3-9461-5cc6c1188e6b/account-auditor/0.log" Mar 20 14:47:53 crc kubenswrapper[4973]: I0320 14:47:53.101910 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8184c0e7-f9ef-48a3-9461-5cc6c1188e6b/account-replicator/0.log" Mar 20 14:47:53 crc kubenswrapper[4973]: I0320 14:47:53.175305 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8184c0e7-f9ef-48a3-9461-5cc6c1188e6b/account-server/0.log" Mar 20 14:47:53 crc kubenswrapper[4973]: I0320 14:47:53.188915 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8184c0e7-f9ef-48a3-9461-5cc6c1188e6b/container-auditor/0.log" Mar 20 14:47:53 crc kubenswrapper[4973]: I0320 14:47:53.279798 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8184c0e7-f9ef-48a3-9461-5cc6c1188e6b/container-replicator/0.log" Mar 20 14:47:53 crc kubenswrapper[4973]: I0320 14:47:53.408068 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8184c0e7-f9ef-48a3-9461-5cc6c1188e6b/container-server/0.log" Mar 20 14:47:53 crc kubenswrapper[4973]: I0320 14:47:53.412768 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8184c0e7-f9ef-48a3-9461-5cc6c1188e6b/container-updater/0.log" Mar 20 14:47:53 crc kubenswrapper[4973]: I0320 14:47:53.427049 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8184c0e7-f9ef-48a3-9461-5cc6c1188e6b/object-auditor/0.log" Mar 20 14:47:53 crc kubenswrapper[4973]: I0320 14:47:53.543725 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8184c0e7-f9ef-48a3-9461-5cc6c1188e6b/object-expirer/0.log" Mar 20 14:47:53 crc kubenswrapper[4973]: I0320 14:47:53.692601 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8184c0e7-f9ef-48a3-9461-5cc6c1188e6b/object-server/0.log" Mar 20 14:47:53 crc kubenswrapper[4973]: I0320 14:47:53.724225 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8184c0e7-f9ef-48a3-9461-5cc6c1188e6b/object-updater/0.log" Mar 20 14:47:53 crc kubenswrapper[4973]: I0320 14:47:53.767458 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8184c0e7-f9ef-48a3-9461-5cc6c1188e6b/object-replicator/0.log" Mar 20 14:47:53 crc kubenswrapper[4973]: I0320 14:47:53.846277 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8184c0e7-f9ef-48a3-9461-5cc6c1188e6b/rsync/0.log" Mar 20 14:47:54 crc kubenswrapper[4973]: I0320 14:47:54.036941 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_8184c0e7-f9ef-48a3-9461-5cc6c1188e6b/swift-recon-cron/0.log" Mar 20 14:47:54 crc kubenswrapper[4973]: I0320 14:47:54.673681 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-jqp9q_7038f01c-aeff-4322-bfd1-715445d5d1cb/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:47:54 crc kubenswrapper[4973]: I0320 14:47:54.853953 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_66048617-923b-4595-bc01-4fedc0092948/test-operator-logs-container/0.log" Mar 20 14:47:55 crc kubenswrapper[4973]: I0320 14:47:55.067554 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-power-monitoring-edpm-deployment-openstack-edpm-j887g_5972691e-cb05-4ded-b36a-b045d3b4726f/telemetry-power-monitoring-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:47:55 crc kubenswrapper[4973]: I0320 14:47:55.098136 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_b5edf151-174a-4c18-b733-318653db1c6e/tempest-tests-tempest-tests-runner/0.log" Mar 20 14:47:55 crc kubenswrapper[4973]: I0320 14:47:55.147155 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-jc7dj_72622eaf-f7fc-44a6-9700-4de40de09009/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 20 14:47:59 crc kubenswrapper[4973]: I0320 14:47:59.857913 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_144466b4-e5d6-4cca-b67b-b76dc24f8dea/memcached/0.log" Mar 20 14:47:59 crc kubenswrapper[4973]: I0320 14:47:59.958692 4973 scope.go:117] "RemoveContainer" containerID="3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" Mar 20 14:47:59 crc kubenswrapper[4973]: E0320 14:47:59.959189 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:48:00 crc kubenswrapper[4973]: I0320 14:48:00.154889 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566968-v4vj6"] Mar 20 14:48:00 crc kubenswrapper[4973]: E0320 14:48:00.176795 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52fa9a9d-9f39-4775-bc42-b41e0f237e09" containerName="container-00" Mar 20 14:48:00 crc kubenswrapper[4973]: I0320 14:48:00.176830 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="52fa9a9d-9f39-4775-bc42-b41e0f237e09" containerName="container-00" Mar 20 14:48:00 crc kubenswrapper[4973]: I0320 14:48:00.177173 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="52fa9a9d-9f39-4775-bc42-b41e0f237e09" containerName="container-00" Mar 20 14:48:00 crc kubenswrapper[4973]: I0320 14:48:00.177990 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566968-v4vj6"] Mar 20 14:48:00 crc kubenswrapper[4973]: I0320 14:48:00.178069 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566968-v4vj6" Mar 20 14:48:00 crc kubenswrapper[4973]: I0320 14:48:00.190862 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:48:00 crc kubenswrapper[4973]: I0320 14:48:00.190931 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:48:00 crc kubenswrapper[4973]: I0320 14:48:00.191065 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:48:00 crc kubenswrapper[4973]: I0320 14:48:00.330621 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmvcw\" (UniqueName: \"kubernetes.io/projected/e3982fe5-3325-4510-be69-024ac254cc8e-kube-api-access-bmvcw\") pod \"auto-csr-approver-29566968-v4vj6\" (UID: \"e3982fe5-3325-4510-be69-024ac254cc8e\") " pod="openshift-infra/auto-csr-approver-29566968-v4vj6" Mar 20 14:48:00 crc kubenswrapper[4973]: I0320 14:48:00.433749 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmvcw\" (UniqueName: \"kubernetes.io/projected/e3982fe5-3325-4510-be69-024ac254cc8e-kube-api-access-bmvcw\") pod \"auto-csr-approver-29566968-v4vj6\" (UID: \"e3982fe5-3325-4510-be69-024ac254cc8e\") " pod="openshift-infra/auto-csr-approver-29566968-v4vj6" Mar 20 14:48:00 crc kubenswrapper[4973]: I0320 14:48:00.452846 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmvcw\" (UniqueName: \"kubernetes.io/projected/e3982fe5-3325-4510-be69-024ac254cc8e-kube-api-access-bmvcw\") pod \"auto-csr-approver-29566968-v4vj6\" (UID: \"e3982fe5-3325-4510-be69-024ac254cc8e\") " pod="openshift-infra/auto-csr-approver-29566968-v4vj6" Mar 20 14:48:00 crc kubenswrapper[4973]: I0320 14:48:00.514538 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566968-v4vj6" Mar 20 14:48:01 crc kubenswrapper[4973]: I0320 14:48:01.334058 4973 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:48:01 crc kubenswrapper[4973]: I0320 14:48:01.336400 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566968-v4vj6"] Mar 20 14:48:01 crc kubenswrapper[4973]: I0320 14:48:01.356572 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566968-v4vj6" event={"ID":"e3982fe5-3325-4510-be69-024ac254cc8e","Type":"ContainerStarted","Data":"1033533f9517897356ade4f50d0858529d3fedf6f900f108c965670ebcc521db"} Mar 20 14:48:03 crc kubenswrapper[4973]: I0320 14:48:03.386241 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566968-v4vj6" event={"ID":"e3982fe5-3325-4510-be69-024ac254cc8e","Type":"ContainerStarted","Data":"24161947c1d5f03ecc35f03ac4bd3e7e9cef313f7a61ef878828c7c11eb089de"} Mar 20 14:48:03 crc kubenswrapper[4973]: I0320 14:48:03.418225 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566968-v4vj6" podStartSLOduration=1.8512414000000001 podStartE2EDuration="3.418204045s" podCreationTimestamp="2026-03-20 14:48:00 +0000 UTC" firstStartedPulling="2026-03-20 14:48:01.331525976 +0000 UTC m=+5202.075195710" lastFinishedPulling="2026-03-20 14:48:02.898488611 +0000 UTC m=+5203.642158355" observedRunningTime="2026-03-20 14:48:03.414074552 +0000 UTC m=+5204.157744296" watchObservedRunningTime="2026-03-20 14:48:03.418204045 +0000 UTC m=+5204.161873789" Mar 20 14:48:05 crc kubenswrapper[4973]: I0320 14:48:05.408816 4973 generic.go:334] "Generic (PLEG): container finished" podID="e3982fe5-3325-4510-be69-024ac254cc8e" containerID="24161947c1d5f03ecc35f03ac4bd3e7e9cef313f7a61ef878828c7c11eb089de" exitCode=0 Mar 20 14:48:05 crc kubenswrapper[4973]: I0320 14:48:05.408915 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566968-v4vj6" event={"ID":"e3982fe5-3325-4510-be69-024ac254cc8e","Type":"ContainerDied","Data":"24161947c1d5f03ecc35f03ac4bd3e7e9cef313f7a61ef878828c7c11eb089de"} Mar 20 14:48:07 crc kubenswrapper[4973]: I0320 14:48:07.359627 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566968-v4vj6" Mar 20 14:48:07 crc kubenswrapper[4973]: I0320 14:48:07.431470 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566968-v4vj6" event={"ID":"e3982fe5-3325-4510-be69-024ac254cc8e","Type":"ContainerDied","Data":"1033533f9517897356ade4f50d0858529d3fedf6f900f108c965670ebcc521db"} Mar 20 14:48:07 crc kubenswrapper[4973]: I0320 14:48:07.431515 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566968-v4vj6" Mar 20 14:48:07 crc kubenswrapper[4973]: I0320 14:48:07.431536 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1033533f9517897356ade4f50d0858529d3fedf6f900f108c965670ebcc521db" Mar 20 14:48:07 crc kubenswrapper[4973]: I0320 14:48:07.493536 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566962-2sqjf"] Mar 20 14:48:07 crc kubenswrapper[4973]: I0320 14:48:07.505536 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566962-2sqjf"] Mar 20 14:48:07 crc kubenswrapper[4973]: I0320 14:48:07.526894 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmvcw\" (UniqueName: \"kubernetes.io/projected/e3982fe5-3325-4510-be69-024ac254cc8e-kube-api-access-bmvcw\") pod \"e3982fe5-3325-4510-be69-024ac254cc8e\" (UID: \"e3982fe5-3325-4510-be69-024ac254cc8e\") " Mar 20 14:48:07 crc kubenswrapper[4973]: I0320 14:48:07.532598 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3982fe5-3325-4510-be69-024ac254cc8e-kube-api-access-bmvcw" (OuterVolumeSpecName: "kube-api-access-bmvcw") pod "e3982fe5-3325-4510-be69-024ac254cc8e" (UID: "e3982fe5-3325-4510-be69-024ac254cc8e"). InnerVolumeSpecName "kube-api-access-bmvcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:48:07 crc kubenswrapper[4973]: I0320 14:48:07.630624 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmvcw\" (UniqueName: \"kubernetes.io/projected/e3982fe5-3325-4510-be69-024ac254cc8e-kube-api-access-bmvcw\") on node \"crc\" DevicePath \"\"" Mar 20 14:48:07 crc kubenswrapper[4973]: I0320 14:48:07.965523 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46a36df3-9f0c-4e6a-b959-5542da9c7534" path="/var/lib/kubelet/pods/46a36df3-9f0c-4e6a-b959-5542da9c7534/volumes" Mar 20 14:48:10 crc kubenswrapper[4973]: I0320 14:48:10.951577 4973 scope.go:117] "RemoveContainer" containerID="3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" Mar 20 14:48:10 crc kubenswrapper[4973]: E0320 14:48:10.952794 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:48:25 crc kubenswrapper[4973]: I0320 14:48:25.953385 4973 scope.go:117] "RemoveContainer" containerID="3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" Mar 20 14:48:25 crc kubenswrapper[4973]: E0320 14:48:25.955487 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:48:31 crc kubenswrapper[4973]: I0320 14:48:31.281097 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b_112512e7-8063-4695-9c7e-c4bdb94bd796/util/0.log" Mar 20 14:48:31 crc kubenswrapper[4973]: I0320 14:48:31.486543 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b_112512e7-8063-4695-9c7e-c4bdb94bd796/util/0.log" Mar 20 14:48:31 crc kubenswrapper[4973]: I0320 14:48:31.508430 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b_112512e7-8063-4695-9c7e-c4bdb94bd796/pull/0.log" Mar 20 14:48:31 crc kubenswrapper[4973]: I0320 14:48:31.519757 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b_112512e7-8063-4695-9c7e-c4bdb94bd796/pull/0.log" Mar 20 14:48:31 crc kubenswrapper[4973]: I0320 14:48:31.793604 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b_112512e7-8063-4695-9c7e-c4bdb94bd796/util/0.log" Mar 20 14:48:31 crc kubenswrapper[4973]: I0320 14:48:31.795757 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b_112512e7-8063-4695-9c7e-c4bdb94bd796/extract/0.log" Mar 20 14:48:31 crc kubenswrapper[4973]: I0320 14:48:31.933552 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3643d6a8a7a036080dc86533e7ba8bce9523a8fe2de6ac00fdf8920243sln6b_112512e7-8063-4695-9c7e-c4bdb94bd796/pull/0.log" Mar 20 14:48:32 crc kubenswrapper[4973]: I0320 14:48:32.257650 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-w7228_9d106cd3-cadb-4cc7-b237-f05294c67dcd/manager/0.log" Mar 20 14:48:32 crc kubenswrapper[4973]: I0320 14:48:32.438814 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-2sflb_96f5f9d5-bb8c-497c-bfbb-8fd46342ce69/manager/0.log" Mar 20 14:48:32 crc kubenswrapper[4973]: I0320 14:48:32.716932 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-rm7q7_7260cd47-ce83-44db-951d-757908bf5953/manager/0.log" Mar 20 14:48:33 crc kubenswrapper[4973]: I0320 14:48:33.015107 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-z4dzx_3233d229-1d2f-4c90-b76a-f27ca914f0ad/manager/0.log" Mar 20 14:48:33 crc kubenswrapper[4973]: I0320 14:48:33.048985 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-zmqsw_84e8fd4a-d562-4e92-adfc-479867cf9d3a/manager/0.log" Mar 20 14:48:33 crc kubenswrapper[4973]: I0320 14:48:33.614820 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-qxlrn_13c558f8-2e66-49c3-b184-7fdbbf4ff6b1/manager/0.log" Mar 20 14:48:33 crc kubenswrapper[4973]: I0320 14:48:33.905595 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-27jg7_c5689d29-0b82-4482-9444-5da20e2da57a/manager/0.log" Mar 20 14:48:33 crc kubenswrapper[4973]: I0320 14:48:33.979968 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-jn8rq_c0abcba1-e57c-4d90-a8cb-61989da15e87/manager/0.log" Mar 20 14:48:34 crc kubenswrapper[4973]: I0320 14:48:34.209695 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-gs46s_530d31a0-48a0-4d06-9b03-c9c205312bdc/manager/0.log" Mar 20 14:48:34 crc kubenswrapper[4973]: I0320 14:48:34.371170 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-crjg2_97aab498-21c1-476f-a64b-a526745fc64a/manager/0.log" Mar 20 14:48:34 crc kubenswrapper[4973]: I0320 14:48:34.482381 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-pzl6t_dbb02721-66ce-44b6-bffe-59851197efa8/manager/0.log" Mar 20 14:48:34 crc kubenswrapper[4973]: I0320 14:48:34.669193 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-kzx4f_45fa2923-6b6f-44da-8693-6ee06c476a8f/manager/0.log" Mar 20 14:48:34 crc kubenswrapper[4973]: I0320 14:48:34.803662 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-p6vx2_d5a271f2-b17d-487d-a61b-00bd17841392/manager/0.log" Mar 20 14:48:34 crc kubenswrapper[4973]: I0320 14:48:34.945985 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-x4r2t_39227253-9885-4ba2-a216-c04066dc7c84/manager/0.log" Mar 20 14:48:35 crc kubenswrapper[4973]: I0320 14:48:35.052814 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-fjztv_3c89c7dd-500b-4bd5-a30e-273c2a485728/manager/0.log" Mar 20 14:48:35 crc kubenswrapper[4973]: I0320 14:48:35.236966 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-6f7459b8bf-nf8rm_bad13d41-c3be-4f23-b40f-f621e669ef5b/operator/0.log" Mar 20 14:48:35 crc kubenswrapper[4973]: I0320 14:48:35.415793 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-kwvd2_f580709c-eab2-41f5-96b4-2e32cf02cdcb/registry-server/0.log" Mar 20 14:48:35 crc kubenswrapper[4973]: I0320 14:48:35.693278 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-9dv29_106eb66b-ca71-49b1-a80e-699f34ac9df9/manager/0.log" Mar 20 14:48:35 crc kubenswrapper[4973]: I0320 14:48:35.709443 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qwwgt"] Mar 20 14:48:35 crc kubenswrapper[4973]: E0320 14:48:35.710123 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3982fe5-3325-4510-be69-024ac254cc8e" containerName="oc" Mar 20 14:48:35 crc kubenswrapper[4973]: I0320 14:48:35.710141 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3982fe5-3325-4510-be69-024ac254cc8e" containerName="oc" Mar 20 14:48:35 crc kubenswrapper[4973]: I0320 14:48:35.710496 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3982fe5-3325-4510-be69-024ac254cc8e" containerName="oc" Mar 20 14:48:35 crc kubenswrapper[4973]: I0320 14:48:35.719093 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qwwgt" Mar 20 14:48:35 crc kubenswrapper[4973]: I0320 14:48:35.767975 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qwwgt"] Mar 20 14:48:35 crc kubenswrapper[4973]: I0320 14:48:35.770071 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-2zn7v_991cca2c-022f-4c90-a1ba-287191fc2d49/manager/0.log" Mar 20 14:48:35 crc kubenswrapper[4973]: I0320 14:48:35.858100 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20a25598-3d74-4fa2-8f68-0240df2b108c-utilities\") pod \"certified-operators-qwwgt\" (UID: \"20a25598-3d74-4fa2-8f68-0240df2b108c\") " pod="openshift-marketplace/certified-operators-qwwgt" Mar 20 14:48:35 crc kubenswrapper[4973]: I0320 14:48:35.858247 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20a25598-3d74-4fa2-8f68-0240df2b108c-catalog-content\") pod \"certified-operators-qwwgt\" (UID: \"20a25598-3d74-4fa2-8f68-0240df2b108c\") " pod="openshift-marketplace/certified-operators-qwwgt" Mar 20 14:48:35 crc kubenswrapper[4973]: I0320 14:48:35.858454 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4scn2\" (UniqueName: \"kubernetes.io/projected/20a25598-3d74-4fa2-8f68-0240df2b108c-kube-api-access-4scn2\") pod \"certified-operators-qwwgt\" (UID: \"20a25598-3d74-4fa2-8f68-0240df2b108c\") " pod="openshift-marketplace/certified-operators-qwwgt" Mar 20 14:48:35 crc kubenswrapper[4973]: I0320 14:48:35.962659 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4scn2\" (UniqueName: \"kubernetes.io/projected/20a25598-3d74-4fa2-8f68-0240df2b108c-kube-api-access-4scn2\") pod \"certified-operators-qwwgt\" (UID: \"20a25598-3d74-4fa2-8f68-0240df2b108c\") " pod="openshift-marketplace/certified-operators-qwwgt" Mar 20 14:48:35 crc kubenswrapper[4973]: I0320 14:48:35.962850 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20a25598-3d74-4fa2-8f68-0240df2b108c-utilities\") pod \"certified-operators-qwwgt\" (UID: \"20a25598-3d74-4fa2-8f68-0240df2b108c\") " pod="openshift-marketplace/certified-operators-qwwgt" Mar 20 14:48:35 crc kubenswrapper[4973]: I0320 14:48:35.963018 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20a25598-3d74-4fa2-8f68-0240df2b108c-catalog-content\") pod \"certified-operators-qwwgt\" (UID: \"20a25598-3d74-4fa2-8f68-0240df2b108c\") " pod="openshift-marketplace/certified-operators-qwwgt" Mar 20 14:48:35 crc kubenswrapper[4973]: I0320 14:48:35.966324 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20a25598-3d74-4fa2-8f68-0240df2b108c-catalog-content\") pod \"certified-operators-qwwgt\" (UID: \"20a25598-3d74-4fa2-8f68-0240df2b108c\") " pod="openshift-marketplace/certified-operators-qwwgt" Mar 20 14:48:35 crc kubenswrapper[4973]: I0320 14:48:35.966413 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20a25598-3d74-4fa2-8f68-0240df2b108c-utilities\") pod \"certified-operators-qwwgt\" (UID: \"20a25598-3d74-4fa2-8f68-0240df2b108c\") " pod="openshift-marketplace/certified-operators-qwwgt" Mar 20 14:48:35 crc kubenswrapper[4973]: I0320 14:48:35.987124 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4scn2\" (UniqueName: \"kubernetes.io/projected/20a25598-3d74-4fa2-8f68-0240df2b108c-kube-api-access-4scn2\") pod \"certified-operators-qwwgt\" (UID: \"20a25598-3d74-4fa2-8f68-0240df2b108c\") " pod="openshift-marketplace/certified-operators-qwwgt" Mar 20 14:48:36 crc kubenswrapper[4973]: I0320 14:48:36.014599 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-6bzww_8b21705a-8662-4147-9a10-9a95982d961c/manager/0.log" Mar 20 14:48:36 crc kubenswrapper[4973]: I0320 14:48:36.054095 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qwwgt" Mar 20 14:48:36 crc kubenswrapper[4973]: I0320 14:48:36.156811 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-tgmwc_88d5fd4b-c230-4b94-b988-0b79ec98d991/operator/0.log" Mar 20 14:48:36 crc kubenswrapper[4973]: I0320 14:48:36.545170 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-xlqnk_aa34abe0-30d3-4d49-9f20-c15990a91a36/manager/0.log" Mar 20 14:48:36 crc kubenswrapper[4973]: I0320 14:48:36.656252 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qwwgt"] Mar 20 14:48:36 crc kubenswrapper[4973]: I0320 14:48:36.794530 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qwwgt" event={"ID":"20a25598-3d74-4fa2-8f68-0240df2b108c","Type":"ContainerStarted","Data":"f7e4f7f85923d45bd05d0011b7498eac42f114f9b91f5856df98ce16dc4b11c0"} Mar 20 14:48:36 crc kubenswrapper[4973]: I0320 14:48:36.898779 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-kzrpr_5a41f6b9-9f79-454a-af8a-c0ad746f1d42/manager/0.log" Mar 20 14:48:36 crc kubenswrapper[4973]: I0320 14:48:36.953811 4973 scope.go:117] "RemoveContainer" containerID="3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" Mar 20 14:48:36 crc kubenswrapper[4973]: E0320 14:48:36.959591 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:48:36 crc kubenswrapper[4973]: I0320 14:48:36.966198 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-fbb6f4f4f-nh7dd_ecf17bc8-3c8a-4791-a205-2bdc718ec15f/manager/0.log" Mar 20 14:48:37 crc kubenswrapper[4973]: I0320 14:48:37.268790 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-78865ff6b4-wlv2f_835537e8-dced-4516-a7b9-168d9bb6b687/manager/0.log" Mar 20 14:48:37 crc kubenswrapper[4973]: I0320 14:48:37.809329 4973 generic.go:334] "Generic (PLEG): container finished" podID="20a25598-3d74-4fa2-8f68-0240df2b108c" containerID="058f7400ce2dd4a1da52f99548c8d65b4b51d90b146cb9c9686a8059a8d98cd8" exitCode=0 Mar 20 14:48:37 crc kubenswrapper[4973]: I0320 14:48:37.809399 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qwwgt" event={"ID":"20a25598-3d74-4fa2-8f68-0240df2b108c","Type":"ContainerDied","Data":"058f7400ce2dd4a1da52f99548c8d65b4b51d90b146cb9c9686a8059a8d98cd8"} Mar 20 14:48:38 crc kubenswrapper[4973]: I0320 14:48:38.823324 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qwwgt" event={"ID":"20a25598-3d74-4fa2-8f68-0240df2b108c","Type":"ContainerStarted","Data":"610cbeecf1368769337c4b72230ae5243de034a2b5fff117cde58058349998df"} Mar 20 14:48:41 crc kubenswrapper[4973]: I0320 14:48:41.865381 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qwwgt" event={"ID":"20a25598-3d74-4fa2-8f68-0240df2b108c","Type":"ContainerDied","Data":"610cbeecf1368769337c4b72230ae5243de034a2b5fff117cde58058349998df"} Mar 20 14:48:41 crc kubenswrapper[4973]: I0320 14:48:41.865283 4973 generic.go:334] "Generic (PLEG): container finished" podID="20a25598-3d74-4fa2-8f68-0240df2b108c" containerID="610cbeecf1368769337c4b72230ae5243de034a2b5fff117cde58058349998df" exitCode=0 Mar 20 14:48:43 crc kubenswrapper[4973]: I0320 14:48:43.892684 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qwwgt" event={"ID":"20a25598-3d74-4fa2-8f68-0240df2b108c","Type":"ContainerStarted","Data":"f2801d0c0dcbbcee1e6c1d1b11358afedd95ff5932cdc525247889cb0ecae319"} Mar 20 14:48:43 crc kubenswrapper[4973]: I0320 14:48:43.922816 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qwwgt" podStartSLOduration=4.296611831 podStartE2EDuration="8.922792583s" podCreationTimestamp="2026-03-20 14:48:35 +0000 UTC" firstStartedPulling="2026-03-20 14:48:37.811779922 +0000 UTC m=+5238.555449666" lastFinishedPulling="2026-03-20 14:48:42.437960674 +0000 UTC m=+5243.181630418" observedRunningTime="2026-03-20 14:48:43.914917227 +0000 UTC m=+5244.658586991" watchObservedRunningTime="2026-03-20 14:48:43.922792583 +0000 UTC m=+5244.666462347" Mar 20 14:48:46 crc kubenswrapper[4973]: I0320 14:48:46.055219 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qwwgt" Mar 20 14:48:46 crc kubenswrapper[4973]: I0320 14:48:46.055570 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qwwgt" Mar 20 14:48:47 crc kubenswrapper[4973]: I0320 14:48:47.125779 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-qwwgt" podUID="20a25598-3d74-4fa2-8f68-0240df2b108c" containerName="registry-server" probeResult="failure" output=< Mar 20 14:48:47 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:48:47 crc kubenswrapper[4973]: > Mar 20 14:48:48 crc kubenswrapper[4973]: I0320 14:48:48.951195 4973 scope.go:117] "RemoveContainer" containerID="3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" Mar 20 14:48:48 crc kubenswrapper[4973]: E0320 14:48:48.952498 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:48:56 crc kubenswrapper[4973]: I0320 14:48:56.129480 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qwwgt" Mar 20 14:48:56 crc kubenswrapper[4973]: I0320 14:48:56.200621 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qwwgt" Mar 20 14:48:56 crc kubenswrapper[4973]: I0320 14:48:56.375899 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qwwgt"] Mar 20 14:48:58 crc kubenswrapper[4973]: I0320 14:48:58.050257 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qwwgt" podUID="20a25598-3d74-4fa2-8f68-0240df2b108c" containerName="registry-server" containerID="cri-o://f2801d0c0dcbbcee1e6c1d1b11358afedd95ff5932cdc525247889cb0ecae319" gracePeriod=2 Mar 20 14:48:58 crc kubenswrapper[4973]: I0320 14:48:58.653321 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qwwgt" Mar 20 14:48:58 crc kubenswrapper[4973]: I0320 14:48:58.688736 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4scn2\" (UniqueName: \"kubernetes.io/projected/20a25598-3d74-4fa2-8f68-0240df2b108c-kube-api-access-4scn2\") pod \"20a25598-3d74-4fa2-8f68-0240df2b108c\" (UID: \"20a25598-3d74-4fa2-8f68-0240df2b108c\") " Mar 20 14:48:58 crc kubenswrapper[4973]: I0320 14:48:58.688841 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20a25598-3d74-4fa2-8f68-0240df2b108c-utilities\") pod \"20a25598-3d74-4fa2-8f68-0240df2b108c\" (UID: \"20a25598-3d74-4fa2-8f68-0240df2b108c\") " Mar 20 14:48:58 crc kubenswrapper[4973]: I0320 14:48:58.689157 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20a25598-3d74-4fa2-8f68-0240df2b108c-catalog-content\") pod \"20a25598-3d74-4fa2-8f68-0240df2b108c\" (UID: \"20a25598-3d74-4fa2-8f68-0240df2b108c\") " Mar 20 14:48:58 crc kubenswrapper[4973]: I0320 14:48:58.697851 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20a25598-3d74-4fa2-8f68-0240df2b108c-kube-api-access-4scn2" (OuterVolumeSpecName: "kube-api-access-4scn2") pod "20a25598-3d74-4fa2-8f68-0240df2b108c" (UID: "20a25598-3d74-4fa2-8f68-0240df2b108c"). InnerVolumeSpecName "kube-api-access-4scn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:48:58 crc kubenswrapper[4973]: I0320 14:48:58.718967 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20a25598-3d74-4fa2-8f68-0240df2b108c-utilities" (OuterVolumeSpecName: "utilities") pod "20a25598-3d74-4fa2-8f68-0240df2b108c" (UID: "20a25598-3d74-4fa2-8f68-0240df2b108c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:48:58 crc kubenswrapper[4973]: I0320 14:48:58.780416 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20a25598-3d74-4fa2-8f68-0240df2b108c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20a25598-3d74-4fa2-8f68-0240df2b108c" (UID: "20a25598-3d74-4fa2-8f68-0240df2b108c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:48:58 crc kubenswrapper[4973]: I0320 14:48:58.791989 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20a25598-3d74-4fa2-8f68-0240df2b108c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:48:58 crc kubenswrapper[4973]: I0320 14:48:58.792049 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4scn2\" (UniqueName: \"kubernetes.io/projected/20a25598-3d74-4fa2-8f68-0240df2b108c-kube-api-access-4scn2\") on node \"crc\" DevicePath \"\"" Mar 20 14:48:58 crc kubenswrapper[4973]: I0320 14:48:58.792063 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20a25598-3d74-4fa2-8f68-0240df2b108c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:48:59 crc kubenswrapper[4973]: I0320 14:48:59.065975 4973 generic.go:334] "Generic (PLEG): container finished" podID="20a25598-3d74-4fa2-8f68-0240df2b108c" containerID="f2801d0c0dcbbcee1e6c1d1b11358afedd95ff5932cdc525247889cb0ecae319" exitCode=0 Mar 20 14:48:59 crc kubenswrapper[4973]: I0320 14:48:59.066025 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qwwgt" event={"ID":"20a25598-3d74-4fa2-8f68-0240df2b108c","Type":"ContainerDied","Data":"f2801d0c0dcbbcee1e6c1d1b11358afedd95ff5932cdc525247889cb0ecae319"} Mar 20 14:48:59 crc kubenswrapper[4973]: I0320 14:48:59.066052 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qwwgt" event={"ID":"20a25598-3d74-4fa2-8f68-0240df2b108c","Type":"ContainerDied","Data":"f7e4f7f85923d45bd05d0011b7498eac42f114f9b91f5856df98ce16dc4b11c0"} Mar 20 14:48:59 crc kubenswrapper[4973]: I0320 14:48:59.066055 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qwwgt" Mar 20 14:48:59 crc kubenswrapper[4973]: I0320 14:48:59.066070 4973 scope.go:117] "RemoveContainer" containerID="f2801d0c0dcbbcee1e6c1d1b11358afedd95ff5932cdc525247889cb0ecae319" Mar 20 14:48:59 crc kubenswrapper[4973]: I0320 14:48:59.093910 4973 scope.go:117] "RemoveContainer" containerID="610cbeecf1368769337c4b72230ae5243de034a2b5fff117cde58058349998df" Mar 20 14:48:59 crc kubenswrapper[4973]: I0320 14:48:59.127994 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qwwgt"] Mar 20 14:48:59 crc kubenswrapper[4973]: I0320 14:48:59.140734 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qwwgt"] Mar 20 14:48:59 crc kubenswrapper[4973]: I0320 14:48:59.141501 4973 scope.go:117] "RemoveContainer" containerID="058f7400ce2dd4a1da52f99548c8d65b4b51d90b146cb9c9686a8059a8d98cd8" Mar 20 14:48:59 crc kubenswrapper[4973]: I0320 14:48:59.228319 4973 scope.go:117] "RemoveContainer" containerID="f2801d0c0dcbbcee1e6c1d1b11358afedd95ff5932cdc525247889cb0ecae319" Mar 20 14:48:59 crc kubenswrapper[4973]: E0320 14:48:59.228994 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2801d0c0dcbbcee1e6c1d1b11358afedd95ff5932cdc525247889cb0ecae319\": container with ID starting with f2801d0c0dcbbcee1e6c1d1b11358afedd95ff5932cdc525247889cb0ecae319 not found: ID does not exist" containerID="f2801d0c0dcbbcee1e6c1d1b11358afedd95ff5932cdc525247889cb0ecae319" Mar 20 14:48:59 crc kubenswrapper[4973]: I0320 14:48:59.229018 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2801d0c0dcbbcee1e6c1d1b11358afedd95ff5932cdc525247889cb0ecae319"} err="failed to get container status \"f2801d0c0dcbbcee1e6c1d1b11358afedd95ff5932cdc525247889cb0ecae319\": rpc error: code = NotFound desc = could not find container \"f2801d0c0dcbbcee1e6c1d1b11358afedd95ff5932cdc525247889cb0ecae319\": container with ID starting with f2801d0c0dcbbcee1e6c1d1b11358afedd95ff5932cdc525247889cb0ecae319 not found: ID does not exist" Mar 20 14:48:59 crc kubenswrapper[4973]: I0320 14:48:59.229042 4973 scope.go:117] "RemoveContainer" containerID="610cbeecf1368769337c4b72230ae5243de034a2b5fff117cde58058349998df" Mar 20 14:48:59 crc kubenswrapper[4973]: E0320 14:48:59.229370 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"610cbeecf1368769337c4b72230ae5243de034a2b5fff117cde58058349998df\": container with ID starting with 610cbeecf1368769337c4b72230ae5243de034a2b5fff117cde58058349998df not found: ID does not exist" containerID="610cbeecf1368769337c4b72230ae5243de034a2b5fff117cde58058349998df" Mar 20 14:48:59 crc kubenswrapper[4973]: I0320 14:48:59.229395 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"610cbeecf1368769337c4b72230ae5243de034a2b5fff117cde58058349998df"} err="failed to get container status \"610cbeecf1368769337c4b72230ae5243de034a2b5fff117cde58058349998df\": rpc error: code = NotFound desc = could not find container \"610cbeecf1368769337c4b72230ae5243de034a2b5fff117cde58058349998df\": container with ID starting with 610cbeecf1368769337c4b72230ae5243de034a2b5fff117cde58058349998df not found: ID does not exist" Mar 20 14:48:59 crc kubenswrapper[4973]: I0320 14:48:59.229410 4973 scope.go:117] "RemoveContainer" containerID="058f7400ce2dd4a1da52f99548c8d65b4b51d90b146cb9c9686a8059a8d98cd8" Mar 20 14:48:59 crc kubenswrapper[4973]: E0320 14:48:59.229697 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"058f7400ce2dd4a1da52f99548c8d65b4b51d90b146cb9c9686a8059a8d98cd8\": container with ID starting with 058f7400ce2dd4a1da52f99548c8d65b4b51d90b146cb9c9686a8059a8d98cd8 not found: ID does not exist" containerID="058f7400ce2dd4a1da52f99548c8d65b4b51d90b146cb9c9686a8059a8d98cd8" Mar 20 14:48:59 crc kubenswrapper[4973]: I0320 14:48:59.229720 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"058f7400ce2dd4a1da52f99548c8d65b4b51d90b146cb9c9686a8059a8d98cd8"} err="failed to get container status \"058f7400ce2dd4a1da52f99548c8d65b4b51d90b146cb9c9686a8059a8d98cd8\": rpc error: code = NotFound desc = could not find container \"058f7400ce2dd4a1da52f99548c8d65b4b51d90b146cb9c9686a8059a8d98cd8\": container with ID starting with 058f7400ce2dd4a1da52f99548c8d65b4b51d90b146cb9c9686a8059a8d98cd8 not found: ID does not exist" Mar 20 14:49:00 crc kubenswrapper[4973]: I0320 14:49:00.010500 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20a25598-3d74-4fa2-8f68-0240df2b108c" path="/var/lib/kubelet/pods/20a25598-3d74-4fa2-8f68-0240df2b108c/volumes" Mar 20 14:49:01 crc kubenswrapper[4973]: I0320 14:49:01.950594 4973 scope.go:117] "RemoveContainer" containerID="3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" Mar 20 14:49:01 crc kubenswrapper[4973]: E0320 14:49:01.951296 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:49:02 crc kubenswrapper[4973]: I0320 14:49:02.600456 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-wkc4c_b22a43e3-90de-4609-bf64-006de1716ae3/control-plane-machine-set-operator/0.log" Mar 20 14:49:02 crc kubenswrapper[4973]: I0320 14:49:02.916234 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jt5vz_0b6717fd-e636-4be6-8c04-c9b46924a3b2/kube-rbac-proxy/0.log" Mar 20 14:49:02 crc kubenswrapper[4973]: I0320 14:49:02.969224 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-jt5vz_0b6717fd-e636-4be6-8c04-c9b46924a3b2/machine-api-operator/0.log" Mar 20 14:49:05 crc kubenswrapper[4973]: I0320 14:49:05.967580 4973 scope.go:117] "RemoveContainer" containerID="cd991d7527d172dd7c86e149b200189738b1b0a10c0503f34f08c3e440cfb9ee" Mar 20 14:49:13 crc kubenswrapper[4973]: I0320 14:49:13.951032 4973 scope.go:117] "RemoveContainer" containerID="3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" Mar 20 14:49:13 crc kubenswrapper[4973]: E0320 14:49:13.952200 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:49:19 crc kubenswrapper[4973]: I0320 14:49:19.009923 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-94mm7_bfdfa88e-4049-47d3-87f8-07c52f1f51df/cert-manager-controller/0.log" Mar 20 14:49:19 crc kubenswrapper[4973]: I0320 14:49:19.175401 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-5n7n5_606e1a56-8a9a-4c0f-a7e0-646755f64185/cert-manager-cainjector/0.log" Mar 20 14:49:19 crc kubenswrapper[4973]: I0320 14:49:19.323567 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-4qwzv_6f5a8b02-59f4-427d-b91d-e7cacaa1ba23/cert-manager-webhook/0.log" Mar 20 14:49:27 crc kubenswrapper[4973]: I0320 14:49:27.950572 4973 scope.go:117] "RemoveContainer" containerID="3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" Mar 20 14:49:27 crc kubenswrapper[4973]: E0320 14:49:27.951426 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:49:33 crc kubenswrapper[4973]: I0320 14:49:33.777515 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-bqcmm_720b7a25-1a14-473b-af5e-0d3e764074ee/nmstate-console-plugin/0.log" Mar 20 14:49:34 crc kubenswrapper[4973]: I0320 14:49:34.066457 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-4z4zn_5f74ece1-d945-42b2-a93f-622bb0c63aa7/nmstate-handler/0.log" Mar 20 14:49:34 crc kubenswrapper[4973]: I0320 14:49:34.094918 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-pmf2g_0f84a71d-ca4b-44e1-9008-4a5f34dbaf9d/kube-rbac-proxy/0.log" Mar 20 14:49:34 crc kubenswrapper[4973]: I0320 14:49:34.242541 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-pmf2g_0f84a71d-ca4b-44e1-9008-4a5f34dbaf9d/nmstate-metrics/0.log" Mar 20 14:49:34 crc kubenswrapper[4973]: I0320 14:49:34.311443 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-nvs2k_8b93e9c4-b7b6-407a-b5c7-9fec8ed5a64a/nmstate-operator/0.log" Mar 20 14:49:34 crc kubenswrapper[4973]: I0320 14:49:34.462715 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-9c8nv_2437569f-a833-4666-a051-db0d4818cc5f/nmstate-webhook/0.log" Mar 20 14:49:38 crc kubenswrapper[4973]: I0320 14:49:38.950516 4973 scope.go:117] "RemoveContainer" containerID="3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" Mar 20 14:49:38 crc kubenswrapper[4973]: E0320 14:49:38.951272 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:49:48 crc kubenswrapper[4973]: I0320 14:49:48.140641 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6996757d8d-46qmw_3bf2a551-4944-4096-99f4-03effa26dde8/kube-rbac-proxy/0.log" Mar 20 14:49:48 crc kubenswrapper[4973]: I0320 14:49:48.229053 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6996757d8d-46qmw_3bf2a551-4944-4096-99f4-03effa26dde8/manager/0.log" Mar 20 14:49:53 crc kubenswrapper[4973]: I0320 14:49:53.951244 4973 scope.go:117] "RemoveContainer" containerID="3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" Mar 20 14:49:53 crc kubenswrapper[4973]: E0320 14:49:53.952225 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:50:00 crc kubenswrapper[4973]: I0320 14:50:00.159389 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566970-6hdtv"] Mar 20 14:50:00 crc kubenswrapper[4973]: E0320 14:50:00.161177 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a25598-3d74-4fa2-8f68-0240df2b108c" containerName="extract-utilities" Mar 20 14:50:00 crc kubenswrapper[4973]: I0320 14:50:00.161194 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a25598-3d74-4fa2-8f68-0240df2b108c" containerName="extract-utilities" Mar 20 14:50:00 crc kubenswrapper[4973]: E0320 14:50:00.161212 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a25598-3d74-4fa2-8f68-0240df2b108c" containerName="registry-server" Mar 20 14:50:00 crc kubenswrapper[4973]: I0320 14:50:00.161218 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a25598-3d74-4fa2-8f68-0240df2b108c" containerName="registry-server" Mar 20 14:50:00 crc kubenswrapper[4973]: E0320 14:50:00.161237 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20a25598-3d74-4fa2-8f68-0240df2b108c" containerName="extract-content" Mar 20 14:50:00 crc kubenswrapper[4973]: I0320 14:50:00.161243 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="20a25598-3d74-4fa2-8f68-0240df2b108c" containerName="extract-content" Mar 20 14:50:00 crc kubenswrapper[4973]: I0320 14:50:00.161539 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="20a25598-3d74-4fa2-8f68-0240df2b108c" containerName="registry-server" Mar 20 14:50:00 crc kubenswrapper[4973]: I0320 14:50:00.162619 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566970-6hdtv" Mar 20 14:50:00 crc kubenswrapper[4973]: I0320 14:50:00.167744 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:50:00 crc kubenswrapper[4973]: I0320 14:50:00.169169 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:50:00 crc kubenswrapper[4973]: I0320 14:50:00.169412 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:50:00 crc kubenswrapper[4973]: I0320 14:50:00.171727 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566970-6hdtv"] Mar 20 14:50:00 crc kubenswrapper[4973]: I0320 14:50:00.203927 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc5t8\" (UniqueName: \"kubernetes.io/projected/a43e8c15-2110-4352-b8b1-1919544f46d9-kube-api-access-zc5t8\") pod \"auto-csr-approver-29566970-6hdtv\" (UID: \"a43e8c15-2110-4352-b8b1-1919544f46d9\") " pod="openshift-infra/auto-csr-approver-29566970-6hdtv" Mar 20 14:50:00 crc kubenswrapper[4973]: I0320 14:50:00.306667 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc5t8\" (UniqueName: \"kubernetes.io/projected/a43e8c15-2110-4352-b8b1-1919544f46d9-kube-api-access-zc5t8\") pod \"auto-csr-approver-29566970-6hdtv\" (UID: \"a43e8c15-2110-4352-b8b1-1919544f46d9\") " pod="openshift-infra/auto-csr-approver-29566970-6hdtv" Mar 20 14:50:00 crc kubenswrapper[4973]: I0320 14:50:00.325392 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc5t8\" (UniqueName: \"kubernetes.io/projected/a43e8c15-2110-4352-b8b1-1919544f46d9-kube-api-access-zc5t8\") pod \"auto-csr-approver-29566970-6hdtv\" (UID: \"a43e8c15-2110-4352-b8b1-1919544f46d9\") " pod="openshift-infra/auto-csr-approver-29566970-6hdtv" Mar 20 14:50:00 crc kubenswrapper[4973]: I0320 14:50:00.487963 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566970-6hdtv" Mar 20 14:50:00 crc kubenswrapper[4973]: I0320 14:50:00.990712 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566970-6hdtv"] Mar 20 14:50:01 crc kubenswrapper[4973]: I0320 14:50:01.686998 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-5mfl6_12942d85-7c63-4b80-8df3-81e0941c91eb/prometheus-operator/0.log" Mar 20 14:50:01 crc kubenswrapper[4973]: I0320 14:50:01.827433 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566970-6hdtv" event={"ID":"a43e8c15-2110-4352-b8b1-1919544f46d9","Type":"ContainerStarted","Data":"214a85f63d8aea94bbe5ea0a87e26c89c53736133464fd108d83bdd804f83d97"} Mar 20 14:50:01 crc kubenswrapper[4973]: I0320 14:50:01.946950 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-655746896-lsnk6_8b74a291-42fd-4819-aab4-957acbce8ec7/prometheus-operator-admission-webhook/0.log" Mar 20 14:50:02 crc kubenswrapper[4973]: I0320 14:50:02.030742 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-655746896-qjslc_645dfcb4-2ecb-4f12-96a6-dc97944672fb/prometheus-operator-admission-webhook/0.log" Mar 20 14:50:02 crc kubenswrapper[4973]: I0320 14:50:02.155836 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-nhgtl_0e53f263-96c0-4390-b28e-ca37e867101b/operator/0.log" Mar 20 14:50:02 crc kubenswrapper[4973]: I0320 14:50:02.274704 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7f87b9b85b-l6hmv_863783c6-0106-43c7-b097-35c4f30db388/observability-ui-dashboards/0.log" Mar 20 14:50:02 crc kubenswrapper[4973]: I0320 14:50:02.381732 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-9b89954cc-wfdgp_a19fcda0-339c-4f0e-9f54-5a2f76c934c5/perses-operator/0.log" Mar 20 14:50:03 crc kubenswrapper[4973]: I0320 14:50:03.865924 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566970-6hdtv" event={"ID":"a43e8c15-2110-4352-b8b1-1919544f46d9","Type":"ContainerStarted","Data":"ec2c1078736b8bf082f871c8d7cc0d6c31ff68b09ce60c4cfd64a252b474ff62"} Mar 20 14:50:03 crc kubenswrapper[4973]: I0320 14:50:03.887711 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566970-6hdtv" podStartSLOduration=2.185967934 podStartE2EDuration="3.887690408s" podCreationTimestamp="2026-03-20 14:50:00 +0000 UTC" firstStartedPulling="2026-03-20 14:50:00.995556463 +0000 UTC m=+5321.739226207" lastFinishedPulling="2026-03-20 14:50:02.697278937 +0000 UTC m=+5323.440948681" observedRunningTime="2026-03-20 14:50:03.881909159 +0000 UTC m=+5324.625578903" watchObservedRunningTime="2026-03-20 14:50:03.887690408 +0000 UTC m=+5324.631360152" Mar 20 14:50:04 crc kubenswrapper[4973]: I0320 14:50:04.877491 4973 generic.go:334] "Generic (PLEG): container finished" podID="a43e8c15-2110-4352-b8b1-1919544f46d9" containerID="ec2c1078736b8bf082f871c8d7cc0d6c31ff68b09ce60c4cfd64a252b474ff62" exitCode=0 Mar 20 14:50:04 crc kubenswrapper[4973]: I0320 14:50:04.877557 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566970-6hdtv" event={"ID":"a43e8c15-2110-4352-b8b1-1919544f46d9","Type":"ContainerDied","Data":"ec2c1078736b8bf082f871c8d7cc0d6c31ff68b09ce60c4cfd64a252b474ff62"} Mar 20 14:50:06 crc kubenswrapper[4973]: I0320 14:50:06.909701 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566970-6hdtv" event={"ID":"a43e8c15-2110-4352-b8b1-1919544f46d9","Type":"ContainerDied","Data":"214a85f63d8aea94bbe5ea0a87e26c89c53736133464fd108d83bdd804f83d97"} Mar 20 14:50:06 crc kubenswrapper[4973]: I0320 14:50:06.910050 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="214a85f63d8aea94bbe5ea0a87e26c89c53736133464fd108d83bdd804f83d97" Mar 20 14:50:06 crc kubenswrapper[4973]: I0320 14:50:06.924528 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566970-6hdtv" Mar 20 14:50:07 crc kubenswrapper[4973]: I0320 14:50:07.001592 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc5t8\" (UniqueName: \"kubernetes.io/projected/a43e8c15-2110-4352-b8b1-1919544f46d9-kube-api-access-zc5t8\") pod \"a43e8c15-2110-4352-b8b1-1919544f46d9\" (UID: \"a43e8c15-2110-4352-b8b1-1919544f46d9\") " Mar 20 14:50:07 crc kubenswrapper[4973]: I0320 14:50:07.015609 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a43e8c15-2110-4352-b8b1-1919544f46d9-kube-api-access-zc5t8" (OuterVolumeSpecName: "kube-api-access-zc5t8") pod "a43e8c15-2110-4352-b8b1-1919544f46d9" (UID: "a43e8c15-2110-4352-b8b1-1919544f46d9"). InnerVolumeSpecName "kube-api-access-zc5t8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:50:07 crc kubenswrapper[4973]: I0320 14:50:07.105173 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc5t8\" (UniqueName: \"kubernetes.io/projected/a43e8c15-2110-4352-b8b1-1919544f46d9-kube-api-access-zc5t8\") on node \"crc\" DevicePath \"\"" Mar 20 14:50:07 crc kubenswrapper[4973]: I0320 14:50:07.919519 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566970-6hdtv" Mar 20 14:50:07 crc kubenswrapper[4973]: I0320 14:50:07.950916 4973 scope.go:117] "RemoveContainer" containerID="3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" Mar 20 14:50:07 crc kubenswrapper[4973]: E0320 14:50:07.951454 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:50:08 crc kubenswrapper[4973]: I0320 14:50:08.023034 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566964-hpzvv"] Mar 20 14:50:08 crc kubenswrapper[4973]: I0320 14:50:08.035310 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566964-hpzvv"] Mar 20 14:50:09 crc kubenswrapper[4973]: I0320 14:50:09.964085 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9dc265a-1fa0-4161-9a5c-0bdeea9ee7e6" path="/var/lib/kubelet/pods/a9dc265a-1fa0-4161-9a5c-0bdeea9ee7e6/volumes" Mar 20 14:50:19 crc kubenswrapper[4973]: I0320 14:50:19.823106 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_cluster-logging-operator-66689c4bbf-9zjgt_f1930ce4-8a2b-4ef3-bf3b-eb748e20f255/cluster-logging-operator/0.log" Mar 20 14:50:20 crc kubenswrapper[4973]: I0320 14:50:20.082781 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_collector-45jbl_03b9e8f1-4452-4677-991f-64048197690b/collector/0.log" Mar 20 14:50:20 crc kubenswrapper[4973]: I0320 14:50:20.162823 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-compactor-0_fef9a921-6e35-4927-87b4-2741b40a3ab8/loki-compactor/0.log" Mar 20 14:50:20 crc kubenswrapper[4973]: I0320 14:50:20.313097 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-distributor-9c6b6d984-b5mdp_870a7fc7-0aac-45df-857e-dba72c60f80a/loki-distributor/0.log" Mar 20 14:50:20 crc kubenswrapper[4973]: I0320 14:50:20.382306 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-fbc7bc644-ltv8r_9ff158ae-7281-4d5c-95cf-ff14e136c414/gateway/0.log" Mar 20 14:50:20 crc kubenswrapper[4973]: I0320 14:50:20.418380 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-fbc7bc644-ltv8r_9ff158ae-7281-4d5c-95cf-ff14e136c414/opa/0.log" Mar 20 14:50:20 crc kubenswrapper[4973]: I0320 14:50:20.559108 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-fbc7bc644-sw7l9_d4e8002e-56ed-40a4-a768-9fd6a44d891c/gateway/0.log" Mar 20 14:50:20 crc kubenswrapper[4973]: I0320 14:50:20.575636 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-gateway-fbc7bc644-sw7l9_d4e8002e-56ed-40a4-a768-9fd6a44d891c/opa/0.log" Mar 20 14:50:20 crc kubenswrapper[4973]: I0320 14:50:20.756627 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-index-gateway-0_28abf398-d023-4625-b8f5-42db7c452df8/loki-index-gateway/0.log" Mar 20 14:50:20 crc kubenswrapper[4973]: I0320 14:50:20.852289 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-ingester-0_96545de2-ed06-4e4e-9102-37e56cbd6cdb/loki-ingester/0.log" Mar 20 14:50:20 crc kubenswrapper[4973]: I0320 14:50:20.978550 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-querier-6dcbdf8bb8-zjf9g_727aa0fc-f2ea-4183-a168-24918669937b/loki-querier/0.log" Mar 20 14:50:21 crc kubenswrapper[4973]: I0320 14:50:21.070241 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-logging_logging-loki-query-frontend-ff66c4dc9-ssp8t_d08579c3-9bb3-4b06-a613-0b81a2d7fb44/loki-query-frontend/0.log" Mar 20 14:50:21 crc kubenswrapper[4973]: I0320 14:50:21.951054 4973 scope.go:117] "RemoveContainer" containerID="3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" Mar 20 14:50:21 crc kubenswrapper[4973]: E0320 14:50:21.952154 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:50:27 crc kubenswrapper[4973]: I0320 14:50:27.165522 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5wddj"] Mar 20 14:50:27 crc kubenswrapper[4973]: E0320 14:50:27.166533 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a43e8c15-2110-4352-b8b1-1919544f46d9" containerName="oc" Mar 20 14:50:27 crc kubenswrapper[4973]: I0320 14:50:27.166546 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="a43e8c15-2110-4352-b8b1-1919544f46d9" containerName="oc" Mar 20 14:50:27 crc kubenswrapper[4973]: I0320 14:50:27.166753 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="a43e8c15-2110-4352-b8b1-1919544f46d9" containerName="oc" Mar 20 14:50:27 crc kubenswrapper[4973]: I0320 14:50:27.168986 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5wddj" Mar 20 14:50:27 crc kubenswrapper[4973]: I0320 14:50:27.181000 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5wddj"] Mar 20 14:50:27 crc kubenswrapper[4973]: I0320 14:50:27.303781 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac3559b4-6191-4dd6-9e30-5af24d25770d-utilities\") pod \"redhat-marketplace-5wddj\" (UID: \"ac3559b4-6191-4dd6-9e30-5af24d25770d\") " pod="openshift-marketplace/redhat-marketplace-5wddj" Mar 20 14:50:27 crc kubenswrapper[4973]: I0320 14:50:27.304452 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xndvg\" (UniqueName: \"kubernetes.io/projected/ac3559b4-6191-4dd6-9e30-5af24d25770d-kube-api-access-xndvg\") pod \"redhat-marketplace-5wddj\" (UID: \"ac3559b4-6191-4dd6-9e30-5af24d25770d\") " pod="openshift-marketplace/redhat-marketplace-5wddj" Mar 20 14:50:27 crc kubenswrapper[4973]: I0320 14:50:27.304536 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac3559b4-6191-4dd6-9e30-5af24d25770d-catalog-content\") pod \"redhat-marketplace-5wddj\" (UID: \"ac3559b4-6191-4dd6-9e30-5af24d25770d\") " pod="openshift-marketplace/redhat-marketplace-5wddj" Mar 20 14:50:27 crc kubenswrapper[4973]: I0320 14:50:27.406842 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xndvg\" (UniqueName: \"kubernetes.io/projected/ac3559b4-6191-4dd6-9e30-5af24d25770d-kube-api-access-xndvg\") pod \"redhat-marketplace-5wddj\" (UID: \"ac3559b4-6191-4dd6-9e30-5af24d25770d\") " pod="openshift-marketplace/redhat-marketplace-5wddj" Mar 20 14:50:27 crc kubenswrapper[4973]: I0320 14:50:27.406925 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac3559b4-6191-4dd6-9e30-5af24d25770d-catalog-content\") pod \"redhat-marketplace-5wddj\" (UID: \"ac3559b4-6191-4dd6-9e30-5af24d25770d\") " pod="openshift-marketplace/redhat-marketplace-5wddj" Mar 20 14:50:27 crc kubenswrapper[4973]: I0320 14:50:27.407006 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac3559b4-6191-4dd6-9e30-5af24d25770d-utilities\") pod \"redhat-marketplace-5wddj\" (UID: \"ac3559b4-6191-4dd6-9e30-5af24d25770d\") " pod="openshift-marketplace/redhat-marketplace-5wddj" Mar 20 14:50:27 crc kubenswrapper[4973]: I0320 14:50:27.407645 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac3559b4-6191-4dd6-9e30-5af24d25770d-catalog-content\") pod \"redhat-marketplace-5wddj\" (UID: \"ac3559b4-6191-4dd6-9e30-5af24d25770d\") " pod="openshift-marketplace/redhat-marketplace-5wddj" Mar 20 14:50:27 crc kubenswrapper[4973]: I0320 14:50:27.407670 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac3559b4-6191-4dd6-9e30-5af24d25770d-utilities\") pod \"redhat-marketplace-5wddj\" (UID: \"ac3559b4-6191-4dd6-9e30-5af24d25770d\") " pod="openshift-marketplace/redhat-marketplace-5wddj" Mar 20 14:50:27 crc kubenswrapper[4973]: I0320 14:50:27.428650 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xndvg\" (UniqueName: \"kubernetes.io/projected/ac3559b4-6191-4dd6-9e30-5af24d25770d-kube-api-access-xndvg\") pod \"redhat-marketplace-5wddj\" (UID: \"ac3559b4-6191-4dd6-9e30-5af24d25770d\") " pod="openshift-marketplace/redhat-marketplace-5wddj" Mar 20 14:50:27 crc kubenswrapper[4973]: I0320 14:50:27.491182 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5wddj" Mar 20 14:50:28 crc kubenswrapper[4973]: I0320 14:50:28.607499 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5wddj"] Mar 20 14:50:29 crc kubenswrapper[4973]: I0320 14:50:29.297229 4973 generic.go:334] "Generic (PLEG): container finished" podID="ac3559b4-6191-4dd6-9e30-5af24d25770d" containerID="00dcaa8219c88dea1318590f8d5d207b2be590468504162932c5989fa99befd9" exitCode=0 Mar 20 14:50:29 crc kubenswrapper[4973]: I0320 14:50:29.297302 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wddj" event={"ID":"ac3559b4-6191-4dd6-9e30-5af24d25770d","Type":"ContainerDied","Data":"00dcaa8219c88dea1318590f8d5d207b2be590468504162932c5989fa99befd9"} Mar 20 14:50:29 crc kubenswrapper[4973]: I0320 14:50:29.297581 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wddj" event={"ID":"ac3559b4-6191-4dd6-9e30-5af24d25770d","Type":"ContainerStarted","Data":"fdc393402585fd628d489795633a5dd869fc3c16717b7ca7bcddb58a02000be8"} Mar 20 14:50:31 crc kubenswrapper[4973]: I0320 14:50:31.320191 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wddj" event={"ID":"ac3559b4-6191-4dd6-9e30-5af24d25770d","Type":"ContainerStarted","Data":"5d4126af38f9a1ec4b566bf4c53efbeea2830e67bad0ef312a87b2b5f5d30759"} Mar 20 14:50:32 crc kubenswrapper[4973]: I0320 14:50:32.333659 4973 generic.go:334] "Generic (PLEG): container finished" podID="ac3559b4-6191-4dd6-9e30-5af24d25770d" containerID="5d4126af38f9a1ec4b566bf4c53efbeea2830e67bad0ef312a87b2b5f5d30759" exitCode=0 Mar 20 14:50:32 crc kubenswrapper[4973]: I0320 14:50:32.333726 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wddj" event={"ID":"ac3559b4-6191-4dd6-9e30-5af24d25770d","Type":"ContainerDied","Data":"5d4126af38f9a1ec4b566bf4c53efbeea2830e67bad0ef312a87b2b5f5d30759"} Mar 20 14:50:36 crc kubenswrapper[4973]: I0320 14:50:36.390809 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wddj" event={"ID":"ac3559b4-6191-4dd6-9e30-5af24d25770d","Type":"ContainerStarted","Data":"3252ef8840f0a8cd21a1c5c8903aa68886371463af1459d5af5346076a6473e5"} Mar 20 14:50:36 crc kubenswrapper[4973]: I0320 14:50:36.951489 4973 scope.go:117] "RemoveContainer" containerID="3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" Mar 20 14:50:36 crc kubenswrapper[4973]: E0320 14:50:36.952021 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:50:37 crc kubenswrapper[4973]: I0320 14:50:37.418359 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5wddj" podStartSLOduration=3.610572578 podStartE2EDuration="10.418327457s" podCreationTimestamp="2026-03-20 14:50:27 +0000 UTC" firstStartedPulling="2026-03-20 14:50:29.299204757 +0000 UTC m=+5350.042874501" lastFinishedPulling="2026-03-20 14:50:36.106959636 +0000 UTC m=+5356.850629380" observedRunningTime="2026-03-20 14:50:37.416089606 +0000 UTC m=+5358.159759360" watchObservedRunningTime="2026-03-20 14:50:37.418327457 +0000 UTC m=+5358.161997201" Mar 20 14:50:37 crc kubenswrapper[4973]: I0320 14:50:37.491311 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5wddj" Mar 20 14:50:37 crc kubenswrapper[4973]: I0320 14:50:37.491369 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5wddj" Mar 20 14:50:37 crc kubenswrapper[4973]: I0320 14:50:37.788916 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-k2z22_549c9d4d-d8f5-441e-9e2b-faa4bc5bd589/controller/0.log" Mar 20 14:50:37 crc kubenswrapper[4973]: I0320 14:50:37.795789 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-k2z22_549c9d4d-d8f5-441e-9e2b-faa4bc5bd589/kube-rbac-proxy/0.log" Mar 20 14:50:37 crc kubenswrapper[4973]: I0320 14:50:37.843132 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgmjd_75fc4720-ae9c-4ae5-8e4c-7c9a800f5478/cp-frr-files/0.log" Mar 20 14:50:38 crc kubenswrapper[4973]: I0320 14:50:38.097688 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgmjd_75fc4720-ae9c-4ae5-8e4c-7c9a800f5478/cp-frr-files/0.log" Mar 20 14:50:38 crc kubenswrapper[4973]: I0320 14:50:38.108174 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgmjd_75fc4720-ae9c-4ae5-8e4c-7c9a800f5478/cp-reloader/0.log" Mar 20 14:50:38 crc kubenswrapper[4973]: I0320 14:50:38.116811 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgmjd_75fc4720-ae9c-4ae5-8e4c-7c9a800f5478/cp-metrics/0.log" Mar 20 14:50:38 crc kubenswrapper[4973]: I0320 14:50:38.169566 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgmjd_75fc4720-ae9c-4ae5-8e4c-7c9a800f5478/cp-reloader/0.log" Mar 20 14:50:38 crc kubenswrapper[4973]: I0320 14:50:38.384799 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgmjd_75fc4720-ae9c-4ae5-8e4c-7c9a800f5478/cp-frr-files/0.log" Mar 20 14:50:38 crc kubenswrapper[4973]: I0320 14:50:38.429324 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgmjd_75fc4720-ae9c-4ae5-8e4c-7c9a800f5478/cp-metrics/0.log" Mar 20 14:50:38 crc kubenswrapper[4973]: I0320 14:50:38.482911 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgmjd_75fc4720-ae9c-4ae5-8e4c-7c9a800f5478/cp-reloader/0.log" Mar 20 14:50:38 crc kubenswrapper[4973]: I0320 14:50:38.493928 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgmjd_75fc4720-ae9c-4ae5-8e4c-7c9a800f5478/cp-metrics/0.log" Mar 20 14:50:38 crc kubenswrapper[4973]: I0320 14:50:38.559525 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-5wddj" podUID="ac3559b4-6191-4dd6-9e30-5af24d25770d" containerName="registry-server" probeResult="failure" output=< Mar 20 14:50:38 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:50:38 crc kubenswrapper[4973]: > Mar 20 14:50:38 crc kubenswrapper[4973]: I0320 14:50:38.827467 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgmjd_75fc4720-ae9c-4ae5-8e4c-7c9a800f5478/cp-reloader/0.log" Mar 20 14:50:38 crc kubenswrapper[4973]: I0320 14:50:38.855036 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgmjd_75fc4720-ae9c-4ae5-8e4c-7c9a800f5478/cp-frr-files/0.log" Mar 20 14:50:38 crc kubenswrapper[4973]: I0320 14:50:38.911103 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgmjd_75fc4720-ae9c-4ae5-8e4c-7c9a800f5478/controller/0.log" Mar 20 14:50:38 crc kubenswrapper[4973]: I0320 14:50:38.914393 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgmjd_75fc4720-ae9c-4ae5-8e4c-7c9a800f5478/cp-metrics/0.log" Mar 20 14:50:39 crc kubenswrapper[4973]: I0320 14:50:39.762752 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgmjd_75fc4720-ae9c-4ae5-8e4c-7c9a800f5478/frr-metrics/0.log" Mar 20 14:50:39 crc kubenswrapper[4973]: I0320 14:50:39.769986 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgmjd_75fc4720-ae9c-4ae5-8e4c-7c9a800f5478/kube-rbac-proxy/0.log" Mar 20 14:50:39 crc kubenswrapper[4973]: I0320 14:50:39.922364 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgmjd_75fc4720-ae9c-4ae5-8e4c-7c9a800f5478/frr/1.log" Mar 20 14:50:40 crc kubenswrapper[4973]: I0320 14:50:40.088865 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgmjd_75fc4720-ae9c-4ae5-8e4c-7c9a800f5478/kube-rbac-proxy-frr/0.log" Mar 20 14:50:40 crc kubenswrapper[4973]: I0320 14:50:40.206710 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgmjd_75fc4720-ae9c-4ae5-8e4c-7c9a800f5478/reloader/0.log" Mar 20 14:50:40 crc kubenswrapper[4973]: I0320 14:50:40.390477 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-rjhq2_1221336e-652c-45b4-bd66-43e96cf2c643/frr-k8s-webhook-server/0.log" Mar 20 14:50:40 crc kubenswrapper[4973]: I0320 14:50:40.651965 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7f9db6bfb5-b8w47_487335bd-36f4-42e2-87e1-5acef7226919/manager/0.log" Mar 20 14:50:40 crc kubenswrapper[4973]: I0320 14:50:40.747475 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-68c6dd9858-4mw5r_d74cf88e-0824-45f2-92ff-3798ad77f943/webhook-server/1.log" Mar 20 14:50:40 crc kubenswrapper[4973]: I0320 14:50:40.910399 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-68c6dd9858-4mw5r_d74cf88e-0824-45f2-92ff-3798ad77f943/webhook-server/0.log" Mar 20 14:50:41 crc kubenswrapper[4973]: I0320 14:50:41.231380 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5npj7_2d4aa91f-29e1-4129-b67e-493c83865a51/kube-rbac-proxy/0.log" Mar 20 14:50:42 crc kubenswrapper[4973]: I0320 14:50:42.571054 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-cgmjd_75fc4720-ae9c-4ae5-8e4c-7c9a800f5478/frr/0.log" Mar 20 14:50:43 crc kubenswrapper[4973]: I0320 14:50:43.220101 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-5npj7_2d4aa91f-29e1-4129-b67e-493c83865a51/speaker/0.log" Mar 20 14:50:48 crc kubenswrapper[4973]: I0320 14:50:48.547629 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-5wddj" podUID="ac3559b4-6191-4dd6-9e30-5af24d25770d" containerName="registry-server" probeResult="failure" output=< Mar 20 14:50:48 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:50:48 crc kubenswrapper[4973]: > Mar 20 14:50:48 crc kubenswrapper[4973]: I0320 14:50:48.951022 4973 scope.go:117] "RemoveContainer" containerID="3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" Mar 20 14:50:48 crc kubenswrapper[4973]: E0320 14:50:48.951660 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:50:57 crc kubenswrapper[4973]: I0320 14:50:57.405939 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q_dfd8af36-fd3a-4466-b375-585baab50b83/util/0.log" Mar 20 14:50:57 crc kubenswrapper[4973]: I0320 14:50:57.547967 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5wddj" Mar 20 14:50:57 crc kubenswrapper[4973]: I0320 14:50:57.624648 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5wddj" Mar 20 14:50:57 crc kubenswrapper[4973]: I0320 14:50:57.703974 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q_dfd8af36-fd3a-4466-b375-585baab50b83/pull/0.log" Mar 20 14:50:57 crc kubenswrapper[4973]: I0320 14:50:57.707916 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q_dfd8af36-fd3a-4466-b375-585baab50b83/pull/0.log" Mar 20 14:50:57 crc kubenswrapper[4973]: I0320 14:50:57.742415 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q_dfd8af36-fd3a-4466-b375-585baab50b83/util/0.log" Mar 20 14:50:58 crc kubenswrapper[4973]: I0320 14:50:58.015153 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q_dfd8af36-fd3a-4466-b375-585baab50b83/pull/0.log" Mar 20 14:50:58 crc kubenswrapper[4973]: I0320 14:50:58.067650 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q_dfd8af36-fd3a-4466-b375-585baab50b83/util/0.log" Mar 20 14:50:58 crc kubenswrapper[4973]: I0320 14:50:58.148786 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde8748wm7q_dfd8af36-fd3a-4466-b375-585baab50b83/extract/0.log" Mar 20 14:50:58 crc kubenswrapper[4973]: I0320 14:50:58.254094 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9_19b59a52-d780-491f-ab38-270ea519cddc/util/0.log" Mar 20 14:50:58 crc kubenswrapper[4973]: I0320 14:50:58.368258 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5wddj"] Mar 20 14:50:58 crc kubenswrapper[4973]: I0320 14:50:58.448642 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9_19b59a52-d780-491f-ab38-270ea519cddc/pull/0.log" Mar 20 14:50:58 crc kubenswrapper[4973]: I0320 14:50:58.464154 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9_19b59a52-d780-491f-ab38-270ea519cddc/util/0.log" Mar 20 14:50:58 crc kubenswrapper[4973]: I0320 14:50:58.512645 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9_19b59a52-d780-491f-ab38-270ea519cddc/pull/0.log" Mar 20 14:50:58 crc kubenswrapper[4973]: I0320 14:50:58.628045 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5wddj" podUID="ac3559b4-6191-4dd6-9e30-5af24d25770d" containerName="registry-server" containerID="cri-o://3252ef8840f0a8cd21a1c5c8903aa68886371463af1459d5af5346076a6473e5" gracePeriod=2 Mar 20 14:50:58 crc kubenswrapper[4973]: I0320 14:50:58.784115 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9_19b59a52-d780-491f-ab38-270ea519cddc/util/0.log" Mar 20 14:50:58 crc kubenswrapper[4973]: I0320 14:50:58.785205 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9_19b59a52-d780-491f-ab38-270ea519cddc/extract/0.log" Mar 20 14:50:58 crc kubenswrapper[4973]: I0320 14:50:58.828651 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1xffl9_19b59a52-d780-491f-ab38-270ea519cddc/pull/0.log" Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.065570 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc_4e0294f1-0f7c-47f2-b81d-9e71231f19aa/util/0.log" Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.259871 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5wddj" Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.382040 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xndvg\" (UniqueName: \"kubernetes.io/projected/ac3559b4-6191-4dd6-9e30-5af24d25770d-kube-api-access-xndvg\") pod \"ac3559b4-6191-4dd6-9e30-5af24d25770d\" (UID: \"ac3559b4-6191-4dd6-9e30-5af24d25770d\") " Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.382586 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac3559b4-6191-4dd6-9e30-5af24d25770d-utilities\") pod \"ac3559b4-6191-4dd6-9e30-5af24d25770d\" (UID: \"ac3559b4-6191-4dd6-9e30-5af24d25770d\") " Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.382620 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac3559b4-6191-4dd6-9e30-5af24d25770d-catalog-content\") pod \"ac3559b4-6191-4dd6-9e30-5af24d25770d\" (UID: \"ac3559b4-6191-4dd6-9e30-5af24d25770d\") " Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.385378 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac3559b4-6191-4dd6-9e30-5af24d25770d-utilities" (OuterVolumeSpecName: "utilities") pod "ac3559b4-6191-4dd6-9e30-5af24d25770d" (UID: "ac3559b4-6191-4dd6-9e30-5af24d25770d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.389079 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac3559b4-6191-4dd6-9e30-5af24d25770d-kube-api-access-xndvg" (OuterVolumeSpecName: "kube-api-access-xndvg") pod "ac3559b4-6191-4dd6-9e30-5af24d25770d" (UID: "ac3559b4-6191-4dd6-9e30-5af24d25770d"). InnerVolumeSpecName "kube-api-access-xndvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.392209 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc_4e0294f1-0f7c-47f2-b81d-9e71231f19aa/util/0.log" Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.412827 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc_4e0294f1-0f7c-47f2-b81d-9e71231f19aa/pull/0.log" Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.417216 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac3559b4-6191-4dd6-9e30-5af24d25770d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac3559b4-6191-4dd6-9e30-5af24d25770d" (UID: "ac3559b4-6191-4dd6-9e30-5af24d25770d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.446884 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc_4e0294f1-0f7c-47f2-b81d-9e71231f19aa/pull/0.log" Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.486280 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac3559b4-6191-4dd6-9e30-5af24d25770d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.486319 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac3559b4-6191-4dd6-9e30-5af24d25770d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.486331 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xndvg\" (UniqueName: \"kubernetes.io/projected/ac3559b4-6191-4dd6-9e30-5af24d25770d-kube-api-access-xndvg\") on node \"crc\" DevicePath \"\"" Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.619441 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc_4e0294f1-0f7c-47f2-b81d-9e71231f19aa/util/0.log" Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.642922 4973 generic.go:334] "Generic (PLEG): container finished" podID="ac3559b4-6191-4dd6-9e30-5af24d25770d" containerID="3252ef8840f0a8cd21a1c5c8903aa68886371463af1459d5af5346076a6473e5" exitCode=0 Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.642974 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wddj" event={"ID":"ac3559b4-6191-4dd6-9e30-5af24d25770d","Type":"ContainerDied","Data":"3252ef8840f0a8cd21a1c5c8903aa68886371463af1459d5af5346076a6473e5"} Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.643006 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5wddj" event={"ID":"ac3559b4-6191-4dd6-9e30-5af24d25770d","Type":"ContainerDied","Data":"fdc393402585fd628d489795633a5dd869fc3c16717b7ca7bcddb58a02000be8"} Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.643013 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5wddj" Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.643029 4973 scope.go:117] "RemoveContainer" containerID="3252ef8840f0a8cd21a1c5c8903aa68886371463af1459d5af5346076a6473e5" Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.647458 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc_4e0294f1-0f7c-47f2-b81d-9e71231f19aa/pull/0.log" Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.666402 4973 scope.go:117] "RemoveContainer" containerID="5d4126af38f9a1ec4b566bf4c53efbeea2830e67bad0ef312a87b2b5f5d30759" Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.687268 4973 scope.go:117] "RemoveContainer" containerID="00dcaa8219c88dea1318590f8d5d207b2be590468504162932c5989fa99befd9" Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.690113 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5wddj"] Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.708362 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5wddj"] Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.710227 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_3d9a37d2dd18988fcb5ca5f4f6b82950da05d40c4031e61bc3bfef57d5w9fkc_4e0294f1-0f7c-47f2-b81d-9e71231f19aa/extract/0.log" Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.756400 4973 scope.go:117] "RemoveContainer" containerID="3252ef8840f0a8cd21a1c5c8903aa68886371463af1459d5af5346076a6473e5" Mar 20 14:50:59 crc kubenswrapper[4973]: E0320 14:50:59.756886 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3252ef8840f0a8cd21a1c5c8903aa68886371463af1459d5af5346076a6473e5\": container with ID starting with 3252ef8840f0a8cd21a1c5c8903aa68886371463af1459d5af5346076a6473e5 not found: ID does not exist" containerID="3252ef8840f0a8cd21a1c5c8903aa68886371463af1459d5af5346076a6473e5" Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.756934 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3252ef8840f0a8cd21a1c5c8903aa68886371463af1459d5af5346076a6473e5"} err="failed to get container status \"3252ef8840f0a8cd21a1c5c8903aa68886371463af1459d5af5346076a6473e5\": rpc error: code = NotFound desc = could not find container \"3252ef8840f0a8cd21a1c5c8903aa68886371463af1459d5af5346076a6473e5\": container with ID starting with 3252ef8840f0a8cd21a1c5c8903aa68886371463af1459d5af5346076a6473e5 not found: ID does not exist" Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.756965 4973 scope.go:117] "RemoveContainer" containerID="5d4126af38f9a1ec4b566bf4c53efbeea2830e67bad0ef312a87b2b5f5d30759" Mar 20 14:50:59 crc kubenswrapper[4973]: E0320 14:50:59.757356 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d4126af38f9a1ec4b566bf4c53efbeea2830e67bad0ef312a87b2b5f5d30759\": container with ID starting with 5d4126af38f9a1ec4b566bf4c53efbeea2830e67bad0ef312a87b2b5f5d30759 not found: ID does not exist" containerID="5d4126af38f9a1ec4b566bf4c53efbeea2830e67bad0ef312a87b2b5f5d30759" Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.757425 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d4126af38f9a1ec4b566bf4c53efbeea2830e67bad0ef312a87b2b5f5d30759"} err="failed to get container status \"5d4126af38f9a1ec4b566bf4c53efbeea2830e67bad0ef312a87b2b5f5d30759\": rpc error: code = NotFound desc = could not find container \"5d4126af38f9a1ec4b566bf4c53efbeea2830e67bad0ef312a87b2b5f5d30759\": container with ID starting with 5d4126af38f9a1ec4b566bf4c53efbeea2830e67bad0ef312a87b2b5f5d30759 not found: ID does not exist" Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.757469 4973 scope.go:117] "RemoveContainer" containerID="00dcaa8219c88dea1318590f8d5d207b2be590468504162932c5989fa99befd9" Mar 20 14:50:59 crc kubenswrapper[4973]: E0320 14:50:59.757848 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00dcaa8219c88dea1318590f8d5d207b2be590468504162932c5989fa99befd9\": container with ID starting with 00dcaa8219c88dea1318590f8d5d207b2be590468504162932c5989fa99befd9 not found: ID does not exist" containerID="00dcaa8219c88dea1318590f8d5d207b2be590468504162932c5989fa99befd9" Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.757882 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00dcaa8219c88dea1318590f8d5d207b2be590468504162932c5989fa99befd9"} err="failed to get container status \"00dcaa8219c88dea1318590f8d5d207b2be590468504162932c5989fa99befd9\": rpc error: code = NotFound desc = could not find container \"00dcaa8219c88dea1318590f8d5d207b2be590468504162932c5989fa99befd9\": container with ID starting with 00dcaa8219c88dea1318590f8d5d207b2be590468504162932c5989fa99befd9 not found: ID does not exist" Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.862443 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f_08fa9eb6-01e9-4352-88bf-0302af5811b7/util/0.log" Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.966437 4973 scope.go:117] "RemoveContainer" containerID="3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" Mar 20 14:50:59 crc kubenswrapper[4973]: E0320 14:50:59.966707 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:50:59 crc kubenswrapper[4973]: I0320 14:50:59.970307 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac3559b4-6191-4dd6-9e30-5af24d25770d" path="/var/lib/kubelet/pods/ac3559b4-6191-4dd6-9e30-5af24d25770d/volumes" Mar 20 14:51:00 crc kubenswrapper[4973]: I0320 14:51:00.495942 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f_08fa9eb6-01e9-4352-88bf-0302af5811b7/util/0.log" Mar 20 14:51:00 crc kubenswrapper[4973]: I0320 14:51:00.576523 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f_08fa9eb6-01e9-4352-88bf-0302af5811b7/pull/0.log" Mar 20 14:51:00 crc kubenswrapper[4973]: I0320 14:51:00.579852 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f_08fa9eb6-01e9-4352-88bf-0302af5811b7/pull/0.log" Mar 20 14:51:00 crc kubenswrapper[4973]: I0320 14:51:00.841640 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f_08fa9eb6-01e9-4352-88bf-0302af5811b7/util/0.log" Mar 20 14:51:00 crc kubenswrapper[4973]: I0320 14:51:00.908973 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f_08fa9eb6-01e9-4352-88bf-0302af5811b7/extract/0.log" Mar 20 14:51:00 crc kubenswrapper[4973]: I0320 14:51:00.922774 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_4be416c5f2f0b2736478b7cfc76f1b991abd25af724ba21bdbdad2dd6chl24f_08fa9eb6-01e9-4352-88bf-0302af5811b7/pull/0.log" Mar 20 14:51:01 crc kubenswrapper[4973]: I0320 14:51:01.062325 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7_0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7/util/0.log" Mar 20 14:51:01 crc kubenswrapper[4973]: I0320 14:51:01.383038 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7_0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7/pull/0.log" Mar 20 14:51:01 crc kubenswrapper[4973]: I0320 14:51:01.440571 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7_0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7/pull/0.log" Mar 20 14:51:01 crc kubenswrapper[4973]: I0320 14:51:01.483003 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7_0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7/util/0.log" Mar 20 14:51:01 crc kubenswrapper[4973]: I0320 14:51:01.865490 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7_0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7/util/0.log" Mar 20 14:51:01 crc kubenswrapper[4973]: I0320 14:51:01.891081 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7_0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7/pull/0.log" Mar 20 14:51:01 crc kubenswrapper[4973]: I0320 14:51:01.898969 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_93d662022be5376a0ed3676a120a68427f47e4653a19a985adf9239726v9qr7_0196446a-e2f3-4f0a-8fc7-c8fa2390a8e7/extract/0.log" Mar 20 14:51:02 crc kubenswrapper[4973]: I0320 14:51:02.485256 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rmwkt_dbdea6e4-12dd-42d4-8ffe-37b38cbccadc/extract-utilities/0.log" Mar 20 14:51:02 crc kubenswrapper[4973]: I0320 14:51:02.817026 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rmwkt_dbdea6e4-12dd-42d4-8ffe-37b38cbccadc/extract-content/0.log" Mar 20 14:51:02 crc kubenswrapper[4973]: I0320 14:51:02.824236 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rmwkt_dbdea6e4-12dd-42d4-8ffe-37b38cbccadc/extract-content/0.log" Mar 20 14:51:02 crc kubenswrapper[4973]: I0320 14:51:02.851165 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rmwkt_dbdea6e4-12dd-42d4-8ffe-37b38cbccadc/extract-utilities/0.log" Mar 20 14:51:03 crc kubenswrapper[4973]: I0320 14:51:03.030674 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rmwkt_dbdea6e4-12dd-42d4-8ffe-37b38cbccadc/extract-content/0.log" Mar 20 14:51:03 crc kubenswrapper[4973]: I0320 14:51:03.057688 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rmwkt_dbdea6e4-12dd-42d4-8ffe-37b38cbccadc/extract-utilities/0.log" Mar 20 14:51:03 crc kubenswrapper[4973]: I0320 14:51:03.173244 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fpwkr_57731a76-c496-43ff-afea-a5685864a2f3/extract-utilities/0.log" Mar 20 14:51:03 crc kubenswrapper[4973]: I0320 14:51:03.443884 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fpwkr_57731a76-c496-43ff-afea-a5685864a2f3/extract-content/0.log" Mar 20 14:51:03 crc kubenswrapper[4973]: I0320 14:51:03.473464 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fpwkr_57731a76-c496-43ff-afea-a5685864a2f3/extract-content/0.log" Mar 20 14:51:03 crc kubenswrapper[4973]: I0320 14:51:03.483210 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fpwkr_57731a76-c496-43ff-afea-a5685864a2f3/extract-utilities/0.log" Mar 20 14:51:03 crc kubenswrapper[4973]: I0320 14:51:03.761663 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fpwkr_57731a76-c496-43ff-afea-a5685864a2f3/extract-utilities/0.log" Mar 20 14:51:03 crc kubenswrapper[4973]: I0320 14:51:03.773637 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fpwkr_57731a76-c496-43ff-afea-a5685864a2f3/extract-content/0.log" Mar 20 14:51:03 crc kubenswrapper[4973]: I0320 14:51:03.898560 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-rmwkt_dbdea6e4-12dd-42d4-8ffe-37b38cbccadc/registry-server/0.log" Mar 20 14:51:04 crc kubenswrapper[4973]: I0320 14:51:04.202052 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fpwkr_57731a76-c496-43ff-afea-a5685864a2f3/registry-server/1.log" Mar 20 14:51:04 crc kubenswrapper[4973]: I0320 14:51:04.643027 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-kfft6_2843ad35-cfc0-4922-8b96-cebb15694c99/marketplace-operator/0.log" Mar 20 14:51:04 crc kubenswrapper[4973]: I0320 14:51:04.758362 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-fpwkr_57731a76-c496-43ff-afea-a5685864a2f3/registry-server/0.log" Mar 20 14:51:05 crc kubenswrapper[4973]: I0320 14:51:05.043982 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6hwhw_31ec1f01-54bd-417e-867e-91484e966352/extract-utilities/0.log" Mar 20 14:51:05 crc kubenswrapper[4973]: I0320 14:51:05.227874 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6hwhw_31ec1f01-54bd-417e-867e-91484e966352/extract-utilities/0.log" Mar 20 14:51:05 crc kubenswrapper[4973]: I0320 14:51:05.244950 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6hwhw_31ec1f01-54bd-417e-867e-91484e966352/extract-content/0.log" Mar 20 14:51:05 crc kubenswrapper[4973]: I0320 14:51:05.258430 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6hwhw_31ec1f01-54bd-417e-867e-91484e966352/extract-content/0.log" Mar 20 14:51:05 crc kubenswrapper[4973]: I0320 14:51:05.448519 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6hwhw_31ec1f01-54bd-417e-867e-91484e966352/extract-content/0.log" Mar 20 14:51:05 crc kubenswrapper[4973]: I0320 14:51:05.448571 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6hwhw_31ec1f01-54bd-417e-867e-91484e966352/extract-utilities/0.log" Mar 20 14:51:05 crc kubenswrapper[4973]: I0320 14:51:05.660149 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-6hwhw_31ec1f01-54bd-417e-867e-91484e966352/registry-server/0.log" Mar 20 14:51:05 crc kubenswrapper[4973]: I0320 14:51:05.809113 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-h5hz4_b450ebc6-3181-4e0d-b546-b10ac89e0481/extract-utilities/0.log" Mar 20 14:51:05 crc kubenswrapper[4973]: I0320 14:51:05.978479 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-h5hz4_b450ebc6-3181-4e0d-b546-b10ac89e0481/extract-content/0.log" Mar 20 14:51:05 crc kubenswrapper[4973]: I0320 14:51:05.985722 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-h5hz4_b450ebc6-3181-4e0d-b546-b10ac89e0481/extract-content/0.log" Mar 20 14:51:06 crc kubenswrapper[4973]: I0320 14:51:06.002959 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-h5hz4_b450ebc6-3181-4e0d-b546-b10ac89e0481/extract-utilities/0.log" Mar 20 14:51:06 crc kubenswrapper[4973]: I0320 14:51:06.136945 4973 scope.go:117] "RemoveContainer" containerID="0b784b3d6db72c4fd3d08921f5c105ae53129255e422784907eaab1b9f5125a1" Mar 20 14:51:06 crc kubenswrapper[4973]: I0320 14:51:06.189873 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-h5hz4_b450ebc6-3181-4e0d-b546-b10ac89e0481/extract-utilities/0.log" Mar 20 14:51:06 crc kubenswrapper[4973]: I0320 14:51:06.316853 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-h5hz4_b450ebc6-3181-4e0d-b546-b10ac89e0481/extract-content/0.log" Mar 20 14:51:06 crc kubenswrapper[4973]: I0320 14:51:06.321232 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-h5hz4_b450ebc6-3181-4e0d-b546-b10ac89e0481/registry-server/1.log" Mar 20 14:51:07 crc kubenswrapper[4973]: I0320 14:51:07.096371 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-h5hz4_b450ebc6-3181-4e0d-b546-b10ac89e0481/registry-server/0.log" Mar 20 14:51:12 crc kubenswrapper[4973]: I0320 14:51:12.951832 4973 scope.go:117] "RemoveContainer" containerID="3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" Mar 20 14:51:12 crc kubenswrapper[4973]: E0320 14:51:12.953134 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:51:22 crc kubenswrapper[4973]: I0320 14:51:22.924882 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-655746896-lsnk6_8b74a291-42fd-4819-aab4-957acbce8ec7/prometheus-operator-admission-webhook/0.log" Mar 20 14:51:22 crc kubenswrapper[4973]: I0320 14:51:22.944017 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-8ff7d675-5mfl6_12942d85-7c63-4b80-8df3-81e0941c91eb/prometheus-operator/0.log" Mar 20 14:51:22 crc kubenswrapper[4973]: I0320 14:51:22.997414 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-655746896-qjslc_645dfcb4-2ecb-4f12-96a6-dc97944672fb/prometheus-operator-admission-webhook/0.log" Mar 20 14:51:23 crc kubenswrapper[4973]: I0320 14:51:23.237092 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-6dd7dd855f-nhgtl_0e53f263-96c0-4390-b28e-ca37e867101b/operator/0.log" Mar 20 14:51:23 crc kubenswrapper[4973]: I0320 14:51:23.271065 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-ui-dashboards-7f87b9b85b-l6hmv_863783c6-0106-43c7-b097-35c4f30db388/observability-ui-dashboards/0.log" Mar 20 14:51:23 crc kubenswrapper[4973]: I0320 14:51:23.285575 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-9b89954cc-wfdgp_a19fcda0-339c-4f0e-9f54-5a2f76c934c5/perses-operator/0.log" Mar 20 14:51:27 crc kubenswrapper[4973]: I0320 14:51:27.950617 4973 scope.go:117] "RemoveContainer" containerID="3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" Mar 20 14:51:27 crc kubenswrapper[4973]: E0320 14:51:27.951686 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:51:39 crc kubenswrapper[4973]: I0320 14:51:39.108911 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6996757d8d-46qmw_3bf2a551-4944-4096-99f4-03effa26dde8/kube-rbac-proxy/0.log" Mar 20 14:51:39 crc kubenswrapper[4973]: I0320 14:51:39.230998 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators-redhat_loki-operator-controller-manager-6996757d8d-46qmw_3bf2a551-4944-4096-99f4-03effa26dde8/manager/0.log" Mar 20 14:51:40 crc kubenswrapper[4973]: I0320 14:51:40.952379 4973 scope.go:117] "RemoveContainer" containerID="3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" Mar 20 14:51:40 crc kubenswrapper[4973]: E0320 14:51:40.953476 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:51:51 crc kubenswrapper[4973]: I0320 14:51:51.951689 4973 scope.go:117] "RemoveContainer" containerID="3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" Mar 20 14:51:52 crc kubenswrapper[4973]: I0320 14:51:52.284312 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerStarted","Data":"b37569a1f4c446b4098a5730225f25c6ac53e44f17daca0b84cf263f39f0c7c7"} Mar 20 14:51:54 crc kubenswrapper[4973]: I0320 14:51:54.738484 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-44n2b"] Mar 20 14:51:54 crc kubenswrapper[4973]: E0320 14:51:54.739608 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3559b4-6191-4dd6-9e30-5af24d25770d" containerName="extract-content" Mar 20 14:51:54 crc kubenswrapper[4973]: I0320 14:51:54.739627 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3559b4-6191-4dd6-9e30-5af24d25770d" containerName="extract-content" Mar 20 14:51:54 crc kubenswrapper[4973]: E0320 14:51:54.739670 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3559b4-6191-4dd6-9e30-5af24d25770d" containerName="extract-utilities" Mar 20 14:51:54 crc kubenswrapper[4973]: I0320 14:51:54.739679 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3559b4-6191-4dd6-9e30-5af24d25770d" containerName="extract-utilities" Mar 20 14:51:54 crc kubenswrapper[4973]: E0320 14:51:54.739692 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3559b4-6191-4dd6-9e30-5af24d25770d" containerName="registry-server" Mar 20 14:51:54 crc kubenswrapper[4973]: I0320 14:51:54.739699 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3559b4-6191-4dd6-9e30-5af24d25770d" containerName="registry-server" Mar 20 14:51:54 crc kubenswrapper[4973]: I0320 14:51:54.740000 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac3559b4-6191-4dd6-9e30-5af24d25770d" containerName="registry-server" Mar 20 14:51:54 crc kubenswrapper[4973]: I0320 14:51:54.742125 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-44n2b" Mar 20 14:51:54 crc kubenswrapper[4973]: I0320 14:51:54.750430 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-44n2b"] Mar 20 14:51:54 crc kubenswrapper[4973]: I0320 14:51:54.891191 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6851b36-e93d-4c63-8160-974bc5f10324-utilities\") pod \"redhat-operators-44n2b\" (UID: \"e6851b36-e93d-4c63-8160-974bc5f10324\") " pod="openshift-marketplace/redhat-operators-44n2b" Mar 20 14:51:54 crc kubenswrapper[4973]: I0320 14:51:54.891285 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6851b36-e93d-4c63-8160-974bc5f10324-catalog-content\") pod \"redhat-operators-44n2b\" (UID: \"e6851b36-e93d-4c63-8160-974bc5f10324\") " pod="openshift-marketplace/redhat-operators-44n2b" Mar 20 14:51:54 crc kubenswrapper[4973]: I0320 14:51:54.891322 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps7ll\" (UniqueName: \"kubernetes.io/projected/e6851b36-e93d-4c63-8160-974bc5f10324-kube-api-access-ps7ll\") pod \"redhat-operators-44n2b\" (UID: \"e6851b36-e93d-4c63-8160-974bc5f10324\") " pod="openshift-marketplace/redhat-operators-44n2b" Mar 20 14:51:54 crc kubenswrapper[4973]: I0320 14:51:54.994673 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6851b36-e93d-4c63-8160-974bc5f10324-utilities\") pod \"redhat-operators-44n2b\" (UID: \"e6851b36-e93d-4c63-8160-974bc5f10324\") " pod="openshift-marketplace/redhat-operators-44n2b" Mar 20 14:51:54 crc kubenswrapper[4973]: I0320 14:51:54.994771 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6851b36-e93d-4c63-8160-974bc5f10324-catalog-content\") pod \"redhat-operators-44n2b\" (UID: \"e6851b36-e93d-4c63-8160-974bc5f10324\") " pod="openshift-marketplace/redhat-operators-44n2b" Mar 20 14:51:54 crc kubenswrapper[4973]: I0320 14:51:54.994804 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps7ll\" (UniqueName: \"kubernetes.io/projected/e6851b36-e93d-4c63-8160-974bc5f10324-kube-api-access-ps7ll\") pod \"redhat-operators-44n2b\" (UID: \"e6851b36-e93d-4c63-8160-974bc5f10324\") " pod="openshift-marketplace/redhat-operators-44n2b" Mar 20 14:51:54 crc kubenswrapper[4973]: I0320 14:51:54.995603 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6851b36-e93d-4c63-8160-974bc5f10324-utilities\") pod \"redhat-operators-44n2b\" (UID: \"e6851b36-e93d-4c63-8160-974bc5f10324\") " pod="openshift-marketplace/redhat-operators-44n2b" Mar 20 14:51:54 crc kubenswrapper[4973]: I0320 14:51:54.995650 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6851b36-e93d-4c63-8160-974bc5f10324-catalog-content\") pod \"redhat-operators-44n2b\" (UID: \"e6851b36-e93d-4c63-8160-974bc5f10324\") " pod="openshift-marketplace/redhat-operators-44n2b" Mar 20 14:51:55 crc kubenswrapper[4973]: I0320 14:51:55.038010 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps7ll\" (UniqueName: \"kubernetes.io/projected/e6851b36-e93d-4c63-8160-974bc5f10324-kube-api-access-ps7ll\") pod \"redhat-operators-44n2b\" (UID: \"e6851b36-e93d-4c63-8160-974bc5f10324\") " pod="openshift-marketplace/redhat-operators-44n2b" Mar 20 14:51:55 crc kubenswrapper[4973]: I0320 14:51:55.077782 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-44n2b" Mar 20 14:51:56 crc kubenswrapper[4973]: I0320 14:51:56.074381 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-44n2b"] Mar 20 14:51:56 crc kubenswrapper[4973]: I0320 14:51:56.333903 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-44n2b" event={"ID":"e6851b36-e93d-4c63-8160-974bc5f10324","Type":"ContainerStarted","Data":"ad92ba5a2211df21d11f21e11a48b49090bfacd4d6dcd43bce3a63a1a438dfd0"} Mar 20 14:51:57 crc kubenswrapper[4973]: I0320 14:51:57.346219 4973 generic.go:334] "Generic (PLEG): container finished" podID="e6851b36-e93d-4c63-8160-974bc5f10324" containerID="85ba3fe20e1638936785287f1d4f5a5eace9be4101740cc2955d161f090c9266" exitCode=0 Mar 20 14:51:57 crc kubenswrapper[4973]: I0320 14:51:57.346416 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-44n2b" event={"ID":"e6851b36-e93d-4c63-8160-974bc5f10324","Type":"ContainerDied","Data":"85ba3fe20e1638936785287f1d4f5a5eace9be4101740cc2955d161f090c9266"} Mar 20 14:51:59 crc kubenswrapper[4973]: I0320 14:51:59.404540 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-44n2b" event={"ID":"e6851b36-e93d-4c63-8160-974bc5f10324","Type":"ContainerStarted","Data":"f0745af6347a8909146032ba3012e4df4ef95b637bf4d1d80f4168aa566c4b76"} Mar 20 14:52:00 crc kubenswrapper[4973]: I0320 14:52:00.196900 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566972-rv4dc"] Mar 20 14:52:00 crc kubenswrapper[4973]: I0320 14:52:00.199583 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566972-rv4dc" Mar 20 14:52:00 crc kubenswrapper[4973]: I0320 14:52:00.208620 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:52:00 crc kubenswrapper[4973]: I0320 14:52:00.208826 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:52:00 crc kubenswrapper[4973]: I0320 14:52:00.217097 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566972-rv4dc"] Mar 20 14:52:00 crc kubenswrapper[4973]: I0320 14:52:00.222656 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:52:00 crc kubenswrapper[4973]: I0320 14:52:00.279722 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnlpq\" (UniqueName: \"kubernetes.io/projected/2a64a74b-6c75-4ce5-909f-938ab3a9737a-kube-api-access-mnlpq\") pod \"auto-csr-approver-29566972-rv4dc\" (UID: \"2a64a74b-6c75-4ce5-909f-938ab3a9737a\") " pod="openshift-infra/auto-csr-approver-29566972-rv4dc" Mar 20 14:52:00 crc kubenswrapper[4973]: I0320 14:52:00.383271 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnlpq\" (UniqueName: \"kubernetes.io/projected/2a64a74b-6c75-4ce5-909f-938ab3a9737a-kube-api-access-mnlpq\") pod \"auto-csr-approver-29566972-rv4dc\" (UID: \"2a64a74b-6c75-4ce5-909f-938ab3a9737a\") " pod="openshift-infra/auto-csr-approver-29566972-rv4dc" Mar 20 14:52:00 crc kubenswrapper[4973]: I0320 14:52:00.432492 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnlpq\" (UniqueName: \"kubernetes.io/projected/2a64a74b-6c75-4ce5-909f-938ab3a9737a-kube-api-access-mnlpq\") pod \"auto-csr-approver-29566972-rv4dc\" (UID: \"2a64a74b-6c75-4ce5-909f-938ab3a9737a\") " pod="openshift-infra/auto-csr-approver-29566972-rv4dc" Mar 20 14:52:00 crc kubenswrapper[4973]: I0320 14:52:00.539082 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566972-rv4dc" Mar 20 14:52:01 crc kubenswrapper[4973]: I0320 14:52:01.565812 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566972-rv4dc"] Mar 20 14:52:01 crc kubenswrapper[4973]: W0320 14:52:01.584918 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2a64a74b_6c75_4ce5_909f_938ab3a9737a.slice/crio-ea6b63b21588ac56a633f7ee1f5ae97fbc486f0de8e26f80fe0d284428aa3805 WatchSource:0}: Error finding container ea6b63b21588ac56a633f7ee1f5ae97fbc486f0de8e26f80fe0d284428aa3805: Status 404 returned error can't find the container with id ea6b63b21588ac56a633f7ee1f5ae97fbc486f0de8e26f80fe0d284428aa3805 Mar 20 14:52:02 crc kubenswrapper[4973]: I0320 14:52:02.447744 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566972-rv4dc" event={"ID":"2a64a74b-6c75-4ce5-909f-938ab3a9737a","Type":"ContainerStarted","Data":"ea6b63b21588ac56a633f7ee1f5ae97fbc486f0de8e26f80fe0d284428aa3805"} Mar 20 14:52:05 crc kubenswrapper[4973]: I0320 14:52:05.518943 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566972-rv4dc" event={"ID":"2a64a74b-6c75-4ce5-909f-938ab3a9737a","Type":"ContainerStarted","Data":"b40936497e71824e7559c6c913763aeea917309a4f83e8418b3ee128ec803c47"} Mar 20 14:52:05 crc kubenswrapper[4973]: I0320 14:52:05.541948 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566972-rv4dc" podStartSLOduration=3.743871564 podStartE2EDuration="5.539769026s" podCreationTimestamp="2026-03-20 14:52:00 +0000 UTC" firstStartedPulling="2026-03-20 14:52:01.587077246 +0000 UTC m=+5442.330746990" lastFinishedPulling="2026-03-20 14:52:03.382974708 +0000 UTC m=+5444.126644452" observedRunningTime="2026-03-20 14:52:05.534473041 +0000 UTC m=+5446.278142815" watchObservedRunningTime="2026-03-20 14:52:05.539769026 +0000 UTC m=+5446.283438770" Mar 20 14:52:06 crc kubenswrapper[4973]: I0320 14:52:06.537987 4973 generic.go:334] "Generic (PLEG): container finished" podID="e6851b36-e93d-4c63-8160-974bc5f10324" containerID="f0745af6347a8909146032ba3012e4df4ef95b637bf4d1d80f4168aa566c4b76" exitCode=0 Mar 20 14:52:06 crc kubenswrapper[4973]: I0320 14:52:06.538079 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-44n2b" event={"ID":"e6851b36-e93d-4c63-8160-974bc5f10324","Type":"ContainerDied","Data":"f0745af6347a8909146032ba3012e4df4ef95b637bf4d1d80f4168aa566c4b76"} Mar 20 14:52:07 crc kubenswrapper[4973]: I0320 14:52:07.552076 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-44n2b" event={"ID":"e6851b36-e93d-4c63-8160-974bc5f10324","Type":"ContainerStarted","Data":"8721d0286e173ed5d7770b4fcacfcd00dae786c1272bb7afbd49b2de41cce06e"} Mar 20 14:52:07 crc kubenswrapper[4973]: I0320 14:52:07.578787 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-44n2b" podStartSLOduration=3.855696682 podStartE2EDuration="13.57876699s" podCreationTimestamp="2026-03-20 14:51:54 +0000 UTC" firstStartedPulling="2026-03-20 14:51:57.349650663 +0000 UTC m=+5438.093320407" lastFinishedPulling="2026-03-20 14:52:07.072720981 +0000 UTC m=+5447.816390715" observedRunningTime="2026-03-20 14:52:07.571316867 +0000 UTC m=+5448.314986611" watchObservedRunningTime="2026-03-20 14:52:07.57876699 +0000 UTC m=+5448.322436734" Mar 20 14:52:08 crc kubenswrapper[4973]: I0320 14:52:08.588745 4973 generic.go:334] "Generic (PLEG): container finished" podID="2a64a74b-6c75-4ce5-909f-938ab3a9737a" containerID="b40936497e71824e7559c6c913763aeea917309a4f83e8418b3ee128ec803c47" exitCode=0 Mar 20 14:52:08 crc kubenswrapper[4973]: I0320 14:52:08.589060 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566972-rv4dc" event={"ID":"2a64a74b-6c75-4ce5-909f-938ab3a9737a","Type":"ContainerDied","Data":"b40936497e71824e7559c6c913763aeea917309a4f83e8418b3ee128ec803c47"} Mar 20 14:52:10 crc kubenswrapper[4973]: I0320 14:52:10.285004 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566972-rv4dc" Mar 20 14:52:10 crc kubenswrapper[4973]: I0320 14:52:10.412668 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnlpq\" (UniqueName: \"kubernetes.io/projected/2a64a74b-6c75-4ce5-909f-938ab3a9737a-kube-api-access-mnlpq\") pod \"2a64a74b-6c75-4ce5-909f-938ab3a9737a\" (UID: \"2a64a74b-6c75-4ce5-909f-938ab3a9737a\") " Mar 20 14:52:10 crc kubenswrapper[4973]: I0320 14:52:10.432643 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a64a74b-6c75-4ce5-909f-938ab3a9737a-kube-api-access-mnlpq" (OuterVolumeSpecName: "kube-api-access-mnlpq") pod "2a64a74b-6c75-4ce5-909f-938ab3a9737a" (UID: "2a64a74b-6c75-4ce5-909f-938ab3a9737a"). InnerVolumeSpecName "kube-api-access-mnlpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:10 crc kubenswrapper[4973]: I0320 14:52:10.516099 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnlpq\" (UniqueName: \"kubernetes.io/projected/2a64a74b-6c75-4ce5-909f-938ab3a9737a-kube-api-access-mnlpq\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:10 crc kubenswrapper[4973]: I0320 14:52:10.621608 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566972-rv4dc" event={"ID":"2a64a74b-6c75-4ce5-909f-938ab3a9737a","Type":"ContainerDied","Data":"ea6b63b21588ac56a633f7ee1f5ae97fbc486f0de8e26f80fe0d284428aa3805"} Mar 20 14:52:10 crc kubenswrapper[4973]: I0320 14:52:10.622692 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea6b63b21588ac56a633f7ee1f5ae97fbc486f0de8e26f80fe0d284428aa3805" Mar 20 14:52:10 crc kubenswrapper[4973]: I0320 14:52:10.622770 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566972-rv4dc" Mar 20 14:52:10 crc kubenswrapper[4973]: I0320 14:52:10.832424 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566966-s9z2v"] Mar 20 14:52:10 crc kubenswrapper[4973]: I0320 14:52:10.859958 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566966-s9z2v"] Mar 20 14:52:11 crc kubenswrapper[4973]: I0320 14:52:11.965029 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="288970b0-4121-4ffc-8fdf-92911f5ab464" path="/var/lib/kubelet/pods/288970b0-4121-4ffc-8fdf-92911f5ab464/volumes" Mar 20 14:52:15 crc kubenswrapper[4973]: I0320 14:52:15.078797 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-44n2b" Mar 20 14:52:15 crc kubenswrapper[4973]: I0320 14:52:15.080438 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-44n2b" Mar 20 14:52:16 crc kubenswrapper[4973]: I0320 14:52:16.677794 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-44n2b" podUID="e6851b36-e93d-4c63-8160-974bc5f10324" containerName="registry-server" probeResult="failure" output=< Mar 20 14:52:16 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:52:16 crc kubenswrapper[4973]: > Mar 20 14:52:26 crc kubenswrapper[4973]: I0320 14:52:26.131494 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-44n2b" podUID="e6851b36-e93d-4c63-8160-974bc5f10324" containerName="registry-server" probeResult="failure" output=< Mar 20 14:52:26 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:52:26 crc kubenswrapper[4973]: > Mar 20 14:52:36 crc kubenswrapper[4973]: I0320 14:52:36.129215 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-44n2b" podUID="e6851b36-e93d-4c63-8160-974bc5f10324" containerName="registry-server" probeResult="failure" output=< Mar 20 14:52:36 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:52:36 crc kubenswrapper[4973]: > Mar 20 14:52:46 crc kubenswrapper[4973]: I0320 14:52:46.440570 4973 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-44n2b" podUID="e6851b36-e93d-4c63-8160-974bc5f10324" containerName="registry-server" probeResult="failure" output=< Mar 20 14:52:46 crc kubenswrapper[4973]: timeout: failed to connect service ":50051" within 1s Mar 20 14:52:46 crc kubenswrapper[4973]: > Mar 20 14:52:55 crc kubenswrapper[4973]: I0320 14:52:55.143696 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-44n2b" Mar 20 14:52:55 crc kubenswrapper[4973]: I0320 14:52:55.205911 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-44n2b" Mar 20 14:52:55 crc kubenswrapper[4973]: I0320 14:52:55.944252 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-44n2b"] Mar 20 14:52:57 crc kubenswrapper[4973]: I0320 14:52:57.148897 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-44n2b" podUID="e6851b36-e93d-4c63-8160-974bc5f10324" containerName="registry-server" containerID="cri-o://8721d0286e173ed5d7770b4fcacfcd00dae786c1272bb7afbd49b2de41cce06e" gracePeriod=2 Mar 20 14:52:57 crc kubenswrapper[4973]: I0320 14:52:57.711660 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-44n2b" Mar 20 14:52:57 crc kubenswrapper[4973]: I0320 14:52:57.782959 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ps7ll\" (UniqueName: \"kubernetes.io/projected/e6851b36-e93d-4c63-8160-974bc5f10324-kube-api-access-ps7ll\") pod \"e6851b36-e93d-4c63-8160-974bc5f10324\" (UID: \"e6851b36-e93d-4c63-8160-974bc5f10324\") " Mar 20 14:52:57 crc kubenswrapper[4973]: I0320 14:52:57.783113 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6851b36-e93d-4c63-8160-974bc5f10324-catalog-content\") pod \"e6851b36-e93d-4c63-8160-974bc5f10324\" (UID: \"e6851b36-e93d-4c63-8160-974bc5f10324\") " Mar 20 14:52:57 crc kubenswrapper[4973]: I0320 14:52:57.783435 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6851b36-e93d-4c63-8160-974bc5f10324-utilities\") pod \"e6851b36-e93d-4c63-8160-974bc5f10324\" (UID: \"e6851b36-e93d-4c63-8160-974bc5f10324\") " Mar 20 14:52:57 crc kubenswrapper[4973]: I0320 14:52:57.784501 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6851b36-e93d-4c63-8160-974bc5f10324-utilities" (OuterVolumeSpecName: "utilities") pod "e6851b36-e93d-4c63-8160-974bc5f10324" (UID: "e6851b36-e93d-4c63-8160-974bc5f10324"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:52:57 crc kubenswrapper[4973]: I0320 14:52:57.792567 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6851b36-e93d-4c63-8160-974bc5f10324-kube-api-access-ps7ll" (OuterVolumeSpecName: "kube-api-access-ps7ll") pod "e6851b36-e93d-4c63-8160-974bc5f10324" (UID: "e6851b36-e93d-4c63-8160-974bc5f10324"). InnerVolumeSpecName "kube-api-access-ps7ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:52:57 crc kubenswrapper[4973]: I0320 14:52:57.903685 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6851b36-e93d-4c63-8160-974bc5f10324-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:57 crc kubenswrapper[4973]: I0320 14:52:57.903730 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ps7ll\" (UniqueName: \"kubernetes.io/projected/e6851b36-e93d-4c63-8160-974bc5f10324-kube-api-access-ps7ll\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:58 crc kubenswrapper[4973]: I0320 14:52:58.003145 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6851b36-e93d-4c63-8160-974bc5f10324-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6851b36-e93d-4c63-8160-974bc5f10324" (UID: "e6851b36-e93d-4c63-8160-974bc5f10324"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:52:58 crc kubenswrapper[4973]: I0320 14:52:58.005916 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6851b36-e93d-4c63-8160-974bc5f10324-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:52:58 crc kubenswrapper[4973]: I0320 14:52:58.157456 4973 generic.go:334] "Generic (PLEG): container finished" podID="e6851b36-e93d-4c63-8160-974bc5f10324" containerID="8721d0286e173ed5d7770b4fcacfcd00dae786c1272bb7afbd49b2de41cce06e" exitCode=0 Mar 20 14:52:58 crc kubenswrapper[4973]: I0320 14:52:58.157501 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-44n2b" event={"ID":"e6851b36-e93d-4c63-8160-974bc5f10324","Type":"ContainerDied","Data":"8721d0286e173ed5d7770b4fcacfcd00dae786c1272bb7afbd49b2de41cce06e"} Mar 20 14:52:58 crc kubenswrapper[4973]: I0320 14:52:58.157527 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-44n2b" event={"ID":"e6851b36-e93d-4c63-8160-974bc5f10324","Type":"ContainerDied","Data":"ad92ba5a2211df21d11f21e11a48b49090bfacd4d6dcd43bce3a63a1a438dfd0"} Mar 20 14:52:58 crc kubenswrapper[4973]: I0320 14:52:58.157540 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-44n2b" Mar 20 14:52:58 crc kubenswrapper[4973]: I0320 14:52:58.157546 4973 scope.go:117] "RemoveContainer" containerID="8721d0286e173ed5d7770b4fcacfcd00dae786c1272bb7afbd49b2de41cce06e" Mar 20 14:52:58 crc kubenswrapper[4973]: I0320 14:52:58.194696 4973 scope.go:117] "RemoveContainer" containerID="f0745af6347a8909146032ba3012e4df4ef95b637bf4d1d80f4168aa566c4b76" Mar 20 14:52:58 crc kubenswrapper[4973]: I0320 14:52:58.199886 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-44n2b"] Mar 20 14:52:58 crc kubenswrapper[4973]: I0320 14:52:58.213981 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-44n2b"] Mar 20 14:52:58 crc kubenswrapper[4973]: I0320 14:52:58.223206 4973 scope.go:117] "RemoveContainer" containerID="85ba3fe20e1638936785287f1d4f5a5eace9be4101740cc2955d161f090c9266" Mar 20 14:52:58 crc kubenswrapper[4973]: I0320 14:52:58.277097 4973 scope.go:117] "RemoveContainer" containerID="8721d0286e173ed5d7770b4fcacfcd00dae786c1272bb7afbd49b2de41cce06e" Mar 20 14:52:58 crc kubenswrapper[4973]: E0320 14:52:58.281052 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8721d0286e173ed5d7770b4fcacfcd00dae786c1272bb7afbd49b2de41cce06e\": container with ID starting with 8721d0286e173ed5d7770b4fcacfcd00dae786c1272bb7afbd49b2de41cce06e not found: ID does not exist" containerID="8721d0286e173ed5d7770b4fcacfcd00dae786c1272bb7afbd49b2de41cce06e" Mar 20 14:52:58 crc kubenswrapper[4973]: I0320 14:52:58.281113 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8721d0286e173ed5d7770b4fcacfcd00dae786c1272bb7afbd49b2de41cce06e"} err="failed to get container status \"8721d0286e173ed5d7770b4fcacfcd00dae786c1272bb7afbd49b2de41cce06e\": rpc error: code = NotFound desc = could not find container \"8721d0286e173ed5d7770b4fcacfcd00dae786c1272bb7afbd49b2de41cce06e\": container with ID starting with 8721d0286e173ed5d7770b4fcacfcd00dae786c1272bb7afbd49b2de41cce06e not found: ID does not exist" Mar 20 14:52:58 crc kubenswrapper[4973]: I0320 14:52:58.281143 4973 scope.go:117] "RemoveContainer" containerID="f0745af6347a8909146032ba3012e4df4ef95b637bf4d1d80f4168aa566c4b76" Mar 20 14:52:58 crc kubenswrapper[4973]: E0320 14:52:58.281546 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0745af6347a8909146032ba3012e4df4ef95b637bf4d1d80f4168aa566c4b76\": container with ID starting with f0745af6347a8909146032ba3012e4df4ef95b637bf4d1d80f4168aa566c4b76 not found: ID does not exist" containerID="f0745af6347a8909146032ba3012e4df4ef95b637bf4d1d80f4168aa566c4b76" Mar 20 14:52:58 crc kubenswrapper[4973]: I0320 14:52:58.281590 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0745af6347a8909146032ba3012e4df4ef95b637bf4d1d80f4168aa566c4b76"} err="failed to get container status \"f0745af6347a8909146032ba3012e4df4ef95b637bf4d1d80f4168aa566c4b76\": rpc error: code = NotFound desc = could not find container \"f0745af6347a8909146032ba3012e4df4ef95b637bf4d1d80f4168aa566c4b76\": container with ID starting with f0745af6347a8909146032ba3012e4df4ef95b637bf4d1d80f4168aa566c4b76 not found: ID does not exist" Mar 20 14:52:58 crc kubenswrapper[4973]: I0320 14:52:58.281618 4973 scope.go:117] "RemoveContainer" containerID="85ba3fe20e1638936785287f1d4f5a5eace9be4101740cc2955d161f090c9266" Mar 20 14:52:58 crc kubenswrapper[4973]: E0320 14:52:58.281908 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85ba3fe20e1638936785287f1d4f5a5eace9be4101740cc2955d161f090c9266\": container with ID starting with 85ba3fe20e1638936785287f1d4f5a5eace9be4101740cc2955d161f090c9266 not found: ID does not exist" containerID="85ba3fe20e1638936785287f1d4f5a5eace9be4101740cc2955d161f090c9266" Mar 20 14:52:58 crc kubenswrapper[4973]: I0320 14:52:58.281934 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85ba3fe20e1638936785287f1d4f5a5eace9be4101740cc2955d161f090c9266"} err="failed to get container status \"85ba3fe20e1638936785287f1d4f5a5eace9be4101740cc2955d161f090c9266\": rpc error: code = NotFound desc = could not find container \"85ba3fe20e1638936785287f1d4f5a5eace9be4101740cc2955d161f090c9266\": container with ID starting with 85ba3fe20e1638936785287f1d4f5a5eace9be4101740cc2955d161f090c9266 not found: ID does not exist" Mar 20 14:52:59 crc kubenswrapper[4973]: I0320 14:52:59.969182 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6851b36-e93d-4c63-8160-974bc5f10324" path="/var/lib/kubelet/pods/e6851b36-e93d-4c63-8160-974bc5f10324/volumes" Mar 20 14:53:06 crc kubenswrapper[4973]: I0320 14:53:06.398690 4973 scope.go:117] "RemoveContainer" containerID="b64916c7602d72df1b9ec2ba122ed3bfc3a8affe6cc0adeff8281ea96651aea4" Mar 20 14:53:06 crc kubenswrapper[4973]: I0320 14:53:06.434909 4973 scope.go:117] "RemoveContainer" containerID="eb7ff137dfe90269893c295cde81c93cbb3355046a0d4bc25a9a8b804748b973" Mar 20 14:53:06 crc kubenswrapper[4973]: I0320 14:53:06.519919 4973 scope.go:117] "RemoveContainer" containerID="6bc29c783723b650e1653b4808e9bfb56477fa2975915d1f5f42e3ebd5d05fc7" Mar 20 14:53:06 crc kubenswrapper[4973]: I0320 14:53:06.570504 4973 scope.go:117] "RemoveContainer" containerID="39b578a38fe5c9207db1fbc7328d64f8780fa14e870bf24cd24a014926ecf52e" Mar 20 14:53:41 crc kubenswrapper[4973]: I0320 14:53:41.864551 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sxwq4"] Mar 20 14:53:41 crc kubenswrapper[4973]: E0320 14:53:41.865787 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6851b36-e93d-4c63-8160-974bc5f10324" containerName="registry-server" Mar 20 14:53:41 crc kubenswrapper[4973]: I0320 14:53:41.865808 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6851b36-e93d-4c63-8160-974bc5f10324" containerName="registry-server" Mar 20 14:53:41 crc kubenswrapper[4973]: E0320 14:53:41.865840 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a64a74b-6c75-4ce5-909f-938ab3a9737a" containerName="oc" Mar 20 14:53:41 crc kubenswrapper[4973]: I0320 14:53:41.865848 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a64a74b-6c75-4ce5-909f-938ab3a9737a" containerName="oc" Mar 20 14:53:41 crc kubenswrapper[4973]: E0320 14:53:41.865868 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6851b36-e93d-4c63-8160-974bc5f10324" containerName="extract-utilities" Mar 20 14:53:41 crc kubenswrapper[4973]: I0320 14:53:41.865876 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6851b36-e93d-4c63-8160-974bc5f10324" containerName="extract-utilities" Mar 20 14:53:41 crc kubenswrapper[4973]: E0320 14:53:41.865912 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6851b36-e93d-4c63-8160-974bc5f10324" containerName="extract-content" Mar 20 14:53:41 crc kubenswrapper[4973]: I0320 14:53:41.865922 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6851b36-e93d-4c63-8160-974bc5f10324" containerName="extract-content" Mar 20 14:53:41 crc kubenswrapper[4973]: I0320 14:53:41.866214 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6851b36-e93d-4c63-8160-974bc5f10324" containerName="registry-server" Mar 20 14:53:41 crc kubenswrapper[4973]: I0320 14:53:41.866261 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a64a74b-6c75-4ce5-909f-938ab3a9737a" containerName="oc" Mar 20 14:53:41 crc kubenswrapper[4973]: I0320 14:53:41.868440 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxwq4" Mar 20 14:53:41 crc kubenswrapper[4973]: I0320 14:53:41.876595 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sxwq4"] Mar 20 14:53:41 crc kubenswrapper[4973]: I0320 14:53:41.987379 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2428e895-d676-4ae1-850f-224b6fc5632b-utilities\") pod \"community-operators-sxwq4\" (UID: \"2428e895-d676-4ae1-850f-224b6fc5632b\") " pod="openshift-marketplace/community-operators-sxwq4" Mar 20 14:53:41 crc kubenswrapper[4973]: I0320 14:53:41.988311 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2428e895-d676-4ae1-850f-224b6fc5632b-catalog-content\") pod \"community-operators-sxwq4\" (UID: \"2428e895-d676-4ae1-850f-224b6fc5632b\") " pod="openshift-marketplace/community-operators-sxwq4" Mar 20 14:53:41 crc kubenswrapper[4973]: I0320 14:53:41.989855 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh856\" (UniqueName: \"kubernetes.io/projected/2428e895-d676-4ae1-850f-224b6fc5632b-kube-api-access-fh856\") pod \"community-operators-sxwq4\" (UID: \"2428e895-d676-4ae1-850f-224b6fc5632b\") " pod="openshift-marketplace/community-operators-sxwq4" Mar 20 14:53:42 crc kubenswrapper[4973]: I0320 14:53:42.092221 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2428e895-d676-4ae1-850f-224b6fc5632b-utilities\") pod \"community-operators-sxwq4\" (UID: \"2428e895-d676-4ae1-850f-224b6fc5632b\") " pod="openshift-marketplace/community-operators-sxwq4" Mar 20 14:53:42 crc kubenswrapper[4973]: I0320 14:53:42.092680 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2428e895-d676-4ae1-850f-224b6fc5632b-utilities\") pod \"community-operators-sxwq4\" (UID: \"2428e895-d676-4ae1-850f-224b6fc5632b\") " pod="openshift-marketplace/community-operators-sxwq4" Mar 20 14:53:42 crc kubenswrapper[4973]: I0320 14:53:42.092789 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2428e895-d676-4ae1-850f-224b6fc5632b-catalog-content\") pod \"community-operators-sxwq4\" (UID: \"2428e895-d676-4ae1-850f-224b6fc5632b\") " pod="openshift-marketplace/community-operators-sxwq4" Mar 20 14:53:42 crc kubenswrapper[4973]: I0320 14:53:42.093039 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2428e895-d676-4ae1-850f-224b6fc5632b-catalog-content\") pod \"community-operators-sxwq4\" (UID: \"2428e895-d676-4ae1-850f-224b6fc5632b\") " pod="openshift-marketplace/community-operators-sxwq4" Mar 20 14:53:42 crc kubenswrapper[4973]: I0320 14:53:42.093224 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh856\" (UniqueName: \"kubernetes.io/projected/2428e895-d676-4ae1-850f-224b6fc5632b-kube-api-access-fh856\") pod \"community-operators-sxwq4\" (UID: \"2428e895-d676-4ae1-850f-224b6fc5632b\") " pod="openshift-marketplace/community-operators-sxwq4" Mar 20 14:53:42 crc kubenswrapper[4973]: I0320 14:53:42.113420 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh856\" (UniqueName: \"kubernetes.io/projected/2428e895-d676-4ae1-850f-224b6fc5632b-kube-api-access-fh856\") pod \"community-operators-sxwq4\" (UID: \"2428e895-d676-4ae1-850f-224b6fc5632b\") " pod="openshift-marketplace/community-operators-sxwq4" Mar 20 14:53:42 crc kubenswrapper[4973]: I0320 14:53:42.202956 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxwq4" Mar 20 14:53:42 crc kubenswrapper[4973]: I0320 14:53:42.716655 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sxwq4"] Mar 20 14:53:43 crc kubenswrapper[4973]: I0320 14:53:43.676003 4973 generic.go:334] "Generic (PLEG): container finished" podID="2428e895-d676-4ae1-850f-224b6fc5632b" containerID="ab7d4237b313f91c1869c9503d0474a6b5d0ad99900d67ebce9b4c71d0065e85" exitCode=0 Mar 20 14:53:43 crc kubenswrapper[4973]: I0320 14:53:43.676147 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxwq4" event={"ID":"2428e895-d676-4ae1-850f-224b6fc5632b","Type":"ContainerDied","Data":"ab7d4237b313f91c1869c9503d0474a6b5d0ad99900d67ebce9b4c71d0065e85"} Mar 20 14:53:43 crc kubenswrapper[4973]: I0320 14:53:43.677803 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxwq4" event={"ID":"2428e895-d676-4ae1-850f-224b6fc5632b","Type":"ContainerStarted","Data":"a9724e17e93dbf6e5a0d28bc44f60eb22c9f575b1c2cd97a51608eab3074f94a"} Mar 20 14:53:43 crc kubenswrapper[4973]: I0320 14:53:43.682178 4973 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 14:53:44 crc kubenswrapper[4973]: I0320 14:53:44.691237 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxwq4" event={"ID":"2428e895-d676-4ae1-850f-224b6fc5632b","Type":"ContainerStarted","Data":"3642b11840939d4270f26681d53b9c33612806493724759b8fc3a7023cda4b3e"} Mar 20 14:53:47 crc kubenswrapper[4973]: I0320 14:53:47.724299 4973 generic.go:334] "Generic (PLEG): container finished" podID="2428e895-d676-4ae1-850f-224b6fc5632b" containerID="3642b11840939d4270f26681d53b9c33612806493724759b8fc3a7023cda4b3e" exitCode=0 Mar 20 14:53:47 crc kubenswrapper[4973]: I0320 14:53:47.724368 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxwq4" event={"ID":"2428e895-d676-4ae1-850f-224b6fc5632b","Type":"ContainerDied","Data":"3642b11840939d4270f26681d53b9c33612806493724759b8fc3a7023cda4b3e"} Mar 20 14:53:48 crc kubenswrapper[4973]: I0320 14:53:48.737448 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxwq4" event={"ID":"2428e895-d676-4ae1-850f-224b6fc5632b","Type":"ContainerStarted","Data":"5423ac8164e35d5ca6680cb382bc2e5a78ef47dc396d622bc8f6ca39308c1b38"} Mar 20 14:53:48 crc kubenswrapper[4973]: I0320 14:53:48.768330 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-sxwq4" podStartSLOduration=3.297409497 podStartE2EDuration="7.768292249s" podCreationTimestamp="2026-03-20 14:53:41 +0000 UTC" firstStartedPulling="2026-03-20 14:53:43.679599939 +0000 UTC m=+5544.423269683" lastFinishedPulling="2026-03-20 14:53:48.150482681 +0000 UTC m=+5548.894152435" observedRunningTime="2026-03-20 14:53:48.755724395 +0000 UTC m=+5549.499394159" watchObservedRunningTime="2026-03-20 14:53:48.768292249 +0000 UTC m=+5549.511961993" Mar 20 14:53:52 crc kubenswrapper[4973]: I0320 14:53:52.203558 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-sxwq4" Mar 20 14:53:52 crc kubenswrapper[4973]: I0320 14:53:52.204404 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-sxwq4" Mar 20 14:53:52 crc kubenswrapper[4973]: I0320 14:53:52.271656 4973 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-sxwq4" Mar 20 14:53:53 crc kubenswrapper[4973]: I0320 14:53:53.807108 4973 generic.go:334] "Generic (PLEG): container finished" podID="f5c0588a-6f96-4145-84f6-488007b3b05a" containerID="1142170c94b4fa736ea77dca698b5e88f957482bf3ea318150cad610c6fd18a5" exitCode=0 Mar 20 14:53:53 crc kubenswrapper[4973]: I0320 14:53:53.807227 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qxmg4/must-gather-b9cmm" event={"ID":"f5c0588a-6f96-4145-84f6-488007b3b05a","Type":"ContainerDied","Data":"1142170c94b4fa736ea77dca698b5e88f957482bf3ea318150cad610c6fd18a5"} Mar 20 14:53:53 crc kubenswrapper[4973]: I0320 14:53:53.808211 4973 scope.go:117] "RemoveContainer" containerID="1142170c94b4fa736ea77dca698b5e88f957482bf3ea318150cad610c6fd18a5" Mar 20 14:53:54 crc kubenswrapper[4973]: I0320 14:53:54.761490 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qxmg4_must-gather-b9cmm_f5c0588a-6f96-4145-84f6-488007b3b05a/gather/0.log" Mar 20 14:54:00 crc kubenswrapper[4973]: I0320 14:54:00.146012 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566974-254sd"] Mar 20 14:54:00 crc kubenswrapper[4973]: I0320 14:54:00.148729 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566974-254sd" Mar 20 14:54:00 crc kubenswrapper[4973]: I0320 14:54:00.152791 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:54:00 crc kubenswrapper[4973]: I0320 14:54:00.153533 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:54:00 crc kubenswrapper[4973]: I0320 14:54:00.157373 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:54:00 crc kubenswrapper[4973]: I0320 14:54:00.167711 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566974-254sd"] Mar 20 14:54:00 crc kubenswrapper[4973]: I0320 14:54:00.299767 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8przx\" (UniqueName: \"kubernetes.io/projected/ce4e6cce-8c25-4a59-b261-1c5a23f3456b-kube-api-access-8przx\") pod \"auto-csr-approver-29566974-254sd\" (UID: \"ce4e6cce-8c25-4a59-b261-1c5a23f3456b\") " pod="openshift-infra/auto-csr-approver-29566974-254sd" Mar 20 14:54:00 crc kubenswrapper[4973]: I0320 14:54:00.402510 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8przx\" (UniqueName: \"kubernetes.io/projected/ce4e6cce-8c25-4a59-b261-1c5a23f3456b-kube-api-access-8przx\") pod \"auto-csr-approver-29566974-254sd\" (UID: \"ce4e6cce-8c25-4a59-b261-1c5a23f3456b\") " pod="openshift-infra/auto-csr-approver-29566974-254sd" Mar 20 14:54:00 crc kubenswrapper[4973]: I0320 14:54:00.421922 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8przx\" (UniqueName: \"kubernetes.io/projected/ce4e6cce-8c25-4a59-b261-1c5a23f3456b-kube-api-access-8przx\") pod \"auto-csr-approver-29566974-254sd\" (UID: \"ce4e6cce-8c25-4a59-b261-1c5a23f3456b\") " pod="openshift-infra/auto-csr-approver-29566974-254sd" Mar 20 14:54:00 crc kubenswrapper[4973]: I0320 14:54:00.497134 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566974-254sd" Mar 20 14:54:01 crc kubenswrapper[4973]: I0320 14:54:01.040920 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566974-254sd"] Mar 20 14:54:01 crc kubenswrapper[4973]: I0320 14:54:01.926431 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566974-254sd" event={"ID":"ce4e6cce-8c25-4a59-b261-1c5a23f3456b","Type":"ContainerStarted","Data":"a19e376f4b4e8df37ffbce6f45968b4f566c2739a6204944c56d25ec13d7e62a"} Mar 20 14:54:02 crc kubenswrapper[4973]: I0320 14:54:02.252571 4973 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-sxwq4" Mar 20 14:54:02 crc kubenswrapper[4973]: I0320 14:54:02.308541 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sxwq4"] Mar 20 14:54:02 crc kubenswrapper[4973]: I0320 14:54:02.935398 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-sxwq4" podUID="2428e895-d676-4ae1-850f-224b6fc5632b" containerName="registry-server" containerID="cri-o://5423ac8164e35d5ca6680cb382bc2e5a78ef47dc396d622bc8f6ca39308c1b38" gracePeriod=2 Mar 20 14:54:03 crc kubenswrapper[4973]: I0320 14:54:03.527863 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxwq4" Mar 20 14:54:03 crc kubenswrapper[4973]: I0320 14:54:03.591073 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2428e895-d676-4ae1-850f-224b6fc5632b-catalog-content\") pod \"2428e895-d676-4ae1-850f-224b6fc5632b\" (UID: \"2428e895-d676-4ae1-850f-224b6fc5632b\") " Mar 20 14:54:03 crc kubenswrapper[4973]: I0320 14:54:03.591282 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2428e895-d676-4ae1-850f-224b6fc5632b-utilities\") pod \"2428e895-d676-4ae1-850f-224b6fc5632b\" (UID: \"2428e895-d676-4ae1-850f-224b6fc5632b\") " Mar 20 14:54:03 crc kubenswrapper[4973]: I0320 14:54:03.593426 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2428e895-d676-4ae1-850f-224b6fc5632b-utilities" (OuterVolumeSpecName: "utilities") pod "2428e895-d676-4ae1-850f-224b6fc5632b" (UID: "2428e895-d676-4ae1-850f-224b6fc5632b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:54:03 crc kubenswrapper[4973]: I0320 14:54:03.677744 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2428e895-d676-4ae1-850f-224b6fc5632b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2428e895-d676-4ae1-850f-224b6fc5632b" (UID: "2428e895-d676-4ae1-850f-224b6fc5632b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:54:03 crc kubenswrapper[4973]: I0320 14:54:03.693813 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh856\" (UniqueName: \"kubernetes.io/projected/2428e895-d676-4ae1-850f-224b6fc5632b-kube-api-access-fh856\") pod \"2428e895-d676-4ae1-850f-224b6fc5632b\" (UID: \"2428e895-d676-4ae1-850f-224b6fc5632b\") " Mar 20 14:54:03 crc kubenswrapper[4973]: I0320 14:54:03.695379 4973 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2428e895-d676-4ae1-850f-224b6fc5632b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 14:54:03 crc kubenswrapper[4973]: I0320 14:54:03.695470 4973 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2428e895-d676-4ae1-850f-224b6fc5632b-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 14:54:03 crc kubenswrapper[4973]: I0320 14:54:03.704393 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2428e895-d676-4ae1-850f-224b6fc5632b-kube-api-access-fh856" (OuterVolumeSpecName: "kube-api-access-fh856") pod "2428e895-d676-4ae1-850f-224b6fc5632b" (UID: "2428e895-d676-4ae1-850f-224b6fc5632b"). InnerVolumeSpecName "kube-api-access-fh856". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:54:03 crc kubenswrapper[4973]: I0320 14:54:03.796930 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh856\" (UniqueName: \"kubernetes.io/projected/2428e895-d676-4ae1-850f-224b6fc5632b-kube-api-access-fh856\") on node \"crc\" DevicePath \"\"" Mar 20 14:54:03 crc kubenswrapper[4973]: I0320 14:54:03.949394 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566974-254sd" event={"ID":"ce4e6cce-8c25-4a59-b261-1c5a23f3456b","Type":"ContainerStarted","Data":"59725975ee9c53003d1d5c0d00dd9391c95a8ab252482444200d23ae51fedeae"} Mar 20 14:54:03 crc kubenswrapper[4973]: I0320 14:54:03.953412 4973 generic.go:334] "Generic (PLEG): container finished" podID="2428e895-d676-4ae1-850f-224b6fc5632b" containerID="5423ac8164e35d5ca6680cb382bc2e5a78ef47dc396d622bc8f6ca39308c1b38" exitCode=0 Mar 20 14:54:03 crc kubenswrapper[4973]: I0320 14:54:03.953513 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sxwq4" Mar 20 14:54:03 crc kubenswrapper[4973]: I0320 14:54:03.970270 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxwq4" event={"ID":"2428e895-d676-4ae1-850f-224b6fc5632b","Type":"ContainerDied","Data":"5423ac8164e35d5ca6680cb382bc2e5a78ef47dc396d622bc8f6ca39308c1b38"} Mar 20 14:54:03 crc kubenswrapper[4973]: I0320 14:54:03.970313 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sxwq4" event={"ID":"2428e895-d676-4ae1-850f-224b6fc5632b","Type":"ContainerDied","Data":"a9724e17e93dbf6e5a0d28bc44f60eb22c9f575b1c2cd97a51608eab3074f94a"} Mar 20 14:54:03 crc kubenswrapper[4973]: I0320 14:54:03.970351 4973 scope.go:117] "RemoveContainer" containerID="5423ac8164e35d5ca6680cb382bc2e5a78ef47dc396d622bc8f6ca39308c1b38" Mar 20 14:54:03 crc kubenswrapper[4973]: I0320 14:54:03.983694 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566974-254sd" podStartSLOduration=3.048895231 podStartE2EDuration="3.983663484s" podCreationTimestamp="2026-03-20 14:54:00 +0000 UTC" firstStartedPulling="2026-03-20 14:54:01.702468841 +0000 UTC m=+5562.446138585" lastFinishedPulling="2026-03-20 14:54:02.637237104 +0000 UTC m=+5563.380906838" observedRunningTime="2026-03-20 14:54:03.96486887 +0000 UTC m=+5564.708538614" watchObservedRunningTime="2026-03-20 14:54:03.983663484 +0000 UTC m=+5564.727333228" Mar 20 14:54:04 crc kubenswrapper[4973]: I0320 14:54:04.002919 4973 scope.go:117] "RemoveContainer" containerID="3642b11840939d4270f26681d53b9c33612806493724759b8fc3a7023cda4b3e" Mar 20 14:54:04 crc kubenswrapper[4973]: I0320 14:54:04.002934 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sxwq4"] Mar 20 14:54:04 crc kubenswrapper[4973]: I0320 14:54:04.015095 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sxwq4"] Mar 20 14:54:04 crc kubenswrapper[4973]: I0320 14:54:04.046889 4973 scope.go:117] "RemoveContainer" containerID="ab7d4237b313f91c1869c9503d0474a6b5d0ad99900d67ebce9b4c71d0065e85" Mar 20 14:54:04 crc kubenswrapper[4973]: I0320 14:54:04.089264 4973 scope.go:117] "RemoveContainer" containerID="5423ac8164e35d5ca6680cb382bc2e5a78ef47dc396d622bc8f6ca39308c1b38" Mar 20 14:54:04 crc kubenswrapper[4973]: E0320 14:54:04.090362 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5423ac8164e35d5ca6680cb382bc2e5a78ef47dc396d622bc8f6ca39308c1b38\": container with ID starting with 5423ac8164e35d5ca6680cb382bc2e5a78ef47dc396d622bc8f6ca39308c1b38 not found: ID does not exist" containerID="5423ac8164e35d5ca6680cb382bc2e5a78ef47dc396d622bc8f6ca39308c1b38" Mar 20 14:54:04 crc kubenswrapper[4973]: I0320 14:54:04.090402 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5423ac8164e35d5ca6680cb382bc2e5a78ef47dc396d622bc8f6ca39308c1b38"} err="failed to get container status \"5423ac8164e35d5ca6680cb382bc2e5a78ef47dc396d622bc8f6ca39308c1b38\": rpc error: code = NotFound desc = could not find container \"5423ac8164e35d5ca6680cb382bc2e5a78ef47dc396d622bc8f6ca39308c1b38\": container with ID starting with 5423ac8164e35d5ca6680cb382bc2e5a78ef47dc396d622bc8f6ca39308c1b38 not found: ID does not exist" Mar 20 14:54:04 crc kubenswrapper[4973]: I0320 14:54:04.090428 4973 scope.go:117] "RemoveContainer" containerID="3642b11840939d4270f26681d53b9c33612806493724759b8fc3a7023cda4b3e" Mar 20 14:54:04 crc kubenswrapper[4973]: E0320 14:54:04.090986 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3642b11840939d4270f26681d53b9c33612806493724759b8fc3a7023cda4b3e\": container with ID starting with 3642b11840939d4270f26681d53b9c33612806493724759b8fc3a7023cda4b3e not found: ID does not exist" containerID="3642b11840939d4270f26681d53b9c33612806493724759b8fc3a7023cda4b3e" Mar 20 14:54:04 crc kubenswrapper[4973]: I0320 14:54:04.091074 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3642b11840939d4270f26681d53b9c33612806493724759b8fc3a7023cda4b3e"} err="failed to get container status \"3642b11840939d4270f26681d53b9c33612806493724759b8fc3a7023cda4b3e\": rpc error: code = NotFound desc = could not find container \"3642b11840939d4270f26681d53b9c33612806493724759b8fc3a7023cda4b3e\": container with ID starting with 3642b11840939d4270f26681d53b9c33612806493724759b8fc3a7023cda4b3e not found: ID does not exist" Mar 20 14:54:04 crc kubenswrapper[4973]: I0320 14:54:04.091188 4973 scope.go:117] "RemoveContainer" containerID="ab7d4237b313f91c1869c9503d0474a6b5d0ad99900d67ebce9b4c71d0065e85" Mar 20 14:54:04 crc kubenswrapper[4973]: E0320 14:54:04.092627 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab7d4237b313f91c1869c9503d0474a6b5d0ad99900d67ebce9b4c71d0065e85\": container with ID starting with ab7d4237b313f91c1869c9503d0474a6b5d0ad99900d67ebce9b4c71d0065e85 not found: ID does not exist" containerID="ab7d4237b313f91c1869c9503d0474a6b5d0ad99900d67ebce9b4c71d0065e85" Mar 20 14:54:04 crc kubenswrapper[4973]: I0320 14:54:04.092765 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab7d4237b313f91c1869c9503d0474a6b5d0ad99900d67ebce9b4c71d0065e85"} err="failed to get container status \"ab7d4237b313f91c1869c9503d0474a6b5d0ad99900d67ebce9b4c71d0065e85\": rpc error: code = NotFound desc = could not find container \"ab7d4237b313f91c1869c9503d0474a6b5d0ad99900d67ebce9b4c71d0065e85\": container with ID starting with ab7d4237b313f91c1869c9503d0474a6b5d0ad99900d67ebce9b4c71d0065e85 not found: ID does not exist" Mar 20 14:54:04 crc kubenswrapper[4973]: I0320 14:54:04.151479 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-qxmg4/must-gather-b9cmm"] Mar 20 14:54:04 crc kubenswrapper[4973]: I0320 14:54:04.151778 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-qxmg4/must-gather-b9cmm" podUID="f5c0588a-6f96-4145-84f6-488007b3b05a" containerName="copy" containerID="cri-o://096b5d7e17d7dd4db535d894e23efafb3f3e670c12ffeeec84b0df45185df7c3" gracePeriod=2 Mar 20 14:54:04 crc kubenswrapper[4973]: I0320 14:54:04.165119 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-qxmg4/must-gather-b9cmm"] Mar 20 14:54:04 crc kubenswrapper[4973]: I0320 14:54:04.665593 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qxmg4_must-gather-b9cmm_f5c0588a-6f96-4145-84f6-488007b3b05a/copy/0.log" Mar 20 14:54:04 crc kubenswrapper[4973]: I0320 14:54:04.666688 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qxmg4/must-gather-b9cmm" Mar 20 14:54:04 crc kubenswrapper[4973]: I0320 14:54:04.728740 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcxsl\" (UniqueName: \"kubernetes.io/projected/f5c0588a-6f96-4145-84f6-488007b3b05a-kube-api-access-dcxsl\") pod \"f5c0588a-6f96-4145-84f6-488007b3b05a\" (UID: \"f5c0588a-6f96-4145-84f6-488007b3b05a\") " Mar 20 14:54:04 crc kubenswrapper[4973]: I0320 14:54:04.729287 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f5c0588a-6f96-4145-84f6-488007b3b05a-must-gather-output\") pod \"f5c0588a-6f96-4145-84f6-488007b3b05a\" (UID: \"f5c0588a-6f96-4145-84f6-488007b3b05a\") " Mar 20 14:54:04 crc kubenswrapper[4973]: I0320 14:54:04.734762 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5c0588a-6f96-4145-84f6-488007b3b05a-kube-api-access-dcxsl" (OuterVolumeSpecName: "kube-api-access-dcxsl") pod "f5c0588a-6f96-4145-84f6-488007b3b05a" (UID: "f5c0588a-6f96-4145-84f6-488007b3b05a"). InnerVolumeSpecName "kube-api-access-dcxsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:54:04 crc kubenswrapper[4973]: I0320 14:54:04.832112 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcxsl\" (UniqueName: \"kubernetes.io/projected/f5c0588a-6f96-4145-84f6-488007b3b05a-kube-api-access-dcxsl\") on node \"crc\" DevicePath \"\"" Mar 20 14:54:04 crc kubenswrapper[4973]: I0320 14:54:04.946411 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5c0588a-6f96-4145-84f6-488007b3b05a-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "f5c0588a-6f96-4145-84f6-488007b3b05a" (UID: "f5c0588a-6f96-4145-84f6-488007b3b05a"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 14:54:04 crc kubenswrapper[4973]: I0320 14:54:04.965706 4973 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-qxmg4_must-gather-b9cmm_f5c0588a-6f96-4145-84f6-488007b3b05a/copy/0.log" Mar 20 14:54:04 crc kubenswrapper[4973]: I0320 14:54:04.966617 4973 generic.go:334] "Generic (PLEG): container finished" podID="f5c0588a-6f96-4145-84f6-488007b3b05a" containerID="096b5d7e17d7dd4db535d894e23efafb3f3e670c12ffeeec84b0df45185df7c3" exitCode=143 Mar 20 14:54:04 crc kubenswrapper[4973]: I0320 14:54:04.966698 4973 scope.go:117] "RemoveContainer" containerID="096b5d7e17d7dd4db535d894e23efafb3f3e670c12ffeeec84b0df45185df7c3" Mar 20 14:54:04 crc kubenswrapper[4973]: I0320 14:54:04.966844 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qxmg4/must-gather-b9cmm" Mar 20 14:54:04 crc kubenswrapper[4973]: I0320 14:54:04.976016 4973 generic.go:334] "Generic (PLEG): container finished" podID="ce4e6cce-8c25-4a59-b261-1c5a23f3456b" containerID="59725975ee9c53003d1d5c0d00dd9391c95a8ab252482444200d23ae51fedeae" exitCode=0 Mar 20 14:54:04 crc kubenswrapper[4973]: I0320 14:54:04.976087 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566974-254sd" event={"ID":"ce4e6cce-8c25-4a59-b261-1c5a23f3456b","Type":"ContainerDied","Data":"59725975ee9c53003d1d5c0d00dd9391c95a8ab252482444200d23ae51fedeae"} Mar 20 14:54:04 crc kubenswrapper[4973]: I0320 14:54:04.994222 4973 scope.go:117] "RemoveContainer" containerID="1142170c94b4fa736ea77dca698b5e88f957482bf3ea318150cad610c6fd18a5" Mar 20 14:54:05 crc kubenswrapper[4973]: I0320 14:54:05.037688 4973 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f5c0588a-6f96-4145-84f6-488007b3b05a-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 14:54:05 crc kubenswrapper[4973]: I0320 14:54:05.049590 4973 scope.go:117] "RemoveContainer" containerID="096b5d7e17d7dd4db535d894e23efafb3f3e670c12ffeeec84b0df45185df7c3" Mar 20 14:54:05 crc kubenswrapper[4973]: E0320 14:54:05.050180 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"096b5d7e17d7dd4db535d894e23efafb3f3e670c12ffeeec84b0df45185df7c3\": container with ID starting with 096b5d7e17d7dd4db535d894e23efafb3f3e670c12ffeeec84b0df45185df7c3 not found: ID does not exist" containerID="096b5d7e17d7dd4db535d894e23efafb3f3e670c12ffeeec84b0df45185df7c3" Mar 20 14:54:05 crc kubenswrapper[4973]: I0320 14:54:05.050226 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"096b5d7e17d7dd4db535d894e23efafb3f3e670c12ffeeec84b0df45185df7c3"} err="failed to get container status \"096b5d7e17d7dd4db535d894e23efafb3f3e670c12ffeeec84b0df45185df7c3\": rpc error: code = NotFound desc = could not find container \"096b5d7e17d7dd4db535d894e23efafb3f3e670c12ffeeec84b0df45185df7c3\": container with ID starting with 096b5d7e17d7dd4db535d894e23efafb3f3e670c12ffeeec84b0df45185df7c3 not found: ID does not exist" Mar 20 14:54:05 crc kubenswrapper[4973]: I0320 14:54:05.050259 4973 scope.go:117] "RemoveContainer" containerID="1142170c94b4fa736ea77dca698b5e88f957482bf3ea318150cad610c6fd18a5" Mar 20 14:54:05 crc kubenswrapper[4973]: E0320 14:54:05.050533 4973 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1142170c94b4fa736ea77dca698b5e88f957482bf3ea318150cad610c6fd18a5\": container with ID starting with 1142170c94b4fa736ea77dca698b5e88f957482bf3ea318150cad610c6fd18a5 not found: ID does not exist" containerID="1142170c94b4fa736ea77dca698b5e88f957482bf3ea318150cad610c6fd18a5" Mar 20 14:54:05 crc kubenswrapper[4973]: I0320 14:54:05.050565 4973 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1142170c94b4fa736ea77dca698b5e88f957482bf3ea318150cad610c6fd18a5"} err="failed to get container status \"1142170c94b4fa736ea77dca698b5e88f957482bf3ea318150cad610c6fd18a5\": rpc error: code = NotFound desc = could not find container \"1142170c94b4fa736ea77dca698b5e88f957482bf3ea318150cad610c6fd18a5\": container with ID starting with 1142170c94b4fa736ea77dca698b5e88f957482bf3ea318150cad610c6fd18a5 not found: ID does not exist" Mar 20 14:54:05 crc kubenswrapper[4973]: I0320 14:54:05.964305 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2428e895-d676-4ae1-850f-224b6fc5632b" path="/var/lib/kubelet/pods/2428e895-d676-4ae1-850f-224b6fc5632b/volumes" Mar 20 14:54:05 crc kubenswrapper[4973]: I0320 14:54:05.965437 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5c0588a-6f96-4145-84f6-488007b3b05a" path="/var/lib/kubelet/pods/f5c0588a-6f96-4145-84f6-488007b3b05a/volumes" Mar 20 14:54:06 crc kubenswrapper[4973]: I0320 14:54:06.471659 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566974-254sd" Mar 20 14:54:06 crc kubenswrapper[4973]: I0320 14:54:06.573306 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8przx\" (UniqueName: \"kubernetes.io/projected/ce4e6cce-8c25-4a59-b261-1c5a23f3456b-kube-api-access-8przx\") pod \"ce4e6cce-8c25-4a59-b261-1c5a23f3456b\" (UID: \"ce4e6cce-8c25-4a59-b261-1c5a23f3456b\") " Mar 20 14:54:06 crc kubenswrapper[4973]: I0320 14:54:06.589812 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce4e6cce-8c25-4a59-b261-1c5a23f3456b-kube-api-access-8przx" (OuterVolumeSpecName: "kube-api-access-8przx") pod "ce4e6cce-8c25-4a59-b261-1c5a23f3456b" (UID: "ce4e6cce-8c25-4a59-b261-1c5a23f3456b"). InnerVolumeSpecName "kube-api-access-8przx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:54:06 crc kubenswrapper[4973]: I0320 14:54:06.677778 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8przx\" (UniqueName: \"kubernetes.io/projected/ce4e6cce-8c25-4a59-b261-1c5a23f3456b-kube-api-access-8przx\") on node \"crc\" DevicePath \"\"" Mar 20 14:54:07 crc kubenswrapper[4973]: I0320 14:54:06.999751 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566974-254sd" event={"ID":"ce4e6cce-8c25-4a59-b261-1c5a23f3456b","Type":"ContainerDied","Data":"a19e376f4b4e8df37ffbce6f45968b4f566c2739a6204944c56d25ec13d7e62a"} Mar 20 14:54:07 crc kubenswrapper[4973]: I0320 14:54:07.000153 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a19e376f4b4e8df37ffbce6f45968b4f566c2739a6204944c56d25ec13d7e62a" Mar 20 14:54:07 crc kubenswrapper[4973]: I0320 14:54:06.999857 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566974-254sd" Mar 20 14:54:07 crc kubenswrapper[4973]: I0320 14:54:07.040036 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566968-v4vj6"] Mar 20 14:54:07 crc kubenswrapper[4973]: I0320 14:54:07.053299 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566968-v4vj6"] Mar 20 14:54:07 crc kubenswrapper[4973]: I0320 14:54:07.968686 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3982fe5-3325-4510-be69-024ac254cc8e" path="/var/lib/kubelet/pods/e3982fe5-3325-4510-be69-024ac254cc8e/volumes" Mar 20 14:54:13 crc kubenswrapper[4973]: I0320 14:54:13.321671 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:54:13 crc kubenswrapper[4973]: I0320 14:54:13.322739 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:54:43 crc kubenswrapper[4973]: I0320 14:54:43.327509 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:54:43 crc kubenswrapper[4973]: I0320 14:54:43.328497 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:55:06 crc kubenswrapper[4973]: I0320 14:55:06.803929 4973 scope.go:117] "RemoveContainer" containerID="24161947c1d5f03ecc35f03ac4bd3e7e9cef313f7a61ef878828c7c11eb089de" Mar 20 14:55:13 crc kubenswrapper[4973]: I0320 14:55:13.320923 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:55:13 crc kubenswrapper[4973]: I0320 14:55:13.321462 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:55:13 crc kubenswrapper[4973]: I0320 14:55:13.321502 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 14:55:13 crc kubenswrapper[4973]: I0320 14:55:13.322558 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b37569a1f4c446b4098a5730225f25c6ac53e44f17daca0b84cf263f39f0c7c7"} pod="openshift-machine-config-operator/machine-config-daemon-qlztx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:55:13 crc kubenswrapper[4973]: I0320 14:55:13.322607 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" containerID="cri-o://b37569a1f4c446b4098a5730225f25c6ac53e44f17daca0b84cf263f39f0c7c7" gracePeriod=600 Mar 20 14:55:13 crc kubenswrapper[4973]: I0320 14:55:13.751058 4973 generic.go:334] "Generic (PLEG): container finished" podID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerID="b37569a1f4c446b4098a5730225f25c6ac53e44f17daca0b84cf263f39f0c7c7" exitCode=0 Mar 20 14:55:13 crc kubenswrapper[4973]: I0320 14:55:13.751131 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerDied","Data":"b37569a1f4c446b4098a5730225f25c6ac53e44f17daca0b84cf263f39f0c7c7"} Mar 20 14:55:13 crc kubenswrapper[4973]: I0320 14:55:13.751421 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerStarted","Data":"ae987235ba54d62052117d63cefdb9b2cc4548bec68e9b044844fc0ac21a1742"} Mar 20 14:55:13 crc kubenswrapper[4973]: I0320 14:55:13.751448 4973 scope.go:117] "RemoveContainer" containerID="3ad77e7d0166c1175600e08c20b0b483368541eac47f376db08ef3bf2dc99790" Mar 20 14:56:00 crc kubenswrapper[4973]: I0320 14:56:00.148008 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566976-zcdtj"] Mar 20 14:56:00 crc kubenswrapper[4973]: E0320 14:56:00.149142 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5c0588a-6f96-4145-84f6-488007b3b05a" containerName="copy" Mar 20 14:56:00 crc kubenswrapper[4973]: I0320 14:56:00.149159 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5c0588a-6f96-4145-84f6-488007b3b05a" containerName="copy" Mar 20 14:56:00 crc kubenswrapper[4973]: E0320 14:56:00.149171 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce4e6cce-8c25-4a59-b261-1c5a23f3456b" containerName="oc" Mar 20 14:56:00 crc kubenswrapper[4973]: I0320 14:56:00.149179 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce4e6cce-8c25-4a59-b261-1c5a23f3456b" containerName="oc" Mar 20 14:56:00 crc kubenswrapper[4973]: E0320 14:56:00.149190 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2428e895-d676-4ae1-850f-224b6fc5632b" containerName="extract-utilities" Mar 20 14:56:00 crc kubenswrapper[4973]: I0320 14:56:00.149198 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="2428e895-d676-4ae1-850f-224b6fc5632b" containerName="extract-utilities" Mar 20 14:56:00 crc kubenswrapper[4973]: E0320 14:56:00.149215 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5c0588a-6f96-4145-84f6-488007b3b05a" containerName="gather" Mar 20 14:56:00 crc kubenswrapper[4973]: I0320 14:56:00.149221 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5c0588a-6f96-4145-84f6-488007b3b05a" containerName="gather" Mar 20 14:56:00 crc kubenswrapper[4973]: E0320 14:56:00.149233 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2428e895-d676-4ae1-850f-224b6fc5632b" containerName="extract-content" Mar 20 14:56:00 crc kubenswrapper[4973]: I0320 14:56:00.149239 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="2428e895-d676-4ae1-850f-224b6fc5632b" containerName="extract-content" Mar 20 14:56:00 crc kubenswrapper[4973]: E0320 14:56:00.149254 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2428e895-d676-4ae1-850f-224b6fc5632b" containerName="registry-server" Mar 20 14:56:00 crc kubenswrapper[4973]: I0320 14:56:00.149260 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="2428e895-d676-4ae1-850f-224b6fc5632b" containerName="registry-server" Mar 20 14:56:00 crc kubenswrapper[4973]: I0320 14:56:00.149598 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="2428e895-d676-4ae1-850f-224b6fc5632b" containerName="registry-server" Mar 20 14:56:00 crc kubenswrapper[4973]: I0320 14:56:00.149622 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce4e6cce-8c25-4a59-b261-1c5a23f3456b" containerName="oc" Mar 20 14:56:00 crc kubenswrapper[4973]: I0320 14:56:00.149639 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5c0588a-6f96-4145-84f6-488007b3b05a" containerName="copy" Mar 20 14:56:00 crc kubenswrapper[4973]: I0320 14:56:00.149653 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5c0588a-6f96-4145-84f6-488007b3b05a" containerName="gather" Mar 20 14:56:00 crc kubenswrapper[4973]: I0320 14:56:00.150722 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566976-zcdtj" Mar 20 14:56:00 crc kubenswrapper[4973]: I0320 14:56:00.152524 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:56:00 crc kubenswrapper[4973]: I0320 14:56:00.154683 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:56:00 crc kubenswrapper[4973]: I0320 14:56:00.154725 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:56:00 crc kubenswrapper[4973]: I0320 14:56:00.163953 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566976-zcdtj"] Mar 20 14:56:00 crc kubenswrapper[4973]: I0320 14:56:00.301169 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvxkr\" (UniqueName: \"kubernetes.io/projected/4bce8a06-1469-4496-8d8e-15d765360325-kube-api-access-pvxkr\") pod \"auto-csr-approver-29566976-zcdtj\" (UID: \"4bce8a06-1469-4496-8d8e-15d765360325\") " pod="openshift-infra/auto-csr-approver-29566976-zcdtj" Mar 20 14:56:00 crc kubenswrapper[4973]: I0320 14:56:00.403816 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvxkr\" (UniqueName: \"kubernetes.io/projected/4bce8a06-1469-4496-8d8e-15d765360325-kube-api-access-pvxkr\") pod \"auto-csr-approver-29566976-zcdtj\" (UID: \"4bce8a06-1469-4496-8d8e-15d765360325\") " pod="openshift-infra/auto-csr-approver-29566976-zcdtj" Mar 20 14:56:00 crc kubenswrapper[4973]: I0320 14:56:00.428554 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvxkr\" (UniqueName: \"kubernetes.io/projected/4bce8a06-1469-4496-8d8e-15d765360325-kube-api-access-pvxkr\") pod \"auto-csr-approver-29566976-zcdtj\" (UID: \"4bce8a06-1469-4496-8d8e-15d765360325\") " pod="openshift-infra/auto-csr-approver-29566976-zcdtj" Mar 20 14:56:00 crc kubenswrapper[4973]: I0320 14:56:00.495276 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566976-zcdtj" Mar 20 14:56:01 crc kubenswrapper[4973]: I0320 14:56:01.181052 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566976-zcdtj"] Mar 20 14:56:01 crc kubenswrapper[4973]: I0320 14:56:01.261003 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566976-zcdtj" event={"ID":"4bce8a06-1469-4496-8d8e-15d765360325","Type":"ContainerStarted","Data":"e0cb8808ac7ffb307b6cbb550bfc046e75abf16955f7edf12b05566ea1c68e29"} Mar 20 14:56:03 crc kubenswrapper[4973]: I0320 14:56:03.293719 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566976-zcdtj" event={"ID":"4bce8a06-1469-4496-8d8e-15d765360325","Type":"ContainerStarted","Data":"7f1a4fe4c38110e8af0bede552b027223d6e4e592174ff0729d49f86fa793129"} Mar 20 14:56:03 crc kubenswrapper[4973]: I0320 14:56:03.320844 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566976-zcdtj" podStartSLOduration=2.1769656 podStartE2EDuration="3.320820955s" podCreationTimestamp="2026-03-20 14:56:00 +0000 UTC" firstStartedPulling="2026-03-20 14:56:01.194954704 +0000 UTC m=+5681.938624448" lastFinishedPulling="2026-03-20 14:56:02.338810059 +0000 UTC m=+5683.082479803" observedRunningTime="2026-03-20 14:56:03.310537354 +0000 UTC m=+5684.054207098" watchObservedRunningTime="2026-03-20 14:56:03.320820955 +0000 UTC m=+5684.064490689" Mar 20 14:56:05 crc kubenswrapper[4973]: I0320 14:56:05.315256 4973 generic.go:334] "Generic (PLEG): container finished" podID="4bce8a06-1469-4496-8d8e-15d765360325" containerID="7f1a4fe4c38110e8af0bede552b027223d6e4e592174ff0729d49f86fa793129" exitCode=0 Mar 20 14:56:05 crc kubenswrapper[4973]: I0320 14:56:05.315373 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566976-zcdtj" event={"ID":"4bce8a06-1469-4496-8d8e-15d765360325","Type":"ContainerDied","Data":"7f1a4fe4c38110e8af0bede552b027223d6e4e592174ff0729d49f86fa793129"} Mar 20 14:56:06 crc kubenswrapper[4973]: I0320 14:56:06.798503 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566976-zcdtj" Mar 20 14:56:06 crc kubenswrapper[4973]: I0320 14:56:06.856690 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvxkr\" (UniqueName: \"kubernetes.io/projected/4bce8a06-1469-4496-8d8e-15d765360325-kube-api-access-pvxkr\") pod \"4bce8a06-1469-4496-8d8e-15d765360325\" (UID: \"4bce8a06-1469-4496-8d8e-15d765360325\") " Mar 20 14:56:06 crc kubenswrapper[4973]: I0320 14:56:06.865942 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bce8a06-1469-4496-8d8e-15d765360325-kube-api-access-pvxkr" (OuterVolumeSpecName: "kube-api-access-pvxkr") pod "4bce8a06-1469-4496-8d8e-15d765360325" (UID: "4bce8a06-1469-4496-8d8e-15d765360325"). InnerVolumeSpecName "kube-api-access-pvxkr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:56:06 crc kubenswrapper[4973]: I0320 14:56:06.959109 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvxkr\" (UniqueName: \"kubernetes.io/projected/4bce8a06-1469-4496-8d8e-15d765360325-kube-api-access-pvxkr\") on node \"crc\" DevicePath \"\"" Mar 20 14:56:07 crc kubenswrapper[4973]: I0320 14:56:07.337609 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566976-zcdtj" event={"ID":"4bce8a06-1469-4496-8d8e-15d765360325","Type":"ContainerDied","Data":"e0cb8808ac7ffb307b6cbb550bfc046e75abf16955f7edf12b05566ea1c68e29"} Mar 20 14:56:07 crc kubenswrapper[4973]: I0320 14:56:07.337651 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e0cb8808ac7ffb307b6cbb550bfc046e75abf16955f7edf12b05566ea1c68e29" Mar 20 14:56:07 crc kubenswrapper[4973]: I0320 14:56:07.337704 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566976-zcdtj" Mar 20 14:56:07 crc kubenswrapper[4973]: I0320 14:56:07.404019 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566970-6hdtv"] Mar 20 14:56:07 crc kubenswrapper[4973]: I0320 14:56:07.415602 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566970-6hdtv"] Mar 20 14:56:07 crc kubenswrapper[4973]: I0320 14:56:07.963573 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a43e8c15-2110-4352-b8b1-1919544f46d9" path="/var/lib/kubelet/pods/a43e8c15-2110-4352-b8b1-1919544f46d9/volumes" Mar 20 14:57:06 crc kubenswrapper[4973]: I0320 14:57:06.937193 4973 scope.go:117] "RemoveContainer" containerID="ec2c1078736b8bf082f871c8d7cc0d6c31ff68b09ce60c4cfd64a252b474ff62" Mar 20 14:57:13 crc kubenswrapper[4973]: I0320 14:57:13.321567 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:57:13 crc kubenswrapper[4973]: I0320 14:57:13.322874 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:57:43 crc kubenswrapper[4973]: I0320 14:57:43.320365 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:57:43 crc kubenswrapper[4973]: I0320 14:57:43.320969 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:58:00 crc kubenswrapper[4973]: I0320 14:58:00.143368 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566978-k6q7q"] Mar 20 14:58:00 crc kubenswrapper[4973]: E0320 14:58:00.144188 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bce8a06-1469-4496-8d8e-15d765360325" containerName="oc" Mar 20 14:58:00 crc kubenswrapper[4973]: I0320 14:58:00.144200 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bce8a06-1469-4496-8d8e-15d765360325" containerName="oc" Mar 20 14:58:00 crc kubenswrapper[4973]: I0320 14:58:00.144443 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bce8a06-1469-4496-8d8e-15d765360325" containerName="oc" Mar 20 14:58:00 crc kubenswrapper[4973]: I0320 14:58:00.149850 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566978-k6q7q" Mar 20 14:58:00 crc kubenswrapper[4973]: I0320 14:58:00.152368 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 14:58:00 crc kubenswrapper[4973]: I0320 14:58:00.152728 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 14:58:00 crc kubenswrapper[4973]: I0320 14:58:00.153575 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 14:58:00 crc kubenswrapper[4973]: I0320 14:58:00.214760 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566978-k6q7q"] Mar 20 14:58:00 crc kubenswrapper[4973]: I0320 14:58:00.268991 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhjhn\" (UniqueName: \"kubernetes.io/projected/d64bc6aa-e903-430b-a045-27ba55556bab-kube-api-access-fhjhn\") pod \"auto-csr-approver-29566978-k6q7q\" (UID: \"d64bc6aa-e903-430b-a045-27ba55556bab\") " pod="openshift-infra/auto-csr-approver-29566978-k6q7q" Mar 20 14:58:00 crc kubenswrapper[4973]: I0320 14:58:00.371476 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhjhn\" (UniqueName: \"kubernetes.io/projected/d64bc6aa-e903-430b-a045-27ba55556bab-kube-api-access-fhjhn\") pod \"auto-csr-approver-29566978-k6q7q\" (UID: \"d64bc6aa-e903-430b-a045-27ba55556bab\") " pod="openshift-infra/auto-csr-approver-29566978-k6q7q" Mar 20 14:58:00 crc kubenswrapper[4973]: I0320 14:58:00.402958 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhjhn\" (UniqueName: \"kubernetes.io/projected/d64bc6aa-e903-430b-a045-27ba55556bab-kube-api-access-fhjhn\") pod \"auto-csr-approver-29566978-k6q7q\" (UID: \"d64bc6aa-e903-430b-a045-27ba55556bab\") " pod="openshift-infra/auto-csr-approver-29566978-k6q7q" Mar 20 14:58:00 crc kubenswrapper[4973]: I0320 14:58:00.468890 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566978-k6q7q" Mar 20 14:58:01 crc kubenswrapper[4973]: I0320 14:58:01.072553 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566978-k6q7q"] Mar 20 14:58:01 crc kubenswrapper[4973]: I0320 14:58:01.663401 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566978-k6q7q" event={"ID":"d64bc6aa-e903-430b-a045-27ba55556bab","Type":"ContainerStarted","Data":"d0e2760232f84775f41a863563de8865a4d5bbe1b664bcfaa13be7f3bee783aa"} Mar 20 14:58:02 crc kubenswrapper[4973]: I0320 14:58:02.687829 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566978-k6q7q" event={"ID":"d64bc6aa-e903-430b-a045-27ba55556bab","Type":"ContainerStarted","Data":"d0400feba6c93c73c94806b9707e4f118c9e205243edcb1321fc4f9d217c6fe2"} Mar 20 14:58:02 crc kubenswrapper[4973]: I0320 14:58:02.714554 4973 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566978-k6q7q" podStartSLOduration=1.853995088 podStartE2EDuration="2.71453412s" podCreationTimestamp="2026-03-20 14:58:00 +0000 UTC" firstStartedPulling="2026-03-20 14:58:01.292215993 +0000 UTC m=+5802.035885737" lastFinishedPulling="2026-03-20 14:58:02.152755035 +0000 UTC m=+5802.896424769" observedRunningTime="2026-03-20 14:58:02.705285617 +0000 UTC m=+5803.448955351" watchObservedRunningTime="2026-03-20 14:58:02.71453412 +0000 UTC m=+5803.458203864" Mar 20 14:58:03 crc kubenswrapper[4973]: I0320 14:58:03.699897 4973 generic.go:334] "Generic (PLEG): container finished" podID="d64bc6aa-e903-430b-a045-27ba55556bab" containerID="d0400feba6c93c73c94806b9707e4f118c9e205243edcb1321fc4f9d217c6fe2" exitCode=0 Mar 20 14:58:03 crc kubenswrapper[4973]: I0320 14:58:03.699999 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566978-k6q7q" event={"ID":"d64bc6aa-e903-430b-a045-27ba55556bab","Type":"ContainerDied","Data":"d0400feba6c93c73c94806b9707e4f118c9e205243edcb1321fc4f9d217c6fe2"} Mar 20 14:58:05 crc kubenswrapper[4973]: I0320 14:58:05.145541 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566978-k6q7q" Mar 20 14:58:05 crc kubenswrapper[4973]: I0320 14:58:05.285839 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhjhn\" (UniqueName: \"kubernetes.io/projected/d64bc6aa-e903-430b-a045-27ba55556bab-kube-api-access-fhjhn\") pod \"d64bc6aa-e903-430b-a045-27ba55556bab\" (UID: \"d64bc6aa-e903-430b-a045-27ba55556bab\") " Mar 20 14:58:05 crc kubenswrapper[4973]: I0320 14:58:05.291894 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d64bc6aa-e903-430b-a045-27ba55556bab-kube-api-access-fhjhn" (OuterVolumeSpecName: "kube-api-access-fhjhn") pod "d64bc6aa-e903-430b-a045-27ba55556bab" (UID: "d64bc6aa-e903-430b-a045-27ba55556bab"). InnerVolumeSpecName "kube-api-access-fhjhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 14:58:05 crc kubenswrapper[4973]: I0320 14:58:05.390041 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fhjhn\" (UniqueName: \"kubernetes.io/projected/d64bc6aa-e903-430b-a045-27ba55556bab-kube-api-access-fhjhn\") on node \"crc\" DevicePath \"\"" Mar 20 14:58:05 crc kubenswrapper[4973]: I0320 14:58:05.727258 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566978-k6q7q" event={"ID":"d64bc6aa-e903-430b-a045-27ba55556bab","Type":"ContainerDied","Data":"d0e2760232f84775f41a863563de8865a4d5bbe1b664bcfaa13be7f3bee783aa"} Mar 20 14:58:05 crc kubenswrapper[4973]: I0320 14:58:05.727309 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d0e2760232f84775f41a863563de8865a4d5bbe1b664bcfaa13be7f3bee783aa" Mar 20 14:58:05 crc kubenswrapper[4973]: I0320 14:58:05.727962 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566978-k6q7q" Mar 20 14:58:05 crc kubenswrapper[4973]: I0320 14:58:05.781500 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566972-rv4dc"] Mar 20 14:58:05 crc kubenswrapper[4973]: I0320 14:58:05.798223 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566972-rv4dc"] Mar 20 14:58:05 crc kubenswrapper[4973]: I0320 14:58:05.963489 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a64a74b-6c75-4ce5-909f-938ab3a9737a" path="/var/lib/kubelet/pods/2a64a74b-6c75-4ce5-909f-938ab3a9737a/volumes" Mar 20 14:58:07 crc kubenswrapper[4973]: I0320 14:58:07.030932 4973 scope.go:117] "RemoveContainer" containerID="b40936497e71824e7559c6c913763aeea917309a4f83e8418b3ee128ec803c47" Mar 20 14:58:13 crc kubenswrapper[4973]: I0320 14:58:13.320239 4973 patch_prober.go:28] interesting pod/machine-config-daemon-qlztx container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 14:58:13 crc kubenswrapper[4973]: I0320 14:58:13.321408 4973 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 14:58:13 crc kubenswrapper[4973]: I0320 14:58:13.321454 4973 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" Mar 20 14:58:13 crc kubenswrapper[4973]: I0320 14:58:13.322455 4973 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ae987235ba54d62052117d63cefdb9b2cc4548bec68e9b044844fc0ac21a1742"} pod="openshift-machine-config-operator/machine-config-daemon-qlztx" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 14:58:13 crc kubenswrapper[4973]: I0320 14:58:13.322505 4973 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerName="machine-config-daemon" containerID="cri-o://ae987235ba54d62052117d63cefdb9b2cc4548bec68e9b044844fc0ac21a1742" gracePeriod=600 Mar 20 14:58:13 crc kubenswrapper[4973]: E0320 14:58:13.448371 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:58:13 crc kubenswrapper[4973]: I0320 14:58:13.810870 4973 generic.go:334] "Generic (PLEG): container finished" podID="70745a45-4eff-4e56-b9ab-efa4a7c83306" containerID="ae987235ba54d62052117d63cefdb9b2cc4548bec68e9b044844fc0ac21a1742" exitCode=0 Mar 20 14:58:13 crc kubenswrapper[4973]: I0320 14:58:13.810969 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" event={"ID":"70745a45-4eff-4e56-b9ab-efa4a7c83306","Type":"ContainerDied","Data":"ae987235ba54d62052117d63cefdb9b2cc4548bec68e9b044844fc0ac21a1742"} Mar 20 14:58:13 crc kubenswrapper[4973]: I0320 14:58:13.811256 4973 scope.go:117] "RemoveContainer" containerID="b37569a1f4c446b4098a5730225f25c6ac53e44f17daca0b84cf263f39f0c7c7" Mar 20 14:58:13 crc kubenswrapper[4973]: I0320 14:58:13.812264 4973 scope.go:117] "RemoveContainer" containerID="ae987235ba54d62052117d63cefdb9b2cc4548bec68e9b044844fc0ac21a1742" Mar 20 14:58:13 crc kubenswrapper[4973]: E0320 14:58:13.812916 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:58:28 crc kubenswrapper[4973]: I0320 14:58:28.951256 4973 scope.go:117] "RemoveContainer" containerID="ae987235ba54d62052117d63cefdb9b2cc4548bec68e9b044844fc0ac21a1742" Mar 20 14:58:28 crc kubenswrapper[4973]: E0320 14:58:28.952078 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:58:42 crc kubenswrapper[4973]: I0320 14:58:42.950989 4973 scope.go:117] "RemoveContainer" containerID="ae987235ba54d62052117d63cefdb9b2cc4548bec68e9b044844fc0ac21a1742" Mar 20 14:58:42 crc kubenswrapper[4973]: E0320 14:58:42.952040 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:58:54 crc kubenswrapper[4973]: I0320 14:58:54.951089 4973 scope.go:117] "RemoveContainer" containerID="ae987235ba54d62052117d63cefdb9b2cc4548bec68e9b044844fc0ac21a1742" Mar 20 14:58:54 crc kubenswrapper[4973]: E0320 14:58:54.951865 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:59:08 crc kubenswrapper[4973]: I0320 14:59:08.951224 4973 scope.go:117] "RemoveContainer" containerID="ae987235ba54d62052117d63cefdb9b2cc4548bec68e9b044844fc0ac21a1742" Mar 20 14:59:08 crc kubenswrapper[4973]: E0320 14:59:08.952230 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:59:20 crc kubenswrapper[4973]: I0320 14:59:20.950732 4973 scope.go:117] "RemoveContainer" containerID="ae987235ba54d62052117d63cefdb9b2cc4548bec68e9b044844fc0ac21a1742" Mar 20 14:59:20 crc kubenswrapper[4973]: E0320 14:59:20.952475 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:59:31 crc kubenswrapper[4973]: I0320 14:59:31.951213 4973 scope.go:117] "RemoveContainer" containerID="ae987235ba54d62052117d63cefdb9b2cc4548bec68e9b044844fc0ac21a1742" Mar 20 14:59:31 crc kubenswrapper[4973]: E0320 14:59:31.952087 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:59:42 crc kubenswrapper[4973]: I0320 14:59:42.951701 4973 scope.go:117] "RemoveContainer" containerID="ae987235ba54d62052117d63cefdb9b2cc4548bec68e9b044844fc0ac21a1742" Mar 20 14:59:42 crc kubenswrapper[4973]: E0320 14:59:42.952470 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 14:59:57 crc kubenswrapper[4973]: I0320 14:59:57.951465 4973 scope.go:117] "RemoveContainer" containerID="ae987235ba54d62052117d63cefdb9b2cc4548bec68e9b044844fc0ac21a1742" Mar 20 14:59:57 crc kubenswrapper[4973]: E0320 14:59:57.952271 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 15:00:00 crc kubenswrapper[4973]: I0320 15:00:00.162439 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566980-rx8lt"] Mar 20 15:00:00 crc kubenswrapper[4973]: E0320 15:00:00.163503 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d64bc6aa-e903-430b-a045-27ba55556bab" containerName="oc" Mar 20 15:00:00 crc kubenswrapper[4973]: I0320 15:00:00.163520 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="d64bc6aa-e903-430b-a045-27ba55556bab" containerName="oc" Mar 20 15:00:00 crc kubenswrapper[4973]: I0320 15:00:00.163901 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="d64bc6aa-e903-430b-a045-27ba55556bab" containerName="oc" Mar 20 15:00:00 crc kubenswrapper[4973]: I0320 15:00:00.164936 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566980-rx8lt" Mar 20 15:00:00 crc kubenswrapper[4973]: I0320 15:00:00.168008 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-m8lkc" Mar 20 15:00:00 crc kubenswrapper[4973]: I0320 15:00:00.168393 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 15:00:00 crc kubenswrapper[4973]: I0320 15:00:00.168622 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 15:00:00 crc kubenswrapper[4973]: I0320 15:00:00.176177 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566980-9dbhm"] Mar 20 15:00:00 crc kubenswrapper[4973]: I0320 15:00:00.178004 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-9dbhm" Mar 20 15:00:00 crc kubenswrapper[4973]: I0320 15:00:00.181932 4973 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 15:00:00 crc kubenswrapper[4973]: I0320 15:00:00.182234 4973 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 15:00:00 crc kubenswrapper[4973]: I0320 15:00:00.189564 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566980-rx8lt"] Mar 20 15:00:00 crc kubenswrapper[4973]: I0320 15:00:00.203514 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566980-9dbhm"] Mar 20 15:00:00 crc kubenswrapper[4973]: I0320 15:00:00.299548 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9n6m\" (UniqueName: \"kubernetes.io/projected/9b9f0bce-71f9-45b9-832e-5897b0a8d71d-kube-api-access-j9n6m\") pod \"auto-csr-approver-29566980-rx8lt\" (UID: \"9b9f0bce-71f9-45b9-832e-5897b0a8d71d\") " pod="openshift-infra/auto-csr-approver-29566980-rx8lt" Mar 20 15:00:00 crc kubenswrapper[4973]: I0320 15:00:00.299708 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35eafd12-efff-41d2-9f88-01e02d4c5b25-secret-volume\") pod \"collect-profiles-29566980-9dbhm\" (UID: \"35eafd12-efff-41d2-9f88-01e02d4c5b25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-9dbhm" Mar 20 15:00:00 crc kubenswrapper[4973]: I0320 15:00:00.299770 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfp82\" (UniqueName: \"kubernetes.io/projected/35eafd12-efff-41d2-9f88-01e02d4c5b25-kube-api-access-hfp82\") pod \"collect-profiles-29566980-9dbhm\" (UID: \"35eafd12-efff-41d2-9f88-01e02d4c5b25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-9dbhm" Mar 20 15:00:00 crc kubenswrapper[4973]: I0320 15:00:00.300292 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35eafd12-efff-41d2-9f88-01e02d4c5b25-config-volume\") pod \"collect-profiles-29566980-9dbhm\" (UID: \"35eafd12-efff-41d2-9f88-01e02d4c5b25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-9dbhm" Mar 20 15:00:00 crc kubenswrapper[4973]: I0320 15:00:00.404091 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35eafd12-efff-41d2-9f88-01e02d4c5b25-config-volume\") pod \"collect-profiles-29566980-9dbhm\" (UID: \"35eafd12-efff-41d2-9f88-01e02d4c5b25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-9dbhm" Mar 20 15:00:00 crc kubenswrapper[4973]: I0320 15:00:00.404402 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9n6m\" (UniqueName: \"kubernetes.io/projected/9b9f0bce-71f9-45b9-832e-5897b0a8d71d-kube-api-access-j9n6m\") pod \"auto-csr-approver-29566980-rx8lt\" (UID: \"9b9f0bce-71f9-45b9-832e-5897b0a8d71d\") " pod="openshift-infra/auto-csr-approver-29566980-rx8lt" Mar 20 15:00:00 crc kubenswrapper[4973]: I0320 15:00:00.404530 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35eafd12-efff-41d2-9f88-01e02d4c5b25-secret-volume\") pod \"collect-profiles-29566980-9dbhm\" (UID: \"35eafd12-efff-41d2-9f88-01e02d4c5b25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-9dbhm" Mar 20 15:00:00 crc kubenswrapper[4973]: I0320 15:00:00.404566 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfp82\" (UniqueName: \"kubernetes.io/projected/35eafd12-efff-41d2-9f88-01e02d4c5b25-kube-api-access-hfp82\") pod \"collect-profiles-29566980-9dbhm\" (UID: \"35eafd12-efff-41d2-9f88-01e02d4c5b25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-9dbhm" Mar 20 15:00:00 crc kubenswrapper[4973]: I0320 15:00:00.406209 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35eafd12-efff-41d2-9f88-01e02d4c5b25-config-volume\") pod \"collect-profiles-29566980-9dbhm\" (UID: \"35eafd12-efff-41d2-9f88-01e02d4c5b25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-9dbhm" Mar 20 15:00:00 crc kubenswrapper[4973]: I0320 15:00:00.412222 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35eafd12-efff-41d2-9f88-01e02d4c5b25-secret-volume\") pod \"collect-profiles-29566980-9dbhm\" (UID: \"35eafd12-efff-41d2-9f88-01e02d4c5b25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-9dbhm" Mar 20 15:00:00 crc kubenswrapper[4973]: I0320 15:00:00.427370 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfp82\" (UniqueName: \"kubernetes.io/projected/35eafd12-efff-41d2-9f88-01e02d4c5b25-kube-api-access-hfp82\") pod \"collect-profiles-29566980-9dbhm\" (UID: \"35eafd12-efff-41d2-9f88-01e02d4c5b25\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-9dbhm" Mar 20 15:00:00 crc kubenswrapper[4973]: I0320 15:00:00.428832 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9n6m\" (UniqueName: \"kubernetes.io/projected/9b9f0bce-71f9-45b9-832e-5897b0a8d71d-kube-api-access-j9n6m\") pod \"auto-csr-approver-29566980-rx8lt\" (UID: \"9b9f0bce-71f9-45b9-832e-5897b0a8d71d\") " pod="openshift-infra/auto-csr-approver-29566980-rx8lt" Mar 20 15:00:00 crc kubenswrapper[4973]: I0320 15:00:00.501447 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566980-rx8lt" Mar 20 15:00:00 crc kubenswrapper[4973]: I0320 15:00:00.527433 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-9dbhm" Mar 20 15:00:01 crc kubenswrapper[4973]: I0320 15:00:01.060850 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566980-9dbhm"] Mar 20 15:00:01 crc kubenswrapper[4973]: I0320 15:00:01.110268 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566980-rx8lt"] Mar 20 15:00:01 crc kubenswrapper[4973]: W0320 15:00:01.116384 4973 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b9f0bce_71f9_45b9_832e_5897b0a8d71d.slice/crio-b9169be93b38701329e23b4b7316b17e6ae9af73113672f54d1b77787f70d418 WatchSource:0}: Error finding container b9169be93b38701329e23b4b7316b17e6ae9af73113672f54d1b77787f70d418: Status 404 returned error can't find the container with id b9169be93b38701329e23b4b7316b17e6ae9af73113672f54d1b77787f70d418 Mar 20 15:00:01 crc kubenswrapper[4973]: I0320 15:00:01.119111 4973 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 15:00:02 crc kubenswrapper[4973]: I0320 15:00:02.067211 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566980-rx8lt" event={"ID":"9b9f0bce-71f9-45b9-832e-5897b0a8d71d","Type":"ContainerStarted","Data":"b9169be93b38701329e23b4b7316b17e6ae9af73113672f54d1b77787f70d418"} Mar 20 15:00:02 crc kubenswrapper[4973]: I0320 15:00:02.068733 4973 generic.go:334] "Generic (PLEG): container finished" podID="35eafd12-efff-41d2-9f88-01e02d4c5b25" containerID="2eb83b938ea9865e527b2776c8e6330071cd8e3e94203ad40da199f4f7c8f230" exitCode=0 Mar 20 15:00:02 crc kubenswrapper[4973]: I0320 15:00:02.068790 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-9dbhm" event={"ID":"35eafd12-efff-41d2-9f88-01e02d4c5b25","Type":"ContainerDied","Data":"2eb83b938ea9865e527b2776c8e6330071cd8e3e94203ad40da199f4f7c8f230"} Mar 20 15:00:02 crc kubenswrapper[4973]: I0320 15:00:02.068812 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-9dbhm" event={"ID":"35eafd12-efff-41d2-9f88-01e02d4c5b25","Type":"ContainerStarted","Data":"76a818496399226021dc576284565f1f48557da3ca047ba837a3f0615fe840d5"} Mar 20 15:00:03 crc kubenswrapper[4973]: I0320 15:00:03.481991 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-9dbhm" Mar 20 15:00:03 crc kubenswrapper[4973]: I0320 15:00:03.592276 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35eafd12-efff-41d2-9f88-01e02d4c5b25-config-volume\") pod \"35eafd12-efff-41d2-9f88-01e02d4c5b25\" (UID: \"35eafd12-efff-41d2-9f88-01e02d4c5b25\") " Mar 20 15:00:03 crc kubenswrapper[4973]: I0320 15:00:03.592430 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfp82\" (UniqueName: \"kubernetes.io/projected/35eafd12-efff-41d2-9f88-01e02d4c5b25-kube-api-access-hfp82\") pod \"35eafd12-efff-41d2-9f88-01e02d4c5b25\" (UID: \"35eafd12-efff-41d2-9f88-01e02d4c5b25\") " Mar 20 15:00:03 crc kubenswrapper[4973]: I0320 15:00:03.592819 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35eafd12-efff-41d2-9f88-01e02d4c5b25-secret-volume\") pod \"35eafd12-efff-41d2-9f88-01e02d4c5b25\" (UID: \"35eafd12-efff-41d2-9f88-01e02d4c5b25\") " Mar 20 15:00:03 crc kubenswrapper[4973]: I0320 15:00:03.596036 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/35eafd12-efff-41d2-9f88-01e02d4c5b25-config-volume" (OuterVolumeSpecName: "config-volume") pod "35eafd12-efff-41d2-9f88-01e02d4c5b25" (UID: "35eafd12-efff-41d2-9f88-01e02d4c5b25"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 15:00:03 crc kubenswrapper[4973]: I0320 15:00:03.605516 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35eafd12-efff-41d2-9f88-01e02d4c5b25-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "35eafd12-efff-41d2-9f88-01e02d4c5b25" (UID: "35eafd12-efff-41d2-9f88-01e02d4c5b25"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 15:00:03 crc kubenswrapper[4973]: I0320 15:00:03.611684 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35eafd12-efff-41d2-9f88-01e02d4c5b25-kube-api-access-hfp82" (OuterVolumeSpecName: "kube-api-access-hfp82") pod "35eafd12-efff-41d2-9f88-01e02d4c5b25" (UID: "35eafd12-efff-41d2-9f88-01e02d4c5b25"). InnerVolumeSpecName "kube-api-access-hfp82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:00:03 crc kubenswrapper[4973]: I0320 15:00:03.698139 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfp82\" (UniqueName: \"kubernetes.io/projected/35eafd12-efff-41d2-9f88-01e02d4c5b25-kube-api-access-hfp82\") on node \"crc\" DevicePath \"\"" Mar 20 15:00:03 crc kubenswrapper[4973]: I0320 15:00:03.698703 4973 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/35eafd12-efff-41d2-9f88-01e02d4c5b25-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 15:00:03 crc kubenswrapper[4973]: I0320 15:00:03.698717 4973 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/35eafd12-efff-41d2-9f88-01e02d4c5b25-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 15:00:04 crc kubenswrapper[4973]: I0320 15:00:04.096767 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-9dbhm" event={"ID":"35eafd12-efff-41d2-9f88-01e02d4c5b25","Type":"ContainerDied","Data":"76a818496399226021dc576284565f1f48557da3ca047ba837a3f0615fe840d5"} Mar 20 15:00:04 crc kubenswrapper[4973]: I0320 15:00:04.096820 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="76a818496399226021dc576284565f1f48557da3ca047ba837a3f0615fe840d5" Mar 20 15:00:04 crc kubenswrapper[4973]: I0320 15:00:04.096880 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566980-9dbhm" Mar 20 15:00:04 crc kubenswrapper[4973]: I0320 15:00:04.586197 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566935-6xxvv"] Mar 20 15:00:04 crc kubenswrapper[4973]: I0320 15:00:04.599017 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566935-6xxvv"] Mar 20 15:00:05 crc kubenswrapper[4973]: I0320 15:00:05.970515 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b2aa00d-07d2-41d5-9fc7-9770039b5663" path="/var/lib/kubelet/pods/2b2aa00d-07d2-41d5-9fc7-9770039b5663/volumes" Mar 20 15:00:07 crc kubenswrapper[4973]: I0320 15:00:07.161706 4973 scope.go:117] "RemoveContainer" containerID="6bd82e091e588f69ebed24d4151858259ca48c2563e430ccd30539816b335bdf" Mar 20 15:00:09 crc kubenswrapper[4973]: I0320 15:00:09.157298 4973 generic.go:334] "Generic (PLEG): container finished" podID="9b9f0bce-71f9-45b9-832e-5897b0a8d71d" containerID="5b86ba45d5ab3c50663c2685ad4262416abc4d09c89ee463b061469b8820e554" exitCode=0 Mar 20 15:00:09 crc kubenswrapper[4973]: I0320 15:00:09.157382 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566980-rx8lt" event={"ID":"9b9f0bce-71f9-45b9-832e-5897b0a8d71d","Type":"ContainerDied","Data":"5b86ba45d5ab3c50663c2685ad4262416abc4d09c89ee463b061469b8820e554"} Mar 20 15:00:10 crc kubenswrapper[4973]: I0320 15:00:10.632756 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566980-rx8lt" Mar 20 15:00:10 crc kubenswrapper[4973]: I0320 15:00:10.711047 4973 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9n6m\" (UniqueName: \"kubernetes.io/projected/9b9f0bce-71f9-45b9-832e-5897b0a8d71d-kube-api-access-j9n6m\") pod \"9b9f0bce-71f9-45b9-832e-5897b0a8d71d\" (UID: \"9b9f0bce-71f9-45b9-832e-5897b0a8d71d\") " Mar 20 15:00:10 crc kubenswrapper[4973]: I0320 15:00:10.719744 4973 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b9f0bce-71f9-45b9-832e-5897b0a8d71d-kube-api-access-j9n6m" (OuterVolumeSpecName: "kube-api-access-j9n6m") pod "9b9f0bce-71f9-45b9-832e-5897b0a8d71d" (UID: "9b9f0bce-71f9-45b9-832e-5897b0a8d71d"). InnerVolumeSpecName "kube-api-access-j9n6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 15:00:10 crc kubenswrapper[4973]: I0320 15:00:10.815792 4973 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9n6m\" (UniqueName: \"kubernetes.io/projected/9b9f0bce-71f9-45b9-832e-5897b0a8d71d-kube-api-access-j9n6m\") on node \"crc\" DevicePath \"\"" Mar 20 15:00:10 crc kubenswrapper[4973]: I0320 15:00:10.951353 4973 scope.go:117] "RemoveContainer" containerID="ae987235ba54d62052117d63cefdb9b2cc4548bec68e9b044844fc0ac21a1742" Mar 20 15:00:10 crc kubenswrapper[4973]: E0320 15:00:10.951800 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 15:00:11 crc kubenswrapper[4973]: I0320 15:00:11.187998 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566980-rx8lt" event={"ID":"9b9f0bce-71f9-45b9-832e-5897b0a8d71d","Type":"ContainerDied","Data":"b9169be93b38701329e23b4b7316b17e6ae9af73113672f54d1b77787f70d418"} Mar 20 15:00:11 crc kubenswrapper[4973]: I0320 15:00:11.188063 4973 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9169be93b38701329e23b4b7316b17e6ae9af73113672f54d1b77787f70d418" Mar 20 15:00:11 crc kubenswrapper[4973]: I0320 15:00:11.188085 4973 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566980-rx8lt" Mar 20 15:00:11 crc kubenswrapper[4973]: I0320 15:00:11.703397 4973 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566974-254sd"] Mar 20 15:00:11 crc kubenswrapper[4973]: I0320 15:00:11.738146 4973 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566974-254sd"] Mar 20 15:00:11 crc kubenswrapper[4973]: I0320 15:00:11.973308 4973 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce4e6cce-8c25-4a59-b261-1c5a23f3456b" path="/var/lib/kubelet/pods/ce4e6cce-8c25-4a59-b261-1c5a23f3456b/volumes" Mar 20 15:00:22 crc kubenswrapper[4973]: I0320 15:00:22.950954 4973 scope.go:117] "RemoveContainer" containerID="ae987235ba54d62052117d63cefdb9b2cc4548bec68e9b044844fc0ac21a1742" Mar 20 15:00:22 crc kubenswrapper[4973]: E0320 15:00:22.951795 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 15:00:33 crc kubenswrapper[4973]: I0320 15:00:33.950772 4973 scope.go:117] "RemoveContainer" containerID="ae987235ba54d62052117d63cefdb9b2cc4548bec68e9b044844fc0ac21a1742" Mar 20 15:00:33 crc kubenswrapper[4973]: E0320 15:00:33.951425 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" Mar 20 15:00:46 crc kubenswrapper[4973]: I0320 15:00:46.045633 4973 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rzzdn"] Mar 20 15:00:46 crc kubenswrapper[4973]: E0320 15:00:46.048196 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35eafd12-efff-41d2-9f88-01e02d4c5b25" containerName="collect-profiles" Mar 20 15:00:46 crc kubenswrapper[4973]: I0320 15:00:46.048317 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="35eafd12-efff-41d2-9f88-01e02d4c5b25" containerName="collect-profiles" Mar 20 15:00:46 crc kubenswrapper[4973]: E0320 15:00:46.048430 4973 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b9f0bce-71f9-45b9-832e-5897b0a8d71d" containerName="oc" Mar 20 15:00:46 crc kubenswrapper[4973]: I0320 15:00:46.048513 4973 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b9f0bce-71f9-45b9-832e-5897b0a8d71d" containerName="oc" Mar 20 15:00:46 crc kubenswrapper[4973]: I0320 15:00:46.048922 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b9f0bce-71f9-45b9-832e-5897b0a8d71d" containerName="oc" Mar 20 15:00:46 crc kubenswrapper[4973]: I0320 15:00:46.049050 4973 memory_manager.go:354] "RemoveStaleState removing state" podUID="35eafd12-efff-41d2-9f88-01e02d4c5b25" containerName="collect-profiles" Mar 20 15:00:46 crc kubenswrapper[4973]: I0320 15:00:46.052780 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rzzdn" Mar 20 15:00:46 crc kubenswrapper[4973]: I0320 15:00:46.065748 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rzzdn"] Mar 20 15:00:46 crc kubenswrapper[4973]: I0320 15:00:46.113124 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rts8x\" (UniqueName: \"kubernetes.io/projected/e6d90470-e8bc-4b90-8261-f22045f5f10f-kube-api-access-rts8x\") pod \"redhat-marketplace-rzzdn\" (UID: \"e6d90470-e8bc-4b90-8261-f22045f5f10f\") " pod="openshift-marketplace/redhat-marketplace-rzzdn" Mar 20 15:00:46 crc kubenswrapper[4973]: I0320 15:00:46.113497 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6d90470-e8bc-4b90-8261-f22045f5f10f-catalog-content\") pod \"redhat-marketplace-rzzdn\" (UID: \"e6d90470-e8bc-4b90-8261-f22045f5f10f\") " pod="openshift-marketplace/redhat-marketplace-rzzdn" Mar 20 15:00:46 crc kubenswrapper[4973]: I0320 15:00:46.114093 4973 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6d90470-e8bc-4b90-8261-f22045f5f10f-utilities\") pod \"redhat-marketplace-rzzdn\" (UID: \"e6d90470-e8bc-4b90-8261-f22045f5f10f\") " pod="openshift-marketplace/redhat-marketplace-rzzdn" Mar 20 15:00:46 crc kubenswrapper[4973]: I0320 15:00:46.216556 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6d90470-e8bc-4b90-8261-f22045f5f10f-utilities\") pod \"redhat-marketplace-rzzdn\" (UID: \"e6d90470-e8bc-4b90-8261-f22045f5f10f\") " pod="openshift-marketplace/redhat-marketplace-rzzdn" Mar 20 15:00:46 crc kubenswrapper[4973]: I0320 15:00:46.216828 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rts8x\" (UniqueName: \"kubernetes.io/projected/e6d90470-e8bc-4b90-8261-f22045f5f10f-kube-api-access-rts8x\") pod \"redhat-marketplace-rzzdn\" (UID: \"e6d90470-e8bc-4b90-8261-f22045f5f10f\") " pod="openshift-marketplace/redhat-marketplace-rzzdn" Mar 20 15:00:46 crc kubenswrapper[4973]: I0320 15:00:46.216941 4973 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6d90470-e8bc-4b90-8261-f22045f5f10f-catalog-content\") pod \"redhat-marketplace-rzzdn\" (UID: \"e6d90470-e8bc-4b90-8261-f22045f5f10f\") " pod="openshift-marketplace/redhat-marketplace-rzzdn" Mar 20 15:00:46 crc kubenswrapper[4973]: I0320 15:00:46.217089 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6d90470-e8bc-4b90-8261-f22045f5f10f-utilities\") pod \"redhat-marketplace-rzzdn\" (UID: \"e6d90470-e8bc-4b90-8261-f22045f5f10f\") " pod="openshift-marketplace/redhat-marketplace-rzzdn" Mar 20 15:00:46 crc kubenswrapper[4973]: I0320 15:00:46.217240 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6d90470-e8bc-4b90-8261-f22045f5f10f-catalog-content\") pod \"redhat-marketplace-rzzdn\" (UID: \"e6d90470-e8bc-4b90-8261-f22045f5f10f\") " pod="openshift-marketplace/redhat-marketplace-rzzdn" Mar 20 15:00:46 crc kubenswrapper[4973]: I0320 15:00:46.235384 4973 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rts8x\" (UniqueName: \"kubernetes.io/projected/e6d90470-e8bc-4b90-8261-f22045f5f10f-kube-api-access-rts8x\") pod \"redhat-marketplace-rzzdn\" (UID: \"e6d90470-e8bc-4b90-8261-f22045f5f10f\") " pod="openshift-marketplace/redhat-marketplace-rzzdn" Mar 20 15:00:46 crc kubenswrapper[4973]: I0320 15:00:46.383158 4973 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rzzdn" Mar 20 15:00:46 crc kubenswrapper[4973]: I0320 15:00:46.882512 4973 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rzzdn"] Mar 20 15:00:47 crc kubenswrapper[4973]: I0320 15:00:47.589017 4973 generic.go:334] "Generic (PLEG): container finished" podID="e6d90470-e8bc-4b90-8261-f22045f5f10f" containerID="7dfb8877373139af6f018db419d5f3f613704636f03f667b681d1af782a7806b" exitCode=0 Mar 20 15:00:47 crc kubenswrapper[4973]: I0320 15:00:47.589202 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rzzdn" event={"ID":"e6d90470-e8bc-4b90-8261-f22045f5f10f","Type":"ContainerDied","Data":"7dfb8877373139af6f018db419d5f3f613704636f03f667b681d1af782a7806b"} Mar 20 15:00:47 crc kubenswrapper[4973]: I0320 15:00:47.589376 4973 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rzzdn" event={"ID":"e6d90470-e8bc-4b90-8261-f22045f5f10f","Type":"ContainerStarted","Data":"d8f0f32abcbb0344e02f8fc52c3d0ae76069a1ef0484509efb3c72406beccfc4"} Mar 20 15:00:47 crc kubenswrapper[4973]: I0320 15:00:47.951635 4973 scope.go:117] "RemoveContainer" containerID="ae987235ba54d62052117d63cefdb9b2cc4548bec68e9b044844fc0ac21a1742" Mar 20 15:00:47 crc kubenswrapper[4973]: E0320 15:00:47.952403 4973 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-qlztx_openshift-machine-config-operator(70745a45-4eff-4e56-b9ab-efa4a7c83306)\"" pod="openshift-machine-config-operator/machine-config-daemon-qlztx" podUID="70745a45-4eff-4e56-b9ab-efa4a7c83306" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515157260455024457 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015157260455017374 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015157244410016510 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015157244410015460 5ustar corecore